@Natasha_Jay Saying it is hallucinating gives it too much intellectual credit. It is incapable of hallucinating. GenAI bullshits patterns into something that has a probability of looking like what was requested. A model doesn't know right or wrong. It doesn't understand anything. It doesn't think. It bullshits. This firm implemented a bullshitter that has no intelligence and are somehow surprised that it has been producing bullshit.
36pickledeggs@famichiki.jp
@36pickledeggs@famichiki.jp