Pedantic comment. It’s commonly understood that hallucination means “made up crap generated by an LLM” we could push for a better name like fabrication, but then we have to re-train all the 95% of the population who don’t even know LLMs aren’t trustworthy.
LLM are statistical language models. They don't hallucinate because they have no brain or senses for that.
Pedantic comment. It’s commonly understood that hallucination means “made up crap generated by an LLM” we could push for a better name like fabrication, but then we have to re-train all the 95% of the population who don’t even know LLMs aren’t trustworthy.