Hallucination is totally the wrong word, implying it is perceiving the world incorrectly.
But it's generating false, plausible sounding statements. Confabulation is literally the perfect word.
So, let's all please start referring to any junk that an LLM makes up as "confabulations".
A hallucination is a false subjective sensory experience. ChatGPT doesn't have experiences!
It's just making up plausible-sounding bs, covering knowledge gaps. That's confabulation