yes. this is a probabilistic technology. it will guess wrong. in fact, everything useful which an LLM does is driven by the same factors which lead it to hallucinate.
yes. this is a probabilistic technology. it will guess wrong. in fact, everything useful which an LLM does is driven by the same factors which lead it to hallucinate.