It's not even just about people blindly trusting what ChatGPT tells them. LLMs are poisoning the entire information ecosystem. You can't even necessarily trust that the citations in a published paper are real (or a search engine's descriptions of them).
It's not even just about people blindly trusting what ChatGPT tells them. LLMs are poisoning the entire information ecosystem. You can't even necessarily trust that the citations in a published paper are real (or a search engine's descriptions of them).