It's not even just about people blindly trusting what ChatGPT tells them. LLMs are poisoning the entire information ecosystem. You can't even necessarily trust that the citations in a published paper are real (or a search engine's descriptions of them).
Via:
social inclusion
increase in property values near stations;
welfare of employees and consumers;
productive use of travel time.
www.eur.nl/en/news/rese...
Via:
social inclusion
increase in property values near stations;
welfare of employees and consumers;
productive use of travel time.
www.eur.nl/en/news/rese...
www.theatlantic.com/ideas/archiv...
www.theatlantic.com/ideas/archiv...
www.cell.com/action/showP...
www.cell.com/action/showP...
www.youtube.com/live/smp0voL...
www.youtube.com/live/smp0voL...
Have your say: engage.vic.gov.au/pu...
Have your say: engage.vic.gov.au/pu...
But the people in charge of searching for lost hikers say the feature is going to exacerbate an issue they’ve been warning about for years: hiking apps providing false information."