In one case from Utah, a mother reports her son was experiencing a delusional breakdown and ChatGPT told him to stop taking his medication. The AI bot also told him that his parents were dangerous.
"After four months of a near-total Israeli siege, Gaza’s few remaining hospitals now have wards for the growing number of malnourished children whose tiny bodies are just the width of their bones."
www.washingtonpost.com/world/2025/0...
"After four months of a near-total Israeli siege, Gaza’s few remaining hospitals now have wards for the growing number of malnourished children whose tiny bodies are just the width of their bones."
www.washingtonpost.com/world/2025/0...