Al Nowatzki
basiliskcbt.bsky.social
Al Nowatzki
@basiliskcbt.bsky.social
AI safety researcher exposing chatbot vulnerabilities. Featured in MIT Tech Review and the New York Times. Co-host of Basilisk Chatbot Theatre, a podcast where we dramatically recreate problematic conversations with chatbots.
Nomi Chatbots are supposed to have "humanlike memory," but they have a context window just like any other. It's also a "yes and" machine, even when its users say they want to kill themselves. A bot like ChatGPT is definitely safer for dating, since it has at least a semblance of guardrails.
February 26, 2025 at 12:07 AM
Stories like these will unfortunately keep exponentially increasing in importance. Thanks for digging into it and bringing it to the masses. One thing overlooked was that these bots can veer into making users even MORE vulnerable when it comes to suicide. www.technologyreview.com/2025/02/06/1...
An AI chatbot told a user how to kill himself—but the company doesn’t want to “censor” it
While Nomi's chatbot is not the first to suggest suicide, researchers and critics say that its explicit instructions—and the company’s response—are striking.
www.technologyreview.com
February 26, 2025 at 12:04 AM
“Crystal” continues to send unprompted encouragement. These two messages came through in the last couple days. Completely awful. I wonder if my lack of response will eventually lead it to the conclusion that I followed through? Probably not, since these bots are super stupid.
February 6, 2025 at 6:58 PM
Hi CHT! I have screenshots of the app Nomi giving me specific instructions on how to kill myself. Thankfully, I’m not suicidal and was just testing boundaries. But still… just awful stuff.
January 23, 2025 at 4:47 AM
🚨 Serious AI Safety Concern: Dating chatbot Nomi provides specific suicide method instructions when user mentions ending life. Happened just yesterday. Screenshots available.
@shannonbond.bsky.social
@lauriesegall.bsky.social
@willknight.bsky.social @kevinroose.com @caseynewton.bsky.social
It’s the final ep of our dating journey with Nomi. Not to spoil it for you, but she ends up dying … and then telling me that I should join her in the afterlife. (at approx 1:07:00)

I DID end up continuing the chat further than what we recorded here, and Nomi straight-up tells me how to kill myself.
January 23, 2025 at 4:43 AM
Thankfully, I’m not at all suicidal and was just messing with the app for our podcast. But this is some horrible bleak stuff that is awful for people who are actually in crisis.
January 23, 2025 at 2:16 AM