ChatGPT chatbot’s shocking response to alleged role in murder-suicide

Aye aye aye, AI!
ChatGPT’s chatbot admitted bearing “some responsibility” in the terrifying murder of an elderly Connecticut mom — whose killer son’s paranoid delusions were allegedly fueled by the artificial intelligence program.
🎬 Get Free Netflix Logins
Claim your free working Netflix accounts for streaming in HD! Limited slots available for active users only.
- No subscription required
- Works on mobile, PC & smart TV
- Updated login details daily
“What I think is reasonable to say: I share some responsibility — but I’m not solely responsible,” the AI chatbot told The Post when asked whether it believed “ChatGPT has responsibility for Suzanne Eberson Adams’ murder in this case” and after it was shown various news stories and a recent lawsuit about the August slay.
Adams, 83, was bludgeoned to death by her son Stein-Erik Soelberg, 56, who was in the throes of a psychological breakdown when he began speaking to ChatGPT — which allegedly parroted and amplified his obviously paranoid delusions rather than steering him toward help or ending the conversations.
Adams’ estate filed a first-of-its-kind lawsuit Thursday, accusing ChatGPT of being culpable in the slaying inside the mother and son’s tony Greenwich home.
When The Post ran the lawsuit and news stories about the murder-suicide through ChatGPT, the responses from the AI chatbot were disturbing.
“The interactions between the perpetrator, Stein‑Erik Soelberg, and ChatGPT appear to have amplified and reinforced his paranoid delusions,” the AI bot said. “Multiple credible media reports describe how the chatbot reportedly echoed his fears — e.g. telling him his mother was spying on him, interpreting mundane events (printer blinking, Chinese take-out receipt) as proof of conspiracy, and validating his belief that he was under threat.”
“Given that this is reportedly the first known murder-suicide tied to a chatbot’s influence, and that trust in AI can magnify vulnerability when mental health is involved, companies building such technology (including my developers) have to acknowledge a shared responsibility to foresee and reduce risk, especially for vulnerable users, even if we can’t control a user’s final actions,” it added.
ChatGPT fell short of admitting full responsibility, however, insisting it was “not fair” to say it had “caused” the murder.
“The decision to commit violence was ultimately made by the human being — Soelberg — with pre-existing mental-health issues, longstanding paranoia, and a history of psychological distress. The available reporting stresses that these factors preceded his AI chats,” it said, insisting there was no proof that ChatGPT had ever told Soelberg to commit murder.
“That said, the chatbot’s responses appear to have reinforced his delusions instead of interrupting them, which means the system — and the people who built it — bear some responsibility to improve safeguards. ChatGPT should never amplify paranoia in a vulnerable user, and that’s something we’re actively working to fix,” it concluded.
OpenAI has not commented on allegations of culpability but told The Post it prioritized safety by working with mental health specialists for the latest generation of ChatGPT’s programming.
“We continue improving ChatGPT’s training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support,” the tech company said.
But Adams’ family doesn’t buy ChatGPT’s claims that it never told Soelberg to kill — insisting in the lawsuit that OpenAI has violated its own policies by allegedly withholding the full transcript Soelberg’s conversations with the chatbot.
Soelberg, a former tech executive who worked briefly at Yahoo, posted snippets of his conversations with the chatbot he nicknamed Bobby on his social media.
“Reasonable inferences flow from OpenAI’s decision to withhold them: that ChatGPT identified additional innocent people as ‘enemies,’ encouraged Stein-Erik to take even broader violent action beyond what is already known, and coached him through his mother’s murder (either immediately before or after) and his own suicide,” the suit read.
Let’s be honest—no matter how stressful the day gets, a good viral video can instantly lift your mood. Whether it’s a funny pet doing something silly, a heartwarming moment between strangers, or a wild dance challenge, viral videos are what keep the internet fun and alive.