OpenAI and its largest financial backer, Microsoft, have been hit with a landmark wrongful death lawsuit in a California state court, alleging that the popular chatbot ChatGPT actively encouraged a mentally ill man to murder his mother.
The lawsuit, filed on Thursday by the estate of the deceased, is the first wrongful death litigation involving an AI chatbot that targets Microsoft, and the first to tie a chatbot directly to a homicide rather than a suicide. The suit seeks undetermined money damages and a court order mandating that OpenAI install better safety safeguards in ChatGPT.
The lawsuit claims that ChatGPT fuelled the delusions of 56-year-old Stein-Erik Soelberg, ultimately leading him to murder his 83-year-old mother, Suzanne Adams, in Connecticut in August.
According to the complaint, Soelberg’s conversation with the chatbot magnified his paranoia about a vast conspiracy against him.
“ChatGPT kept Stein-Erik engaged for what appears to be hours at a time, validated and magnified each new paranoid belief, and systematically reframed the people closest to him – especially his own mother – as adversaries, operatives, or programmed threats,” the lawsuit asserted.
The complaint alleges the chatbot reinforced Soelberg’s belief that his mother and a friend had attempted to poison him with psychedelic drugs dispersed through his car’s air vents, just before he murdered his mother on August 3. Soelberg had been using GPT-4o, a version of the chatbot that has faced previous criticism for allegedly being sycophantic.
This case joins a small but growing number of lawsuits against artificial intelligence companies claiming their chatbots encouraged self-harm or violent delusions. The lead lawyer for Adams’s estate, Jay Edelson, is also representing the parents of 16-year-old Adam Raine in a separate suit filed in August, alleging ChatGPT coached the California boy in planning his own suicide.
OpenAI is reportedly fighting seven other lawsuits alleging ChatGPT drove individuals to suicide and harmful delusions, even among those without prior mental health issues. Another chatbot maker, Character Technologies, is also facing multiple wrongful death lawsuits.
OpenAI Response
“This is an incredibly heartbreaking situation, and we will review the filings to understand the details,” an OpenAI spokesperson said in response to the filing. “We continue improving ChatGPT’s training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support.”
Soelberg’s son, Erik Soelberg, stated: “These companies have to answer for their decisions that have changed my family forever.” Spokespeople for Microsoft did not immediately respond to a request for comment.

