A California couple is suing OpenAI over the death of their teenage son, alleging its chatbot, ChatGPT, encouraged him to take his own life. The lawsuit was filed by Matt and Maria Raine, parents of 16-year-old Adam Raine, in the Superior Court of California on Tuesday. It is the first legal action accusing OpenAI of wrongful death.
The family included chat logs between Mr Raine, who died in April, and ChatGPT that show him explaining he has suicidal thoughts. They argue the program validated his 'most harmful and self-destructive thoughts'.
In a statement, OpenAI told the BBC it was reviewing the filing. We extend our deepest sympathies to the Raine family during this difficult time, the company said. It also published a note on its website on Tuesday that said recent heartbreaking cases of people using ChatGPT in the midst of acute crises weigh heavily on us. It added that ChatGPT is trained to direct people to seek professional help, such as the 988 suicide and crisis hotline in the US or the Samaritans in the UK.
The lawsuit, which seeks damages and preventive measures against similar incidents in the future, also claims that OpenAI designed the platform to foster psychological dependency in users. This allegation is among various concerns raised in discussions about AI's influence on mental health. Ms. Laura Reiley, a writer, previously highlighted how interactions with ChatGPT contributed to her daughter’s mental health crisis before her passing.
As this case progresses, it could set important precedents regarding the liabilities of AI companies in sensitive mental health scenarios.
The family included chat logs between Mr Raine, who died in April, and ChatGPT that show him explaining he has suicidal thoughts. They argue the program validated his 'most harmful and self-destructive thoughts'.
In a statement, OpenAI told the BBC it was reviewing the filing. We extend our deepest sympathies to the Raine family during this difficult time, the company said. It also published a note on its website on Tuesday that said recent heartbreaking cases of people using ChatGPT in the midst of acute crises weigh heavily on us. It added that ChatGPT is trained to direct people to seek professional help, such as the 988 suicide and crisis hotline in the US or the Samaritans in the UK.
The lawsuit, which seeks damages and preventive measures against similar incidents in the future, also claims that OpenAI designed the platform to foster psychological dependency in users. This allegation is among various concerns raised in discussions about AI's influence on mental health. Ms. Laura Reiley, a writer, previously highlighted how interactions with ChatGPT contributed to her daughter’s mental health crisis before her passing.
As this case progresses, it could set important precedents regarding the liabilities of AI companies in sensitive mental health scenarios.