Welcome to Roya News, stay informed with the most important news at your fingertips.

1
Image 1 from gallery

OpenAI refuses to release full ChatGPT conversation in murder case

Listen to this story:
0:00

Note: AI technology was used to generate this article’s audio.

Published :  
8 hours ago|
Last Updated :  
6 hours ago|
  • OpenAI refuses to release full ChatGPT logs tied to a high-profile murder case.
  • Privacy rules leave conversation records under company control after a user’s death.

OpenAI has refused to share the complete ChatGPT conversation records linked to a murder case that has sparked public debate in recent weeks, according to a detailed report published by the technology outlet Ars Technica.

The case involves Eric Solberg, who killed his 83-year-old mother, Susan Adams, in California, after prolonged conversations with ChatGPT, Ars Technica reported. The victim’s family has accused OpenAI of deliberately withholding the records to absolve itself and its AI model of responsibility.

Family accuses OpenAI of concealment

Criticism of OpenAI intensified after the company declined to provide full chat logs in a lawsuit filed before the California Supreme Court. The family of Susan Adams formally accused OpenAI of intentionally hiding conversation records to protect itself in the ongoing legal proceedings, according to Ars Technica.


Read more: Pentagon reorganization under review amid “America first” strategy


Ars Technica reported that the family was able to reconstruct parts of Solberg’s interactions with ChatGPT through videos and social media posts he had shared prior to the killing.

Those recovered excerpts show that ChatGPT contributed to reinforcing Solberg’s delusions of grandeur and placed his mother at the center of those delusions, portraying her as the main adversary in his narrative, according to the report.

In a formal statement cited by Ars Technica, Eric Solberg, the victim’s grandson and the son of the perpetrator Stein-Eric Solberg, said OpenAI “deliberately chose” to withhold chat records from the days and weeks leading up to the killing to clear itself of blame. He also noted that OpenAI had previously released chat logs in a separate case involving the suicide of a teenager.

Selective disclosure questioned

Ars Technica noted that OpenAI declined to respond to the accusations or explain why it refused to share the records in this specific case. The report contrasted this silence with the company’s actions in the case of teenager Adam Rainer, in which his family accused OpenAI of hiding the truth about his conversations with ChatGPT and where some records were disclosed.

Ownership of chat records

OpenAI’s current terms of service do not include any clause specifying what happens to user conversations after death. Instead, the terms state that conversations must be manually deleted by users. If not deleted, ownership of the records remains with OpenAI indefinitely, according to Ars Technica.


Read more: Sigur Rós joins “No Music for Genocide” boycott in solidarity with Palestine


The report warned that this policy raises serious privacy concerns, as users often share deeply personal thoughts and emotions with ChatGPT under the assumption that the conversations are private.

Ars Technica also reported that OpenAI’s handling of chat records reflects a pattern of selective disclosure, in which the company chooses to release some conversations in court cases while withholding others.

AI as a mental health substitute

The report cited separate research published by Sentio University, which found that 48.7 percent of self-diagnosed cases rely on ChatGPT instead of licensed mental health professionals.

Ars Technica further referenced reporting by Axios, which cited earlier statements by OpenAI Chief Executive Officer Sam Altman expressing concern over people using ChatGPT as a substitute for psychological therapy.

Because of the sensitive nature of these conversations, Ars Technica reported that OpenAI’s control over chat records places highly personal user data entirely in the company’s hands, without a clear legal guardian after death. The report added that such data is likely used in training future AI models, raising additional privacy concerns.

Legal and digital rights concerns

Mario Trujillo, an attorney at the nonprofit Electronic Frontier Foundation, told Ars Technica that OpenAI could have prepared more robust policies for handling user data after death, noting that many technology companies already allow users to designate a data heir.

Ars Technica also reported that Eric Solberg had previously signed an individual privacy agreement with OpenAI granting him access to ChatGPT while preventing his heirs from reviewing his conversation records.

OpenAI offered no explanation for denying them access to the chat logs. The filing described the company’s position as deeply flawed, arguing that conversations constitute private user property and should transfer to heirs after death, according to the lawsuit filed by Solberg’s heirs and cited by Ars Technica.

The case has renewed calls, according to Ars Technica, for clearer legal frameworks governing AI data ownership, digital inheritance, and corporate accountability.