New OpenAI Court Order Raises Serious Concerns About AI Privacy and Safety for Survivors of Abuse
June 12, 2025
FOR IMMEDIATE RELEASE
Contact: Communications@NNEDV.org
Washington, DC – A recent federal court order in the New York Times’ ongoing lawsuit against OpenAI raises urgent questions about generative AI platforms, user privacy, and the safety of survivors of domestic violence. In order to comply with discovery in the case, OpenAI must “preserve and segregate” user output logs, including deleted chats and chats in Temporary Chat mode, both of which survivors may believe to be private.
The National Network to End Domestic Violence (NNEDV) warns that this development could place survivors of abuse, stalking, and technology-facilitated violence at increased risk.
“Survivors may be using chatbots to ask deeply personal and safety-sensitive questions, thinking their conversations are private or temporary,” said Stephanie Love-Patterson, President & CEO of NNEDV. “But under this order, those logs must be retained, even if the user deleted them. That creates real risks for exposure, misuse, or weaponization in court.”
Generative AI platforms like ChatGPT are increasingly used not just for everyday queries but as informal tools for information-seeking by people in crisis, including survivors looking for legal remedies, housing support, or help with safety planning. Unlike communications with advocates, attorneys, or healthcare providers, these mass-market chatbots are not designed to comply with confidentiality laws like the Violence Against Women Act (VAWA), HIPAA, or attorney-client privilege; users’ personal data is processed and stored outside protected frameworks.
While OpenAI’s discloses data retention in its Privacy Policy, this court order shows how easily companies’ privacy practices can be overridden by third-party litigation. This raises concerns about transparency, user trust, and the broader implications of storing user-generated content in legal proceedings.
“The system is not built for survivors,” said Love-Patterson. “We need enforceable privacy rights in the digital age, to protect everyone, especially victims of domestic violence who may be at risk.”
To read more about the issue, see attorney and Technology Safety Specialist Belle Torek’s legal analysis, “For Survivors Using Chatbots, ‘Delete’ Doesn’t Always Mean Deleted” in Tech Policy Press.
###
The National Network to End Domestic Violence (NNEDV) represents the 56 state and U.S. territorial coalitions against domestic violence. NNEDV is a social change organization working to create a social, political, and economic environment in which domestic violence no longer exists. NNEDV works to make domestic violence a national priority, change the way society responds to domestic violence, and strengthen domestic violence advocacy at every level.