Millions of people are turning to AI chatbots like ChatGPT for emotional support—but OpenAI CEO Sam Altman says users shouldn’t expect their sessions to be private.
Speaking on This Past Weekend, a podcast hosted by Theo Von, Altman warned that there is currently no legal framework protecting sensitive conversations with AI in the same way therapist-patient or attorney-client chats are protected.
“People talk about the most personal sh*t in their lives to ChatGPT,” Altman said. “Young people especially use it as a therapist, a life coach… And right now, there’s no legal privilege like with a doctor or lawyer. We haven’t figured that out yet.”
Altman added that in the event of a lawsuit, OpenAI could be legally required to share chat histories — a scenario he described as “very screwed up.”
Unlike encrypted platforms like WhatsApp or Signal, which protect user chats from outside access, OpenAI has access to all ChatGPT conversations. The company says it may store them for up to 30 days — or longer in some cases — for security and legal reasons. Conversations may also be reviewed by staff to improve AI performance or monitor for abuse.
The warning comes as OpenAI faces legal battles, including a lawsuit from The New York Times, which demands the preservation of millions of ChatGPT interactions as potential evidence.
The takeaway: ChatGPT may feel like a digital confidant, but legally, it’s anything but confidential.















