OpenAI CEO Sam Altman has made a crucial clarification: ChatGPT conversations are not confidential. Unlike real therapy, legal counsel, or medical consultations, your private chats with ChatGPT lack any legal privilege or privacy protection.
In an interview with podcaster Theo Von, Altman said, “People talk about the most personal sh*t in their lives to ChatGPT. Young people use it like a therapist or a life coach.” But he added, “If you talk to a therapist or a lawyer, there’s confidentiality. We haven’t figured that out for ChatGPT yet.”
Altman’s warning doesn’t end with privacy concerns. He also confirmed that deleted chats may not be permanently erased. For legal or security reasons, records might still be retrieved, raising alarms over data safety and personal privacy. While many users assume the “delete” option wipes the slate clean, OpenAI reserves the right to retain certain information for investigation or compliance purposes. “No one had to think about that even a year ago,” Altman admitted. “Now it’s a huge issue.”
A Stanford University study has added to the concern. Researchers tested AI bots in simulated therapy scenarios and found that chatbots often responded inappropriately to serious mental health conditions. The bots failed to recognize crises and even reinforced stigmas around schizophrenia and addiction. “We find that these chatbots encourage delusions and show harmful biases,” the study noted. While licensed therapists are trained to respond ethically and without bias, chatbots do not follow clinical standards.
Also Read Dog Babu Gets a Residence Certificate in Bihar And the Internet Can’t Handle It
Millions worldwide use ChatGPT for emotional support, relationship advice, and life decisions. But Altman’s admission changes the conversation. Users need to understand that ChatGPT is a tool not a substitute for confidential, professional help. Until regulations evolve and protections are built in, assume anything you share with ChatGPT can be accessed, reviewed, or stored.