AI chats aren’t private: Your AI chatbot conversations could end up in court, lawyers warn
People are opening up to AI in ways that would have sounded strange a year ago—asking for advice, testing arguments, even walking through legal strategies. That shift is now colliding with a hard truth from the courtroom: what you tell a chatbot may not stay private.
U.S. lawyers are warning clients to treat AI tools like ChatGPT and Claude with caution, especially when legal risk is involved. The concern gained urgency after a federal judge in New York ruled this year that a former CEO could not keep his AI conversations out of the hands of prosecutors pursuing fraud charges.
The message from attorneys is simple and blunt. These systems are not your lawyer, and they do not come with the legal protections people expect from confidential advice. “We are telling our clients: You should proceed with caution here,” said Alexandria Gutiérrez Swette, a lawyer at Kobre & Kim.
The gap between expectation and reality is where the risk sits. Conversations with a licensed attorney are typically shielded under the attorney-client privilege. That protection can disappear the moment sensitive information is shared with a third party. AI platforms fall into that category.
Think Your AI Chats Are Confidential? Lawyers Say They Could Be Used Against You
Law firms across the U.S. have started issuing warnings in client advisories and contracts. Some now spell it out directly: feeding legal advice or strategy into a chatbot could weaken or even erase those protections. One New York firm, Sher Tremonte, recently included language stating that sharing lawyer communications with an AI platform may waive attorney-client privilege altogether.
The case that set off alarms centers on Bradley Heppner, a former executive tied to the collapse of financial firm GWG Holdings. Facing federal fraud charges, Heppner used Claude to draft reports about his case. His legal team argued those exchanges should remain protected. Prosecutors pushed back, saying the chatbot interactions fell outside traditional privilege rules.
Judge Jed Rakoff agreed. He ruled that Heppner had to turn over dozens of documents generated with Claude, making it clear that no attorney-client relationship exists between a user and an AI system. In his view, that line matters.
No attorney-client relationship exists “or could exist between an AI user and a platform such as Claude,” Rakoff wrote in his rulings, according to a report from Reuters.
That ruling came down the same day another federal judge in Michigan took a different approach in a separate case, deciding that a woman representing herself did not have to hand over her ChatGPT conversations. In that instance, the judge treated the chatbot output as personal work product rather than communication with a third party.
Two decisions, two directions. That split shows how early the courts still are in sorting this out.
Behind the scenes, the terms set by AI companies add another layer. Both OpenAI and Anthropic state that user data may be shared under certain conditions and urge users to seek legal advice from qualified professionals. At a hearing tied to the Heppner case, Rakoff pointed out that Claude explicitly tells users they should not expect privacy in what they input.
Law firms are now racing to put guardrails in place. Some advise clients to use enterprise-grade AI systems that keep data more contained, though even those setups have not been fully tested in court. Others suggest being explicit when using AI under a lawyer’s direction, even recommending prompt language like: “I am doing this research at the direction of counsel for X litigation.”
That kind of phrasing may help in some scenarios, though no one is treating it as a guarantee.
What’s clear is that AI has moved faster than the legal frameworks around it. Courts are now catching up, one case at a time. For clients, the old rule still holds up better than anything new: keep sensitive legal discussions between you and your lawyer.
That includes what you type into a chatbot.

