New judge’s ruling makes OpenAI keeping a record of all your ChatGPT chats one step closer to reality

- A federal judge rejected the petition of a chatgpt user against his order that OpenAi preserves all catpt cats
- The order follows a request from the New York Times as part of its trial against Openai and Microsoft
- OPENAI plans to continue to compete against the decision
Openai will keep all your conversations with Chatgpt and may share them with a lot of lawyers, even those you thought deleted. It is the result of an order of the federal judge supervising a trial brought against Openai by The New York Times on violation of copyright. Ona Wang confirmed her previous order to preserve all Chatgpt conversations for evidence after having rejected a motion from the user of Chatgpt Aidan Hunt, one of the many Chatgpt users asking him to cancel the order on privacy and other concerns.
Judge Wang told Openai to preserve the “indefinitely” chatgpt results since the Times stressed that it would be a way to know if the chatbot illegally recreated items without paying the original publishers. But finding these examples means hanging on all intimate, clumsy or simply private communications that someone had with the chatbot. Although what users write are not part of the order, it is not difficult to imagine finding who converses with chatgpt on the personal subject according to what AI wrote. In fact, the more personal the discussion, the more easy it would be easy to identify the user.
Hunt stressed that he had no warning that it could happen until he saw a report on the order in an online forum. And is now concerned about the fact that his conversations with Chatgpt could be disseminated, including “very sensitive personal and commercial information”. He asked the judge to leave the prescription or modify him to leave aside a particularly private content, such as conversations carried out in private mode, or in the event of medical or legal questions.
According to Hunt, the judge exceeded his limits with the ordinance because “this case implies significant and new constitutional questions on the incident of privacy rights to the use of artificial intelligence – a field of law in rapid development – and the capacity of a magistrate [judge] Establish a national mass monitoring program by means of a discovery order in a civil affair. »»
Judge Wang rejected his request because they are not linked to the question of copyright. She stressed that it is preservation, no disclosure, and that it is hardly unique or rare that the courts tell a private company to keep certain litigation files. It is technically correct, but, of course, a daily person using Chatgpt might not feel this.
She also seemed to particularly hate the mass monitoring accusation, citing this section of Hunt’s petition and slamming it with the legal language equivalent of a dissociate track. Wang judge added a “[sic]”At the quotation of the Hunt depot and a footnote stressing that the petition” does not explain how the document retention order of a court which directs the preservation, segregation and retention of certain private data by a private company for limited purposes is or could be, a “mass monitoring program at the national level.” It’s not. The judiciary is not an agency for the application of the law. “”
This “sic burn”, there is still a chance that the order will be canceled or modified after Openai went to court this week to repel within the framework of the greater battle of the paperwork around the trial.
Deleted but not gone
Hunt’s other concern is that, whatever this case takes place, Openai will now have the possibility of keeping cats which, according to users, have been deleted and could use them in the future. OpenAI is feared to focus on protecting user confidentiality on the legal opportunity. OPENAI has so far pleaded for this privacy and has asked the Court of oral arguments to challenge the detention order which will take place this week. The company said it wanted to push it hard on behalf of its users. But in the meantime, your cat newspapers are in limbo.
Many may have considered that writing in Chatgpt is like talking to a friend who can keep a secret. Perhaps the more understands now that it always acts as a computer program, and the equivalent of the history of your browser and that Google research terms are still there. At the very least, hopefully, there will be more transparency. Even if they are the courts demanding that AI companies retain sensitive data, users must be informed by companies. We should not discover it by chance on a web forum.
And if Openai really wants to protect its users, it could start offering more granular controls: clear tips for anonymous mode, stronger suppression guarantees and alerts when conversations are preserved for legal reasons. Until then, it could be wise to treat Chatgpt a little less like a therapist and a little more like a colleague who could wear a thread.