Stop overpaying - start transferring money with Ogvio. Join the waitlist & grab early Rewards NOW! 🎁
OpenAI Fights The New York Times Lawsuit to Save Deleted User Chats
Key Takeaways
- OpenAI is challenging a court order that requires it to save all user data, even deleted chats, by citing privacy concerns;
- The data retention order is tied to a copyright case from The New York Times, which claims its content was used without permission;
- The case centers on whether training AI with news articles qualifies as fair use or violates copyright laws.
OpenAI is attempting to overturn a court order that requires the company to retain copies of all user interactions, including those that have been deleted.
The request, published on June 5, is part of a copyright case filed by The New York Times.
The legal order, issued on May 13, tells OpenAI to stop removing output logs and to store them separately until further notice from the court. This includes logs that would normally be erased as part of routine system processes.
Did you know?
Subscribe - We publish new crypto explainer videos every week!
What is Decentralized Crypto Gambling? (Animated Explainer)
OpenAI’s Chief Operating Officer, Brad Lightcap, stated the company believes this step goes too far. He explained that the company is challenging the ruling to protect users’ privacy and trust.
The New York Times filed its lawsuit in December 2023. It claims OpenAI and Microsoft used its content without permission to train their artificial intelligence (AI) tools, including ChatGPT and Bing Chat.
OpenAI said its actions fall under "fair use", a legal concept that allows limited use of copyrighted work under certain conditions. The New York Times disagreed, especially when the AI can recreate full sentences from its articles or help people avoid the site’s paywall by summarizing locked content.
The New York Times stated that it is standing up for the value of journalism and protecting the rights of writers. However, OpenAI claimed that The New York Times has been selective in what examples it used in the lawsuit and is resisting innovation.
Meanwhile, Anthropic, an AI company, has been accused of using song lyrics without permission to train its AI model, Claude. How did Anthropic respond? Read the full story.