Published in News

OpenAi claims New York Times paid hackers

by on28 February 2024

Court case depends on it

OpenAI has asked a judge to chuck parts of the copycat lawsuit .filed, saying that Apple's favourite paper paid someone to hack OpenAI to make up a reason for suing them.

OpenAI said there is no way an average person would never use OpenAI and break the terms of use.

According to OpenAI: "The truth, which will come out in this case, is that the Times paid someone to hack OpenAI's products. It took them tens of thousands of tries to make the odd results that makeup Exhibit J to the Complaint.

They could do so only by targeting and using a bug (which OpenAI has said they will fix) by using dodgy prompts that break OpenAI's terms of use."

OpenAI goes on to say that the NYT took extreme steps that were not at all the standard way of using OpenAI's products to get "word-for-word passages" from The New York Times, including giving parts of the text that they were trying to get OpenAI to copy.

They called the Times's claims that the news industry is in danger from OpenAI "pure fiction", saying: "Normal people do not use OpenAI's products in this way. The Times's idea that the made-up attacks of its paid thug show that the Fourth Estate is somehow at risk from this tech is pure fiction."

So too is it a hint that the public might copy its agent's weird activity."

The bit about "its agent's weird activity" refers to the "paid thug" that OpenAI says The New York Times used to make a situation where OpenAI copied text.

OpenAI's filing suggests that the Times is trying to "take over facts" and the "rules of language", which refers to the idea that using text data to train AI models to make new content does not copy because that's a new use.

Artists are having a hard time in court saying copycat because AI is increasingly seen as new, which is the principle where a copied material is changed with a new meaning or used differently, like in a joke, comment or in making something new out of it.

Last modified on 28 February 2024
Rate this item
(0 votes)