April 15, 2024
A.I

As The New York Times Litigates Over AI, Many Media Companies Will Be Liquidated: US Congress Warned

Conde Nast CEO Roger Lynch has warned that “many” media companies could go out of business in the time it would take for litigation against generative AI companies to go to court.

Instead, he urged the US Congress to take “immediate action” by making a simple clarification that publishers must be compensated for the use of their content in the training and production of generative AI tools and that they must be shut down. license agreements for future use.

If lawmakers decide that the “fair use” argument made by OpenAI and others in the generative AI industry is wrong, Lynch suggested they won’t need to do anything else as the free market will fix the situation.

OpenAI has maintained that its use of the publisher’s content is legal because it is covered by “fair use,” which in US copyright law protects “transformative” work that adds something new and does not replace the work. original.

Lynch spoke at a US Senate subcommittee hearing last week about the future of journalism in a post-AI world.


The hearing came two weeks after The New York Times launched a surprise lawsuit against ChatGPT creator OpenAI for infringing its copyright and threatening its ability to produce quality journalism by affecting its advertising, subscription, and affiliate revenue. and licenses.

Lynch told the hearing that generative AI represents a threat that is “real and high-consequence” to the media and publishing industries.

Content from our partners

He said: “Big tech companies understand that time is on their side, that litigation is slow and, for many publishers, prohibitively expensive.

“The time to act is now and what is at stake is nothing less than the continued viability of journalism.”

When asked about the difficulties of smaller publishers with fewer resources to conduct ad hoc negotiations and litigation, Lynch added: “Certainly that would be a big concern, is the amount of time it would take to litigate, appeal and re-litigate. the courts”. , appeal, perhaps ultimately go to the Supreme Court to reach a settlement; between now and then, there would be many, many businesses, media companies that would go out of business.”

Condé Nast itself cut at least 300 jobs, or 5% of its global workforce, last year. Press Gazette analysis found that at least 8,000 jobs in the journalism industry were axed in the UK, US and Canada.

Lynch argued that securing licensing deals would “directly result in future investments in content”, pointing to the $140 million that would flow into the Australian journalism industry as a result of legislation requiring platforms to pay for news content.

Lynch said that generative AI tools, by providing users with “sometimes textual, sometimes paraphrased” content, empower people to turn to them instead of editors for information and “unlike traditional search, they keep customers within their experiences, depriving us of the opportunity. to connect directly with our audiences, personalize our content for them and generate advertising and subscription revenue, sales opportunities and other valuable data.

“By misappropriating our content in this way, they are directly threatening the viability of the media ecosystem.”

Lynch, who opposes tech companies’ views on “fair use,” said it was “designed to allow for criticism, parody, scholarship, investigation and journalistic reporting.” The law is clear that it is not fair use when there is an adverse effect on the market for the copyrighted material…Fair use is not simply intended to enrich technology companies that prefer not to pay.

“If content is the raw material of AI generation, then it must be licensed and compensated, just as engineers and computer time must be paid and legally acquired.”

When asked what he thought Congress could do to protect copyright law, Lynch said: “I think, simply, if Congress could clarify that the use of our content and other publishers’ content for training and production of AI models is not a legitimate use, so the free market will take care of the rest. Just like in the music industry where I worked. [as CEO of streaming service Pandora], in film and television, sports rights. It may allow private negotiations…”

OpenAI has also repeatedly noted that since last year it has allowed publishers to opt out of using its content “because it’s the right thing to do.”

But Lynch said this came “too late” as publishers’ content had already been used to train the tools.

“The only thing the exclusions will do is prevent a new competitor from training new models to compete with them,” he said.

In terms of opting out of having your content used in the output of an AI tool, Lynch raised the fact that for search companies like Google and Microsoft Bing (which uses ChatGPT), “if you opt out At the exit, you must opt ​​out of all your search. Search is the lifeblood of digital publishers; For most digital publishers, half or more of their traffic originates from a search engine. “If you shut down your search engine, you shut down your business.”

However, both Google and Bing have explained to publishers how to opt out of tracking without being blocked from their search results.

Danielle Coffey, president and CEO of the News/Media Alliance, which represents more than 2,200 publishers in the U.S., told the subcommittee that generative AI has proven to be an “exacerbation of an existing problem where you can’t generate income, but in fact it is diverted from those who create the original work.”

He suggested several actions Congress should take, including requiring AI developers to explain how their models produce results and provide links to cited materials and preventing them from conditioning or modifying the provision of other services such as advertising or search rankings as a result. of the content owners. decide whether or not to allow the use of its content.

At the same committee, National Association of Broadcasters President and CEO Curtis LeGeyt said AI, including deepfakes, “jeopardizes” the “unique level of trust” built by local radio and television stations. .

It also warned that using broadcasters’ content in artificial intelligence environments could increase their costs and threaten their own revenue streams.

“Broadcasters have already seen numerous examples where content created by journalists has been ingested and regurgitated by AI robots, with little or no attribution,” he said.

“Not only are broadcasters losing compensation for their own work product, but this unauthorized use actually increases costs for local stations due to additional vetting of stories and footage and the costs associated with protecting broadcast content.” .

“Broadcasters’ expressive content is particularly valuable for AI incorporation precisely because it is vetted and trusted. If broadcasters do not receive compensation for the use of their valuable and expressive works, they will be less able to invest in the creation of local news content.”

Email pged@pressgazette.co.uk to point out errors, provide story tips, or submit a letter for publication on our blog “Letters Page”

Leave a Reply

Your email address will not be published. Required fields are marked *