Perplexity

BBC Threatens Legal Action Against AI Firm Perplexity Over Alleged Copyright Breach

The BBC has issued a legal warning to U.S.-based artificial intelligence company Perplexity, accusing it of reproducing BBC content without permission and demanding it cease the practice immediately.

According to a letter addressed to Perplexity CEO Aravind Srinivas, the BBC is insisting the AI firm delete all BBC content it has used, stop accessing its materials, and propose financial compensation for past usage. The broadcaster says Perplexity’s chatbot has been displaying BBC articles “verbatim,” in what it describes as a clear violation of copyright laws and the BBC’s terms of use.

This marks the first time the BBC has taken legal action of this nature against an AI company – reflecting growing concerns across the publishing industry about the unauthorised use of news content to power generative AI tools.

Perplexity, in response, dismissed the BBC’s claims, issuing a statement that read: “The BBC’s claims are just one more part of the overwhelming evidence that the BBC will do anything to preserve Google’s illegal monopoly.” The company did not elaborate on how Google was relevant to the dispute.

The BBC’s legal challenge comes amid heightened scrutiny of AI firms that rely on vast amounts of web content – often scraped by bots – to train or inform their systems. While many publishers, including the BBC, use “robots.txt” files to block such web crawlers from accessing their content, these files are not legally binding, and their instructions are often ignored.

In its letter, the BBC noted that although it had explicitly disallowed Perplexity’s crawlers, the company had continued to extract content, ignoring the restrictions. The broadcaster also cited earlier research which showed that Perplexity, among other AI platforms, had inaccurately summarised BBC stories, breaching editorial guidelines and potentially damaging its reputation with audiences, including UK licence fee payers.

“These are not minor infractions,” the BBC stated, adding that the AI-generated responses risk undermining trust in the BBC by misrepresenting its journalism.

The controversy highlights a broader industry concern. The Professional Publishers Association (PPA), which represents over 300 UK media outlets, expressed “deep concern” over what it called illegal scraping by AI platforms. The PPA warned that such practices could harm the country’s £4.4 billion publishing sector and the tens of thousands it employs.

Perplexity has maintained that it does not train foundation models and insists it uses web content only to generate real-time responses. Describing itself as an “answer engine,” the company claims to search trusted sources online and synthesise results for users. However, it advises users to verify the information provided, acknowledging the risk of errors.

The BBC’s action follows a similar incident in January, when Apple temporarily suspended an AI-generated feature on iPhones after it produced false summaries of BBC News headlines.

As generative AI tools continue to evolve, tensions between content creators and AI developers are expected to intensify, with calls mounting for clearer regulations on how copyrighted material is used in training and deploying AI systems.

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome content in your inbox, every week.

We don’t spam!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *