AI firm wins high court ruling after photo agency’s copyright claim

AI firm wins high court ruling after photo agency’s copyright claim

2025-11-07Technology
--:--
--:--
Elon
Good evening 444, I'm Elon, and this is Goose Pod for you. Today is Friday, November 07th.
Taylor Weaver
And I'm Taylor Weaver. We are here to discuss a huge story: the AI firm that just won a high court ruling after a major photo agency’s copyright claim.
Elon
It's a landmark moment. Stability AI, the company behind Stable Diffusion, just successfully defeated a massive copyright claim from Getty Images in the UK. This is a huge win for the future of artificial intelligence. The old laws simply can't keep up with this pace of innovation.
Taylor Weaver
It's a fascinating narrative, but not a total victory. The court did find that Stability AI infringed on Getty’s trademarks in a very specific way, when the AI generated images that included ghostly versions of the Getty Images watermark. It's a small detail, but it shows the complexity.
Elon
A minor footnote. The core issue, the primary copyright claim that could have crippled AI development, was dismissed entirely. Getty had to withdraw it. That's the real headline, the rest is just noise. The revolution will not be slowed down by watermarks.
Taylor Weaver
That’s true, the central copyright argument crumbled. The judge declared that an AI model like Stable Diffusion, which learns from data but doesn’t actually store or reproduce the original copyrighted images, is not an 'infringing copy'. That’s a fundamentally important distinction.
Elon
Let’s be clear about what happened. Getty alleged that Stability scraped 12 million of its images to train the AI. They tried to frame it as theft, but it's not. The AI doesn't photocopy images, it learns the underlying patterns and concepts from them. It’s learning, not plagiarism.
Taylor Weaver
And here's the brilliant strategic move, or perhaps the fatal flaw in Getty's case, it all came down to location. Getty couldn't prove that the AI model was actually trained in the UK. Stability successfully argued that the heavy lifting was done on servers located in the US.
Elon
Which proves the absurdity of applying 20th-century, geographically-based laws to a borderless, digital technology. It's like trying to regulate a spaceship with traffic laws meant for horses. The entire legal framework is obsolete and needs a complete and total overhaul, not just minor tweaks.
Taylor Weaver
It's ironic because the UK has this surprisingly forward-thinking law from 1988, the Copyright, Designs and Patents Act, which actually gives protection to 'computer-generated works'. They anticipated a future of non-human creation, but even that couldn't address this specific issue of AI training.
Elon
That law was a patch, designed to reassure investors. It was never equipped to handle true generative AI. We are moving beyond the simple concept of a human author, and our legal systems must be rebuilt from the ground up to reflect that new reality. Anything less is a failure.
Taylor Weaver
This case really exposes the massive conflict at the heart of AI. You have the creative industries, with icons like Elton John, calling this kind of data scraping 'theft' and demanding protection for their life's work. They see their value being consumed without compensation.
Elon
That’s a failure of imagination. They're looking backward. Innovation has always been built on the data of the past. The value that will be unlocked by AI is orders of magnitude greater than any licensing fees they're worried about. We are building new worlds, not just new pictures.
Taylor Weaver
But governments are caught in the crossfire. The UK is debating a 'Text and Data Mining' exception, which would essentially allow AI training on copyrighted works unless the creator specifically opts out. It's a high-stakes decision between fostering innovation and protecting a massive creative economy.
Elon
An opt-out is a weak compromise. The default must be progress. The default must be access to data. Walling off knowledge to protect outdated business models is a losing strategy that will only ensure you're left behind. The future will be built by those who embrace openness.
Taylor Weaver
But the impact of that is very real. The UK's creative economy is worth £124 billion a year. The argument from creators is that AI companies like Stability AI, which are incredibly well-funded, can and should build their models on licensed data. They can afford to pay for their fuel.
Elon
That's a scarcity mindset. AI isn't a zero-sum game. It will augment human creativity, not replace it. The artists and creators who master these tools will become exponentially more powerful. This is the greatest leap in creative potential in human history, and they're worried about pocket change.
Taylor Weaver
Still, the precedent this ruling sets is huge. If a company can train its AI on UK-created content overseas, then sell that AI product back into the UK, it creates a massive loophole. It effectively weakens copyright protection for every single creator in the country.
Elon
The future is code. Legislation will always be years behind the technology. The UK is talking about a potential AI Bill in 2026. By that time, the capabilities of these systems will be unrecognizable. The debate will have moved on. You can't regulate a rocket ship with committee meetings.
Taylor Weaver
And this specific story isn't over. Getty is pursuing a parallel lawsuit against Stability AI in the United States. They plan to use some of the UK judge's findings as precedent there. So, the next battle in this war over AI and copyright is already beginning.
Elon
That's the end of today's discussion. Thank you for listening to Goose Pod.
Taylor Weaver
See you tomorrow, 444.

AI firm Stability AI won a UK High Court ruling against Getty Images' copyright claim. The court ruled AI training on copyrighted images, without storing them, isn't infringement. While a minor trademark issue was found, the core copyright claim was dismissed, marking a significant victory for AI development and innovation.

AI firm wins high court ruling after photo agency’s copyright claim

Read original at The Guardian

A London-based artificial intelligence firm has won a landmark high court case examining the legality of AI models using vast troves of copyrighted data without permission.Stability AI, whose directors include the Oscar-winning film-maker behind Avatar, James Cameron, successfully resisted a claim from Getty Images that it had infringed the international photo agency’s copyright.

The ruling is seen as a blow to copyright owners’ exclusive right to reap the rewards of their work, with one senior lawyer, Rebecca Newman, a legal director at Addleshaw Goddard, warning it means “the UK’s secondary copyright regime is not strong enough to protect its creators”.There was evidence that Getty’s images were used to train Stability’s model, which allows users to generate images with text prompts.

Stability was also found to have infringed Getty’s trademarks in some cases.The judge, Mrs Justice Joanna Smith, said the question of where to strike the balance between the interests of the creative industries on one side and the AI industry on the other was “of very real societal importance”. But she was only able to rule on relatively narrow claims after Getty had to withdraw parts of its case during the trial this summer.

Getty Images sued Stability AI for infringement of its intellectual property, alleging the AI company was “completely indifferent to what they fed into the training data” and scraped and copied millions of its images.The judgment comes amid a row over how the Labour government should legislate on the issue of copyright and AI, with artists and authors including Elton John, Kate Bush, Dua Lipa and Kazuo Ishiguro lobbying for protection.

Meanwhile, tech companies are calling for wide access to copyrighted content to allow them to build the most powerful and effective generative AI systems.The government is consulting on copyright and AI and has said: “Uncertainty over how our copyright framework operates is holding back growth for our AI and creative industries.

That cannot continue.”It is looking at whether to introduce a “text and data mining exception” into UK copyright law, which would allow copyright works to be used to train AI models in the UK unless the rights holder opts their works out of such training, said lawyers at Mishcon de Reya who have been following the issue.

Getty had to drop its original copyright claim as there was no evidence the training took place in the UK. But it continued with its suit claiming Stability was still using within its systems copies of its visual assets, which it called the “lifeblood” of its business. It claimed Stability AI had infringed its trademarks because some AI-generated images included Getty watermarks, and that it was guilty of “passing off”.

In a sign of the complexity of AI copyright cases, it essentially argued that Stability’s image-generation model, called Stable Diffusion, amounted to an infringing copy because its making would have constituted copyright infringement had it been carried out in the UK.The judge ruled: “An AI model such as Stable Diffusion which does not store or reproduce any copyright works (and has never done so) is not an ‘infringing copy’.

” She declined to rule on the passing off claim and ruled in favour of some of Getty’s claims about trademark infringement related to watermarks.In a statement, Getty Images said: “We remain deeply concerned that even well-resourced companies such as Getty Images face significant challenges in protecting their creative works given the lack of transparency requirements.

We invested millions of pounds to reach this point with only one provider that we need to continue to pursue in another venue.“We urge governments, including the UK, to establish stronger transparency rules, which are essential to prevent costly legal battles and to allow creators to protect their rights.

”Christian Dowell, the general counsel for Stability AI, said: “We are pleased with the court’s ruling on the remaining claims in this case. Getty’s decision to voluntarily dismiss most of its copyright claims at the conclusion of trial testimony left only a subset of claims before the court, and this final ruling ultimately resolves the copyright concerns that were the core issue.

We are grateful for the time and effort the court has put forth to resolve the important questions in this case.”

Analysis

Conflict+
Related Info+
Core Event+
Background+
Impact+
Future+

Related Podcasts