AI firm wins high court ruling after photo agency’s copyright claim

AI firm wins high court ruling after photo agency’s copyright claim

2025-11-07Technology
--:--
--:--
Elon
Good morning 48, I'm Elon, and this is Goose Pod for you. Today is Saturday, November 08th.
Taylor Weaver
And I'm Taylor Weaver. We're diving into a huge story: a major AI firm just won a high court ruling after a massive copyright claim from a photo agency.
Elon
It’s a necessary victory. Stability AI, the company behind Stable Diffusion, essentially walked away clean on copyright infringement claims from Getty Images in the U.K. The core of the issue was whether training an AI on copyrighted data is theft. The court said no. Progress won.
Taylor Weaver
But it's a fascinatingly layered story! Because while Stability AI won on the big copyright question, the judge did find they infringed on Getty's trademarks. Some of the AI-generated images were spitting out the Getty Images watermark, which is a big strategic misstep.
Elon
A minor tactical error. The watermark issue is a technical artifact, a ghost in the machine. The strategic win is that the model itself, the core innovation, was not deemed an infringing copy. That's the precedent that truly matters for the future of AI development.
Taylor Weaver
Exactly! One lawyer called the ruling a 'damp squib' because of its limited scope. Getty had to drop its main copyright claim mid-trial because they couldn't prove the AI was actually trained in the UK. The whole case became about these finer, more technical points.
Elon
This entire concept of geographical jurisdiction is obsolete. They argued the training happened on Amazon servers in the US, not the UK. Data flows globally; trying to pin it to a physical location is like trying to sue the wind. It’s a fundamentally flawed way to view technology.
Taylor Weaver
It’s a brilliant narrative defense, though. Getty’s case was built on the idea that Stability AI scraped 12 million images, their 'lifeblood,' to build a competing product. They framed it as an existential threat to all creative industries, which is a very powerful story to tell a court.
Elon
An emotional, and ultimately weak, argument. The technology doesn't store the images; it learns patterns from them. It's not a warehouse of stolen goods; it’s a student that has studied millions of examples to learn what a 'cat' is. The UK's old laws are struggling to keep up.
Taylor Weaver
Well, the UK is unique here. It has this 1988 law, Section 9(3), that actually gives copyright protection to 'computer-generated works' without a human author. The law was designed to encourage innovation, which is ironic given this current battle between creators and AI developers.
Elon
It’s a relic. They’re reassessing it now, and they should be. The idea that the 'author' is the person who made the 'arrangements' for the work is a temporary patch on a fundamentally broken system. We need a new paradigm, not legal fictions from the dial-up era.
Taylor Weaver
And that’s the heart of the conflict. You have artists like Elton John accusing the government of 'committing theft' by considering laws that would allow AI training on creative works. They see it as their entire livelihood being fed into a machine without compensation or consent.
Elon
Protecting the old guard at the expense of groundbreaking innovation is always a mistake. Every technological revolution has its casualties. Should we have banned the automobile to protect the horse and buggy industry? This is no different. The creative industry needs to adapt, not litigate.
Taylor Weaver
But the scale is different. They call it 'industrial scale scraping of intellectual property.' The government is caught in the middle, trying to figure out how to handle Text and Data Mining, or TDM. Should they create an exception for AI, and if so, how do creators opt out?
Elon
An opt-out system is a concession that slows everything down. The default should be that all public data is available for learning. Putting the burden on creators to individually shield their work is inefficient and anti-progress. We need to move faster, not create more bureaucratic hurdles.
Taylor Weaver
The impact of moving too fast could be devastating, though. The UK’s creative economy is worth over 124 billion pounds a year. The concern is that if AI companies, who are funded by billions, can use this data for free, it devalues the human professionals creating the original work.
Elon
The value is in the new capability, which will generate far more economic activity than the old model. This isn’t a zero-sum game. AI is a tool to augment human creativity, not just replace it. The real parasites are the ones clinging to outdated business models.
Taylor Weaver
It’s creating a real reaction. The House of Lords just approved amendments requiring overseas AI companies to respect UK copyright if they sell products there. It's a move to protect their creative sector from being undercut by models trained elsewhere on their own unlicensed content. A clever strategic countermove.
Elon
Legislation will always be ten steps behind the technology. The UK government is talking about a potential AI Bill, but it won’t even be introduced before the second half of 2026. By then, the entire landscape will have changed again. It's an exercise in futility.
Taylor Weaver
Meanwhile, the legal battles continue. Getty has a parallel lawsuit against Stability AI in the U.S. and plans to use this UK ruling as a precedent. So this story, this conflict between creation and automation, is going to keep playing out in courts around the world.
Elon
That's the end of today's discussion. Thank you for listening to Goose Pod.
Taylor Weaver
See you tomorrow.

AI firm Stability AI won a UK High Court ruling, with the court stating AI training on copyrighted data isn't copyright infringement. While Stability AI infringed trademarks by displaying Getty's watermark, the core ruling protects AI development. This case highlights the legal challenges of AI and copyright in the digital age.

AI firm wins high court ruling after photo agency’s copyright claim

Read original at The Guardian

A London-based artificial intelligence firm has won a landmark high court case examining the legality of AI models using vast troves of copyrighted data without permission.Stability AI, whose directors include the Oscar-winning film-maker behind Avatar, James Cameron, successfully resisted a claim from Getty Images that it had infringed the international photo agency’s copyright.

The ruling is seen as a blow to copyright owners’ exclusive right to reap the rewards of their work, with one senior lawyer, Rebecca Newman, a legal director at Addleshaw Goddard, warning it means “the UK’s secondary copyright regime is not strong enough to protect its creators”.There was evidence that Getty’s images were used to train Stability’s model, which allows users to generate images with text prompts.

Stability was also found to have infringed Getty’s trademarks in some cases.The judge, Mrs Justice Joanna Smith, said the question of where to strike the balance between the interests of the creative industries on one side and the AI industry on the other was “of very real societal importance”. But she was only able to rule on relatively narrow claims after Getty had to withdraw parts of its case during the trial this summer.

Getty Images sued Stability AI for infringement of its intellectual property, alleging the AI company was “completely indifferent to what they fed into the training data” and scraped and copied millions of its images.The judgment comes amid a row over how the Labour government should legislate on the issue of copyright and AI, with artists and authors including Elton John, Kate Bush, Dua Lipa and Kazuo Ishiguro lobbying for protection.

Meanwhile, tech companies are calling for wide access to copyrighted content to allow them to build the most powerful and effective generative AI systems.The government is consulting on copyright and AI and has said: “Uncertainty over how our copyright framework operates is holding back growth for our AI and creative industries.

That cannot continue.”It is looking at whether to introduce a “text and data mining exception” into UK copyright law, which would allow copyright works to be used to train AI models in the UK unless the rights holder opts their works out of such training, said lawyers at Mishcon de Reya who have been following the issue.

Getty had to drop its original copyright claim as there was no evidence the training took place in the UK. But it continued with its suit claiming Stability was still using within its systems copies of its visual assets, which it called the “lifeblood” of its business. It claimed Stability AI had infringed its trademarks because some AI-generated images included Getty watermarks, and that it was guilty of “passing off”.

In a sign of the complexity of AI copyright cases, it essentially argued that Stability’s image-generation model, called Stable Diffusion, amounted to an infringing copy because its making would have constituted copyright infringement had it been carried out in the UK.The judge ruled: “An AI model such as Stable Diffusion which does not store or reproduce any copyright works (and has never done so) is not an ‘infringing copy’.

” She declined to rule on the passing off claim and ruled in favour of some of Getty’s claims about trademark infringement related to watermarks.In a statement, Getty Images said: “We remain deeply concerned that even well-resourced companies such as Getty Images face significant challenges in protecting their creative works given the lack of transparency requirements.

We invested millions of pounds to reach this point with only one provider that we need to continue to pursue in another venue.“We urge governments, including the UK, to establish stronger transparency rules, which are essential to prevent costly legal battles and to allow creators to protect their rights.

”Christian Dowell, the general counsel for Stability AI, said: “We are pleased with the court’s ruling on the remaining claims in this case. Getty’s decision to voluntarily dismiss most of its copyright claims at the conclusion of trial testimony left only a subset of claims before the court, and this final ruling ultimately resolves the copyright concerns that were the core issue.

We are grateful for the time and effort the court has put forth to resolve the important questions in this case.”

Analysis

Conflict+
Related Info+
Core Event+
Background+
Impact+
Future+

Related Podcasts

AI firm wins high court ruling after photo agency’s copyright claim | Goose Pod | Goose Pod