Technology moves fast. Legislation can be slow. For decades, that has felt like a fundamental fact of public life. But the gap between our laws and the world they are supposed to govern feels wider than ever. While the internet has transformed every element of our society, the state has not kept up.
Most of the laws that prohibit the creation and distribution of child sexual abuse imagery have been in place since the 1990s. Back then, Photoshop was in its infancy. The physical photographs that paedophiles shared were no less vile, but they were easier for the police to seize and destroy.
Since then, technology has completely changed the way these people operate, making it possible to create and distribute horrifyingly life-like images and videos of children in seconds. In just one month, the Internet Watch Foundation identified more than 20,000 AI-generated images on one dark web forum.
I firmly believe that AI will be a force for good in our society, helping British businesses grow and saving our doctors, teachers and nurses time so they can provide support to those who need it most. I have committed to bringing forward legislation to ensure that we can safely realise the extraordinary opportunities this technology offers in the decades to come. But the safety of children must always come first.
That is why today, as part of our Plan for Change to deliver safer streets and protect the most vulnerable in society, we are announcing four new offences that will close the legal loopholes that are letting paedophiles escape punishment.
In the past couple of years, we have seen a growing number of people facing prison for creating and distributing this kind of content. But we know that, for every person put behind bars, there are many more who are acting with impunity. In the outdated eyes of the law, some aren’t considered criminals at all. Today, it is actually legal to possess a “paedophile manual” that tells you – in horrifying detail – how you can use AI to sexually abuse children.
With the emergence of AI models capable of creating this kind of heinous content faster than we could have ever imagined, some might say it was inevitable that our laws fell behind. But we cannot pretend that all of this is new, nor can we say that we couldn’t have seen it coming. The fact that previous governments failed to close the loopholes that allow this technology to be used in such a vile way is unforgivable. Because every day of delay has come at a heartbreaking human cost – a cost shouldered by our children.
A 15-year-old girl rang the NSPCC recently. An online stranger had edited photos from her social media to make fake nude images. The images showed her face and, in the background, you could see her bedroom. The girl was terrified that someone would send them to her parents and, worse still, the pictures were so convincing that she was scared her parents wouldn’t believe that they were fake.
There are thousands of stories like this happening behind bedroom doors across Britain. Children being exploited. Parents who lack the knowledge or the power to stop it. Every one of them is evidence of the catastrophic social and legal failures of the past decade.
That is why we are taking urgent action through a raft of new offences that will finally close the legal loopholes that paedophiles are exploiting to ruin young lives. If you are found in possession of a “paedophile manual”, you will now face years in jail. For the first time, we’ll imprison the people who are making the AI models that generate child sexual abuse material. Those who run or moderate websites where paedophiles share advice on how to groom children and avoid detection will spend a decade behind bars.
But we also know that this is a global problem, with British criminals travelling abroad to film child sexual abuse in places where the police won’t stop them and bringing images and videos back across our borders on their phones and laptops. That is why we’re giving our border force the power to scan the devices of those they suspect to possess child sexual abuse material against the Child Abuse Images Database. This won’t just see more paedophiles put in prison; it will stop the tidal wave of illegal content inundating our country in its tracks.
These measures are the first of their kind in the world. We are going further than any other country in the battle against AI-generated child sexual abuse material. But we are also sending a clear message to anyone who seeks to do children harm or believes that the law is too slow to catch them: you won’t be able to cower behind a keyboard any more. Soon there will be nowhere to hide.
-
Peter Kyle is secretary of state for science, innovation and technology