Combating Online Child Sex Abuse and Exploitation in the United Kingdom with a Focus on Artificial Intelligence
Every child with access to the internet or a mobile device is vulnerable to exploitation and abuse. The United Kingdom has nearly 98% internet penetration (2024) per the website Statistica,com, as well as 121 mobile devices per 100 people (2022) according to the CIA Factbook. While the UK has a high-risk rate for Online Child Sexual Exploitation and Abuse (OCSEA), the UK is also a leading force in the fight against this phenomenom.
In 2019, law enforcement agencies in the UK were arresting around 450 individuals and safeguarding over 600 children each month through their efforts to combat OCSEA. It was also estimated that in the UK alone, there were 80,000 people presenting a sexual threat to children online. The National Crime Agency (NCA) showed that in 2018, 2.88 million accounts were registered globally across the most harmful child sexual abuse dark web sites, with at least 5% believed to be registered in the UK.
To tackle OCSEA globally, the UK government, with support from other national governments, leading technological companies, INTERPOL, UN agencies, and civil society organizations, established the WeProtect initiative in 2014. The initiative developed the WeProtect Global Alliance Model National Response (MNR), which provides a comprehensive blueprint for effectively tackling child sexual exploitation and abuse at the national level.
Notably, the Online Safety Act 2023 (the Act) is a recent set of laws that protects children and adults online. It puts a range of new duties on social media companies and search services, making them more responsible for their users’ safety on their platforms. Strong protections in the Act have been designed for children and have been advertised by the UK government as making ‘the UK the safest place in the world to be a child online’. Platforms, for example, are required to prevent children from accessing harmful and age-inappropriate content and provide parents and children with clear and accessible ways to report problems online when they do arise.
The criminal offences introduced by the Act came into effect on 31 January 2024. These offences cover:
These new offences apply directly to the individuals sending them, and convictions have already been made under the cyberflashing and threatening communications offences. To be noted as well, ‘deepfake’ sexually explicit images, are soon to become a criminal offence in the UK.
The categories of harmful content that platforms need to protect children from encountering are set out in the Act. Children must be prevented from accessing Primary Priority Content, and should be given age-appropriate access to Priority Content. The types of content which fall into these categories are set out below.
Primary Priority Content
Priority Content
Ofcom is now the independent regulator of Online Safety. It is tasked with setting out steps providers can take to fulfil their safety duties in codes of practice. It has a broad range of powers to assess and enforce providers’ compliance with the framework. Ofcom is required to take users’ rights into account when setting out steps to take.
On the implementation of the Act, companies can be fined up to £18 million or ten percent of their qualifying worldwide revenue, whichever is greater. Criminal action can be taken against senior managers who fail to ensure companies follow information requests from Ofcom. Ofcom will also be able to hold companies and senior managers (where they are at fault) criminally liable if the provider fails to comply with Ofcom’s enforcement notices in relation to specific child safety duties or to child sexual abuse and exploitation on their service.
In the most extreme cases, with the agreement of the courts, Ofcom will be able to require payment providers, advertisers and internet service providers to stop working with a site, preventing it from generating money or being accessed from the UK.
Challenges
According to the International Centre for Missing and Exploited Children (ICMEC)’s ‘Child Sexual Abuse Material (CSAM): Model Legislation & Global Review’, the UK meets four of the five criteria: It has legislation specific to CSAM “Child Sexual Abuse Material”, it defines Technology Facilitated CSAM Offenses, it criminalizes simple possession, but it does not make Internet Service Provider (ISP) reporting mandatory. It does not explicitly state that ISPs must report suspected child abuse images to law enforcement or to some mandated agency; however, ISPs may be held liable for third party content if it hosts or caches content on its servers and possession may possibly occur in the jurisdiction where the server is located. In the United Kingdom, possession is an offense and as such ISPs will report suspected child abuse material to law enforcement once they are aware of it.
CyberTips from the U.S. National Center for Missing and Exploited Children (NCMEC) are the most standardized worldwide measurement of Online Child Sexual Exploitation and Abuse. Most major platforms, such as Meta, X, Snapchat, Google, and Microsoft, have their servers in the United States and therefore must report suspected Child Sexual Abuse Material files to NCMEC through automated means. CyberTips are files of interest that represent potential cases of Child Sexual Abuse Material which are then designated by country. The UK saw an increase of 140% in suspected CSAM reports between 2019 and 2023, with 178,648 CyberTips in 2023, which represents 262 CyberTips per 100,000 of its population.
Artificial Intelligence Focus
A series of research reports by the Internet Watch Foundation has recently highlighted the increasing use of Artificial Intelligence (AI) to generate CSAM, as well as the improvement of the technology. In October 2023, a dark web child sexual abuse forum was surveyed, and revisited in 2024 for new analysis. It was found that more criminal AI CSAM images were shared – a total of 3,512 AI CSAM images, with 90% of images assessed by IWF analysts realistic enough to be assessed under the same law as real CSAM. Those images contained more images in the most severe category of CSAM in the UK (Category A, which contains penetrative sexual activity, bestiality, or sadism) than in October 2023. This time, 32% of criminal pseudo-photographs were Category A, indicating that perpetrators are generating more complex ‘hardcore’ scenarios.
Other findings from the IWF include that the first AI CSAM videos are now in circulation. These are mostly partially-synthetic – ‘deepfake’ – videos, though some primitive fully-synthetic videos also exist. The IWF has also been encountering an increasing amount of AI-generated content, including AI CSAM, on the clear web. Extensive evidence for the sharing of AI models for generating images of specific children, including known victims of CSAM and famous children, has been identified.
Recommended by LinkedIn
At the time of writing of the IWF reports, the UK prohibition on paedophile manuals continued to exclude pseudo-photographs of children – necessarily encompassing all AI CSAM. This means, therefore, that tutorials and guides shared among members of these communities detailing how to generate realistic AI CSAM remain legal.
As outlined in the IWF October 2023 report, AI CSAM in the UK falls under two different laws, which have different criteria and sentencing guidelines:
• The Protection of Children Act 1978 (as amended by the Criminal Justice and Public Order Act 1994). This law criminalises the taking, distribution and possession of an “indecent photograph or pseudophotograph of a child”.
• The Coroners and Justice Act 2009. This law criminalises the possession of “a prohibited image of a child”. These are nonphotographic – generally cartoons, drawings, animations or similar. The key criterion for classification as criminal under the Protection of Children Act 1978 is that the image “appears to be a photograph”.
Legislation
The UK's four nations – England, Northern Ireland, Scotland, and Wales – each have their own framework of child protection legislation, guidance, and practice to:
Although the child protection systems are different in each nation, they are all based on similar principles.
Each UK nation is responsible for its own policies and laws for education, health and social welfare. This covers most aspects of safeguarding and child protection. Laws are passed to prevent behaviour that can harm children or require action to protect children. Guidance sets out what organisations should do to help keep children safe.
Some rights are recognised at international level through agreements between governments. The UK has signed up to the United Nations Convention on the Rights of the Child (UNCRC) and the European Convention on Human Rights (ECHR) (PDF), both of which set out a number of children’s rights.
The Human Rights Act 1998 sets out the fundamental rights and freedoms that everyone in the UK is entitled to. It incorporates the rights set out in the European Convention on Human Rights (ECHR) (PDF) into domestic British law. The Human Rights Act came into force in the UK in October 2000.
The Equality Act 2010 protects children, young people and adults against discrimination, harassment and victimisation in relation to housing, education, clubs, the provision of services and work. The Act applies to England, Scotland and Wales.
Northern Ireland has a number of anti-discrimination laws relating to the provision of services.
Each of the four nations in the UK has a Children's Commissioner who is responsible for promoting and protecting the rights and best interests of children and young people:
The UK is part of the INHOPE network for helplines, with the Internet Watch Foundation providing an internet Hotline for the public and IT professionals to report potentially criminal online content within its remit and to be the 'notice and takedown' body for this content.
This virtual roundtable scheduled for September 10, 2024, is funded by the U.S. Department of State's Office to Monitor and Combat Trafficking in Persons (J/TIP) and aims to raise awareness and catalyze dialogue and action on this topic in the UK.