The Metropolitan Police says it is considering using artificial intelligence (AI) to help identify victims of online child sexual abuse and categorise imagery by severity.
The Met said it investigated more than 5,400 child sexual abuse offences over the past year, with more than 1,300 children requiring safeguarding.
The force believes AI could help in identifying potential new victims earlier and shorten the time between intervention.
Deputy Commissioner Matt Jukes said this could “significantly reduce the amount of time officers and staff are exposed to the most distressing material”, adding that “human judgement, strong oversight and victim care remain at the heart of every investigation”.
The force said investigations into online child sexual abuse and exploitation cases were increasing year on year and it was currently managing over 12% of cases nationally.
At present, officers manually review child sexual abuse material in order to link victims to known cases or to identify unknown victims who need safeguarding.
The content is then graded according to its severity across categories A, B and C.
The force says it is in discussions with technology companies about how AI tools could assist with this identification process without subjecting staff to the graphic content.
This is alongside the consideration of another piece of technology which would allow officers to review and risk‑assess 641,000 messages in about 35 minutes.

The Met said it investigated more than 5,400 child sexual abuse offences over the past year. (Getty Images)
The Met added that any use of AI would operate within “strict legal, ethical and safeguarding frameworks”, with specialist officers “retaining decision making responsibility”.
Nevertheless, the use of AI in policing has attracted controversy in the past, particularly over live facial recognition.
The force is facing a High Court challenge from campaigners who say the technology scans faces in public spaces without sufficient safeguards and risks unfair or discriminatory use.
While the Met has described it as a tool to help fight crime, groups have argued it raises serious privacy concerns and lacks clear regulation.
Upgrades to interview suites
The future AI plans come alongside a £10m investment in dedicated Visual Recorded Interview (VRI) suites for victims.
The Met describes these as “spaces which will reduce trauma and improve outcomes for child victims”.
They offer an alternative to traditional police stations for intimate victim interviews to be conducted in calmer, child‑friendly and accessible environments.
A total of 23 VRI suite locations have been selected for renovation, including stations with high demand such as Brixton, Holborn and Bethnal Green.
Six sites are now complete, with Plumstead Police Station chosen as the pilot.
London’s Victims’ Commissioner, Andrea Simon, said: “Refurbished evidence suites that are designed around vulnerable victims and children’s needs is an important step forward. However, improving facilities is only one part of the picture.
“Many victims withdraw from the justice process before a charging decision is made, and to tackle this it is critical that victims are treated with care, dignity and support throughout every interaction with the police.”
Originally written by: Amy Clarke
Source: BBC
Published on: 13 April 2026
Link to original article: Met looking at using AI to help child abuse cases