A new artificial intelligence (AI) toolkit ‘iCOP’ has been designed to identify child sexual abuse content online that can lead police to catch the abusers, researchers report.
According to the study published in the journal Digital Investigation, the ‘iCOP’ toolkit automatically identifies new or previously unknown child sexual abuse media using AI.
“With iCOP, we hope we are giving police the tools they need to catch child sexual abusers early based on what they are sharing online,” said Claudia Peersman from Lancaster University.
“Because originators of such media can be hands-on abusers, their early detection and apprehension can safeguard their victims from further abuse,” Peersman added.
When ‘iCOP’ was tested on real-life cases, it was highly accurate with a false positive rate of only 7.9 per cent for images and 4.3 per cent for videos.
The research showed that there were hundreds of searches for child abuse images every second worldwide, resulting in hundreds of thousands of child sexual abuse images and videos being shared every year.
“People who produce child sexual abuse media are often abusers themselves — the US National Center for Missing and Exploited Children found that 16 per cent of the people who possess such media had directly and physically abused children,” the study noted.