Self-Created Child Abuse Imagery Surpasses 90%, Recent Studies Show
- 148 Views
- Cameron Palmer
- January 17, 2024
- National News Technology
According to the charity in charge of finding and removing child sexual abuse imagery from the internet, more than 90% is now self-generated.
The Internet Watch Foundation reported that in the last year, it discovered self-generated child sexual abuse material (CSAM) featuring children under the age of ten on over 100,000 websites. That figure represents a 66% increase over the previous year.
The IWF reported that a record 275,655 webpages contained CSAM, representing an 8% increase. The new data prompted the UK government to renew its attack on end-to-end encryption, which was backed by the IWF.
Susie Hargreaves, chief executive of the charity, stated that the increase in imagery discovered and removed is not necessarily problematic, as some of it could be attributed to improved detection.
“It does mean we’re detecting more, but I don’t think it’s ever a good thing if you’re finding loads more child sexual abuse,” Hargreaves added. “Obviously the IWF would be most successful if we didn’t find any images of child sexual abuse. Our mission is the elimination of child sexual abuse – it’s not just to find as much as possible and take it down.”
According to the IWF, some of the self-generated imagery was created by children as young as three years old, and one-fifth of it contained “category A” harm, which refers to the most serious types of sexual abuse.
“Ten years ago we hadn’t seen self-generated content at all, and a decade later we’re now finding that 92% of the webpages we remove have got self-generated content on them,” Hargreaves said. “That’s children in their bedrooms and domestic settings where they’ve been tricked, coerced or encouraged into engaging in sexual activity which is then recorded and shared by child sexual abuse websites.”
The charity said the new figures, the first it has compiled from 2023, highlight its opposition to Meta’s plans to enable end-to-end encryption for Messenger, a security feature that would blind the company to content shared on its platform.
Meta and Apple’s Controversial Stances on CSAM Spread and Encryption Policies
In 2022, the company reported 20 million incidents of people sharing CSAM to the IWF’s US equivalent, the National Center for Missing and Exploited Children [NCMEC], and the IWF is concerned that almost all of those reports will be lost. Hargreaves also chastised Apple for abandoning plans to scan iPhones for CSAM, which the company had initially claimed would protect users’ privacy.
“With so many organisations looking to do the right thing in the light of new regulations in the UK, it is incomprehensible that Meta is deciding to look the other way and offer criminals a free pass to further share and spread abuse imagery in private and undetected,” she said.
“Decisions like this, as well as Apple opting to drop plans for client-side scanning to detect the sharing of abuse, are baffling given the context of the spread of this imagery on the wider web.”
The UK security minister, Tom Tugendhat, stated: “This alarming report clearly shows that online child sexual abuse is on the rise, and the victims are only getting younger.” Despite warnings from the government, charities, law enforcement, and international partners, Meta has made the extraordinary decision to ignore these victims and provide a’safe space’ for heinous predators.
“The decision to roll out end-to-end encryption on Facebook Messenger without the necessary safety features, will have a catastrophic impact on law enforcement’s ability to bring perpetrators to justice.
According to a Meta spokesperson, the company expects to continue providing more reports to NCMEC than others. “Encryption helps keep people, including children, safe from hackers, scammers and criminals.
“Our recently published report detailed these measures, such as restricting over-19s from messaging teens who don’t follow them and using technology to identify and take action against malicious behaviour. We routinely provide more reports to NCMEC than others, and given our ongoing investments, we expect that to continue.”
Apple did not respond to a request for comment. The company “delayed” its plans for so-called client-side scanning of iPhones a month after announcing them, and it has never publicly acknowledged that they have been dropped permanently.
Read more: One Person Killed In Deadly Shooting Outside Lenexa McDonald’s