Published in Cloud

iPhones are the paedophile’s tool of choice

by on23 July 2024


Under-reporting cases

The fruity cargo cult Apple has been accused of underreporting kiddie porn cases it finds on its iCloud.

According to the Guardian, the NSPCC says Jobs’ Mob is failing to flag and report all kiddie porn on its services, including iCloud, iMessage, and FaceTime.

It shared shared police data showing Apple vastly undercounts kiddie porn found globally.

In 2023, UK police investigated more kiddie porn cases than Apple reported worldwide. Apple reported only 267 instances in 2023, while its peers like Meta and Google reported millions.

The NSPCC’s Richard Collard said that Jobs’ Mob undercounts kiddie porn on its platform and says Apple's child safety efforts need significant improvements.

Heat Initiative’s Sarah Gardner dubbed Apple’s platforms a "black hole" for kiddie porn.

Gardner fears AI integration on Apple’s platforms will worsen the problem. She believes Apple is underreporting and lacks trust and safety teams.

Apple integrated ChatGPT into Siri, iOS, and Mac OS, raising concerns about handling kiddie porn.

Apple has not commented on the NSPCC's report. Last September, it focused on connecting vulnerable users with local resources rather than scanning for illegal content. It is unclear what this means practically, as it seems to be telling children to contact a responsible adult rather than tackling paedophile’s huge iCloud libraries.

We guess this has the bonus of not creating a PR problem with people who do not want it to scan the iCloud looking for criminal activity.  

Last modified on 23 July 2024
Rate this item
(3 votes)