The National Society for the Prevention of Cruelty to Children (NSPCC), a UK charity focused on child protection, has publicly accused Apple of not reporting all the situations in which it detects photos and videos of child sexual abuse on its platforms.
The British organization says that last year Apple reported only 267 cases in which it detected this type of material, far fewer than the 1.47 million cases reported by Google or the 30.6 million cases reported by Meta (Facebook and Instagram). Apple's figure is also far below the cases reported by TikTok, X, Snapchat, Xbox and PlayStation.
‘There is a worrying discrepancy between the number of child abuse image crimes in the UK that occur on Apple's services and the almost negligible number of global reports of abuse content that they make to the authorities,’ said one of the NSPCC's officials, Richard Collard. ‘Apple is clearly lagging behind many of its peers in the fight against child sexual abuse.’
In response to these statements, Apple told The Guardian that, instead of analyzing users' personal data, it is opting for a different strategy that ’d
