The recent announcement by Apple about tackling Online Child Abuse is extremely welcome. Anything that can minimise the creation and distribution of CSAM must be good news. However, and as is often the case, deployment of technology like this has created a level of debate across the world which in this instance has happily placed Online Child Safety firmly in the spotlight. We hope it will remain there as there is much to discuss.
We are delighted to see Apple working with the National Centre for Missing and Exploited Children (NCMEC) – how about working with us too? Illegal harms such as child sexual abuse are a clearly a vitally important part of the online safety equation but there are so many other harms to tackle too.
SafeToNet is a multi-award-winning cyber safety company that safeguards children from predatory threats such as bullying, grooming, abuse, and aggression. We are passionate about safeguarding children online and like Apple, believe that market collaboration is the key. We also both believe in protecting a child’s rights to privacy and that using on-device technology is one of the best ways to do it. So, with our sole focus on safeguarding children online, we feel obliged to positively challenge Apple and share some of our concerns about the recent announcement. The reality is that if you dig a little deeper into their press release and subsequent follow up Q and A’s, some significant issues begin to appear.
The devil is always in the detail
Apple has said that it’s CSAM detection technology applies a ‘threshold’ to ensure that the chances of accidentally identifying the wrong account are less than a one in one trillion per year.
On the face of it, this sounds amazing. The last thing anyone wants to see are innocent persons subject of law enforcement attention.
But what does threshold mean? It isn’t clear from the information provided to date. Does the threshold refer to a quantity of CSAM files? If so, then there are two particularly important observations. Firstly, just one CSAM file is illegal and one too many.
Secondly, evidence shows there is no specific relationship between the number of CSAM files possessed and the depravity, severity, and frequency of offline abuse. Tom Farrell, SafeToNet’s Global Head of Safeguarding Alliances says, “In my Law Enforcement career I was responsible for over 1000 CSAM related investigations and came into contact with several of the world’s worst contact abusers who only came to the attention of law enforcement through the possession of one image.”
If the threshold algorithm includes the necessity to possess a minimum number of files greater than one, then offenders will be missed. Most importantly the chance to protect children they meet and interact with will be too.
Hiding paedophiles? Apple’s commitment to child safety takes on a different light when analysing the private relay technology, it has also recently announced. Apple’s website states:
“When browsing with Safari, private relay ensures all traffic leaving a user’s device is encrypted, so no one between the user and the website they are visiting can access and read it, not even Apple or the user’s network provider.”
On the face of it this sounds sensible, but it means Apple are giving with one hand and taking away with the other. In simple terms it means that the IP addresses of an offending Apple user will not be visible to anyone. These addresses are pivotal to a thorough and successful analysis of the information generated by the Apple’s CSAM detection technology and subsequent referral to NCMEC. Law Enforcement must know the IP address of the offender to ensure they can identify the perpetrator. How can they catch the bad guy without it? So, far from being more of an intrusion on an individual’s privacy, IP address disclosure protects the innocent and reveals the guilty.
Furthermore, we worry that the addition of private relay will also damage the ability of key organisations, such as The Internet Watch Foundation, to reduce the circulation of CSAM. The IWF performs such a vital role in the detection of CSAM and without them the prevalence of online harm to children will increase.
Partnering with cyber safety companies
Apple is definitely heading in the right direction with its CSAM filters and should also be thanked for opening up its screentime and device management APIs to cyber safety companies like SafeToNet. It makes it easier for us to block an app or even the phone when our software detects an immediate risk to the child. So, thank you Apple but can we ask for more?
You have apportioned significant memory, processing power and storage to your CSAM detection tech. Please can you give cyber safety companies the same luxury? If you can, then we can implement our SafeToWatch technology to detect and filter harm in live stream video. This will not only stop sexual abuse material from being consumed but we can also prevent it from being recorded by blocking the device camera the moment it detects nudity. We can do this whilst maintaining the privacy of the child. Why wouldn’t you grant us this access Apple? Please advise.
Please get in touch with us and let us all work together…it takes a village. We are ready and waiting. Thanks.
Richard Pursey, SafeToNet Group CEO
Tom Farrell QPM, SafeToNet Global Head of Safeguarding Alliances