The Bookseller and the Publisher

I listened with increasing concern to the latest safeguarding podcast published by the SafeToNet Foundation. Safeguarding children online is a complex mix of technology, law as well as ethics and culture. The podcast episode “Agents of the State”, focusses on decades old and convoluted US legislation known as “Section 230”, which might sound like an obscure MI5 department in a John Le Carré spy thriller, but it’s actually the single most important internet-related law ever.

The Wikipedia entry for this law, frequently referred to as “The Twenty-Six Words That Created the Internet”, says this: 

Section 230 of the Communications Decency Act (CDA) of 1996 (a common name for Title V of the Telecommunications Act of 1996) is a landmark piece of Internet legislation in the United States, codified at 47 U.S.C. § 230. Section 230(c)(1) provides immunity from liability for providers and users of an “interactive computer service” who publish information provided by third-party users.”

And, critically (the famous 26 words):

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

All of which is a bit indigestible, so I’ll try to break it down into bite sized chunks.

Unintended radioactive content

It was recognised early on that the internet, or more precisely the World Wide Web, would be used to disseminate what many people regard as “child pornography”, but what is actually Child Sex Abuse Material (CSAM) which was then, as now, illegal. The problem that the service providers, the Compuservs and the Prodigys of the world of the day had, was that from a legal standpoint, they could not handle or touch this content; it was illegal for them to do so.

This may sound bizarre, but let’s think it through. It was illegal for individuals to distribute this material online, but the cloak of anonymity that the internet services gave them enabled them to do it with little to no risk. Just as it was illegal for individuals to distribute this content online, so it was for organisations.

Why would an organisation want to distribute this content? Because the law said they should, so as to assist with law enforcement. At least the law said that once they are aware of this content, they needed to report it to an American organisation called the National Centre for Missing and Exploited Children (NCMEC), and NCMEC would then deal with it.

But forwarding this material to NCMEC even with the best of intentions and as per the law, would still be regarded as distributing child sex abuse material. An individual may be prepared to take the risk of distributing CSAM, but an organisation couldn’t place itself at risk of legal action, despite them following “the law” with the best of intent.

You can see why many regard CSAM as “radioactive content”. Something had to change.

The bookseller vs the publisher

Plenty of laws existed at the time which governed content. For example if a newspaper published an article which defamed someone, then the newspaper could be sued. Although earlier in the 20th century and resolved in the 1960s, the book Lady Chatterley’s Lover was considered too obscene by the laws of the 1920s and could only be published in an edited form. In neither case could the book seller, the book store, be sued for selling these publications.

So what were these new online services? Were they publishers of content, or were they more akin to bookstores where people paid to access content supplied by third parties (US services at the time being subscription-based)? In the end, the law decided they were more akin to booksellers and could not be held liable for the content on their platforms or services, in the same way a bookseller couldn’t be sued for the content of a book. This is the famous (or infamous) Section 230.

And this made sense in one way: it allowed the service providers to then forward otherwise illegal CSAM to NCMEC so that it could be appropriately dealt with. So.. a result?

Unintended consequences 

Not quite. The business model for internet services switched from monthly subscription models, where reputation was all important as they were demanding a payment to access, to an advertising model where the number of adverts shown was king, but access was free. And the easiest way to ensure the largest possible number of ads served was to simply allow the highest number of accounts to be created on any given platform, whether these were fake (anonymous), nefarious accounts or not. The profit motive held sway.

Section 230, remember, gave them immunity from liability for the third party content published on and distributed through their online “bookstore”. This was unconditional. The law does not say they must take proactive steps to find and remove this content. Section 230 did not say something like “You have this immunity from liability as long as you take proactive steps to find, remove and report CSAM, and if you don’t follow best practice in so doing you’ll lose your immunity”.

Which is a shame because if it had, then we wouldn’t be in the mess we’re now in.

How does SafeToNet help?

We often say that our safeguarding and wellbeing app is “platform independent”. Because we contextualise the outbound messages children type before they leave the child’s device, even before they are encrypted and before they end up on the social media platform, we help safeguard children irrespective of what the platform does.

From the SafeToNet Foundation’s podcast, it seems that the social media platforms are fighting tooth and nail against any change to Section230, against any suggestion that their immunity from liability should be conditional, that they need to earn their immunity by implementing, following and being held to account for best practice in child online safety.

SafeToNet’s software itself is immune from these legal shenanigans. It will continue to, in real time, disrupt conversations that are harmful to the child, whether cyberbullying from peer group members or grooming from a sexual predator who might be disguised as a child. It will continue to offer real time advice and guidance to the child so that they are better equipped to deal with online life, and it will continue to offer emotion management through our private-to-the-child wellbeing features. 

You can download our software from the app store of your choice and start safeguarding your children today.


Richard Pursey, Group CEO SafeToNet

Richard is a serial entrepreneur with a background in behavioural analytics having successfully started and sold a number of technology companies. Prior to co-founding SafeToNet, Richard spent time working in the voluntary sector and would drive children suffering from cancer to hospital for treatment. He learned much by talking to the children about their lives and in particular their online experiences. Richard also previously served on the board of the West Berkshire NHS Primary Care Trust, where he was exposed to the brutal reality of being a child in today’s online world and the mental health issues associated with online harms. 

UNITED KINGDOM

Customer Support

General Inquiries