Answer me this (a simple yes or no will suffice): If there was a product that had the potential to cause your child serious and demonstrable harm, would you expect the Government to place controls on it?
The logical answer is yes. And there are many such products that the Government does control to prevent harm to children. Age restrictions on the sale of alcohol and tobacco are the most obvious examples.
Three news items I saw in the past week convinced me that the Government – and society as a whole – is falling tragically short in the control of one product that does more harm to young people than liquor and cigarettes combined. It is social media.
No parent could have read the lead story in the Weekend Herald without feeling enormous empathy for Cambridge Middle School principal Daryl Gibbs. The headline on that story was ‘I put to tool in her hand’: Principal shares daughter’s online ordeal. It told of how within three weeks of giving his 13-year-old a smartphone, she had downloaded Snapchat and received her first ‘You should kill yourself’ message.
He shared his feelings of guilt and admitted he was naïve to think that placing limits on her connections would keep her safe. It did not. She suffered anxiety, depression, and absenteeism from school as a result of what she saw on her phone. What was most disturbing was the fact that her contacts were being monitored by her parents. The harm was coming through people she knew.
The second news item was a story I saw on 3 News/Stuff and the RNZ website about the Worst Children’s Library. It highlighted an upcoming event at Auckland Normal Intermediate School where a section of its library would have its books replaced by titles covering the range of harmful content children can readily access on the Internet. Parents were invited to hear a range of speakers on the dangers their children face online.
Among the speakers would be Rob Cope, the founder of Our Kids Online and the author of a Bill he proposed to Government in March titled the New Zealand Child Internet Safety Act. You can read the draft here .
On the Tuesday Commentary: A parent’s anguish convinces me it ‘s time to treat social media like other toxic substances. http://www.knightlyviews.comillustration above shows some of the milder titles. The exhibition had been given an R18 restriction for good reason. Cope told RNZ about the harmful content children were finding online.
“There’s beheadings, disembowelment, there’s rapes, torture. Your worst nightmare is sitting there on every single device, just waiting for our kids to find and that is what a lot of our kids are actually finding.”
The third news item was a sickening story in The Guardian about men drawn into child pornography by algorithms that take them into increasingly darker spaces. Here is a link to the story but I warn you it is a gut-wrenching read.
The Guardian’s story stated that, in England and Wales, 850 men a month are arrested for online child abuse offences and the UK’s Internet Watch Foundation last year acted to remove 300,000 web pages “each containing at least one, if not hundreds or thousands, of illegal images and videos”. The story noted that pornographic material that would have been considered extreme a generation ago “is now readily available on iPads, desktops and the phones in teenagers’ pockets”.
There are a multitude of issues surrounding the production and distribution of pornography but my focus here is on children. They are victims in two ways – exploited to provide vile content and courted as consumers. However, the harm they face in a digital world extends far beyond pornography.
These three stories weighed on my mind and, clearly, I am not alone in my concerns. The screening of the British mini-series Adolescence has prompted a wide reaction and led the Herald on Sunday to call in an editorial for “a greater community approach to the issue and this includes contributions from retailers, internet services providers and the Government”.
Too many of our current legislative protections rely on the public to report transgressions. Just look at the Department of Internal Affairs Keep It Real Online website to see what I mean. And far too many of the ‘protective’ measures are after the fact. The damage has been done by the time it has been reported.
Rob Cope’s proposed Bill contains a multi-faceted approach to protecting children aimed at stopping young people under the age of 18 from being able to access potentially harmful content. It calls for actions by both internet service providers (ISPs) and retailers. However, quite understandably the draft Bill focusses on only one section of the community – children – while the problems spread far wider than our tamariki and rangitahi.
Law enforcement agencies are active in searching out illegal Internet content, particularly material related to child abuse. At a lower level, for the past decade we have had the Harmful Digital Communications Act that has afforded a limited (and now obviously inadequate) measure of protection. Again, its ‘protections’ are largely after the fact.
Significantly, Section 24 of that Act effectively absolves content hosts (platforms or ISPs) from liability under the Act if they comply with a complaint notification system and take-down provisions.
And that is the heart of the problem.
ISPs are seen to have secondary, not primary responsibility. They are regarded as providing a distribution service for groups of self-selecting individuals who produce all of the content (apart from the lucrative advertisements that are the real reason the service is there).
If it was that straight-forward, such a level of responsibility may be appropriate. However, it is not as simple as that. ISPs use algorithms and artificial intelligence to manipulate groupings, content, and advertisements. They already have a measure of control over who sees what, and they exercise it to improve their bottom lines.
The supersonic advances in artificial intelligence will give ISPs the ability to interrogate all forms of content in real time (if they don’t have it already). They will have the tools to ensure that their sites are free from any harmful content directed at children. Those same tools presumably would have the capacity to clean up social media in general.
I have been a longtime advocate of declaring social media platforms to be publishers under law and subject to the same responsibilities and liabilities as the people who produce newspapers or broadcasts. Those three stories and the British mini-series leave me in absolutely no doubt that such a designation cannot come soon enough. It would force the platforms to use all of the tools at their disposal to minimise harm.
Australia’s ban on social media for children under the age of 16 is due to come into effect in December. New Zealand health professionals say the ban is worth considering here but should be coordinated with a range of other initiatives.
Tragically, our politicians have been spooked by the might of transnational platforms determined to minimise or deny responsibility, and those platforms now also have the mailed fist of a White House bully to reinforce their power.
But what makes those platforms so special? If a massive American chemical conglomerate had been selling a garden spray that was found to be carcinogenic, our Government would have demanded its withdrawal from the New Zealand market or removal of the dangerous element from its formula.
I see no difference between a chemical that can kill and social media that can destroy the lives of our young people.

One thought on “Time to treat social media like a cancer-causing industrial chemical”