Enough is enough: regulation is vital to protect children on social networks
Recent media coverage has revealed the true extent to which social media providers have treated child safeguarding as an optional extra, rather than a necessity. As the debate around how to tackle this problem rages on, the NSPCC is clear that tough but proportionate regulation of social networks is the only solution.
The inaction from social media providers has actively fuelled the scale and extent of the risks of online abuse that children face: from the production and distribution of child abuse images, to the failure to tackle harmful and inappropriate content on suicide and self-harm, and the growing scale of technology-facilitated grooming.
The NSPCC campaign, Flaw in the Law, made it a criminal offence for an adult to send a child a sexual message. In the first eighteen months of this new law being introduced there were over 5000 police-recorded offences for sexual communication with a child in England and Wales. Two thirds of these offences took place on Facebook, Snapchat or Instagram.
As the Government publishes its Online Harms White Paper, we have an unprecedented opportunity to protect children from abuse. Social networks can no longer be given the benefit of the doubt. Since 2005 there have been 13 self-regulatory initiatives, but every one of these has ultimately failed to keep children safe from the threat of online abuse.
This failure to keep children safe online has resulted in considerable appetite for reform across the nation. A recent survey carried out by the NSPCC found that nine out of ten adults want to see statutory regulation that makes social media providers legally responsible for the risks on their sites. We must act on this growing consensus.
The NSPCC is clear that the Government will only deliver on its ambition to make Britain the safest place in the online world by introducing statutory regulation of social networks. Sites should be required to adhere to a set of minimum safeguarding standards, including a requirement to use technology to identify and prevent grooming.
Social media and online platforms must be subject to a legally enforceable duty of care. This means sites must proactively identify foreseeable risks on their platforms and take steps to mitigate them through the design and function of their technology. This would ensure existing sites become safe and those of the future are built to be safe from day one. Other things available to our children– food, toys, clothes, for example – all meet standards ensuring children are safe to possess and to use them. Social networks must be the same.
The regulator should be given information disclosure powers to enable it to assess the scale and extent of the risks that children face. Platforms should be legally required to proactively disclose safety breaches to the regulator and risk assess new products and services offered to children.
A regulator is the right solution because a body with the right tech expertise will give certainty to the industry. However, the regulator must have the powers necessary to do its job. It must be able to apply measures that incentivise compliance, including the ability to levy fines and refer platforms and named directors for potential prosecutions.
For too long, children have paid the price for social networks failing to tackle abuse occurring on their sites. The Government must now decide whether it will be remembered for protecting children for generations to come. This is an opportunity to be on the right side of history. It’s time to ensure every child is finally kept safe online.