Kids And Social Media: Who Is Responsible For Policing The Age Restriction?
It’s no secret that despite social media platforms stating kids under 13 can’t sign up, they are still creating profiles. In November 2017, a study from regulator Ofcom found underage access was “on the rise” and half of 11-12-year-olds have a social media profile.
So whose responsibility is it to reduce the number of children signing up to social media before they reach their teenage years? Should the government intervene or should companies like Snapchat and Facebook be doing more to ensure kids can’t sign up? Or is it down to parents to make sure their children don’t have access?
What’s perhaps surprising, is that no one is shirking responsibility. When we asked parents, the government and social media networks, they all said the liability should be shared.
The majority (80%) of parents of six-to 16-year-olds believe mums and dads have a ‘high level’ of responsibility for keeping their children safe online, according to a survey of 2,000 parents by nonprofit organisation Internet Matters. More than half (61%) thought this responsibility extends to social media companies too and almost 50% believed app creators and internet service providers are also responsible. “Collectively, we all have a responsibility to help children navigate the digital world both safely and smartly - this includes making sure age-restrictions are adhered to,” said Carolyn Bunting, CEO of Internet Matters.
Siobhan Freegard, founder of ChannelMum.com agreed, adding: “Keeping kids safe is everyone’s responsibility. Whether it’s playing in the park or socialising on social media, we have a collective duty to ensure children are protected.”
We contacted three major social media companies - Facebook, Instagram and Snapchat - to ask what they do to enforce the age restrictions. Snapchat said all new users are required to provide an age when they register. If they become aware that a user is under the age of 13, they will terminate their account. They said they use the “best available technology” to prevent under 13s from registering and ensure their app isn’t available in the “kids” or “family” sections of the app store.
In a similar vein, Instagram said if an underage user is reported, they remove the account. If someone sees an account they think is run by someone under the age of 13, they are encouraged to report this via a form on the Help Centre. However, they do allow under-13s to have accounts managed by an adult – e.g a parent or manager.
Despite these precautions being taken, some argue that social media companies should still be doing more to combat underage sign-ups. “A tick box to say you’re the right age simply isn’t enough,” said Freegard. “Social media companies are under scrutiny for data handling and need to enforce age restrictions much more rigidly as part of this.”
Andy Burrows, NSPCC’s associate head of child safety online, also said there is more social networks should do to make their sites safer for children and ensure that young users are not being exposed to inappropriate or dangerous content, including making age restrictions much clearer on sign-up pages, and building child safety measures into site design from the start.
Given that more than half of 12-year-olds already have a social media account, Burrows said there’s an urgent need to make these sites as safe as possible, which ultimately means it’s not only down to the social media platforms themselves: “That’s why we want the government to enforce a mandatory social network code, backed by a regulator, with a set of child safety measures that all social networks must have in place,” he said.
In response to these comments a Department for Digital, Culture, Media & Sport (DCMS) spokesperson said: “Social media companies have a duty to make their platforms safer, including taking a stronger role in closing down underage accounts. Through our Internet Safety Strategy, we are working closely with the industry to encourage solutions that will increase online safety in the UK, but haven’t ruled out further regulation if significant progress is not made.”
It is clear that no one believes parents should be doing this alone, however, they do have a part to play in ensuring their kids aren’t signing up to these networks underage. “Parents can help keep their children safe online by ensuring they are familiar with the age restrictions on the apps their children want to use and by also having open, honest and regular conversations about the online world and allowing them to think critically about why those age-restrictions may apply,” said Bunting.
Burrows said he also believes parents should be “making these conversations part of everyday life” in the same way you would ask about your child’s day at school, adding that the NSPCC and 02’s Net Aware tool can give parents practical tips on the different apps and sites that children use.
Freegard said mums and dads should educate their child on sensible use, the dangers of social media and giving information away to strangers. “Sure it’s tough to police and your kids may not thank you for it - but it’s better than your child being harmed by inappropriate use,” she said.
She also argued that education should continue at school: “Most schools are already doing an excellent job with safer internet use policies and restricting the use of phones on school property. Finally, we also need a shift in our thinking in wider society. Lets see social media as a helpful tool for our lives, but get back to real-life interactions being equally as valuable. Social media should lead to social meet-ups and then we’d feel happier all-round.”