Those lost in the never-ending scroll of TikTok may occasionally find themselves on the wrong side of the internet. But while most adults can judge which videos are spouting misinformation, children don’t necessarily have the same gift.
Discover our latest podcast
According to NewsGuard and reported by The Guardian, while other social media platforms have been scrubbing themselves of COVID misinformation, this type of content can still be found on TikTok, where it can pop up on the screens of children as young as nine.
NewsGuard fears children will access anti-vaxx content on TikTok
NewsGuard, an organisation that handles misinformation online, noted anti-vaxx videos have slipped through the cracks on TikTok, with videos and accounts with thousands of followers, spouting myths about COVID statistics and content which discourages people from getting vaccinated.
The misinformation watchdog reported the content to the World Health Organisation (WHO), but clips remained on the platform, potentially being accessible to children.
As part of NewGuard’s research, they found that while TikTok only permitted full access to the app for those aged 13 and older, some participants younger than 13 could create full-access accounts using fake ages.
A TikTok spokesperson revealed: ‘Our community guidelines make clear that we do not allow medical misinformation, including misinformation relating to COVID-19 vaccines. We work diligently to take action on content and accounts that spread misinformation while also promoting authoritative content about COVID-19 and directly supporting the vaccine effort in the UK.’
However, the UK Managing Director for NewsGuard, Alex Cadier, believes that while the sits is trying to take anti-vaxx content down, many videos remain, and ones that have been deleted may have still received thousands of views:
TikTok’s failure to stop the spread of dangerous health misinformation on their app is unsustainable, bordering on dangerous. Despite claims of taking action against misinformation, the app still allows anti-vaccine content and health hoaxes to spread relatively unimpeded.
Cadier continued: ‘This is made worse by the fact that the more anti-vaccine content kids interact with, the more anti-vaccine content they’ll be shown. If self-regulation isn’t working for social media platforms, then regulation, like the online safety bill, has to be the way forward to keep young people safe online.’
Demands increase for social media to be held responsible for its effects on children
Issues around social media and children have resurfaced as recent reports leaked detailing how Facebook and Instagram can negatively affect the mental health of teenagers.
On Friday, The Financial Times also reported that the digital rights charity 5Rights has launched an investigation into many online companies, including Twitter, TikTok, Snapchat and Instagram, for breaching The Children’s Code which aims to protect their privacy online.
Violations of The Children’s Code include using subtle tricks to nudge kids into sharing their location or receiving targeted ads or potentially harmful content. Violations also include not vetting children’s ages properly before allowing them to video chat with strangers.
This week it was also announced that TikTok and Snapchat would need to adhere to new Ofcom rules that shield children from harmful content or risk being axed.
Ofcom’s tough new conditions mean that video-sharing platforms (VSPs) will be responsible for ensuring that children don’t come across footage depicting or inciting violence, hate, extremism and x-rated content. Those who fail to uphold the new standard risk being fined or having the services suspended.
Ofcom chief executive Dame Melanie Dawes explained: ‘Online videos play a huge role in our lives now, particularly for children. But many people see hateful, violent or inappropriate material while using them.’
The platforms where these videos are shared now have a legal duty to take steps to protect their users. So we’re stepping up our oversight of these tech companies while also gearing up for the task of tackling a much wider range of online harms in the future.