Revealed: anti-vaccine TikTok videos being viewed by children as young as nine

Read More

Revealed: anti-vaccine TikTok videos being viewed by children as young as nine

Clips peddling Covid misinformation remain on site months after being flagged to TikTok by Newsguard

Last modified on Fri 8 Oct 2021 13.26 EDT

Lies and conspiracy theories about Covid-19, which have amassed millions of views and are accessible to young children, have remained on the social media platform TikTok for months after it was alerted to them, the Guardian has learned.

TikTok accounts with hundreds of thousands of followers that discourage vaccination and peddle myths about Covid survival rates were uncovered by Newsguard, an organisation that monitors online misinformation.

Newsguard said it had flagged the dangerous content to TikTok in June but many of the accounts remained active on the platform.

The revelation comes amid renewed concern about the impact that social media is having on young people, after it was reported that Instagram, which is owned by Facebook, had internal research showing its app was harming teenagers.

As part of its investigation, Newsguard said children as young as nine had been able to access the content, despite TikTok only permitting full access to the app for those aged 13 and over. Three participants in the organisation’s research who were under 13 were able to create accounts on the app by entering fake dates of birth.

TikTok told the Guardian it worked diligently to take action on content and accounts that spread misinformation.

Some of the accounts seen by the Guardian had posted individual videos containing Covid misinformation that had attracted up to 9.2m views. The misinformation included false comments about dangerous side-effects of specific brands of Covid vaccine and misleading comparisons between Covid survival rates and vaccine efficacy rates.

Alex Cadier, UK managing director for NewsGuard, said: “TikTok’s failure to stop the spread of dangerous health misinformation on their app is unsustainable bordering on dangerous. Despite claims of taking action against misinformation, the app still allows anti-vaccine content and health hoaxes to spread relatively unimpeded.

“This is made worse by the fact that the more anti-vaccine content kids interact with, the more anti-vaccine content they’ll be shown. If self-regulation isn’t working for social media platforms, then regulation, like the online safety bill, has to be the way forward to keep young people safe online.”

Published in May, the draft online safety bill imposes a “duty of care” on social media companies, and some other platforms that allow users to share and post material, to remove “harmful content”. This can include content that is legal but still judged to be harmful, such as abuse that doesn’t reach the threshold of criminality, and posts that encourage self-harm and misinformation.

Cadier added: “The difficulty in really knowing the scale of this problem is that TikTok hold all the information and get to mark their own homework.

“They say they’ve taken down 30,000 videos containing Covid-19 misinformation in the first quarter of 2021, which is a good step, but how many are left? Of the ones they deleted, how many views did each get? Who shared them? Where did they spread? Where did they come from? How many users mostly see misinformation when they see Covid-19 related content?”

On Friday, the Financial Times reported an investigation by the digital rights charity 5Rights had alleged that dozens of tech companies, including TikTok, Snapchat, Twitter and Instagram, were breaching the UK’s new children’s code, which protects children’s privacy online.

The research was submitted to the Information Commissioner’s Office as part of a complaint written by Beeban Kidron, the charity’s chair and the member of the House of Lords who originally proposed the code.

Violations of the code alleged by 5Rights include design tricks and nudges that encourage children to share their locations or receive personalised advertising, data-driven features that serve harmful material including on eating disorders, self-harm and suicide, and insufficient assurance of a child’s age, before allowing inappropriate actions such as video-chatting strangers.

TikTok uses a small notification at the bottom of the screen that says “learn more about Covid-19 vaccines” and links directly to the NHS coronavirus vaccines page.

One-quarter of TikTok’s 130 million monthly active users in the US were aged 10 to 19 as of March 2021 and nearly half of the total users were under 30, the data company Statista reported. In the UK, according to Statista, people under 25 represent 24% of all users.

TikTok has begun to eclipse other well-established social media platforms in popularity, having overtaken YouTube in average viewing time for Android users in the US and UK, according to the app analytics firm App Annie. TikTok was the world’s most downloaded app in 2020, App Annie reported.

TikTok is owned by ByteDance, an internet conglomerate based in China and partially owned by the Chinese government.

A TikTok spokesperson said: “Our community guidelines make clear that we do not allow medical misinformation, including misinformation relating to Covid-19 vaccines. We work diligently to take action on content and accounts that spread misinformation while also promoting authoritative content about Covid-19 and directly supporting the vaccine effort in the UK.”

The debate over younger people and their interaction with social media platforms has reignited over the past month following the revelations that Instagram knew via internal research that its app was harming the mental health of some teenage girls.

Facebook has described the revelations, published in the Wall Street Journal following a document leak by the whistleblower Frances Haugen, as a “mischaracterisation” of its work. The documents include a survey result that estimated that 30% of teenage girls felt Instagram made dissatisfaction with their body worse.

The research about vaccination misinformation on TikTok comes after parents and teaching unions raised concerns that the jab rollout to children in the UK was “haphazard” and “incredibly slow”. Fewer than one in 10 (9%) 12- to 15-year-olds had been vaccinated by last Sunday, while new data released on Friday showed one in 14 had Covid last week.

All children in the UK aged 12 to 15 are eligible for a Covid jab following a decision made by the UK’s chief medical officers. Healthy 12- to 15-year-olds are being offered one Covid jab at the moment, but those vulnerable to the virus, or living with someone who is, will be offered two doses eight weeks apart.

Related articles

You may also be interested in

Headline

Never Miss A Story

Get our Weekly recap with the latest news, articles and resources.
Cookie policy

We use our own and third party cookies to allow us to understand how the site is used and to support our marketing campaigns.