You are here

Should TikTok be banned?

Tik Tok seems at first sight an innocent sort of thing: an app loved by children and young people, designed for making and sharing short, funny videos.

Yet it has become highly controversial. India has already banned it. Other countries, including the US and Australia, claim to be considering a ban. In the UK, TikTok is under investigation by the Information Commissioner’s Office (ICO) for allegedly mishandling children’s data.

US Secretary of State Mike Pompeo, currently in Britain for talks with British politicians, has said people should only use TikTok – or let their children use it – if they’re happy for their data to be collected by the Chinese state. Members of the US army and navy have been forbidden to use it.

So should parents in the UK be concerned? The short answer is yes – although it remains to be seen whether the bad smell that hangs around TikTok is mainly to do with its past.

The company claims that it does want to protect children’s data. But it has flouted the law in several countries. In May this year, the Dutch announced they were launching a probe into whether the company sufficiently protects children’s data under both national law and EU-backed GDPR.

Recently, though, the company has made some moves to mitigate accusations of bad faith, with efforts to restrict use by children aged under 13 from signing up and the introduction of a form of parental controls.

So should we give TikTok the benefit of the doubt?

Collecting children’s data without permission

In 2019, TikTok was fined a record $5.7million by the US Federal Trade Commission for flouting US law by collecting data on children under 13 without their parents’ permission.

To be fair, the fine related to TikTok’s previous incarnation as (also a Chinese-owned app), before the two companies merged and TikTok effectively took over.

More worryingly, perhaps, as recently as this month South Korean authorities fined TikTok £123,000 for illegally collecting data on children under the age of 14 without their parents’ permission.

Direct messaging

The Telegraph reported earlier this month that adults grooming children on TikTok through private messages were not being removed from the platform. Instead, they were banned for a week for the first offence, a month for the second, and removed only after a third.

The report, based on whistleblowing former TikTok moderators, claimed that around one-in-10 direct messages flagged for review involved adults inappropriately messaging children. (TikTok replied to these allegations that the flagging process was to allow its child protection teams in China and Dublin to investigate properly.)

It is not clear, however, when the whistleblowers were working on the site. At the end of April this year TikTok stopped under-16s from being able to receive direct messages.

Nevertheless, the Children’s Charities’ Coalition on Internet Safety has called for TikTok to be banned from the UK until it can prove it is safe for children.

It’s Chinese

TikTok has become caught up in the war of words between China and the UK and US. Where you stand on whether it's unsafe will depend partly on whether you think the undoubted human rights abuses in China (of Uighurs and democracy activists in Hong Kong) extend to everything that China touches. And whether TikTok is, as it claims, separate from the Chinese state.

Erring on the side of caution, there are several issues to be concerned about:

Data collection (again)

TikTok maintains that data on its users is held in Singapore, the US and elsewhere around the world, but not in China. It insists it is not an arm of the Chinese state. But there are legitimate reasons to think that if China wanted access to the data, ByteDance, TikTok’s parent company, might find it very hard to resist.

This may seem a remote concern to UK parents now – but if in a decade’s time your child happened to become, say, a democracy activist, they might not want the Chinese state to know everything about them.


Documents leaked to the Guardian last year showed that TikTok had censored political statements, including videos that mention Tiananmen Square or Tibetan independence. The company claims the instructions revealed in these documents have since been withdrawn. But fears remain that an app with its roots in a system that controls flows of information will censor dissent where it finds it politically uncomfortable.

For You page

TikTok’s For You page is a curated feed of videos. Users need never click away from it. The algorithm that determines this, like curation algorithms on other platforms, is a closely guarded secret.

Curation on social media sites has been blamed for many things, including the rise of anti-vaxxers and the current fragility of democracy. It’s not only Chinese companies that channel or restrict information. But there are fears that TikTok’s curation allows anything that is politically inconvenient simply to disappear.

What is TikTok doing to show its concern about children’s privacy?

In February last year, TikTok introduced Family Safety Mode, which allowed parents to link their children’s accounts with their own. This gives parents the ability to control screen time; to block adult content; and to see their children’s direct messages. In January this year, it tightened the privacy controls on accounts for those aged under 16, making them private by default and blocking the downloading of videos.   

This is all a move in the right direction – although, as ever, you could argue that putting all the responsibility on parents absolves the company itself.

Clear as mud?

Making sense of TikTok is made harder by the fact that it has got caught up in a geopolitical war of words which seeks to portray China as invariably a bad actor.

In the recent past, TikTok has unquestionably fallen foul of the law over its handling of children’s data.

It remains to be seen whether more cases will come to light, but we look forward to the Dutch report. And to the report of the ICO investigation, which has now taken a year. We also hope to see legislation brought forward, based on the government’s White Paper on digital harms, which would impose a duty of care of children on social media companies.

There is more of this story to come. TikTok may be sincere in its stated wish to adhere to accepted standards of child protection. But parents in the UK will feel they still need quite a lot of reassurance.

Image: kovop58/


The conversation: TikTok bans

TikTok: everything you need to know about the video production app

TikTok update bans under-16s from private messaging