W

hile social networks like Facebook and Twitter are stagnating, TikTok is growing fast. With such growth comes plenty of attention, and TikTok is now facing scrutiny in several areas, from the algorithmic distribution of harmful content to how it handles user data.

The US has already banned the viral video app on government devices in more than 20 states due to spying allegations stemming from its Chinese ownership. Universities in Oklahama, Alabama, and Texas have followed suit, by restricting students from accessing the app over campus wi-fi networks.

While TikTok narrowly avoided an outright ban under former US president Donald Trump, it is facing growing pressure to prove its security credentials, with specific fears over Chinese government access to user data – a problem that doesn’t impact comparable western social networks. It also faces the same content clean-up challenges as other platforms, where millions of users are able to post videos instantly and moderators struggle to keep pace.

So should you quit TikTok or curb your kids’ usage? Below we’ve outlined the arguments so that you can make up your own mind.

Is TikTok harmful?

The case against TikTok can broadly be split into two categories: harmful content and privacy concerns due to its Chinese ownership.

For the former, there are good reasons to be alarmed. The Center for Countering Digital Hate has found TikTok will show children harmful content as soon as they show an interest in related topics.

Its researchers generated accounts in the US, UK, Canada, and Australia, on behalf of fictional 13-year-olds. They “liked” and interacted with videos related to mental health and body image, to assess how this would affect the content shown in the app’s For You feed.

The accounts were shown self-harm or eating disorder content every 206 seconds on average, and more extreme content was shown to accounts intended to represent vulnerable youths, with references to weight loss in their usernames.

TikTok has also been demonstrated to be a hive of misinformation. In September 2022, Newsguard found that when searching “2022 election”, “mRNA vaccine”, and “Uvalde tx conspiracy”, 20 per cent of TikTok posts contained false or misleading information. Newsguard is a service that rates news and information websites based on how trustworthy they are.

The concern about Chinese ownership is the bigger picture. Governments — including the United States where a campaign to ban TikTok is gathering pace — are concerned about it as a national security risk, because it’s owned by Chinese company ByteDance.

<img src="https://www.socialmediaasia.com/wp-content/uploads/trump-tiktok-07-08.jpg" alt="

Trump wants a ban on TikTok

” height=”1644″ width=”2467″ srcset=”https://static.standard.co.uk/s3fs-public/thumbnails/image/2020/08/07/06/trump-tiktok-07-08.jpg?width=320&auto=webp&quality=50&crop=2467%3A1644%2Csmart 320w, https://www.socialmediaasia.com/wp-content/uploads/trump-tiktok-07-08.jpg 640w” layout=”responsive” class=”i-amphtml-layout-responsive i-amphtml-layout-size-defined” i-amphtml-layout=”responsive”>

Trump wants a ban on TikTok

/ Lionel Bonaventure and Jim Watson / AFP via Getty Images

“There is clearly bipartisan support to do something about TikTok, and the continued reports about harmful content and misinformation being served to users – particularly young people – will only add fuel to the fire,” says Insider Intelligence principal analyst Jasmine Enberg.

A paper by cybersecurity firm Internet 2.0 claims the TikTok app uses “excessive” data harvesting, reaping information on user location, the contents of direct messages, and more, and stores it – in part – on servers in mainland China.

TikTok admitted in November that Chinese staff can and do access user data. But a spokesperson reiterated: “We have never provided any data to the Chinese government. We believe in the importance of storing European user data in Europe; keeping data flows outside of the region to a minimum.”

A Forbes report claimed ByteDance planned to “monitor the personal location of some specific American citizens”.  TikTok denied the claims made in the article, but later would sack four employees for accessing personal data of journalists in an attempt to track down sources. This is enough for Alicia Kearns, Conservative MP for Rutland and Melton and Chair of the Foreign Affairs Select Committee, to advocate Brits deleting the app.

“What TikTok does is it gives away the data that makes you most vulnerable: who are you friends with; what are your interests; what are the interests you have that you may not want publicly disclosed; who you are having private conversations with; the locations you go to,” she told Sophie Ridge of Sky News.

“Our data is a key vulnerability and China is building a tech totalitarian state on the back of our data.”

Where the two concerns — harmful content and Chinese ownership — meet, is if the Chinese government has any say in how the algorithms surface content. Could it apply pressure on ByteDance to spread propaganda or harmful content to British teens? It’s not completely far-fetched, given what we know about Russian troll farms spreading disinformation in the West and the alarming fact that seven per cent of UK adults now get their news from TikTok.

Is the criticism fair? Is TikTok safe?

These are serious points worth highlighting, but it’s worth noting how many of these safety concerns apply to other companies, too.

If you side with US politicians who seek to ban TikTok and effectively booted Huawei out of the US in 2019, you should probably also avoid a host of other brands. These include Honor, Xiaomi, OnePlus, OPPO, Lenovo, Realme, and ZTE, among others. They are all Chinese.

As for harmful content, it’s not like Facebook, YouTube, and Twitter haven’t had their fair share of content scandals where fake news, dangerous disinformation, and scams spread freely with devastating real-world consequences. Social media algorithms value attention and engagement above all else, and that leads to a dark place.

TikTok seems to get special attention because of its Chinese ownership, which can feel somewhat Sinophobic. If you are willing to discount the idea the Chinese government looms over the app 24/7, TikTok starts to look like just another social network.

In October, even the GCHQ’s director Jeremy Fleming said that he wouldn’t be concerned if his own children used TikTok.

But his follow-up was just as important. He said he would “speak to my child about the way in which they think about their personal data on their device”. That’s useful advice whether about TikTok or any other part of the social web.

<img src="https://www.socialmediaasia.com/wp-content/uploads/04743f9f9e78d4b4a48dfbd605b05cd8Y29udGVudHNlYXJjaGFwaSwxNjY1NTYwNzg2-2.48019227.jpg" alt="

Sir Jeremy Fleming

” height=”2333″ width=”3499″ srcset=”https://static.standard.co.uk/2022/10/11/09/04743f9f9e78d4b4a48dfbd605b05cd8Y29udGVudHNlYXJjaGFwaSwxNjY1NTYwNzg2-2.48019227.jpg?width=320&auto=webp&quality=50&crop=3499%3A2333%2Csmart 320w, https://www.socialmediaasia.com/wp-content/uploads/04743f9f9e78d4b4a48dfbd605b05cd8Y29udGVudHNlYXJjaGFwaSwxNjY1NTYwNzg2-2.48019227.jpg 640w” layout=”responsive” class=”i-amphtml-layout-responsive i-amphtml-layout-size-defined” i-amphtml-layout=”responsive”>

Sir Jeremy Fleming

/ Joe Giddens / PA

How to make TikTok safer for your children

For all its problems, the upcoming Online Safety Bill is at least taking the issue of harmful content seriously. If the legislation passes, it mandates that social-media companies must actively look for illegal content, rather than relying on a reporting system to dig it out. However, provisions on “legal but harmful” content, which includes some videos related to self-harm, have been removed.

But what about the here and now? Blocking your children from using TikTok isn’t a problem-free solution. If all their friends use the platform, then you’re making their social life harder, after all. So what can you do to make it as safe as possible?

One tip is to ensure they are registered with the correct date of birth in the app. This won’t allow those under the age of 13 to register, and you shouldn’t let them sidestep that restriction: it’s there for a reason, after all.

But there’s another reason to be truthful with ages. Accounts for those aged 13 to 15 are set to “private” by default, meaning any content posted cannot be viewed by anyone else. And, even if these accounts are made public, their content will not be shared to the For You feed.

Direct messaging is switched off for under-16s, and those under 18 cannot live stream or receive Virtual Gifts, which make up TikTok’s tipping economy.