Australia recently joined the United States, Canada, Britain and New Zealand to ban TikTok from government devices as concerns grow over the privacy and security of the Chinese-owned video-sharing app. Similar bans and restrictions have occurred in Denmark, Taiwan and India, amongst others.
In addition, there are fears that China’s security services could manipulate TikTok’s recommendation algorithm to influence what users see.
TikTok, owned by ByteDance, insists it is independent and does not share data with China’s authoritarian government.
Safeguards
BCS CEO Rashik Parmar told the monthly BCS Policy Jam session we should create the right ‘guardrails’ for social media, including TikTok: “In government and as a society, we have serious questions to ask about how to limit some of these online harms better than we are doing now.”
As for concerns around its ownership by a Chinese company, he said: “TikTok hasn’t done anything yet, whilst some of the existing big tech companies have, so should we think of those (US-based big tech firms) similarly?”
Rashik cited the Facebook-Cambridge Analytica scandal, where the data of millions of people was harvested without their consent.
‘Lurking’ on the platform
Casey Calista, a global strategist in both UK and international policy, said when it came to young people, she believed TikTok had done a ‘huge amount’ of harm, and added: “Speaking as a stepmother, I’ve been able to see that happen. I’ve been a ‘lurker’ on TikTok for a while to see better what’s happening on the platform and fulfil my duty, safeguarding two young minds.
“My 15-year-old stepson knew who the [controversial influencer] Andrew Tate was before I did – and I’m very on top of this. So, the reality of that [toxic masculinity] culture and how it is fed to people is happening now, and it’s really scary.”
Nothing new
Dr Alexi Drew, an expert in international law and emerging technologies, said she, too, lurked on TikTok and other such platforms and agreed they could be harmful.
But she added that every time a new platform comes, the older generation raises concerns about what the younger generation sees online: “What that tells us is not that we shouldn’t do something, but it’s not a new issue.
“There are one size fits all approaches to doing better governance, education, and strategy. Tools should be available for parents, mentors, and governments to ensure these harms are not transferred from generation to generation when a new platform appears.
“It’s a whack-a-mole approach. We should look for a more systemic way of good governance and education structures applicable for when these new platforms arrive.
“Otherwise it creates a lag, and then we’re trying to tell eleven- to eighteen-year-olds that they can’t have a platform that they’ve grown used to.”
Casey disagreed with Alexi’s belief that these fears are nothing new, as the newer platforms are developing with such speed and influence.
She added she felt the world was moving more towards ‘divergent systems of digital regulation.’
The global approach to regulation
Dr Drew was optimistic, however, that international agreements could be reached: “We’re not quite at the point of governance divergence; we’re at the point where we could see a joint direction of global governance around these technologies.
“We’re seeing a growth in better understanding of this need in this area, and others, such as data protection. It’s an opportunity for us to come together and have a more standardised approach. I might be overly optimistic, but that’s my hope.”