Lauren Hemmings’s TikTok feed started like anyone else’s.
Short videos of families dancing, a comedy skit about the pandemic, baking cakes.
Like almost a billion people around the world, she downloaded the app for entertainment.
“It was more of an innocent hope of just getting a good laugh really,” the 19-year-old says.
As she scrolled through video after video, Lauren’s feed became darker.
The app would change the direction of her life and warp her perceptions of the world and herself.
TikTok has changed the internet. It’s a cultural phenomenon.
The app’s powerful algorithm is like nothing the world’s seen before.
A joint investigation by triple j Hack and Four Corners has found the TikTok algorithm is exposing Australians to dangerous content while controlling which people and political movements get users’ attention.
TikTok says its mission is to “inspire creativity and bring joy”. But it risks distorting the way much of a generation is seeing the world, and not always for the better.
On TikTok you don’t pick what to watch — the videos are served to you in an endless stream.
Each user’s feed is unique, and labelled “For You”.
What you see is picked by the algorithm — a set of computerised instructions that, in theory, work out what you want, and give you more of it.
As soon you sign up, TikTok starts collecting data about your location, gender and age and, more contentiously, your facial data.
The more you “like” videos, follow an account, or watch a TikTok video until it ends, the more the algorithm learns about your interests.
“What really sets TikTok apart … is just how accurate and how up-to-the-minute this For You page recommender system seems to be,” QUT researcher Dr Bondy Kaye says.
“It is very hard to break that cycle, and it’s by design that you never really get to the end of the content.”
The more it keeps you scrolling, the more ads you see. That’s what’s catapulted TikTok’s Chinese parent company ByteDance to a value of more than $250 billion.
When Lauren Hemmings downloaded TikTok, the algorithm showed her a video by a popular fitness influencer, with a similar body shape to hers, who’d been tracking her food intake and losing significant amounts of weight.
“As I [followed] her, a lot of the same pages kept on showing up,” Lauren says.
“I had never really had that many negative thoughts about my body until I had someone saying, ‘I hated this body. I’d cry about this body every night.’
“I was no longer seeing funny dance videos or anything. It was just like this complete focus on that fitness and healthy lifestyle goal.”
The TikTok algorithm pushed Lauren toward the viral trend of meticulously tracking how many calories you eat in a day, something researchers warn promotes eating disorders.
The hashtag “What I Eat in a Day” has more than 7 billion views on TikTok.
It’s inundated with videos of people detailing how many calories they’ve eaten, often sliced between mirror shots of their skinny bodies.
“It turned into this obsession,” Lauren says.
“I felt that I could not eat anything without knowing how many calories it contained and without meeting my target number of calories throughout the day … There were a few months where I didn’t put anything into my mouth that I had not weighed.
“Before TikTok, calorie counting had never crossed my path.”
After four months on TikTok, Lauren was diagnosed with an eating disorder.
Researchers say there are many factors that contribute to eating disorders, but social media has increasingly become a focus of concern.
“I’d like to think that I wouldn’t have struggled with an eating disorder if I hadn’t downloaded TikTok … the algorithm is seeing vulnerable people and then playing on that vulnerability,” Lauren says.
Swinburne University’s Dr Suku Sukunesan advises TikTok on how to make the app safer.
He’s embedded himself in the app’s eating disorder communities.
“I was immediately given all this eating disorder content. After a couple of hours, TikTok suggested 30 different accounts to follow and they were all people living with eating disorder issues,” he says.
Dr Sukunesan says these TikToks effectively teach people how to have an eating disorder, and the algorithm can lead them to more severe videos, such as ones that promote self-harming.
“It’s almost like a pit with no end and you find that these kids would ultimately harm themselves more,” he says.
Claire Benstead was hesitant about going on TikTok. She’d heard it wasn’t a good space for people with eating disorders.
The 22-year-old has been in and out of hospital for more than five years, but was in recovery when she downloaded the app.
She very quickly started seeing videos which made her eating disorder worse.
“I wasn’t recovering at all. I was actively relapsing,” she says.
“As I got sicker and I got more obsessive, all I could do was just flick through my phone and look at this footage. I spent hours on it and just fixated on it.”
Claire’s psychologists suggested she clean up her toxic feed by reporting videos that promote eating disorders.
The company’s policies say TikTok bans “content depicting, promoting, normalising, or glorifying activities that could lead to suicide, self-harm, or eating disorders”.
But Claire has tried to report videos promoting eating disorders only to be told they don’t breach any of TikTok’s guidelines.
“It’s kind of invalidating. It’s got the highest mortality rate of any illness, you’re promoting those behaviours and it’s making it worse,” Claire says.
TikTok’s response to dealing with this problem is to ban pro-eating disorder hashtags so users can’t search for those videos. If they do, a number for eating disorder support service The Butterfly Foundation pops up.
People living with eating disorders and researchers say while this is a positive step, it’s not going to stop the app exposing people to these videos, especially when people can easily get around bans on hashtags.
“Our teams consult with NGOs and other partners to continuously update the list of keywords on which we intervene,” a TikTok spokeswoman says.
Another TikTok user told Hack and Four Corners that when she reported a viral video of a man taking his own life it was also found not to breach the app’s community guidelines.
The video originated on Facebook but was shared on all the major social media platforms.
TikTok was the last to take it down, and only after the video went viral on the platform.
“It’s extremely disturbing as a researcher, as a parent, to know that that kind of content doesn’t breach their guidelines,” Dr Sukunesan says.
It takes less than 30 seconds to find harmful content on TikTok, and a few hours for the algorithm to dominate someone’s feed with offensive videos, according to several researchers.
Tech advocacy organisation Reset Australia ran experiments and discovered it takes about four hours for the algorithm to learn that a 13-year-old is interested in racist content, and about seven hours for sexist videos to swamp someone’s feed.
The longer those users watch that kind of content, the more frequently they appear.
Reset Australia’s Rys Farthing says every piece of harmful content adds up.
“I couldn’t imagine what it’d be like to be a young person who’d fallen into that particular rabbit hole and was being served that feed. That would be pretty troubling.”
Shadow bans and bias
While TikTok is facing pressure to eradicate harmful videos, it’s also been accused of using the algorithm to censor and suppress posts for the wrong reasons.
Perth-based TikTok creator Unice Wani has gained almost 600,000 followers in just over a year, performing viral dances and lip-syncing to popular hip-hop sounds.
The 18-year-old uses the platform to raise awareness about issues facing her community.
“The more I go viral, the more I can basically show the younger generation and show more coloured girls or people out there I’m OK in my own skin,” she says.
But she’s noticed her videos are often hidden from the TikTok feed, meaning no-one sees them. It’s something TikTok users refer to as “shadow banning” and is a reflection, they say, of the algorithm’s biases.
“You tend to get a lot of shadow bans for speaking up about stuff such as racism … I guess they focus more on the white girls dancing and stuff like that.”
University of Melbourne AI researcher Dr Niels Wouters says while TikTok’s feed is automated, like any algorithm, it’s created by humans.
“As humans, we all have biases. So, when we create an algorithm, we are absolutely at risk of embedding our own biases in these algorithms.”
In July, several Black influencers went on an indefinite strike, refusing to choreograph the viral dances TikTok relies on, and accusing the app of capitalising on their creativity without preferencing them in the algorithm.
“People say that in order for you to get views on TikTok, in order for you to get likes, you need to have talent. When we show these talents, no-one gives us credit for it,” Unice says.
“No matter how much we try, we’re just not going to get that.”
In March 2020, TikTok policy documents were leaked showing moderators were instructed to suppress posts by creators considered “ugly, poor, or disabled”.
The documents said videos including people who are “chubby or obese” with “ugly facial looks … like too many wrinkles … or facial deformities … and other disabilities” should be excluded.
“If the character’s appearance or the shooting environment is not good, the video will be much less attractive to be recommended to new users,” the documents said.
TikTok responded at the time saying most of those guidelines were no longer in use. But TikTok creators living with a disability have accused the company of continuing this practice.
One of those creators is Paniora Nukunuku.
Set on the streets of Sydney, Paniora’s TikToks are a mix of candid and comedic skits about living with a disability, race, and politics.
“I didn’t expect the content that I do to gather the amount of fans that I have right now, it warms my heart hearing young people saying, ‘I never thought I’ll see someone that looked like me,'” he says.
A week after the company leak, Paniora posted a video about a stranger who told him he shouldn’t have a disability permit.
“The video got taken down … I don’t know why,” he says.
Paniora appealed and the video was put back up, but he’s had other videos about his disability removed as well.
“I know that my content gives value to so many people who look like me, who live life like me, who are brown like me. If they don’t have any representation on the social media platform, it can be really challenging to relate to people.”
Experts like Bondy Kaye say the app needs to address these concerns.
“If TikTok doesn’t start devoting the kind of resources necessary to engage with some of these critiques, it’s going to become more of an issue and it’s likely going to lead to black, Indigenous, people of colour, leaving the app forever,” Dr Kaye says.
TikTok’s new normal
It’s not the only time Paniora has felt TikTok’s power.
A video he posted about Black Lives Matter saw his account banned for a week, while another he filmed at a rally in support of Palestinians was taken down just a few hours after he posted it.
“I was furious. I was like, why? There is nothing in these videos that will justify a removal, there really isn’t,” Paniora says.
“It definitely feels like TikTok has some preference on what content should be posted on the platform … It can be really disheartening.”
Last year, TikTok apologised for suppressing posts with the hashtags “Black Lives Matter” and “George Floyd” after thousands of creators took to the platform to protest about their videos being suppressed or accounts being banned.
The company says a glitch in the algorithm caused the issue, but minority creators say there’s a pattern emerging.
Several other creators who’ve posted pro-Palestinian content have had similar experiences.
The Australian Strategic Policy Institute (ASPI) did the first academic investigation into censorship on TikTok and found the company actively uses its algorithm to hide political speech it deems controversial.
The study — which was funded by the US State Department — found hashtags about the mass detention of Uyghurs, Hong Kong protests, LGBTQI and anti-Russian government videos were among those being suppressed.
“We see evidence of how content moderation that takes place in China, how that type of thinking is still applied to TikTok outside of China,” ASPI’s Fergus Ryan says.
He says TikTok is struggling with its identity.
“As it has expanded around the world, and particularly after it’s received a lot of scrutiny, the company has tried to, as much as possible, disconnect TikTok, the company, from its roots in China. But ultimately, those links can’t be fully severed.”
However, TikTok denies being involved in censorship, saying in a statement that: “We have never removed content at the request of the Chinese government, nor have we been asked to.”
As TikTok expands its Australian operations, cementing itself locally in a bid to grow its user base, experts and the app’s users are urging the company to face up to its criticisms.
Niels Wouters is concerned it’s distorting the way people see the world.
“We’re really at risk of having generations of young people that … have formed identities in response to something that a technology platform prescribes to be normal or the new normal.”
For Lauren, keeping the app wasn’t worth sacrificing her mental health.
“I ended up cutting off TikTok after a few months, but even with that, it still left me with the eating disorder … it has taken a really, really long time to fix that,” she says.
“TikTok isn’t out here to help people. If they’re going to make money off something, then they will make money off something. I think they maybe need to realise the impact that is having on people.”
Story by: Avani Dias, Jeanavive McGregor and Lauren Day
Digital production and design: Nick Wiggins
Photographs: Mathew Marsic
Video production: Nick Wiggins and Harriet Tatham