When Lauren Hemmings first started using TikTok, the overseas version of TikTok, the push messages he saw were similar to those of any other user.
Short videos of family dances, funny short videos of the epidemic, cake baking, etc.
Click to enter the graphic reading mode (recommended)
“The initial idea was very simple, I just wanted to have fun,” said Lauren, who is now 19 years old.
However, when she scrolled through the videos, Lauren found that the video content pushed by Tiktok, the overseas version of Douyin, became more and more dark.
Tiktok may change the direction of her life and distort her view of the world and herself.
TikTok has changed the Internet and has become a current cultural phenomenon.
The powerful algorithm of this application is unprecedented in the world.
A survey jointly conducted by the “Hack” program and “Four Corners” of the Australian Broadcasting Corporation (ABC) Triple j radio station found that TikTok’s algorithm exposes Australians to some dangerous content and controls who and Political movement resistance can get the attention of users.
TikTok declares its mission to “inspire creativity and bring joy.” But the risk that TikTok brings is that it may distort the way a considerable part of the users perceive the world in a generation, and this influence on the world outlook is not always positive.
When watching videos on TikTok, there is no need to choose-videos will be pushed to you continuously.
Each user’s video feed is unique and will be marked with “For You”.
The videos that users see are all calculated and selected by algorithms (a set of computer instructions). In theory, these algorithms can find the videos you want and push more similar videos to you.
As soon as a user registers, TikTok begins to collect data about the user’s location, gender, and age, as well as more controversial user facial data.
If a user “likes” a video, follows an account, or watches TikTok videos throughout the process, the more the algorithm can understand the user’s interest.
Queensland University of Technology researcher Dr. Bondy Kaye said: “What really makes TikTok unique is… this’recommend for you’ page. The push system seems to be very timely and very accurate.”
“It’s very difficult to break out of this cycle. The original design is to let you keep watching.”
The more you scroll down, the more ads you see. And this is the reason why TikTok’s Chinese parent company ByteDance is worth more than 250 billion Australian dollars.
Click to enter the graphic reading mode (recommended)
When Lauren Hemings downloaded TikTok, TikTok’s algorithm pushed her a video of a fitness influencer. The net celebrity has a body shape similar to that of Lauren. He once controlled his food intake and lost a lot of weight.
“When I[关注]When she was, many of the same pages kept popping up,” Lauren said.
“I never really had so many negative thoughts about my body until I heard someone say,’I hate this body. I cry every night because of this body shape.'”
“I don’t watch funny dance videos or anything else anymore, I just focus completely on fitness and healthy lifestyle goals like this.”
TikTok’s algorithm takes Lauren to an extreme, desperately focusing on how many calories he eats in a day. Researchers warn that this practice can lead to eating disorders.
“What do I eat in a day” hashtag has more than 7 billion views on TikTok.
The label is full of videos detailing how many calories people ate, often interspersed with shots of their slim body.
“It became an addiction,” Lauren said.
“I think if I don’t know how many calories the food contains, or if I eat it will exceed my daily calorie intake goal, then I can’t eat it… For a few months, I always weigh good food before eating it. .
“I never counted calories before using TikTok.”
After using TikTok for four months, Lauren was diagnosed with an eating disorder.
Researchers say that there are many factors that can lead to eating disorders, but social media has increasingly become the focus of the problem.
“I think if I didn’t download TikTok, I wouldn’t have an eating disorder… Tiktok’s algorithm is to find people with a certain weakness, and then take advantage of that weakness,” Lauren said.
Dr. Suku Sukunesan of Swinburne University advises TikTok on how to improve application security.
He placed himself in the Tiktok eating disorder community.
“I immediately got all this content involving eating disorders. A few hours later, TikTok suggested that I follow 30 different accounts, they are all people with eating disorders,” he said.
Dr. Sukunesan said that these TikToks effectively teach people how to suffer from eating disorders, and this algorithm may also lead them to more serious videos, such as those that promote self-harm.
“It’s almost like a bottomless abyss, these kids will eventually hurt themselves further,” he said.
Harmful and unprofitable push
Claire Benstead is hesitant to use TikTok. She heard that TikTok is not suitable for people with eating disorders.
Claire, 22, has been hospitalized for treatment several times in the past five years. She is recovering and downloaded the TikTok app.
She soon started watching related videos, which made her eating disorder worse.
“I haven’t recovered at all, and the relapse of the disease has progressed very quickly,” she said.
“As the illness worsened, I became more obsessed. All I could do was turn on the phone and watch this video. I watched it for a few hours, and I couldn’t stop it at all.”
Claire’s psychiatrist suggested that she clean up these harmful posts by reporting videos promoting eating disorders.
TikTok’s company policy states that TikTok prohibits “depicting, promoting, downplaying or beautifying content that may lead to suicide, self-harm, or eating disorders”.
But Claire tried to report videos promoting eating disorders, but was told that these videos did not violate any TikTok guidelines.
“It seems to be futile. Eating disorders have the highest mortality rate of all diseases, and you are promoting these behaviors to make the situation worse,” Claire said.
TikTok’s response to this problem is to prohibit the promotion of hashtags about eating disorders so that users cannot search for these videos. If someone searches, TikTok will pop up the number of the Butterfly Foundation, an eating disorder support service.
Eating disorder patients and researchers have said that although this is a positive measure, it will not prevent users from watching these videos, especially when people can easily bypass the tag search ban.
“Our team consults with NGOs and other partners to continuously update the list of keyword interventions,” said a TikTok spokeswoman.
Another TikTok user told the “Hack” and “Four Corners” columns that when she saw a video of a man committing suicide and reported it, the response she got was that the video did not violate TikTok’s platform specifications.
The video originated from Facebook, but was shared on all major social media platforms.
TikTok is the last platform to take the video off the shelves, and the video has been madly reposted on TikTok.
Dr. Sukunesan said: “As a researcher, as a parent, it is very disturbing to know that this kind of content does not violate platform norms.”
According to several researchers, it takes less than 30 seconds to find harmful content on TikTok, and the algorithm can fill a user’s push page with offensive videos within a few hours.
The technology advocacy organization Reset Australia conducted a series of experiments and found that this algorithm takes about 4 hours to learn that a 13-year-old child is interested in racist content, and it takes about 7 hours to use sexist videos to account for Push page full of users.
The longer users watch this type of content, the more often this type of video will appear.
Rys Farthing of Reset Australia, the technology regulator, said the impact of any harmful content will increase cumulatively.
“I can’t imagine what it would be like if young people fell into this vortex and kept getting such pushes. This is very disturbing.”
Click to enter the graphic reading mode (recommended)
Shadow bans and algorithmic bias
On the one hand, TikTok is facing pressure to eradicate harmful videos. On the other hand, it has also been accused of using this algorithm to censor and suppress posts for unethical reasons.
TikTok Internet celebrity Unice Wani, who lives in Perth, has gained nearly 600,000 fans in more than a year by performing popular dances and lip-synching popular hip-hop music.
The 18-year-old girl used the TikTok platform to raise awareness of the various issues facing her community.
Younis said: “The more popular I become, the more I can basically show to the younger generation, to show more girls of color or other people that I am confident in my identity.”
But Younis noticed that her videos are often hidden by the push page, which means that no one can see her videos. This is what TikTok users call a “shadow ban”, and they say it is a reflection of algorithmic bias.
“You tend to get a lot of shadow bans for saying things like racism… I guess they pay more attention to white girls’ dances and the like.”
Dr. Niels Wouters, an artificial intelligence researcher at the University of Melbourne, said that although TikTok’s push is done automatically, which is no different from any other algorithm, the algorithm is created by humans.
“As humans, we all have biases. Therefore, when we create an algorithm, it is absolutely possible to embed our own biases in these algorithms.”
In July, some black internet celebrities went on strike indefinitely, refusing to choreograph the popular dances TikTok relied on, and accused TikTok of using their creativity without prioritizing them in the algorithm.
“People say that to get page views on TikTok, to make people like you, you have to be talented. However, when we show these talents, no one owes them to us,” Younis said.
“No matter how hard we try, we just can’t get these.”
In March 2020, TikTok’s policy document was leaked, and its policy required administrators to suppress posts from creators deemed “ugly, poor, or disabled”.
These documents stated that videos of people including “fat or overweight”, “faces are too ugly…such as too many wrinkles…or facial deformities…and other disabilities” should be removed.
“If the protagonist does not look good, then the film will definitely be less attractive and not worth recommending to new users,” the documents said.
TikTok responded at the time that most of these guidelines are no longer used. However, some TikTok video creators with disabilities accuse TikTok company’s approach has not changed.
Paniora Nukunuku (Paniora Nukunuku) is one of the creators.
The content that Paniola posted to TikToks is based on the streets of Sydney. It is a frank and comedic short series about the life, race and politics of the disabled.
He said: “I didn’t expect that the content I did would now have so many fans. It was heartwarming to hear young people say,’I never thought I would see people like me’.”
A week after the company’s policy documents were leaked, Paniola posted a video about a stranger telling him that he should not have a disability certificate.
“This video was taken down…I don’t know why,” he said.
Paniola filed an appeal and the video was put back, but some of his other videos about his disability were also deleted.
“I know that my content provides value to many people like me. They look like me, they live like me, and they have brown skin like me. If they don’t have any representation on social media platforms, Then it is really difficult to establish contact with them.”
Experts like Bundy Kay say that TikTok needs to solve these problems. TikTok
“If TikTok does not invest the necessary resources to face some of these criticisms, it will cause even greater problems, and it will likely cause blacks, aboriginals, and people of color to leave the app forever,” said Dr. Kay.
In a statement, TikTok said it “supports diversity” and denied discrimination against “any creator or community on our platform”.
TikTok’s new normal
This was not the only time Paniola felt the power of TikTok.
A video he posted about the “black man’s life is also a fate” incident caused his account to be banned for a week, and another video he shot at a rally in support of the Palestinians was deleted a few hours after it was posted.
“I was very angry. I thought, why? There is nothing in these videos to justify the deletion, really not,” Paniola said.
“It definitely feels like TikTok has some preference for what content should be published on the platform…It can be really frustrating.”
Last year, TikTok apologized for suppressing posts with the hashtags “Black’s life is also life” and “George Freud”, after thousands of creators protested on the platform that their videos were suppressed or their accounts were suppressed. prohibit.
The company said that a glitch in the algorithm caused the problem, but a few creators said that a pattern is emerging.
Several other creators who publish pro-Palestinian content have similar experiences.
The Australian Institute for Strategic Policy (ASPI) conducted the first academic investigation of TikTok’s censorship system and found that the company was actively using its algorithms to hide political statements that it considered controversial.
The research funded by the US State Department found that the suppressed hashtags included large-scale detentions of Uyghurs, Hong Kong protests, LGBTQI niche groups, and short video content against the Russian government.
He said that TikTok is struggling with his identity.
“With its expansion around the world, especially after it has received a lot of supervision, the company has tried to decouple the TikTok company from its roots in China as much as possible. But in the end, these connections cannot be completely cut off.”
In a statement, TikTok denied that the company was involved in the censorship system.
“We will not adjust or remove content based on political sensitivity. We have never removed content at the request of the Chinese government, and no one asked us to do so.”
As TikTok expands its business in Australia and consolidates its local position to expand its user base, experts and users of the app are urging the company to face up to its criticism.
Niels Wouters worries that it distorts the way people see the world.
“We do face the risk that generations of young people…have formed an identity for what the technology platform defines as normal or new normal.”
For Lauren, keeping the app is not worth sacrificing her mental health.
She said: “I finally quit TikTok a few months later, but even then, it still left me with an eating disorder… It took me a very, very long time to solve this problem.”
“TikTok is not here to help people. If they want to make money from something, then they will make money from it. I think they may need to be aware of the impact this has on people.”
Written by: Avani Dias, Jeanavive McGregor 和 Lauren Day
Production and design: Nick Wiggins
photography: Mathew Marsic
video: Nick Wiggins 和 Harriet Tatham
Related English articles