TikTok algorithms are promoting videos about self-harm and eating disorders to vulnerable teens, according to a report released Wednesday that highlights concerns about social media and its impact on the mental health of teens. youth.
Researchers from the Center for Countering Digital Hate have created TikTok accounts for fictional teenage characters in the US, UK, Canada and Australia. Researchers running the accounts then “liked” videos about self-harm and eating disorders to see how TikTok’s algorithm would react.
Within minutes, the wildly popular platform was recommending videos about weight loss and self-harm, including photos of idealized models and body types, images of razor blades, and discussions of suicide.
When researchers created accounts with usernames that suggested particular vulnerability to eating disorders—names that included the words “lose weight,” for example—the accounts were fed with even more harmful content. .
“It’s like being stuck in a room of distorted mirrors where you’re constantly being told you’re ugly, you’re not good enough, maybe you should kill yourself,” said center CEO Imran. Ahmed, whose organization has offices in the United States and the United Kingdom.
“He literally sends the most dangerous messages possible to young people. »
Social media algorithms work by identifying topics and content of interest to a user, then nurturing them in a way that maximizes their time on the site. But social media critics say the same algorithms that promote content about a particular sports team, hobby or dance craze can send users down a rabbit hole of harmful content.
This is a particular problem for teens and children, who tend to spend more time online and are more vulnerable to bullying, peer pressure or negative content about eating disorders or suicide, according to Josh Golin, executive director of Fairplay, a nonprofit that supports increased child protection online.
He added that TikTok is not the only platform that fails to protect young users from harmful content and aggressive data collection.
“All of these wrongs are related to the business model,” Golin said. “The social media platform makes no difference. »
In a statement from a company spokesperson, TikTok disputed the findings, noting that the researchers did not use the platform like typical users, and saying the results were skewed accordingly. The company also said a user’s account name shouldn’t affect the type of content the user receives.
TikTok prohibits users under the age of 13, and its official rules prohibit videos that encourage eating disorders or suicide. In the United States, users who search for eating disorder content in TikTok receive a message offering mental health resources and contact information for the National Eating Disorder Association.
“We regularly consult with health experts, weed out violations of our policies, and provide access to support resources for anyone in need,” said the statement from TikTok, which is owned by ByteDance Ltd., a China-based company. present in Singapore.
Despite the platform’s efforts, researchers at the Center for Countering Digital Hate found that eating disorder content has been viewed on TikTok billions of times. In some cases, researchers found that young TikTok users used coded language about eating disorders to avoid moderation of TikTok content.
The amount of harmful content being pushed to teens on TikTok shows that self-regulation has failed, Ahmed said, adding that federal rules are needed to force platforms to do more to protect children.
He points out that the version of TikTok offered to Chinese audiences is designed to promote math and science content to younger users, and limits how long 13- and 14-year-olds are on the site each day.
A proposal before Congress would impose new rules limiting the data social media platforms can collect about young users and create a new office within the Federal Trade Commission focused on protecting the privacy of young social media users. .
One of the bill’s sponsors, Senator Edward Markey, said Wednesday that optimistic lawmakers from both parties can agree on the need for tighter regulation of how platforms access and use information. young users.
“Data is the raw material that big tech uses to track, manipulate and traumatize young people in our country every day,” Markey said.