Tik Tok Algorithm Promotes Eating Disorder, Self-Harm Related Posts: Report

TikTok’s algorithms are recommending videos about self-harm and eating disorders to vulnerable teens, news agency Associated Press (AP) reported citing a report published this week that highlighted worries about social media and its impact on youth mental health.

TikTok accounts for fictional teen personas in the United States, United Kingdom, Canada, and Australia were constructed by researchers at the charity Center for Countering Digital Hate. The researchers who ran the accounts “liked” videos on self-harm and eating disorders to observe how TikTok’s algorithm reacted.

Within minutes, the immensely popular platform started recommending videos on losing weight and self-harm, including photos of models and desired body types, razor blade images, and suicide chats.

When the researchers constructed identities with user names that implied a specific predisposition to eating disorders, such as “lose weight,” the accounts were given even more hazardous content, as per the report.

“It’s like being caught in a hall of distorted mirrors where you’re continuously told you’re ugly, you’re not good enough, maybe you should kill yourself,” said Imran Ahmed, CEO of the centre, which has offices in the United States and the United Kingdom. “It literally sends the most damaging messages to young people,” according to the report by AP.

ALSO READ: Tribute To Amitabh Bachchan’s ‘Angry Young Man’ Image Through An Exhibition At 28th KIFF

Social media algorithms function by recognising subjects and information that a user is interested in, and then sending them more of the same to maximise their time on the site.

However, critics of social media argue that the same algorithms that highlight information about a specific sports team, hobby, or dance craze can lead users down a rabbit hole of harmful content.

According to the report, it’s a particular issue for teens and children, who spend more time online and are more vulnerable to bullying, peer pressure, or negative content about eating disorders or suicide, as per a statement by Josh Golin, executive director of Fairplay, a nonprofit that advocates for stronger online child protections.

He went on to say that TikTok isn’t the only platform that doesn’t safeguard its young users from hazardous content and invasive data collecting.

ALSO READ: One Million Migratory Birds Flying From Europe Expected This Season At Andhra’s Kolleru lake

“All of these negative effects are tied to the business model,” Golin explained. “It makes no difference whether social media platform is used,” news agency AP quoted him as saying.

TikTok contested the findings in a statement from a company spokesperson, emphasising that the researchers did not use the platform like regular users, and that the results were skewed as a result. The business also stated that a user’s account name should have no bearing on the type of content the user receives.

(With Inputs From Agencies)