US: TikTok Sued For Pushing ‘Blackout Challenge’ That Choke Two Girls To Death

A case filed against video-sharing platform TikTok in California accuses the platform of “intentionally and repeatedly” pushing the Blackout Challenge that triggered deaths of an eight-year-old girl in Texas and a nine-year-old girl in Wisconsin last year. Children died at the time of performing the acts promoted through the “Blackout Challenge” that makes a sport of choking oneself until passing out, reported news agency AFP.

The lawsuit filed in state court in Los Angeles last week blamed the company for the acts saying, “TikTok needs to be held accountable for pushing deadly content to these two young girls.”

ALSO READ: ‘Twitter Actively Resisting Information Rights’, Elon Musk Threatens To Call Off Twitter Deal: Report (abplive.com)

“TikTok has invested billions of dollars to intentionally design products that push dangerous content that it knows are dangerous and can result in the deaths of its users,” according to Matthew Bergman, an attorney at the Social Media Victims Law Center.

Hitting out at the company, the lawsuit alleged that TikTok’s algorithm promoted the Blackout Challenge to each of the girls. One of the girls died from self-strangulation using rope while the other used a dog leash.

The plea also mentioned deaths among children in Italy, Australia and other countries linked to the Blackout Challenge. The video-sharing platform has featured and promoted several other challenges in which users record themselves performing themed acts that are sometimes dangerous.

Of the TikTok challenges mentioned in court documents, one relates to the “Skull Breaker Challenge” in which others jump kicking an individual’s leg from under both sides leaving the person flip and hit their heads on ground.

While the “Coronavirus Challenge” was about licking random items and surfaces in public during the pandemic, and the “Fire Challenge” involves dousing things with flammable liquid and setting them ablaze, court documents said.

The suit demands the judge to order TikTok to stop promoting dangerous content for children through its algorithm besides paying unspecified cash damages.