Appeals court reopens lawsuit against TikTok over child’s death

  • A U.S. appeals court has reinstated a lawsuit filed by the mother of a 10-year-old girl who died while attempting a blackout challenge.
  • While federal law generally protects online publishers from liability for content posted by others, the court said TikTok could be liable for promoting content or using algorithms to direct it to children.

OUR TAKE
By choosing to promote content to specific users, TikTok is involved in the users’ viewing choices, and should therefore be responsible for the guidance it provides to minors. In order to provide a healthy online environment for minors, the government and social media platforms should be more vigilant in enforcing the age rating system and controlling the rapidly growing social media networks.
— Iydia Ding, BTW reporter

What happened

A U.S. appeals court on Tuesday reinstated a lawsuit filed by the mother of a 10-year-old girl in Pennsylvania who allegedly died while attempting a viral challenge seen on TikTok, which dares people to choke until they lose consciousness. A district judge initially dismissed the lawsuit, citing Section 230 of the Communications Regulation Act of 1996, which is commonly used to protect Internet companies from liability for content posted on their websites. On Tuesday, a three-judge appeals court panel partially reversed that decision, sending the case back to a lower court for a trial.

While federal law generally protects online publishers from liability for content posted by others, the court said TikTok could be liable for promoting content or using algorithms to direct it to children. “TikTok chooses to recommend and promote content to specific users and, in doing so, is engaging in its own first-dialect argument,” Judge Patty Shwartz of the U.S. Third Circuit Court in Philadelphia wrote in an opinion issued Tuesday. Attorneys for TikTok’s parent company, ByteDance, did notCalls and emails seeking comment were immediately returned.

Also read: Nepal lifts TikTok ban imposed for disrupting social harmony

Also read: TikTok calls itself a foreign-owned US news organisation

Why it’s important

“Nylah may not have known what she was doing or that following the image on the screen would kill her. But TikTok knew Nylah would be watching because the company’s customised algorithm placed the video on her ‘For You Page’.” Judge Paul Matey wrote in part concurring in the opinion. Jeffrey Goodman, an attorney for the family, said more scrutiny of Section 230 by the courts is inevitable as technology permeates every aspect of our lives. “Today’s opinion is the clearest statement yet that Section 230 does not provide the kind of all-encompassing protection that social media companies have been claiming,” Goodman said.

By choosing to use algorithms to promote appropriate content to specific users, TikTok is participating in the viewing choices of its users, and therefore should be held responsible for the guidance it provides to minors. In order to provide a healthy online environment for minors, the government and social media platforms should be more vigilant in enforcing the age-rating system and controlling the rapidly growing social media networks.

Iydia-Ding

Iydia Ding

Iydia Ding is a intern reporter at BTW Media covering products. She studing at Shanghai International Studies University. Send tips to i.ding@btw.media.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *