The panel rejects an attempt to dismiss a 2021 lawsuit over “blackout challenge.”
The Third Circuit Court of Appeals ruled that a lawsuit claiming that TikTok’s algorithm led to the death of a 10-year-old may proceed. In 2021, TikTok users circulated videos creating the “blackout challenge,” recording themselves choking themselves until they passed out and daring others to do the same. A 10-year-old girl attempted the challenge using a purse strap in her mother’s closet, leading to her death.
The girl’s mother sued TikTok for her daughter’s death, but the case was dismissed in lower courts by TikTok’s citing of Section 230 of the Communications Decency Act. This section shields communications platforms from legal liability from false information provided by third parties on its platform.
However, U.S. Circuit Judge Patty Shwartz, writing for the three-judge panel, said that reasoning no longer held after Supreme Court rulings made this past July. In those cases, the Supreme Court held that an algorithm created by an online social media platform reflects “editorial judgments” about “compiling the third-party speech it wants in the way it wants.”
“TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech,” she wrote. Judge Shwartz said, under that logic, content curation using algorithms is speech by the company itself, which is not protected by Section 230.
As the Lord Leads, Pray with Us…
- For the judges of the Circuit Courts of Appeal as they review and reverse or uphold lower court decisions.
- For U.S. officials as they continue to evaluate the protections of the Communications Decency At and Section 230 protections for platforms.
Sources: Reuters