Skip to main content

INTERNET-TIKTOK-IMMUNITY

Anderson v. TikTok, Inc., U.S. App. LEXIS 21771 (U.S. Ct. of Appeals, 3d Cir., August 27, 2024) (Shwartz, J.)

SHWARTZ, Circuit Judge.

TikTok, Inc., via its algorithm, recommended and promoted videos posted by third parties to ten-year-old Nylah Anderson on her uniquely curated “For You Page.” One video depicted the “Blackout Challenge,” which encourages viewers to record themselves engaging in acts of self-asphyxiation. After watching the [*2] video, Nylah attempted the conduct depicted in the challenge and unintentionally hanged herself. Nylah’s mother, Tawainna Anderson, sued TikTok and its corporate relative ByteDance, Inc., (collectively, “TikTok”) for violations of state law. The District Court dismissed her complaint, holding that the Communications Decency Act (“CDA”), 47 U.S.C. § 230, immunizes TikTok. For the following reasons, we will reverse in part, vacate in part, and remand.

Here, as alleged, TikTok’s FYP algorithm “decides on the third-party speech that will be included in or excluded from a compilation—and then organizes and presents the included items” on users’ FYPs. NetChoice, 144 S. Ct. at 2402. Accordingly, TikTok’s algorithm, which recommended the Blackout Challenge to Nylah on her FYP, was TikTok’s own “expressive activity,” id., and thus its first-party speech.

Section 230 immunizes only information “provided by another,” 47 U.S.C. § 230(c)(1), and here, because the information that forms the basis of Anderson’s lawsuit—i.e., TikTok’s recommendations via its FYP algorithm—is TikTok’s own expressive activity, § 230 does not bar Anderson’s claims.

We need not address in this case the publisher/distributor distinction our colleague describes, nor do we need to decide whether the word “publisher” as used in § 230 is limited to the act of allowing third-party content to be posted on a website an ICS hosts, as compared to third-party content an ICS promotes or distributes through some additional action, because, in this case, the only distribution at issue is that which occurred via TikTok’s algorithm, which as explained herein, is not immunized by § 230 because the algorithm is TikTok’s own expressive activity.13We recognize that this holding may be in tension with Green v. America Online (AOL), where we held that § 230 immunized a party from any liability for the platform’s failure
to prevent certain users from “transmit[ing] harmful online messages” to other users. 318 F.3d 465, 468 (3d Cir. 2003). We reached this conclusion on the grounds that§ 230 “bar[red]‘lawsuits seeking to hold a service provider liable for . . . deciding whether to publish, withdraw, postpone, or alter content.’”Id.at 471(quoting Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th Cir. 1997)).Green, however, did not involve an ICS’s content recommendations via an algorithm and pre-dated NetChoice. Similarly, our holding may depart from the pre-NetChoice views of other circuits. See, e.g., Dyroffv. Ultimate Software Grp., 934 F.3d 1093, 1098 (9th Cir. 2019) (“[R]recommendations and notifications . . . are not content in and of themselves.”);Force v. Facebook, Inc., 934 F.3d 53, 70(2d Cir. 2019)(“Merely arranging and displaying others’ content to users . . . through [] algorithms—even if the content is not actively sought by those users—is not enough to hold [a defendant platform] responsible as the developer or creator of that content.” (internal quotation marks and citation omitted));Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 21(1st Cir. 2016)(concluding that § 230 immunity applied because the structure and operation of the website, notwithstanding that it effectively aided sex traffickers, reflected editorial choices related to traditional publisher functions); Jones v. Dirty World Ent. Recordings LLC, 755 F.3d 398, 407 (6th Cir. 2014)(adopting Zeran by noting that “traditional editorial functions” are immunized by § 230); Klayman v. Zuckerburg, 753 F.3d 1354, 1359 (D.C. Cir. 2014)(immunizing a platform’s “decision whether to print or retract a given piece of content”); Johnson v. Arden, 614 F.3d 785, 791-92 (8th Cir. 2010)(adopting Zeran); Doe v. MySpace, Inc., 528 F.3d 413, 420 (5th Cir. 2008)(rejecting an argument
that § 230 immunity was defeated where the allegations went to the platform’s traditional editorial functions). 14To the extent that Anderson still pursues any claims not premised upon TikTok’s algorithm, we leave to the District Court to determine, among other things, whether, consistent with this Opinion, those claims are barred by § 230.

Today, § 230 rides in to rescue corporations from virtually any claim loosely related to content posted by a third party, no matter the cause of action and whatever the provider’s actions. See, e.g., Gonzalez v. Google LLC, 2 F.4th 871, 892–98 (9th Cir. 2021),vacated,598 U.S. 617 (2023);Force, 934 F.3d at65–71. The result is a § 230 that immunizes platforms from the consequences of their own conduct and permits platforms to ignore ordinary obligation that most businesses have to take reasonable steps to prevent their services from causing devastating harm. But this conception of §230 immunity departs from the best ordinary meaning of the text and ignores the context of congressional action. Section 230 was passed to address an old problem arising in a then-unique context, not to, “create a lawless no-man’s-land” of legal liability. Fair House Council 13 of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1164(9th Cir. 2008) (en banc).

It allows suits to proceed if the allegedly wrongful conduct is not based on the mere hosting of third-party content, but on the acts or omissions of the provider of the interactive computer service. This is where Zeran went astray, wrongly reasoning that distributor liability “is merely a subset, or a species, of publisher liability.”129 F.3d at 332. It is true that “sources sometimes use language that arguably blurs the distinction between publishers and distributors.” Malwarebytes,141 S. Ct. at 15(Thomas, J., statement respecting denial of certiorari). But understanding §230(c)(1)’s use of “publisher” to subsume distributor liability conflicts with the context surrounding §230’s enactment. Both CompuServe and Stratton Oakmont saw two distinct concepts. See CompuServe, 776 F. Supp. at 138–41; Stratton Oakmont, 1995 WL 323710, at *1–5.So did the common law of common carriers. It is implausible to conclude Congress decided to silently jettison both past and present to coin a new meaning of “publisher” in §230(c)(1). See Malwarebytes,141 S. Ct. at 14–16(Thomas, J., statement respecting denial of certiorari); Doe v. Am. Online, Inc., 783 So. 2d 1010, 1023–25 (Fla. 2001) (Lewis, J., dissenting).

Properly read, §230(c)(1) says nothing about a provider’s own conduct beyond mere hosting.

What does all this mean for Anderson’s claims? Well, § 230(c)(1)’s preemption of traditional publisher liability precludes Anderson from holding TikTok liable for the Blackout Challenge videos’ mere presence on TikTok’s platform. A conclusion Anderson’s counsel all but concedes. But § 230(c)(1) does not preempt distributor liability, so Anderson’s claims seeking to hold TikTok liable for continuing to host the Blackout Challenge videos knowing they were causing the death of children can proceed. So too for her claims seeking to hold TikTok liable for its targeted recommendations of videos it knew were harmful. That is TikTok’s own conduct, a subject outside of§ 230(c)(1).Whether that conduct is actionable under state law is another question. But § 230 does not preempt liability on those bases.