Justice Thomas Argues Again for Reading § 230 Immunity More Narrowly

Justice Thomas’s today statement on certiorari denial Doe v. Facebook, Inc.:

Jane Doe, 15, was lured to a meeting by a male adult sexual predator via Facebook in 2012. She was raped repeatedly, beat and sold for sex shortly afterwards. Doe eventually ran away and sued Facebook at Texas state court alleging that Facebook had violated Texas’ anti-sex-trafficking statute. Doe also alleged other common-law violations. Facebook applied to the Texas Supreme Court for an order of mandamus, which would have dismissed Doe’s lawsuit. The court held that a provision of the Communications Decency Act known as § 230 bars Doe’s common-law claims, but not her statutory sex-trafficking claim.

Section 230 (c)(1) provides that, “[n]An interactive computer service provider or user may be treated like the publisher, speaker, or author of any information that is provided by another content provider. Texas Supreme Court stated that internet services have been treated uniformly by courts as publishers.[s]” under § 230(c)(1), and thus immune, whenever a plaintiff ‘s claim “‘stem[s]From [the platform’s]Publication of information by third parties.

This broad understanding of publisher immunity means that any claims against an internet company for not warning consumers about product defects, or failing to take reasonable precautions “to safeguard their users from malicious or objectionable behavior of other users” will be dismissed. The Texas Supreme Court acknowledged that it is “plausible” to read § 230(c)(1) more narrowly to immunize internet platforms when plaintiffs seek to hold them “strictly liable” for transmitting third-party content, but the court ultimately felt compelled to adopt the consensus approach.

This decision exemplifies how courts have interpreted § 230 “to confer sweeping immunity on some of the largest companies in the world,” particularly by employing a “capacious conception of what it means to treat a website operator as [a]”Publisher or speaker.” Here, the Texas Supreme Court afforded publisher immunity even though Facebook allegedly “knows its system facilitates human traffickers in identifying and cultivating victims,” but has nonetheless “failed to take any reasonable steps to mitigate the use of Facebook by human traffickers” because doing so would cost the company users—and the advertising revenue those users generate. [Plaintiff’s Complaint]You can also see Reply Brief, which lists recent disclosures as well as investigations that support these claims. It is hard to see why the protection § 230(c)(1) grants publishers against being held strictly liable for third parties’ content should protect Facebook from liability for its Own “Acts and Omissions”

Before we dismiss serious allegations, it is essential that “we are certain” the law requires us to do so. As I have explained, the arguments in favor of broad immunity under § 230 rest largely on “policy and purpose,” not on the statute’s plain text. The Texas Supreme Court acknowledged that this was true.[t]he United States Supreme Court—or better yet, Congress—may soon resolve the burgeoning debate about whether the federal courts have thus far correctly interpreted section 230.” Assuming Congress does not step in to clarify § 230’s scope, we should do so in an appropriate case.

Unfortunately, it isn’t so. Our jurisdiction is limited to reviewing only the “[f]”Inal Judgments or Decrees” issued by state courts. Finality usually requires an “effective determination of the litigation, and not merely interim or interlocutory steps.” The litigation was not considered “final” because Doe’s statute claim was allowed to proceed by the Texas Supreme Court. Doe concedes that the litigation is not “final.” But that exception cannot apply here because the Texas courts have not yet conclusively adjudicated a personal-jurisdiction defense that, if successful, would “effectively moot the federal-law question raised here.”

Therefore, I agree with the Court’s decision to deny certiorari. We should, however, address the proper scope of immunity under § 230 in an appropriate case.

This is the Texas Supreme Court’s Summary of Common-Law Claims by Plaintiffs. It held that they were Preempted.

The essence of Plaintiffs’ negligence, gross-negligence, negligent-undertaking, and products-liability claims is that, because Plaintiffs were users of Facebook or Instagram, the company owed them a duty to warn them or otherwise protect them against recruitment into sex trafficking by other users. Plaintiffs assert that Facebook failed to comply with this duty when it “implement any safeguards for adults not to contact minors,” “report unusual messages,” and “warn.”[]discover the dangers presented by sextraffickers.[]it Platforms for sex traffickers.” These claims are “treat” according to section 230, which is the basis of every publication decision we have.[]You are not allowed to use Facebook’s third-party communication as the speaker or publisher.

Plaintiffs argue their common law claims don’t treat Facebook as “publisher”, “speaker”, and “does not seek to hold”. [it]Not for “exercising any kind of editorial function over its user’s communications”, but for “its own failure to implement any measures that protect them from the dangers posed” by its products. This theory of liability would hold Facebook liable even if it was framed in the terms of Facebook’s omissions. However, this theory does not consider that Facebook passively served an “intermediar.”[y] for other parties’… injurious messages.”

Or, to put it another way: “The duty that [Plaintiffs]Allergy[] [Facebook]The mere fact that they were exposed to third-party material that could harm them is what makes it violated. [Facebook’s] status … as a ‘publisher or speaker'” of that content. This claim seeks to hold Facebook responsible for any harm done by malicious users to its platforms. All the actions Plaintiffs allege Facebook should have taken to protect them—warnings, restrictions on eligibility for accounts, removal of postings, etc.—are actions courts have consistently viewed as those of a “publisher” for purposes of section 230. No matter whether Plaintiffs claim failure to warn or negligence, all liability will be determined by second-guessing Facebook’s “decisions regarding the monitoring, screening and deletion of”. [third-party]Content from the network. …

The Texas Supreme Court summarizes state statutory claims in detail, explaining why they don’t have to be preempted.

Plaintiffs also sued Facebook in Texas under a statute which created a civil cause for action against anyone “who intends or knowingly benefits” from participating on a venture that traffics someone else. Plaintiffs claim Facebook violated this law through “acts of omissions” such as “knowingly encouraging the sextrafficking of”. [Plaintiffs]”, “creat[ing]It is an ideal place for sex criminals to start a gang and trap survivors. … Liability under [this statute]A showing that a defendant obtained a benefit from “participat” is required.[ing]In a “human-trafficking” “venture” Participation in human-trafficking “ventures” is much more than passive complicity. It is clear that this meaning of “participate” means more than passive acquiescence in trafficking conducted by others.[t]”To participate in or be actively involved in something. …

Facebook can be charged with “intentionally and knowingly benefitting”[ting]You can participate in a [trafficking] venture” is to charge it with “some affirmative conduct”—that is, “an overt act” beyond “mere negative acquiescence”—”designed to aid in the success of the venture.” Accordingly, a claim under Section 98.002 does not arise from injurious communication of other websites. It is based on affirmative acts by the website to facilitate such communications.

This distinction—between passive acquiescence in the wrongdoing of others and affirmative acts encouraging the wrongdoing—is evident in Plaintiffs’ allegations, which we construe liberally at the [motion-to-dismiss] stage. Although many Plaintiffs claim that Facebook failed to do what Plaintiffs expect, section 98.002 claims state that Facebook encourages sex-trafficking using its platform. In one instance, petitions allege that Facebook encourages sex trafficking.[ed] a breeding ground for sex traffickers to stalk and entrap survivors”; that “Facebook … knowingly aided, facilitated and assisted sex traffickers, including the sex trafficker[s]Who recruited [Plaintiffs]Facebook”, and that they “knowingly profited” by rendering such assistance. That “Facebook has helped and facilited the trafficking [Plaintiffs]Facebook and any other minors”; Facebook also “uses the information it gathers and purchases on its users to guide users to people they might want to meet”.[i]n doing so, … facilitates human trafficking by identifying potential targets, like [Plaintiffs]These statements may be used to connect traffickers and those people. Read liberally in Plaintiffs’ favor, these statements may be taken as alleging affirmative acts by Facebook to encourage unlawful conduct on its platforms….

All evidence suggests that Facebook is not exempt from CDA claims. According to the Ninth Circuit, defendants are subject to CDA immunity when they do not act as “passive transmitters”.[s]”Other information.” An internet platform defendant who “contributes to” or “directly involves in” the “alleged illegality of communication by third parties on its platform is not “immunized.” Plaintiffs have a statutory cause to action based on the allegations that Facebook encourages trafficking through its platforms.

These accusations are not the same as Plaintiffs’ common law claims. Under which Facebook is charged only with “providing neutral instruments to execute what might be illegal or illicit” communications by its users, these allegations do not differ. Common-law claims “are based upon [Facebook’s]Passive acquiescence to the misconduct of its customers” is what the CDA allows. The company has “entitled CDA immunity”. Like the Ninth Circuit, however, we understand the CDA to stop short of immunizing a defendant for its “affirmative acts … contribut[ing]Any alleged illegality” of user-created content. Facebook’s claimed violations [the Texas statute]This is the case for many who fall under this category. This does not mean that Facebook is a publisher responsible for actions or words of third-party providers. They instead treat Facebook as any other party that is responsible for its own wrongdoings. Other courts have drawn a similar line….

These [statutory] claims may proceed to further litigation, although we express no opinion on their viability at any later stage of these cases….