In the past month, there have been some notable developments surrounding Section 230 of the Communications Decency Act (“CDA” or “Section 230”) beyond the ongoing debate in Congress over the potential for legislative reform. These include a novel application of CDA in a FCRA online privacy case (Henderson v. The Source for Public Data, No. 20-294 (E.D. Va. May 19, 2021)) and the denial of CDA immunity in another case involving an alleged design defect in a social media app (Lemmon v. Snap Inc., No. 20-55295 (9th Cir. May 4, 2021), as well as the uncertainties surrounding a new Florida law that attempts to regulate content moderation decisions and user policies of large online platforms.
Despite continued scrutiny over the legal immunity online providers enjoy under Section 230 of the Communications Decency Act (CDA), online platforms continue to successfully invoke its protections. This is illustrated by three recent decisions in which courts dismissed claims that sought to impose liability on providers for hosting or restricting access to user content and for providing a much-discussed social media app filter.
In one case, a California district court dismissed a negligence claim against online real estate database Zillow over a fraudulent posting, holding that any allegation of a duty to monitor new users and prevent false listing information inherently derives from Zillow’s status as a publisher and is therefore barred by the CDA. (924 Bel Air Road LLC v. Zillow Group Inc., No. 19-01368 (C.D. Cal. Feb. 18, 2020)). In the second, the Ninth Circuit, in an important ruling, affirmed the dismissal of claims against YouTube for violations of the First Amendment and the Lanham Act over its decision to restrict access to the plaintiff’s uploaded videos. The Ninth Circuit found that despite YouTube’s ubiquity and its role as a public-facing platform, it is a private forum not subject to judicial scrutiny under the First Amendment. It also found that its statements concerning its content moderation policies could not form a basis of false advertising liability. (Prager Univ. v. Google LLC, No. 18-15712 (9th Cir. Feb. 26, 2020)). And in a third case, the operator of the messaging app Snapchat was granted CDA immunity in a wrongful death suit brought by individuals killed in a high-speed automobile crash where one of the boys in the car had sent a snap using the app’s Speed Filter, which had captured the speed of the car at 123MPH, minutes before the fatal accident. (Lemmon v. Snap, Inc., No. 19-4504 (C.D. Cal. Feb. 25, 2020)).
In recent years, there have been a number of suits filed in federal courts seeking to hold social media platforms responsible for providing material support to terrorists by allowing members of such groups to use social media accounts and failing to effectively block their content and terminate such accounts. As we’ve previously written about, such suits have generally not been successful at either the district court or circuit court level and have been dismissed on the merits or on the basis of immunity under Section 230 of the Communications Decency Act (CDA).
This past month, in a lengthy, important 2-1 decision, the Second Circuit affirmed dismissal of federal Anti-Terrorism Act (ATA) claims against Facebook on CDA grounds for allegedly providing “material support” to Hamas. The court also declined to exercise supplemental jurisdiction over plaintiff’s foreign law claims. (Force v. Facebook, Inc., No. 18-397 (2d Cir. July 31, 2019)). Despite the plaintiffs’ creative pleadings that sought to portray Facebook’s processing of third-party content as beyond the scope of CDA immunity, the court found that claims related to supplying a communication forum and failing to completely block or eliminate hateful terrorist content necessarily treated Facebook as the publisher of such content and were therefore barred under the CDA.
Three recent court decisions reaffirm the expansive immunity awarded to online providers that host third-party content under Section 230 of the Communications Decency Act (“CDA”), 47 U.S.C. § 230(c): California’s Superior Court decision in Murphy v. Twitter, Inc., No. CGC-19-573712 (Cal. Super. June 12, 2019), and the Northern District of California’s decisions Brittain v. Twitter, Inc., No. 19-00114 (N.D. Cal. June 10, 2019), and Fyk v. Facebook, Inc., No. 18-05159 (N.D. Cal. June 18, 2019).
The District of Utah ruled in late May that Section 230 of the Communications Decency Act, 47 U.S.C. §230 (“CDA”) shields The Tor Project, Inc. (“Tor”), the organization responsible for maintaining the Tor Browser, from claims for strict product liability, negligence, abnormally dangerous activity, and civil conspiracy.
The claims were asserted against Tor following an incident where a minor died after taking illegal narcotics purchased from a site on the “dark web” on the Tor Network. (Seaver v. Estate of Cazes, No. 18-00712 (D. Utah May 20, 2019)). The parents of the child sued, among others, Tor as the service provider through which the teenager was able to order the drug on the dark web. Tor argued that the claims against it should be barred by CDA immunity and the district court agreed.