Back in October 2022, the Supreme Court granted certiorari in Gonzalez v. Google, an appeal that challenged whether YouTube’s targeted algorithmic recommendations qualify as “traditional editorial functions” protected by the CDA — or, rather, whether such recommendations are not the actions of a “publisher” and thus fall outside of
Since the passage of Section 230 of the Communication Decency Act (“CDA”), the majority of federal circuits have interpreted the CDA to establish broad federal immunity to causes of action that would treat service providers as publishers of content provided by third parties. The CDA was passed in the early days of e-commerce and was written broadly enough to cover not only the online bulletin boards and not-so-very interactive websites that were common then, but also more modern online services, web 2.0 offerings and today’s platforms that might use algorithms to organize, repackage or recommend user-generated content.
Over 25 years ago, the Fourth Circuit, in the landmark Zeran case, the first major circuit court-level decision interpreting Section 230, held that Section 230 bars lawsuits, which, at their core, seek to hold a service provider liable for its exercise of a publisher’s “traditional editorial functions — such as deciding whether to publish, withdraw, postpone or alter content.” Courts have generally followed this reasoning ever since to determine whether an online provider is being treated as a “publisher” of third party content and thus entitled to immunity under the CDA. The scope of “traditional editorial functions” is at the heart of a case currently on the docket at the Supreme Court. On October 3, 2022, the Supreme Court granted certiorari in an appeal that is challenging whether a social media platform’s targeted algorithmic recommendations fall under the umbrella of “traditional editorial functions” protected by the CDA or whether such recommendations are not the actions of a “publisher” and thus fall outside of CDA immunity. (Gonzalez v. Google LLC, No. 21-1333 (U.S. cert. granted Oct. 3, 2022)).
In the past month, there have been some notable developments surrounding Section 230 of the Communications Decency Act (“CDA” or “Section 230”) beyond the ongoing debate in Congress over the potential for legislative reform. These include a novel application of CDA in a FCRA online privacy case (Henderson v. The Source for Public Data, No. 20-294 (E.D. Va. May 19, 2021)) and the denial of CDA immunity in another case involving an alleged design defect in a social media app (Lemmon v. Snap Inc., No. 20-55295 (9th Cir. May 4, 2021), as well as the uncertainties surrounding a new Florida law that attempts to regulate content moderation decisions and user policies of large online platforms.
Despite continued scrutiny over the legal immunity online providers enjoy under Section 230 of the Communications Decency Act (CDA), online platforms continue to successfully invoke its protections. This is illustrated by three recent decisions in which courts dismissed claims that sought to impose liability on providers for hosting or restricting access to user content and for providing a much-discussed social media app filter.
In one case, a California district court dismissed a negligence claim against online real estate database Zillow over a fraudulent posting, holding that any allegation of a duty to monitor new users and prevent false listing information inherently derives from Zillow’s status as a publisher and is therefore barred by the CDA. (924 Bel Air Road LLC v. Zillow Group Inc., No. 19-01368 (C.D. Cal. Feb. 18, 2020)). In the second, the Ninth Circuit, in an important ruling, affirmed the dismissal of claims against YouTube for violations of the First Amendment and the Lanham Act over its decision to restrict access to the plaintiff’s uploaded videos. The Ninth Circuit found that despite YouTube’s ubiquity and its role as a public-facing platform, it is a private forum not subject to judicial scrutiny under the First Amendment. It also found that its statements concerning its content moderation policies could not form a basis of false advertising liability. (Prager Univ. v. Google LLC, No. 18-15712 (9th Cir. Feb. 26, 2020)). And in a third case, the operator of the messaging app Snapchat was granted CDA immunity in a wrongful death suit brought by individuals killed in a high-speed automobile crash where one of the boys in the car had sent a snap using the app’s Speed Filter, which had captured the speed of the car at 123MPH, minutes before the fatal accident. (Lemmon v. Snap, Inc., No. 19-4504 (C.D. Cal. Feb. 25, 2020)).