Section 230 of the Communications Decency Act (the “CDA” or “Section 230”), known prolifically as “the 26 words that created the internet,” remains the subject of ongoing controversy. As extensively reported on this blog, the world of social media, user-generated content, and e-commerce has been consistently
CDA immunity
Generative AI Providers Subject to Reduced CDA Immunity Under Proposed Legislation
One of the many legal questions swirling around in the world of generative AI (“GenAI”) is to what extent Section 230 of the Communications Decency Act (CDA) applies to the provision of GenAI. Can CDA immunity apply to GenAI-generated output and protect GenAI providers from potential third party liability?
On June 14, 2023, Senators Richard Blumenthal and Josh Hawley introduced the “No Section 230 Immunity for AI Act,” bipartisan legislation that would expressly remove most immunity under the CDA for a provider of an interactive computer service if the conduct underlying the claim or charge “involves the use or provision of generative artificial intelligence by the interactive computer service.” While the bill would eliminate “publisher” immunity under §230(c)(1) for claims involving the use or provision of generative artificial intelligence by an interactive computer service, immunity for so-called “Good Samaritan” blocking under § 230(c)(2)(A), which protects service providers and users from liability for claims arising out of good faith actions to screen or restrict access to “objectionable” material from their services, would not be affected.
That Was Close! The Supreme Court Declines Opportunity to Address CDA Immunity in Social Media
Back in October 2022, the Supreme Court granted certiorari in Gonzalez v. Google, an appeal that challenged whether YouTube’s targeted algorithmic recommendations qualify as “traditional editorial functions” protected by the CDA — or, rather, whether such recommendations are not the actions of a “publisher” and thus fall outside of…
Important CDA Section 230 Case Lands in Supreme Court: Level of Protection Afforded Modern Online Platforms at Stake
Since the passage of Section 230 of the Communication Decency Act (“CDA”), the majority of federal circuits have interpreted the CDA to establish broad federal immunity to causes of action that would treat service providers as publishers of content provided by third parties. The CDA was passed in the early days of e-commerce and was written broadly enough to cover not only the online bulletin boards and not-so-very interactive websites that were common then, but also more modern online services, web 2.0 offerings and today’s platforms that might use algorithms to organize, repackage or recommend user-generated content.
Over 25 years ago, the Fourth Circuit, in the landmark Zeran case, the first major circuit court-level decision interpreting Section 230, held that Section 230 bars lawsuits, which, at their core, seek to hold a service provider liable for its exercise of a publisher’s “traditional editorial functions — such as deciding whether to publish, withdraw, postpone or alter content.” Courts have generally followed this reasoning ever since to determine whether an online provider is being treated as a “publisher” of third party content and thus entitled to immunity under the CDA. The scope of “traditional editorial functions” is at the heart of a case currently on the docket at the Supreme Court. On October 3, 2022, the Supreme Court granted certiorari in an appeal that is challenging whether a social media platform’s targeted algorithmic recommendations fall under the umbrella of “traditional editorial functions” protected by the CDA or whether such recommendations are not the actions of a “publisher” and thus fall outside of CDA immunity. (Gonzalez v. Google LLC, No. 21-1333 (U.S. cert. granted Oct. 3, 2022)).
App Store Protected by CDA Immunity (and Limitation of Liability) for Losses from Fraudulent Crypto Wallet App
In a recent ruling, a California district court held that Apple, as operator of that App Store, was protected from liability for losses resulting from that type of fraudulent activity. (Diep v. Apple Inc., No. 21-10063 (N.D. Cal. Sept. 2, 2022)). This case is important in that, in…
Second Circuit Vacates CDA Decision and Reissues a Narrower Opinion Reaching Same Conclusion, Providing Some Practical CDA Lessons for the Future
Less than one week after issuing an order vacating its own March 2021 opinion in an important Communications Decency Act (“CDA”) case and granting a petition for rehearing, the Second Circuit issued a new opinion reaffirming “protection” under Section 230 of the CDA for video-sharing site Vimeo, Inc. (“Vimeo”) (Domen v. Vimeo, Inc., No. 20-616 (2d Cir. July 21, 2021) (amended opinion)).
It’s not completely clear why the Second Circuit decided to grant a rehearing and amend its original opinion to only reach essentially the same holding. It is possible that given the attention surrounding the CDA, the court thought it best to narrow the language of its original holding so it could insulate its ruling from possible Supreme Court review (recall, Justice Thomas previously issued a statement following denial of certiorari in a prior CDA case, that “in an appropriate case,” the Court should consider whether the text of the CDA “aligns with the current state of immunity enjoyed by Internet platforms”). The Second Circuit’s second decision arguably watered down some of its stronger statements in its earlier opinion enunciating broad CDA immunity (e.g., even swapping out the word “immunity” for “protection” when discussing the CDA). The court even mused in dicta near the end of the opinion about the types of claims that might fall outside of CDA protection, as if to intimate that CDA Section 230 immunity is broad, but not as broad as its detractors suggest.
Yet, despite the narrowing of its original opinion, the court reached the same result under the same reasoning. As in the original (now vacated) opinion from March 2021, the Second Circuit’s amended decision relied on Section 230(c)(2), the Good Samaritan provision, which allows online providers to self-regulate the moderation of third party content in good faith without fear of liability. Unlike the original opinion, in the second go-round the appeals court also knocked out the plaintiff’s claims on the merits, finding allegations of discrimination based on the presence of similar videos uploaded by other users that were left up on the site as “vanishingly thin” (thereby further reducing the chance of Supreme Court review).
The President Revokes Prior Administration’s Executive Order on CDA Section 230
On May 14, 2021, President Biden issued an executive order revoking, among other things, his predecessor’s action (Executive Order 13295 of May 28, 2020) that directed the executive branch to clarify certain provisions under Section 230 of the Communications Decency Act (“Section 230” or the “CDA”) and remedy what former President Trump had claimed was the social media platforms’ “selective censorship” of user content and the “flagging” of content that does not violate a provider’s terms of service. The now-revoked executive order had, among other things, directed the Commerce Department to petition for rulemaking with the FCC to clarify certain aspect of CDA immunity for online providers (the FCC invited public input on the topic, but did not ultimately move forward with a proposed rulemaking) and requested the DOJ to draft proposed legislation curtailing the protections under the CDA (the DOJ submitted a reform proposal to Congress last October).
Mobile App Platform Entitled to CDA Immunity over State Law Claims Related to In-App Purchases of Loot Boxes
Happy Silver Anniversary to Section 230 of Communications Decency Act (“CDA” or “Section 230”), which was signed into law by President Bill Clinton in February 1996. At that time, Congress enacted CDA Section 230 in response to case law that raised the specter of liability for any online service provider that attempted to moderate its platform, thus discouraging the screening out and blocking of offensive material. As has been extensively reported on this blog, the world of social media and user-generated content is supported by protections afforded by Section 230. Now, 25 years later, the CDA is at a crossroads of sorts and its protections have stoked some controversy. Yet, as it stands, Section 230 continues to provide robust immunity for online providers.
In a recent case, Google LLC (“Google”) successfully argued for the application of Section 230, resulting in a California district court dismissing, with leave to amend, a putative class action alleging consumer protection law claims against the Google Play App Store. The claims concerned the offering for download of third party mobile video games that allow users to buy Loot Boxes, which are in-app purchases that contain a randomized assortment of items that can improve a player’s chances at advancing in a videogame. The plaintiffs claimed these offerings constituted illegal “slot machines or devices” under California law. (Coffee v. Google LLC, No. 20-03901 (N.D. Cal. Feb. 10, 2021)).
Group of Democratic Senators Release Latest CDA Reform Bill
With the change in administrations in Washington, there has been a drive to enact or amend legislation in a variety of areas. However, most initiatives lack the zeal found with the bipartisan interest in “reining in social media” and pursuing reforms to Section 230 of the Communications Decency Act (CDA). As we have documented,, the parade of bills and approaches to curtail the scope of the immunities given to “interactive computer services” under CDA Section 230 has come from both sides of the aisle (even if the justifications for such reform differ along party lines). The latest came on February 5, 2021, when Senators Warner, Hirono and Klobuchar announced the SAFE TECH Act. The SAFE TECH Act would limit CDA immunity by enacting “targeted exceptions” to the law’s broad grant of immunity.
CDA “Reform” on the Horizon: Investors and Operators Take Note
The appetite for acquisitions and investment in online businesses has never been stronger, with many of the most attractive online opportunities being businesses that host, manage and leverage user-generated content. These businesses often rely on the immunities offered by Section 230 of the Communications Decency Act, 47 U.S.C. §230 (“Section 230” or the “CDA”) to protect them from liability associated with receiving, editing and posting or removing such content. And investors have relied on the existence of robust immunity under the CDA in evaluating the risk associated with such investments in such businesses. This seemed reasonable, as following the legacy of the landmark 1997 Zeran decision, for more than two decades courts have fairly consistently interpreted Section 230 to provide broad immunity for online providers of all types.
However, in the last five years or so, the bipartisan critics of the CDA have gotten more full-throated in decrying the presence of hate speech, revenge porn, defamation, disinformation and other objectionable content on online platforms. This issue has been building throughout the past year, and has reached a fever pitch in the weeks leading up to the election. The government’s zeal for “reining in social media” and pursuing reforms to Section 230, again, on a bipartisan basis, has come through loud and clear (even if the justifications for such reform differ on party lines). While we cannot predict exactly what structure these reforms will take, the political winds suggest that regardless of the administration in charge, change is afoot for the CDA in late 2020 or 2021. Operating businesses must take note, and investors should keep this in mind when conducting diligence reviews concerning potential investments.