In the past month, there have been some notable developments surrounding Section 230 of the Communications Decency Act (“CDA” or “Section 230”) beyond the ongoing debate in Congress over the potential for legislative reform. These include a novel application of CDA in a FCRA online privacy case (Henderson v. The Source for Public Data, No. 20-294 (E.D. Va. May 19, 2021)) and the denial of CDA immunity in another case involving an alleged design defect in a social media app (Lemmon v. Snap Inc., No. 20-55295 (9th Cir. May 4, 2021), as well as the uncertainties surrounding a new Florida law that attempts to regulate content moderation decisions and user policies of large online platforms.  

On May 14, 2021, President Biden issued an executive order revoking, among other things, his predecessor’s action (Executive Order 13295 of May 28, 2020) that directed the executive branch to clarify certain provisions under Section 230 of the Communications Decency Act (“Section 230” or the “CDA”) and remedy what former President Trump had claimed was the social media platforms’ “selective censorship” of user content and the “flagging” of content that does not violate a provider’s terms of service. The now-revoked executive order had, among other things, directed the Commerce Department to petition for rulemaking with the FCC to clarify certain aspect of CDA immunity for online providers (the FCC invited public input on the topic, but did not ultimately move forward with a proposed rulemaking) and requested the DOJ to draft proposed legislation curtailing the protections under the CDA (the DOJ submitted a reform proposal to Congress last October).

Happy Silver Anniversary to Section 230 of Communications Decency Act (“CDA” or “Section 230”), which was signed into law by President Bill Clinton in February 1996. At that time, Congress enacted CDA Section 230 in response to case law that raised the specter of liability for any online service provider that attempted to moderate its platform, thus discouraging the screening out and blocking of offensive material. As has been extensively reported on this blog, the world of social media and user-generated content is supported by protections afforded by Section 230. Now, 25 years later, the CDA is at a crossroads of sorts and its protections have stoked some controversy. Yet, as it stands, Section 230 continues to provide robust immunity for online providers.

In a recent case, Google LLC (“Google”) successfully argued for the application of Section 230, resulting in a California district court ­dismissing, with leave to amend, a putative class action alleging consumer protection law claims against the Google Play App Store.  The claims concerned the offering for download of third party mobile video games that allow users to buy Loot Boxes, which are in-app purchases that contain a randomized assortment of items that can improve a player’s chances at advancing in a videogame.  The plaintiffs claimed these offerings constituted illegal “slot machines or devices” under California law.  (Coffee v. Google LLC, No. 20-03901 (N.D. Cal. Feb. 10, 2021)).

With the change in administrations in Washington, there has been a drive to enact or amend legislation in a variety of areas. However, most initiatives lack the zeal found with the bipartisan interest in “reining in social media” and pursuing reforms to Section 230 of the Communications Decency Act (CDA).  As we have documented,, the parade of bills and approaches to curtail the scope of the immunities given to “interactive computer services” under CDA Section 230 has come from both sides of the aisle (even if the justifications for such reform differ along party lines). The latest came on February 5, 2021, when Senators Warner, Hirono and Klobuchar announced the SAFE TECH Act.  The SAFE TECH Act would limit CDA immunity by enacting “targeted exceptions”  to the law’s broad grant of immunity.

The appetite for acquisitions and investment in online businesses has never been stronger, with many of the most attractive online opportunities being businesses that host, manage and leverage user-generated content.  These businesses often rely on the immunities offered by Section 230 of the Communications Decency Act, 47 U.S.C. §230 (“Section 230” or the “CDA”) to protect them from liability associated with receiving, editing and posting or removing such content.  And investors have relied on the existence of robust immunity under the CDA in evaluating the risk associated with such investments in such businesses.  This seemed reasonable, as following the legacy of the landmark 1997 Zeran decision, for more than two decades courts have fairly consistently interpreted Section 230 to provide broad immunity for online providers of all types.

However, in the last five years or so, the bipartisan critics of the CDA have gotten more full-throated in decrying the presence of hate speech, revenge porn, defamation, disinformation and other objectionable content on online platforms. This issue has been building throughout the past year, and has reached a fever pitch in the weeks leading up to the election. The government’s zeal for “reining in social media” and pursuing reforms to Section 230, again, on a bipartisan basis, has come through loud and clear (even if the justifications for such reform differ on party lines). While we cannot predict exactly what structure these reforms will take, the political winds suggest that regardless of the administration in charge, change is afoot for the CDA in late 2020 or 2021. Operating businesses must take note, and investors should keep this in mind when conducting diligence reviews concerning potential investments.

In continuing its efforts to enforce its terms and policies against developers that engage in unauthorized scraping of user data, this week Facebook brought suit against two marketing analytics firms, BrandTotal Ltd (“BrandTotal”) and Unimania, Inc. (“Unimania”) (collectively, the “Defendants”) (Facebook, Inc. v. BrandTotal Ltd., No. 20Civ04246

Section 230 of the Communications Decency Act, 47 U.S.C. §230 (“Section 230” or the “CDA”), enacted in 1996, is generally viewed as the most important statute supporting the growth of Internet commerce.  The key provision of the CDA, Section 230(c)(1)(a), only 26 words long, simply states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This one sentence has been the source of bedrock service provider immunity for third party content made available through a provider’s infrastructure, thus enabling the growth of a vigorous online ecosystem. Without such immunity, providers would have to face what the Ninth Circuit once termed, “death by ten thousand duck-bites,” in having to fend off claims that they promoted or tacitly assented to objectionable third party content.

The brevity of this provision of the CDA is deceptive, however. The CDA – and the immunity it conveys – is controversial, and those 26 words have been the subject of hundreds, if not thousands, of litigations.  Critics of the CDA point to the proliferation of hate speech, revenge porn, defamatory material, disinformation and other objectionable content – in many cases, the sites hosting such third party content (knowingly or unknowingly) are protected by the broad scope of the CDA. Other objections are merely based on unhappiness about the content of the speech, albeit in many cases true, such as comments that are critical of individuals, their businesses or their interests. Litigants upset about such content have sought various CDA workarounds over the past two decades in a mostly unsuccessful attempt to bypass the immunity and reach the underlying service providers.

The back-and-forth debate around the scope and effects of the CDA and the broad discretion afforded online providers regarding content hosting and moderation decisions is not new.  However, it was brought into a new focus when the President, vexed at the way some of his more controversial posts were being treated by certain social media platforms, issued a May 20, 2020 Executive Order for the purpose of curtailing legal protections for online providers. The goal was to remedy what the White House believed was the online platforms’ “deceptive or pretextual actions stifling free and open debate by censoring certain viewpoints.”

The Executive Order – which is currently being challenged in court as unconstitutional – directed several federal agencies to undertake certain legislative and regulatory efforts toward CDA reform. Consequently, in June 2020 the DOJ stated “that the time is ripe to realign the scope of Section 230 with the realities of the modern internet” and released a 28-page document with its preliminary recommendations for reform of Section 230.  A month later, the Commerce Department submitted a petition requesting that the FCC write rules to limit the scope of CDA immunity and place potentially additional compliance requirements on many providers that host third party content.  Then, on September 23, 2020, the DOJ announced that it had sent its legislative proposal for amending the CDA to Congress. The DOJ, in its cover letter to Congress, summed up the need for reform: “The proposed legislation accordingly seeks to align the scope of Section 230 immunities with the realities of the modern internet while ensuring that the internet remains a place for free and vibrant discussion.”