The currents around the Communications Decency Act just got a little more turbulent as the White House and executive branch try to reel in the big fish of CDA reform.

On July 27, 2020, the Commerce Department submitted a petition requesting the FCC initiate a rulemaking to clarify the provisions of Section 230 of the Communications Decency Act (CDA). Unrelated, but part of the same fervor in Washington to “rein in social media,” the leaders of the major technology companies appeared before the House Judiciary Antitrust Subcommittee at a hearing yesterday, July 29, 2020, to discuss the Committee’s ongoing investigation of competition in the digital marketplace, where some members inquired about content moderation practices. Moreover, last month, a pair of Senators introduced the PACT Act, a targeted (but still substantial) update to the CDA (and other CDA reform bills are also being debated, including a bill to carve out sexually exploitative material involving children from the CDA`s reach).

Section 230 of the Communications Decency Act (“CDA”), 47 U.S.C. §230, enacted in 1996, is often cited as the most important law supporting the Internet, e-commerce and the online economy. Yet, it continues to be subject to intense criticism, including from politicians from both sides of the aisle. Many argue that the CDA has been applied in situations far beyond the original intent of Congress when the statue was enacted. Critics point to the role the CDA has played in protecting purveyors of hate speech, revenge porn, defamation, disinformation and other objectionable content.

Critics of the CDA raise valid concerns.  But what is the right way to address them? One must remember that for organizations that operate websites, mobile apps, social media networks, corporate networks and other online services, the CDA’s protections are extremely important.  Many of those businesses could be impaired if they were subject to liability (or the threat of liability) for objectionable third party content residing on their systems.

The criticism surrounding the CDA hit a fever pitch on May 28, 2020 when the President weighed in on the issue by signing an Executive Order attempting to curtail legal protections under Section 230. While the Executive Order was roundly labelled as political theater – and is currently being challenged in court as unconstitutional – it notably directed the Justice Department to submit draft proposed legislation (i.e., a CDA reform bill) to accomplish the policy objectives of the Order. This week, on June 17, 2020, the DOJ announced “that the time is ripe to realign the scope of Section 230 with the realities of the modern internet” and released a document with its recommendations for legislative reform of Section 230.  This is on the heels of a recent initiative by several GOP lawmakers to introduce their own version of a reform bill.

President Trump signed an Executive Order today attempting to curtail legal protections under Section 230 of the Communications Decency Act (“Section 230” or the “CDA”). The Executive Order strives to clarify that Section 230 immunity “should not extend beyond its text and purpose to provide protection for those who purport to provide users a forum for free and open speech, but in reality use their power over a vital means of communication to engage in deceptive or pretextual actions stifling free and open debate by censoring certain viewpoints.”

Section 230 protects online providers in many respects concerning the hosting of user-generated content and bars the imposition of distributor or publisher liability against a provider for the exercise of its editorial and self-regulatory functions with respect to such user content.  In response to certain moderation efforts toward the President’s own social media posts this week, the Executive Order seeks to remedy what the President claims is the social media platforms’ “selective censorship” of user content and the “flagging” of content that does not violate a provider’s terms of service.

The Executive Order does a number of things. It first directs:

  • The Commerce Department to file a petition for rulemaking with the FCC to clarify certain aspect of CDA immunity for online providers, namely Good Samaritan immunity under 47 U.S.C. §230(c)(2). Good Samaritan immunity provides  that an interactive computer service provider may not be made liable “on account of” its decision in “good faith” to restrict access to content that it considers to be “obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable.”
    • The provision essentially gives providers leeway to screen out or remove objectionable content in good faith from their platforms without fear of liability. Courts have generally held that the section does not require that the material actually be objectionable; rather, the CDA affords protection for blocking material that the provider or user considers to be “objectionable.” However, Section 230(c)(2)(A) requires that providers act in “good faith” in screening objectionable content.
    • The Executive Order states that this provision should not be “distorted” to protect providers that engage in “deceptive or pretextual actions (often contrary to their stated terms of service) to stifle viewpoints with which they disagree.” The order asks the FCC to clarify “the conditions under which an action restricting access to or availability of material is not ‘taken in good faith’” within the meaning of the CDA, specifically when such decisions are inconsistent with a provider’s terms of service or taken without adequate notice to the user.
    • Interestingly, the Order also directs the FCC to clarify if there are any circumstances where a provider that screens out content under the CDA’s Good Samaritan protection but fails to meet the statutory requirements should still be able to claim protection under CDA Section 230(c)(1) for “publisher” immunity (as, according to the Order, such decisions would be the provider’s own “editorial” decisions).
    • To be sure, courts in recent years have dismissed claims against services for terminating user accounts or screening out content that violates content policies using the more familiar Section 230(c)(1) publisher immunity, stating repeatedly that decisions to terminate an account (or not publish user content) are publisher decisions, and are protected under the CDA. It appears that the Executive Order is suggesting that years of federal court precedent – from the landmark 1997 Zeran case until today – that have espoused broad immunity under the CDA for providers’ “traditional editorial functions” regarding third party content (which include the decision whether to publish, withdraw, postpone or alter content provided by another) were perhaps decided in error.

In the swirl of scrutiny surrounding the big Silicon Valley tech companies and with some in Congress declaiming that Section 230 of the Communications Decency Act (CDA) should be curtailed, 2019 has quietly been an important year for CDA jurisprudence with a number of opinions enunciating robust immunity under CDA Section 230. In particular, there has been a trio of noteworthy circuit court-level opinions rejecting plaintiffs’ attempt to make an “end run” around the CDA based on the assertion that online service providers lose immunity if they algorithmically recommend or recast content in another form to other users of the site.

This week, in a case with an unsettling fact pattern, the Ninth Circuit made it a quartet – ruling that a now-shuttered social networking site was immune from liability under the CDA for connecting a user with a dealer who sold him narcotics that culminated in an overdose. The court found such immunity because the site’s functions, which included content recommendations and notifications to members of discussion groups, were “content-neutral tools” used to facilitate communications. (Dyroff v. The Ultimate Software Group, Inc., No. 18-15175 (9th Cir. Aug. 20, 2019)). 

In the past few months, there have been a number of notable decisions affirming broad immunity under the Communications Decency Act (CDA), 47 U.S.C. §230(c), for online providers that host third party content. The beat goes on, as in late May, a Utah district court ruled that the Tor Browser, which allows for anonymous communications and transactions on the internet, was protected by CDA Section 230 for a website’s sale of illegal substances to a minor on the dark web via the Tor Browser.

More recently, the D.C. Circuit affirmed the dismissal of claims brought by multiple locksmith companies (the “plaintiffs”) against the operators of the major search engines (the “defendants” or “providers”) for allegedly publishing the content of fraudulent locksmiths’ websites and translating street-address and area-code information on those websites into map pinpoints that were displayed in response to user search requests. (Marshall’s Locksmith Service v. Google LLC, No. 18-7018 (D.C. Cir. June 7, 2019)). According to the plaintiffs, by burying legitimate locksmiths listings (with actual, local physical locations) beneath so-called “scam” listings from locksmith call centers that act as lead generators for subcontractors, who may or may not be fully trained, plaintiffs’ legitimate businesses suffered market harm and were forced to pay for additional advertising. (Beyond this case, the issue of false local business listings appearing in Google Maps remains an ongoing concern, according to a report from the Wall Street Journal yesterday).

UPDATE: On December 31, 2019, the Ninth Circuit released an amended opinion in Enigma Software Group USA, LLC v. Malwarebytes, Inc., No. 17-17351 (9th Cir. Dec. 31, 2019). The case also involves competing providers of filtering software and issues concerning the scope of CDA §230(c)(2). In reversing the lower court’s dismissal of claims under the CDA, the Ninth Circuit held that “the phrase ‘otherwise objectionable’ does not include software that the provider finds objectionable for anticompetitive reasons.”

Three recent court decisions affirmed the robust immunity under the Communications Decency Act (CDA), 47 U.S.C. §230(c), for online providers that host third-party content: the Second Circuit’s decision in Herrick v. Grindr LLC, No. 18-396 (2d Cir. Mar. 27, 2019) (summary order), the Wisconsin Supreme Court’s opinion in Daniel v. Armslist, LLC, No. 2017AP344, 2019 WI 47 (Wis. Apr. 30, 2019),  and the Northern District of California’s decision in P.C. Drivers Headquarters, LP v. Malwarebytes Inc., No. 18-05409 (N.D. Cal. Mar. 6, 2019).

In what is one of the most recent attempts to circumvent the immunity provided in Section 230 of the Communications Decency Act (“CDA” or “CDA Section 230”), the United States District Court for the District of Massachusetts made it clear that claims brought under the Defend Trade Secrets Act (18 U.S.C. §§ 1836, et seq.) (“DTSA”) are not exempt from the scope of CDA immunity. In Craft Beer Stellar, LLC v. Glassdoor, Inc., No. 18-10510, 2018 U.S. Dist. LEXIS 178960 (D. Mass. Oct. 17, 2018)), the district court found that, as stated in the DTSA itself, the DTSA is not an “intellectual property” law, and is therefore not excluded from the scope of the immunity provisions that protect online service providers from being treated as a publisher or distributor of third-party content. The ruling is a victory for online providers, affirming a robust interpretation of CDA immunity and representing what is likely the first judicial view on how federal trade secret claims should be treated under CDA Section 230. 

UPDATE: On January 22, 2019, the Supreme Court denied review of the California Supreme Court decision.

In a closely-followed dispute, the California Supreme Court vacated a lower court order, based upon a default judgment in a defamation action, which had directed Yelp, Inc. (“Yelp”), a non-party to the original suit, to take down certain consumer reviews posted on its site. (Hassell v. Bird, No. S235968, 2018 WL 3213933 (Cal. July 2, 2018)).  If the plaintiffs had included Yelp as a defendant in the original suit, such a suit would have likely been barred by Section 230 of the Communications Decency Act (“CDA” or “CDA Section 230”); instead, the plaintiffs adopted a litigation strategy to bypass such legal immunities.  In refusing to allow plaintiff’s “creative pleading” to avoid the CDA, the outcome was a win for online companies and platforms that host user-generated content (“A Case for the Internet,” declared Yelp).