In the swirl of scrutiny surrounding the big Silicon Valley tech companies and with some in Congress declaiming that Section 230 of the Communications Decency Act (CDA) should be curtailed, 2019 has quietly been an important year for CDA jurisprudence with a number of opinions enunciating robust immunity under CDA Section 230. In particular, there has been a trio of noteworthy circuit court-level opinions rejecting plaintiffs’ attempt to make an “end run” around the CDA based on the assertion that online service providers lose immunity if they algorithmically recommend or recast content in another form to other users of the site.

This week, in a case with an unsettling fact pattern, the Ninth Circuit made it a quartet – ruling that a now-shuttered social networking site was immune from liability under the CDA for connecting a user with a dealer who sold him narcotics that culminated in an overdose. The court found such immunity because the site’s functions, which included content recommendations and notifications to members of discussion groups, were “content-neutral tools” used to facilitate communications. (Dyroff v. The Ultimate Software Group, Inc., No. 18-15175 (9th Cir. Aug. 20, 2019)). 

UPDATE: In September 2020, the parties settled the matter.

UPDATE: On August 23, 2019, the Third Circuit granted Amazon’s petition for rehearing en banc in the Oberdorf case.  As per the order, the opinion dated July 3, 2019 is vacated.

In early July, an appeals court ruled that Amazon should be considered a “seller” of goods under Pennsylvania products liability law and subject to strict liability for consumer injuries caused by the defective goods sold on its site by third-party vendors. (Oberdorf v. Amazon.com, No. 18-1041 (3rd Cir. July 3, 2019)). While the decision involved interpretation of Pennsylvania law – and Amazon has previously prevailed on the “seller” issue in various courts around the country in recent years – the ruling is still noteworthy as it was based upon § 402A Restatement (Second) of Torts (which other states may follow), and the ruling may signal a willingness to reinterpret the definition of “seller” in the modern era of online platforms. The decision also highlights the limits of immunity under Section 230 of the Communications Decency Act (CDA) for online marketplaces when it comes to products liability claims based on a site’s sales activity, as opposed to editorial decisions related to third-party product listings.

In recent years, there have been a number of suits filed in federal courts seeking to hold social media platforms responsible for providing material support to terrorists by allowing members of such groups to use social media accounts and failing to effectively block their content and terminate such accounts. As we’ve previously written about, such suits have generally not been successful at either the district court or circuit court level and have been dismissed on the merits or on the basis of immunity under Section 230 of the Communications Decency Act (CDA).

This past month, in a lengthy, important 2-1 decision, the Second Circuit affirmed dismissal of federal Anti-Terrorism Act (ATA) claims against Facebook on CDA grounds for allegedly providing “material support” to Hamas. The court also declined to exercise supplemental jurisdiction over plaintiff’s foreign law claims. (Force v. Facebook, Inc., No. 18-397 (2d Cir. July 31, 2019)).  Despite the plaintiffs’ creative pleadings that sought to portray Facebook’s processing of third-party content as beyond the scope of CDA immunity, the court found that claims related to supplying a communication forum and failing to completely block or eliminate hateful terrorist content necessarily treated Facebook as the publisher of such content and were therefore barred under the CDA. 

Two recent web scraping disputes highlight some important issues regarding whether a website owner may successfully allege a breach of contract action against a commercial party that has scraped website content contrary to “clickwrap” and “browsewrap” website terms of use.

In Southwest Airlines Co. v. Roundpipe, LLC, No. 18-0033 (N.D. Tex. Mar. 22, 2019), a Texas district court declined to dismiss Southwest Airlines Co.’s (“Southwest”) breach of contract claim against an entity that scraped airfare data from Southwest’s site in violation of the website terms of use. Southwest brought multiple claims against Roundpipe, LLC (“Roundpipe”) after it discovered that Roundpipe had created a website, SWMonkey.com, that, using scraping, sent consumers notifications if their Southwest ticket prices decreased after purchase (which would presumably allow them to exchange the original ticket for a lower-priced ticket).

Southwest’s website terms prohibited scraping or the use of any automated tools to access its fares or other content. Soon after the launch of SWMonkey, Southwest sent a cease and desist letter stating that Roundpipe was obtaining Southwest’s data in violation of the website terms, among other reasons, and demanded that the site be taken down.  After negotiations and additional correspondence from Southwest, Roundpipe shut down the website and disabled its scraping and fare tracking functionality.

In the past few months, there have been a number of notable decisions affirming broad immunity under the Communications Decency Act (CDA), 47 U.S.C. §230(c), for online providers that host third party content. The beat goes on, as in late May, a Utah district court ruled that the Tor Browser, which allows for anonymous communications and transactions on the internet, was protected by CDA Section 230 for a website’s sale of illegal substances to a minor on the dark web via the Tor Browser.

More recently, the D.C. Circuit affirmed the dismissal of claims brought by multiple locksmith companies (the “plaintiffs”) against the operators of the major search engines (the “defendants” or “providers”) for allegedly publishing the content of fraudulent locksmiths’ websites and translating street-address and area-code information on those websites into map pinpoints that were displayed in response to user search requests. (Marshall’s Locksmith Service v. Google LLC, No. 18-7018 (D.C. Cir. June 7, 2019)). According to the plaintiffs, by burying legitimate locksmiths listings (with actual, local physical locations) beneath so-called “scam” listings from locksmith call centers that act as lead generators for subcontractors, who may or may not be fully trained, plaintiffs’ legitimate businesses suffered market harm and were forced to pay for additional advertising. (Beyond this case, the issue of false local business listings appearing in Google Maps remains an ongoing concern, according to a report from the Wall Street Journal yesterday).

The District of Utah ruled in late May that Section 230 of the Communications Decency Act, 47 U.S.C. §230 (“CDA”) shields The Tor Project, Inc. (“Tor”), the organization responsible for maintaining the Tor Browser, from claims for strict product liability, negligence, abnormally dangerous activity, and civil conspiracy.

The claims were asserted against Tor following an incident where a minor died after taking illegal narcotics purchased from a site on the “dark web” on the Tor Network. (Seaver v. Estate of Cazes, No. 18-00712 (D. Utah May 20, 2019)). The parents of the child sued, among others, Tor as the service provider through which the teenager was able to order the drug on the dark web. Tor argued that the claims against it should be barred by CDA immunity and the district court agreed.

During the 2016 election, certain Russian operatives used fake social media profiles to influence voters and also created bot accounts to add likes to and share posts across the internet.  And more recently, in January 2019, the New York Attorney General and Office of the Florida Attorney General announced settlements with certain entities that sold fake social media engagement, such as followers, likes and views.  Moreover, many of the social media platforms have had recent purges of millions of fake accounts.  Thus, it’s clear that bots and automated activity on social media platforms has been on everyone’s radar…including state legislators’ too.

Indeed, California passed a chatbot disclosure law (SB-1001) last September that makes it unlawful for persons to mislead users about their artificial bot identity in certain circumstances, and it is only now coming into effect on July 1st.  In essence, the purpose of law was to inform users when they are interacting with a virtual assistant or chatbot or automated social media account so that users could change their behavior or expectations accordingly.  Entities that may interact online or via mobile applications with their customers regarding commercial transactions via a chatbot on their own website or automated account on another platform should certainly take note of the new California law’s disclosure requirements.

UPDATE: On December 31, 2019, the Ninth Circuit released an amended opinion in Enigma Software Group USA, LLC v. Malwarebytes, Inc., No. 17-17351 (9th Cir. Dec. 31, 2019). The case also involves competing providers of filtering software and issues concerning the scope of CDA §230(c)(2). In reversing the lower court’s dismissal of claims under the CDA, the Ninth Circuit held that “the phrase ‘otherwise objectionable’ does not include software that the provider finds objectionable for anticompetitive reasons.”

Three recent court decisions affirmed the robust immunity under the Communications Decency Act (CDA), 47 U.S.C. §230(c), for online providers that host third-party content: the Second Circuit’s decision in Herrick v. Grindr LLC, No. 18-396 (2d Cir. Mar. 27, 2019) (summary order), the Wisconsin Supreme Court’s opinion in Daniel v. Armslist, LLC, No. 2017AP344, 2019 WI 47 (Wis. Apr. 30, 2019),  and the Northern District of California’s decision in P.C. Drivers Headquarters, LP v. Malwarebytes Inc., No. 18-05409 (N.D. Cal. Mar. 6, 2019).

UPDATE: On January 22, 2019, the Supreme Court denied review of the California Supreme Court decision.

In a closely-followed dispute, the California Supreme Court vacated a lower court order, based upon a default judgment in a defamation action, which had directed Yelp, Inc. (“Yelp”), a non-party to the original suit, to take down certain consumer reviews posted on its site. (Hassell v. Bird, No. S235968, 2018 WL 3213933 (Cal. July 2, 2018)).  If the plaintiffs had included Yelp as a defendant in the original suit, such a suit would have likely been barred by Section 230 of the Communications Decency Act (“CDA” or “CDA Section 230”); instead, the plaintiffs adopted a litigation strategy to bypass such legal immunities.  In refusing to allow plaintiff’s “creative pleading” to avoid the CDA, the outcome was a win for online companies and platforms that host user-generated content (“A Case for the Internet,” declared Yelp).