In the swirl of scrutiny surrounding the big Silicon Valley tech companies and with some in Congress declaiming that Section 230 of the Communications Decency Act (CDA) should be curtailed, 2019 has quietly been an important year for CDA jurisprudence with a number of opinions enunciating robust immunity under CDA Section 230. In particular, there has been a trio of noteworthy circuit court-level opinions rejecting plaintiffs’ attempt to make an “end run” around the CDA based on the assertion that online service providers lose immunity if they algorithmically recommend or recast content in another form to other users of the site.

This week, in a case with an unsettling fact pattern, the Ninth Circuit made it a quartet – ruling that a now-shuttered social networking site was immune from liability under the CDA for connecting a user with a dealer who sold him narcotics that culminated in an overdose. The court found such immunity because the site’s functions, which included content recommendations and notifications to members of discussion groups, were “content-neutral tools” used to facilitate communications. (Dyroff v. The Ultimate Software Group, Inc., No. 18-15175 (9th Cir. Aug. 20, 2019)). 

Facebook recently announced that it would make changes to its news feed to prioritize content that users share and discuss and material from “reputable publishers.”  These changes are part of what Mark Zuckerberg says is a refocusing of Facebook from “helping [users] find relevant content to helping [users] have

UPDATE: In late October 2016, the parties notified the court that they were in discussions to settle the matter and would jointly stipulate to a dismissal of the action without prejudice.  On November 2nd, the court dismissed the action.

Title V of the Telecommunications Act of 1996, also known as the “Communications Decency Act of 1996” or “CDA” was signed into law in February 1996.  The goal of the CDA was to control the exposure of minors to indecent material, but the law’s passage provoked legal challenges and pertinent sections of the Act were subsequently struck down by the Supreme Court as unconstitutional limitations on free speech. Yet, one section of the CDA, §230, remained intact and has proven to encourage the growth of web-based, interactive services.

Over the last few years, website operators, search engines and other interactive services have enjoyed a relative stable period of CDA immunity under Section 230 of the Communications Decency Act (CDA) from liability associated with user-generated content.  Despite a few outliers, Section 230 has been generally interpreted by most courts to protect website operators and other “interactive computer services” against claims arising out of third-party content.

However, a recent dispute involving a Snapchat feature known as “Discover” raises new questions under the CDA.  The feature showcases certain interactive “channels” from selected partners who curate content daily.  Last month, a parent of a 14-year old filed a putative class action against Snapchat claiming that her son was exposed to inappropriately racy content, particularly since, as plaintiff alleges, Snapchat does not tailor its feeds for adult and younger users.  (Doe v. Snapchat, Inc., No. 16-04955 (C.D. Cal. filed July 7, 2016)).  The complaint asserts that while Snapchat’s terms of service prohibit users under 13 from signing up for the service, it does not include any warnings about any possible “offensive” content on Snapchat for those under 18, beyond stating some “Community Guidelines” about what types of material users should not post in “Stories” or “Snaps.”

The U.S. Securities and Exchange Commission gave disclosures made through social media platforms such as Facebook and Twitter a conditional “thumbs up” in a Report of Investigation it released on April 2, 2013.  Issuers of securities, the SEC stated, can use social media to disseminate material, nonpublic information without having