UPDATE: In late October 2016, the parties notified the court that they were in discussions to settle the matter and would jointly stipulate to a dismissal of the action without prejudice.  On November 2nd, the court dismissed the action.

Title V of the Telecommunications Act of 1996, also known as the “Communications Decency Act of 1996” or “CDA” was signed into law in February 1996.  The goal of the CDA was to control the exposure of minors to indecent material, but the law’s passage provoked legal challenges and pertinent sections of the Act were subsequently struck down by the Supreme Court as unconstitutional limitations on free speech. Yet, one section of the CDA, §230, remained intact and has proven to encourage the growth of web-based, interactive services.

Over the last few years, website operators, search engines and other interactive services have enjoyed a relative stable period of CDA immunity under Section 230 of the Communications Decency Act (CDA) from liability associated with user-generated content.  Despite a few outliers, Section 230 has been generally interpreted by most courts to protect website operators and other “interactive computer services” against claims arising out of third-party content.

However, a recent dispute involving a Snapchat feature known as “Discover” raises new questions under the CDA.  The feature showcases certain interactive “channels” from selected partners who curate content daily.  Last month, a parent of a 14-year old filed a putative class action against Snapchat claiming that her son was exposed to inappropriately racy content, particularly since, as plaintiff alleges, Snapchat does not tailor its feeds for adult and younger users.  (Doe v. Snapchat, Inc., No. 16-04955 (C.D. Cal. filed July 7, 2016)).  The complaint asserts that while Snapchat’s terms of service prohibit users under 13 from signing up for the service, it does not include any warnings about any possible “offensive” content on Snapchat for those under 18, beyond stating some “Community Guidelines” about what types of material users should not post in “Stories” or “Snaps.”

What’s interesting about the suit is that the plaintiff’s claim is based on the rarely-cited CDA Section 230(d), a provision which, to our knowledge, has not been litigated.   The text of Section 230(d) states that interactive services, when entering into an agreement with users, must notify such users of the availability of parental control protections.

(d) Obligations of interactive computer service

A provider of interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors. Such notice shall identify, or provide the customer with access to information identifying, current providers of such protections.

The plaintiff claims that Snapchat is liable under §230(d) for failing to provide notice of parental filters and otherwise “failing to warn minors or parents in its Terms of Service or User Agreement about the harmful and offensive content Snapchat, Inc., knowingly makes available to minors on Snapchat Discover.”  The plaintiff is seeking consequential and statutory damages that he purportedly alleges are allowed under the statute as well as injunctive relief that would compel Snapchat to warn users about any offensive content and provide parental filters to block certain content for minor users.

The outcome of the suit remains undetermined, as Snapchat has not yet filed an Answer or moved for dismissal, but the complaint raises several questions:

  • Can a violation of CDA Section 230(d) form the basis of a private civil action?  Does the statutory scheme create any applicable statutory penalties for violations of Section 230?
  • If CDA Section 230(d) can form the basis of a cause of action, would a provider’s mere violation (i.e., no notice in a user agreement about available parental controls) be a “concrete and particularized” injury and give a plaintiff standing to sue under Spokeo?
  • What would proper compliance with CDA Section 230(d) look like (e.g., notice in a site’s terms of service)?
  • Given that the crux of the suit involves the display of third-party content, is Snapchat’s presentation of its media partners’ stories protected by CDA Section 230(c)(1) immunity?
  • Could Snapchat invoke CDA immunity even though plaintiff’s complaint alleges that Snapchat is a developer of the content (“Snapchat controls and curates and in many cases helps create the the content it posts with its media partners…”)?

We will be watching this case closely. If successful, it will require changes to the terms of service of most interactive computer services.