New York has enacted a new law, effective February 9, 2021, regulating automatic renewal and some “free trial” type agreements. While some organizations may have already taken steps to be in compliance with industry requirements, the federal Restore Online Shoppers’ Confidence Act (ROSCA), and similar auto-renewal laws in place

On November 30, 2020, the Supreme Court held oral argument in its first case interpreting the “unauthorized access” provision of the Computer Fraud and Abuse Act (CFAA). The CFAA in part prohibits knowingly accessing a computer “without authorization” or “exceeding authorized access” to a computer and thereby obtaining information and causing a “loss” under the statute. The case concerns an appeal of an Eleventh Circuit decision affirming the conviction of a police officer for violating the CFAA for accessing a police license plate database he was authorized to use but used instead for non-law enforcement purposes. (See U.S. v. Van Buren, 940 F. 3d 1192 (11th Cir. 2019), pet. for cert. granted Van Buren v. U.S., No. 19-783 (Apr. 20, 2020)). The issue presented is: “Whether a person who is authorized to access information on a computer for certain purposes violates Section 1030(a)(2) of the Computer Fraud and Abuse Act if he accesses the same information for an improper purpose.”

The defendant Van Buren argued that he is innocent because he accessed only databases that he was authorized to use, even though he did so for an inappropriate reason.  He contended that the CFAA was being interpreted too broadly and that such a precedent could subject individuals to criminal liability merely for violating corporate computer use policies. During oral argument, Van Buren’s counsel suggested that such a wide interpretation of the CFAA was turning the statute into a “sweeping Internet police mandate” and that the Court shouldn’t construe a statute “simply on the assumption the government will use it responsibly.”  In rebuttal, the Government countered that Van Buren’s misuse of access for personal gain was the type of “serious breaches of trust by insiders” that statutory language is designed to cover.

The appetite for acquisitions and investment in online businesses has never been stronger, with many of the most attractive online opportunities being businesses that host, manage and leverage user-generated content.  These businesses often rely on the immunities offered by Section 230 of the Communications Decency Act, 47 U.S.C. §230 (“Section 230” or the “CDA”) to protect them from liability associated with receiving, editing and posting or removing such content.  And investors have relied on the existence of robust immunity under the CDA in evaluating the risk associated with such investments in such businesses.  This seemed reasonable, as following the legacy of the landmark 1997 Zeran decision, for more than two decades courts have fairly consistently interpreted Section 230 to provide broad immunity for online providers of all types.

However, in the last five years or so, the bipartisan critics of the CDA have gotten more full-throated in decrying the presence of hate speech, revenge porn, defamation, disinformation and other objectionable content on online platforms. This issue has been building throughout the past year, and has reached a fever pitch in the weeks leading up to the election. The government’s zeal for “reining in social media” and pursuing reforms to Section 230, again, on a bipartisan basis, has come through loud and clear (even if the justifications for such reform differ on party lines). While we cannot predict exactly what structure these reforms will take, the political winds suggest that regardless of the administration in charge, change is afoot for the CDA in late 2020 or 2021. Operating businesses must take note, and investors should keep this in mind when conducting diligence reviews concerning potential investments.

In continuing its efforts to enforce its terms and policies against developers that engage in unauthorized scraping of user data, this week Facebook brought suit against two marketing analytics firms, BrandTotal Ltd (“BrandTotal”) and Unimania, Inc. (“Unimania”) (collectively, the “Defendants”) (Facebook, Inc. v. BrandTotal Ltd., No. 20Civ04246

Section 230 of the Communications Decency Act, 47 U.S.C. §230 (“Section 230” or the “CDA”), enacted in 1996, is generally viewed as the most important statute supporting the growth of Internet commerce.  The key provision of the CDA, Section 230(c)(1)(a), only 26 words long, simply states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This one sentence has been the source of bedrock service provider immunity for third party content made available through a provider’s infrastructure, thus enabling the growth of a vigorous online ecosystem. Without such immunity, providers would have to face what the Ninth Circuit once termed, “death by ten thousand duck-bites,” in having to fend off claims that they promoted or tacitly assented to objectionable third party content.

The brevity of this provision of the CDA is deceptive, however. The CDA – and the immunity it conveys – is controversial, and those 26 words have been the subject of hundreds, if not thousands, of litigations.  Critics of the CDA point to the proliferation of hate speech, revenge porn, defamatory material, disinformation and other objectionable content – in many cases, the sites hosting such third party content (knowingly or unknowingly) are protected by the broad scope of the CDA. Other objections are merely based on unhappiness about the content of the speech, albeit in many cases true, such as comments that are critical of individuals, their businesses or their interests. Litigants upset about such content have sought various CDA workarounds over the past two decades in a mostly unsuccessful attempt to bypass the immunity and reach the underlying service providers.

The back-and-forth debate around the scope and effects of the CDA and the broad discretion afforded online providers regarding content hosting and moderation decisions is not new.  However, it was brought into a new focus when the President, vexed at the way some of his more controversial posts were being treated by certain social media platforms, issued a May 20, 2020 Executive Order for the purpose of curtailing legal protections for online providers. The goal was to remedy what the White House believed was the online platforms’ “deceptive or pretextual actions stifling free and open debate by censoring certain viewpoints.”

The Executive Order – which is currently being challenged in court as unconstitutional – directed several federal agencies to undertake certain legislative and regulatory efforts toward CDA reform. Consequently, in June 2020 the DOJ stated “that the time is ripe to realign the scope of Section 230 with the realities of the modern internet” and released a 28-page document with its preliminary recommendations for reform of Section 230.  A month later, the Commerce Department submitted a petition requesting that the FCC write rules to limit the scope of CDA immunity and place potentially additional compliance requirements on many providers that host third party content.  Then, on September 23, 2020, the DOJ announced that it had sent its legislative proposal for amending the CDA to Congress. The DOJ, in its cover letter to Congress, summed up the need for reform: “The proposed legislation accordingly seeks to align the scope of Section 230 immunities with the realities of the modern internet while ensuring that the internet remains a place for free and vibrant discussion.”

Many online services feature comprehensive terms of use intended to protect their business from various types of risks.  While it is often the case that a great deal of thought goes into the creation of those terms, frequently less attention is paid to how those terms are actually presented to users of the service. As case law continues to demonstrate, certain mobile and website presentations will be held to be enforceable, others will not.  Courts continue to indicate that enforceability of terms accessible by hyperlink depends on the totality of the circumstances, namely the clarity and conspicuousness of the relevant interface (both web and mobile) presenting the terms to the user. In a prior post about electronic contracting this year, we outlined, among other things, the danger of having a cluttered registration screen.  In this post, we will spotlight five recent rulings from the past few months where courts blessed the mobile contracting processes of e-commerce companies, as well as one case which demonstrates the danger of using a pre-checked box to indicate assent to online terms.

The moral of these stories is clear – the presentation of online terms is essential to enhancing the likelihood that they will be enforced, if need be. Thus, the design of the registration or sign-up page is not just an issue for the marketing, design and technical teams – the legal team must focus on how a court would likely view a registration interface, including pointing out the little things that can make a big difference in enforceability. A failure to present the terms properly could result in the most carefully drafted terms of service ultimately having no impact on the business at all.

Last week, a putative privacy-related class action was filed in California district court against financial analytics firm Envestnet, Inc. (“Envestnet”), which operates Yodlee, Inc. (“Yodlee”). (Wesch v. Yodlee Inc., No. 20-05991 (N.D. Cal. filed Aug. 25, 2020)). According to the complaint, Yodlee is one of the largest financial data aggregators in the world and through its software platforms, which are built into various fintech products offered by financial institutions, it aggregates financial data such as bank balances and credit card transaction histories from individuals in the United States. The crux of the suit is that Yodlee collects and then sells access to such anonymized financial data without meaningful notice to consumers, and stores or transmits such data without adequate security, all in violation of California and federal privacy laws.

The timing of this case is interesting, as it comes on the heels of the recent settlement of the litigation the between the City Attorney of Los Angeles and the operator of a weather app over claims that locational information collected through the weather app was being sold to third parties without adequate permission from the user of the app.

This past week, the operator of the popular Weather Channel (“TWC”) mobile phone app entered into a Stipulation of Settlement with the Los Angeles City Attorney, Mike Feuer (“City Attorney”), closing the books on one of the first litigations to focus on the collection of locational data through mobile phones. (People v. TWC Product and Technology, LLC, No. 19STCV00605 (Cal. Super., L.A. Cty, Stipulation Aug. 14, 2020)). While the settlement appears to allow TWC to continue to use locational information for app-related services and to serve advertising (as long the app includes some agreed-upon notices and screen prompts to consumers), what is glaringly absent from the settlement is a discussion of sharing locational information with third parties for purposes other than serving advertising or performing services in the app. Because applicable law, industry practice and the policies of Apple and Google themselves have narrowed the ability to share locational information for such purposes, the allegations of the case were, in a sense, subsumed in the tsunami of attention that locational information sharing has attracted. While some are viewing this settlement as a roadmap for locational information collection and sharing, in fact the settlement is quite narrow.

The currents around the Communications Decency Act just got a little more turbulent as the White House and executive branch try to reel in the big fish of CDA reform.

On July 27, 2020, the Commerce Department submitted a petition requesting the FCC initiate a rulemaking to clarify the provisions of Section 230 of the Communications Decency Act (CDA). Unrelated, but part of the same fervor in Washington to “rein in social media,” the leaders of the major technology companies appeared before the House Judiciary Antitrust Subcommittee at a hearing yesterday, July 29, 2020, to discuss the Committee’s ongoing investigation of competition in the digital marketplace, where some members inquired about content moderation practices. Moreover, last month, a pair of Senators introduced the PACT Act, a targeted (but still substantial) update to the CDA (and other CDA reform bills are also being debated, including a bill to carve out sexually exploitative material involving children from the CDA`s reach).