On October 24, 2022, a Delaware district court held that certain claims under the Computer Fraud and Abuse Act (CFAA) relating to the controversial practice of web scraping were sufficient to survive the defendant’s motion to dismiss. (Ryanair DAC v. Booking Holdings Inc., No. 20-01191 (D. Del. Oct. 24, 2022)). The opinion potentially breathes life into the use of the CFAA to combat unwanted scraping.

In the case, Ryanair DAC (“Ryanair”), a European low-fare airline, brought various claims against Booking Holdings Inc. (and its well-known suite of online travel and hotel booking websites) (collectively, “Defendants”) for allegedly scraping the ticketing portion of the Ryanair site. Ryanair asserted that the ticketing portion of the site is only accessible to logged-in users and therefore the data on the site is not public data.

The decision is important as it offers answers (at least from one district court) to several unsettled legal issues about the scope of CFAA liability related to screen scraping. In particular, the decision addresses:

  • the potential for vicarious liability under the CFAA (which is important as many entities retain third party service providers to perform scraping)
  • how a data scraper’s use of evasive measures (e.g., spoofed email addresses, rotating IP addresses) may be considered under a CFAA claim centered on an “intent to defraud”
  • clarification as to the potential role of technical website-access limitations in analyzing CFAA “unauthorized access” liability

To find answers to these questions, the court’s opinion distills the holdings of two important CFAA rulings from this year – the Supreme Court’s holding in Van Buren that adopted a narrow interpretation of “exceeds unauthorized access” under the CFAA and the Ninth Circuit’s holding in the screen scraping hiQ case where that court found that the concept of “without authorization” under the CFAA does not apply to “public” websites.

Roughly two weeks apart, on July 21, 2022 and August 5, 2022, respectively, Amazon made headlines for agreeing to acquire One Medical, “a human-centered and technology-powered primary care organization,” for approximately $3.9 billion and iRobot, a global consumer robot company, known for its creation of the Roomba vacuum,

Since the passage of Section 230 of the Communication Decency Act (“CDA”), the majority of federal circuits have interpreted the CDA to establish broad federal immunity to causes of action that would treat service providers as publishers of content provided by third parties.  The CDA was passed in the early days of e-commerce and was written broadly enough to cover not only the online bulletin boards and not-so-very interactive websites that were common then, but also more modern online services, web 2.0 offerings and today’s platforms that might use algorithms to organize, repackage or recommend user-generated content.

Over 25 years ago, the Fourth Circuit, in the landmark Zeran case, the first major circuit court-level decision interpreting Section 230, held that Section 230 bars lawsuits, which, at their core, seek to hold a service provider liable for its exercise of a publisher’s “traditional editorial functions — such as deciding whether to publish, withdraw, postpone or alter content.” Courts have generally followed this reasoning ever since to determine whether an online provider is being treated as a “publisher” of third party content and thus entitled to immunity under the CDA.  The scope of “traditional editorial functions” is at the heart of a case currently on the docket at the Supreme Court. On October 3, 2022, the Supreme Court granted certiorari in an appeal that is challenging whether a social media platform’s targeted algorithmic recommendations fall under the umbrella of “traditional editorial functions” protected by the CDA or whether such recommendations are not the actions of a “publisher” and thus fall outside of CDA immunity. (Gonzalez v. Google LLC, No. 21-1333 (U.S. cert. granted Oct. 3, 2022)).

In a recent ruling, a California district court held that Apple, as operator of that App Store, was protected from liability for losses resulting from that type of fraudulent activity. (Diep v. Apple Inc., No. 21-10063 (N.D. Cal. Sept. 2, 2022)). This case is important in that, in

On August 29, 2022, the Federal Trade Commission (FTC) announced that it had filed a complaint against Kochava, Inc. (“Kochava”), a digital marketing and analytics firm, seeking an order halting Kochava’s alleged acquisition and downstream sale of “massive amounts” of precise geolocation data collected from consumers’ mobile devices.

The complaint alleges that the data is collected in a format that would allow third parties to track consumers’ movements to and from sensitive locations, including those related to reproductive health, places of worship, and their private residences, among others.  The FTC alleged that “consumers have no insight into how this data is used” and that they do not typically know that inferences about them and their behaviors will be drawn from this information.  The FTC claimed that the sale or license of this sensitive data, which could present an “unwarranted intrusion” into personal privacy, was an unfair business practice under Section 5 of the FTC Act.

On August 11, 2022, the Federal Trade Commission (FTC) issued an Advance Notice of Proposed Rulemaking (ANPR) and announced it was exploring a rulemaking process to “crack down on harmful commercial surveillance” and lax data security.  The agency defines commercial surveillance as “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information.”

The FTC View

The FTC has not released any proposed rules but seeks public comment on the harms stemming from commercial surveillance and whether new rules are needed to protect consumer data privacy. As part of the ANPR, and before setting out a host of questions for public comment, the FTC offers its take on the opaque ecosystem surrounding the collection of mobile data and personal information (which the FTC asserts is often done without consumers’ full understanding). The FTC discusses the subsequent sharing and sale of information to data aggregators and brokers that then sell data access or data analysis products to marketers, researchers, or other businesses interested in gaining insights from alternative data sources. The agency argues that based on news reporting, published research and its own enforcement actions, the benefits of the current consumer data marketplace may be outweighed by “harmful commercial surveillance and lax data security practices,” thus potentially requiring rules to protect consumers and to offer more regulatory clarity to companies beyond the FTC’s case-by-case enforcement. As FTC Chair Lina Khan said in her statement accompanying the ANPR: “[T]he growing digitization of our economy—coupled with business models that can incentivize endless hoovering up of sensitive user data and a vast expansion of how this data is used —means that potentially unlawful practices may be prevalent, with case-by-case enforcement failing to adequately deter lawbreaking or remedy the resulting harms.”

FTC Invitation for Comment

After describing the FTC view on the issues, the Commission invites public comment on whether it should implement new trade regulation rules or other regulatory alternatives concerning the ways companies (1) collect, aggregate, protect, use, analyze, and retain consumer data, as well as (2) transfer, share, sell, or otherwise monetize that data in ways that are unfair or deceptive.  Within the ANPR are a myriad of questions (too numerous to list here; a fact sheet is available here and the press release also offers a breakdown). Though, perhaps the multimillion-dollar questions asked by the agency are: Which kinds of data should be subject to a potential privacy rule?  To what extent, if at all, should a new regulation impose limitations on companies’ collection, use, and retention of consumer data?

On July 11, 2022, the Federal Trade Commission (FTC) published “Location, health, and other sensitive information: FTC committed to fully enforcing the law against illegal use and sharing of highly sensitive data,” on its Business Blog.  The blog post is likely related to an Executive Order (the “EO”) signed by President Biden in the wake of the Supreme Court’s Dobbs decision. Among other things, the EO directed the FTC to consider taking steps to protect consumers’ privacy when seeking information about and related to the provision of reproductive health care services.

While this latest drumbeat on this issue came from the FTC, we expect to see attention to this issue by other regulators, including, perhaps, the Department of Justice as well as state attorneys general.

Although the FTC post centers on location data and reproductive health services, it is likely that there will be more scrutiny of the collection and use of location data in general. This renewed focus will potentially subject a wide group of digital ecosystem participants to increased attention.  The spotlight will likely fall on interactive platforms, app publishers, software development kit (SDK) developers, data brokers and data analytics firms – over practices concerning the collection, sharing and perceived misuse of data generally.

Can internet service providers necessarily be compelled to unmask anonymous copyright infringers? In an opinion touching on Digital Millennium Copyright Act (DMCA) subpoenas, First Amendment concerns, and fair use, the Northern District of California said, in this one particular instance, no, granting Twitter’s motion to quash a subpoena seeking to reveal information behind an anonymous poster. (In re DMCA § 512(h) Subpoena to Twitter, Inc., No. 20-80214 (N.D. Cal. June 21, 2022)). The anonymous figure at the center of the dispute is @CallMeMoneyBags, an anonymous Twitter user who posts criticisms of wealthy people—particularly those working in tech, finance, and politics. Some such criticism lies at the heart of this dispute.

On June 15, 2022, Senator Elizabeth Warren introduced a bill, cosponsored by a host of other Democratic and independent Senators, the “Health and Location Data Protection Act of 2022,” which, subject to a few exceptions, would, among other things, prohibit the selling, sharing or transferring location data and health data. The bill gives the Federal Trade Commission (FTC) rulemaking and enforcement authority for violations of the law and also grants state attorneys general the right to bring actions; notably, the law would also give a private right of action to persons adversely affected by a violation of the proposed law.

Web 3.0 and the promise of the metaverse has generated excitement about new markets for businesses large and small. But as with any technological frontier, legal uncertainties cause new risks to emerge alongside the opportunities. One area currently full of legal questions is trademark law. We will examine what we