New Media and Technology Law Blog

Important CDA Section 230 Case Lands in Supreme Court: Level of Protection Afforded Modern Online Platforms at Stake

Since the passage of Section 230 of the Communication Decency Act (“CDA”), the majority of federal circuits have interpreted the CDA to establish broad federal immunity to causes of action that would treat service providers as publishers of content provided by third parties.  The CDA was passed in the early days of e-commerce and was written broadly enough to cover not only the online bulletin boards and not-so-very interactive websites that were common then, but also more modern online services, web 2.0 offerings and today’s platforms that might use algorithms to organize, repackage or recommend user-generated content.

Over 25 years ago, the Fourth Circuit, in the landmark Zeran case, the first major circuit court-level decision interpreting Section 230, held that Section 230 bars lawsuits, which, at their core, seek to hold a service provider liable for its exercise of a publisher’s “traditional editorial functions — such as deciding whether to publish, withdraw, postpone or alter content.” Courts have generally followed this reasoning ever since to determine whether an online provider is being treated as a “publisher” of third party content and thus entitled to immunity under the CDA.  The scope of “traditional editorial functions” is at the heart of a case currently on the docket at the Supreme Court. On October 3, 2022, the Supreme Court granted certiorari in an appeal that is challenging whether a social media platform’s targeted algorithmic recommendations fall under the umbrella of “traditional editorial functions” protected by the CDA or whether such recommendations are not the actions of a “publisher” and thus fall outside of CDA immunity. (Gonzalez v. Google LLC, No. 21-1333 (U.S. cert. granted Oct. 3, 2022)). Continue Reading

App Store Protected by CDA Immunity (and Limitation of Liability) for Losses from Fraudulent Crypto Wallet App

In a recent ruling, a California district court held that Apple, as operator of that App Store, was protected from liability for losses resulting from that type of fraudulent activity. (Diep v. Apple Inc., No. 21-10063 (N.D. Cal. Sept. 2, 2022)). This case is important in that, in a motion to dismiss, a platform provider was able to use both statutory and contractual protections to avoid liability for the acts of third party cyber criminals.

Read the full post on our Blockchain and the Law blog.

FTC Sues Data Provider over the Collection and Sale of Geolocation Data

On August 29, 2022, the Federal Trade Commission (FTC) announced that it had filed a complaint against Kochava, Inc. (“Kochava”), a digital marketing and analytics firm, seeking an order halting Kochava’s alleged acquisition and downstream sale of “massive amounts” of precise geolocation data collected from consumers’ mobile devices.

The complaint alleges that the data is collected in a format that would allow third parties to track consumers’ movements to and from sensitive locations, including those related to reproductive health, places of worship, and their private residences, among others.  The FTC alleged that “consumers have no insight into how this data is used” and that they do not typically know that inferences about them and their behaviors will be drawn from this information.  The FTC claimed that the sale or license of this sensitive data, which could present an “unwarranted intrusion” into personal privacy, was an unfair business practice under Section 5 of the FTC Act. Continue Reading

Businesses That Use Consumer Data or Data Products (Everyone?) Take Heed: FTC Moves Ahead with Rulemaking Process on “Commercial Surveillance” Practices

On August 11, 2022, the Federal Trade Commission (FTC) issued an Advance Notice of Proposed Rulemaking (ANPR) and announced it was exploring a rulemaking process to “crack down on harmful commercial surveillance” and lax data security.  The agency defines commercial surveillance as “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information.”

The FTC View

The FTC has not released any proposed rules but seeks public comment on the harms stemming from commercial surveillance and whether new rules are needed to protect consumer data privacy. As part of the ANPR, and before setting out a host of questions for public comment, the FTC offers its take on the opaque ecosystem surrounding the collection of mobile data and personal information (which the FTC asserts is often done without consumers’ full understanding). The FTC discusses the subsequent sharing and sale of information to data aggregators and brokers that then sell data access or data analysis products to marketers, researchers, or other businesses interested in gaining insights from alternative data sources. The agency argues that based on news reporting, published research and its own enforcement actions, the benefits of the current consumer data marketplace may be outweighed by “harmful commercial surveillance and lax data security practices,” thus potentially requiring rules to protect consumers and to offer more regulatory clarity to companies beyond the FTC’s case-by-case enforcement. As FTC Chair Lina Khan said in her statement accompanying the ANPR: “[T]he growing digitization of our economy—coupled with business models that can incentivize endless hoovering up of sensitive user data and a vast expansion of how this data is used —means that potentially unlawful practices may be prevalent, with case-by-case enforcement failing to adequately deter lawbreaking or remedy the resulting harms.”

FTC Invitation for Comment

After describing the FTC view on the issues, the Commission invites public comment on whether it should implement new trade regulation rules or other regulatory alternatives concerning the ways companies (1) collect, aggregate, protect, use, analyze, and retain consumer data, as well as (2) transfer, share, sell, or otherwise monetize that data in ways that are unfair or deceptive.  Within the ANPR are a myriad of questions (too numerous to list here; a fact sheet is available here and the press release also offers a breakdown). Though, perhaps the multimillion-dollar questions asked by the agency are: Which kinds of data should be subject to a potential privacy rule?  To what extent, if at all, should a new regulation impose limitations on companies’ collection, use, and retention of consumer data? Continue Reading

FTC Blog Post Highlights Regulatory Focus on Collection of Location and Health Data

On July 11, 2022, the Federal Trade Commission (FTC) published “Location, health, and other sensitive information: FTC committed to fully enforcing the law against illegal use and sharing of highly sensitive data,” on its Business Blog.  The blog post is likely related to an Executive Order (the “EO”) signed by President Biden in the wake of the Supreme Court’s Dobbs decision. Among other things, the EO directed the FTC to consider taking steps to protect consumers’ privacy when seeking information about and related to the provision of reproductive health care services.

While this latest drumbeat on this issue came from the FTC, we expect to see attention to this issue by other regulators, including, perhaps, the Department of Justice as well as state attorneys general.

Although the FTC post centers on location data and reproductive health services, it is likely that there will be more scrutiny of the collection and use of location data in general. This renewed focus will potentially subject a wide group of digital ecosystem participants to increased attention.  The spotlight will likely fall on interactive platforms, app publishers, software development kit (SDK) developers, data brokers and data analytics firms – over practices concerning the collection, sharing and perceived misuse of data generally. Continue Reading

Unmasking Anonymous Copyright Infringers: Where the DMCA, First Amendment, and Fair Use Meet

Can internet service providers necessarily be compelled to unmask anonymous copyright infringers? In an opinion touching on Digital Millennium Copyright Act (DMCA) subpoenas, First Amendment concerns, and fair use, the Northern District of California said, in this one particular instance, no, granting Twitter’s motion to quash a subpoena seeking to reveal information behind an anonymous poster. (In re DMCA § 512(h) Subpoena to Twitter, Inc., No. 20-80214 (N.D. Cal. June 21, 2022)). The anonymous figure at the center of the dispute is @CallMeMoneyBags, an anonymous Twitter user who posts criticisms of wealthy people—particularly those working in tech, finance, and politics. Some such criticism lies at the heart of this dispute. Continue Reading

Senator Warren Introduces Bill to Ban the Sale of Location and Health Data

On June 15, 2022, Senator Elizabeth Warren introduced a bill, cosponsored by a host of other Democratic and independent Senators, the “Health and Location Data Protection Act of 2022,” which, subject to a few exceptions, would, among other things, prohibit the selling, sharing or transferring location data and health data. The bill gives the Federal Trade Commission (FTC) rulemaking and enforcement authority for violations of the law and also grants state attorneys general the right to bring actions; notably, the law would also give a private right of action to persons adversely affected by a violation of the proposed law. Continue Reading

Three Questions Brands Must Ask about Trademarks and the Metaverse

Web 3.0 and the promise of the metaverse has generated excitement about new markets for businesses large and small. But as with any technological frontier, legal uncertainties cause new risks to emerge alongside the opportunities. One area currently full of legal questions is trademark law. We will examine what we see as three of the biggest open questions that should make anyone entering the metaverse tread carefully—issues that should even make organizations staying IRL to begin to consider how to protect their brands from unexpected challenges in the virtual world.

Read the full article at The Drum.

DOJ Revises Policy for CFAA Prosecution to Reflect Developments in Web Scraping and Other Matters

On May 19, 2022, the Department of Justice (DOJ) announced that it had revised its policy regarding prosecution under the federal anti-hacking statute, the Computer Fraud and Abuse Act (CFAA). Since the DOJ last made changes to its CFAA policy in 2014, there have been a number of relevant developments in technology and business practices, most notably related to web scraping.  Among other things, the revised policy reflects aspects of the evolving views of this sometimes-controversial statute and the outcome of two major CFAA court decisions in the last year (the Ninth Circuit hiQ decision and the Supreme Court’s Van Buren decision), both of which adopted a narrow interpretation of the CFAA in situations beyond a traditional outside computer hacker scenario.

While the DOJ’s revised CFAA policy is only binding on federal CFAA criminal prosecution decisions (and could be amended by subsequent Administrations) and does not directly affect state prosecutions (including under the many state versions of the CFAA) or civil litigation in the area, it is likely to be relevant and influential in those situations as well, and in particular, with respect to web scraping. It seems that even the DOJ has conceded that the big hiQ and Van Buren court decisions have mostly (but not entirely) eliminated the threat of criminal prosecution under the CFAA when it comes to the scraping of “public” data. Still, as described below, the DOJ’s revisions to its policy, as written, are not entirely consistent with the hiQ decision. Continue Reading

LexBlog

This website uses third party cookies, over which we have no control. To deactivate the use of third party advertising cookies, you should alter the settings in your browser.

OK