The Federal Trade Commission (FTC) has long expressed a concern about the potential misuse of location data.  For example, in a 2022 blog post, “Location, health, and other sensitive information: FTC committed to fully enforcing the law against illegal use and sharing of highly sensitive data,” the agency termed the entire location data ecosystem “opaque” and has investigated the practices and brought enforcement actions against mobile app operators and data brokers with respect to sensitive data.

One such FTC enforcement began with an August 2022 complaint against Kochava, Inc. (“Kochava”), a digital marketing and analytics firm, seeking an order halting Kochava’s alleged acquisition and downstream sale of “massive amounts” of precise geolocation data collected from consumers’ mobile devices. In that complaint, the FTC alleged that Kochava, in its role as a data broker, collects a wealth of information about consumers and their mobile devices by, among other means, purchasing data from outside entities to sell to its own customers.  Among other things, the FTC alleged that the location data provided by Kochava to its customers was not anonymized and that it was possible, using third party services, to use the geolocation data combined with other information to identify a mobile device user or owner.

In May 2023, an Idaho district court granted Kochava’s motion to dismiss the FTC’s complaint, with leave to amend. Subsequently, the FTC filed an amended complaint, and Kochava requested that the court keep the amended complaint under seal, which it did until it could rule on the merits of the parties’ arguments.

On November 3, 2023, the court granted the FTC’s motion to unseal the amended complaint, finding no “compelling reason” to keep the amended complaint under seal and rejecting Kochava’s arguments that the amended complaint’s allegations were “knowingly false” or “misleading.” (FTC v. Kochava Inc., No. 22-00377 (D. Idaho Nov. 3, 2023)). As a result, the FTC’s amended complaint has been unsealed to the public.

On October 30, 2023, President Biden issued an “Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence” (Fact Sheet, here) designed to spur new AI safety and security standards, encourage the development of privacy-preserving technologies in conjunction with AI training, address certain instances of algorithmic discrimination, advance the responsible use of AI in healthcare, study the impacts of AI on the labor market, support AI research and a competitive environment in the industry, and issue guidance on the use of AI by federal agencies.  This latest move builds on the White House’s previously-released “Blueprint for an AI Bill of Rights” and its announcement this past summer that it had secured voluntary commitments from major AI companies focusing on what the White House termed as “three principles that must be fundamental to the future of AI – safety, security, and trust.” 

Roughly two weeks apart, on July 21, 2022 and August 5, 2022, respectively, Amazon made headlines for agreeing to acquire One Medical, “a human-centered and technology-powered primary care organization,” for approximately $3.9 billion and iRobot, a global consumer robot company, known for its creation of the Roomba vacuum,

On August 29, 2022, the Federal Trade Commission (FTC) announced that it had filed a complaint against Kochava, Inc. (“Kochava”), a digital marketing and analytics firm, seeking an order halting Kochava’s alleged acquisition and downstream sale of “massive amounts” of precise geolocation data collected from consumers’ mobile devices.

The complaint alleges that the data is collected in a format that would allow third parties to track consumers’ movements to and from sensitive locations, including those related to reproductive health, places of worship, and their private residences, among others.  The FTC alleged that “consumers have no insight into how this data is used” and that they do not typically know that inferences about them and their behaviors will be drawn from this information.  The FTC claimed that the sale or license of this sensitive data, which could present an “unwarranted intrusion” into personal privacy, was an unfair business practice under Section 5 of the FTC Act.

On August 11, 2022, the Federal Trade Commission (FTC) issued an Advance Notice of Proposed Rulemaking (ANPR) and announced it was exploring a rulemaking process to “crack down on harmful commercial surveillance” and lax data security.  The agency defines commercial surveillance as “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information.”

The FTC View

The FTC has not released any proposed rules but seeks public comment on the harms stemming from commercial surveillance and whether new rules are needed to protect consumer data privacy. As part of the ANPR, and before setting out a host of questions for public comment, the FTC offers its take on the opaque ecosystem surrounding the collection of mobile data and personal information (which the FTC asserts is often done without consumers’ full understanding). The FTC discusses the subsequent sharing and sale of information to data aggregators and brokers that then sell data access or data analysis products to marketers, researchers, or other businesses interested in gaining insights from alternative data sources. The agency argues that based on news reporting, published research and its own enforcement actions, the benefits of the current consumer data marketplace may be outweighed by “harmful commercial surveillance and lax data security practices,” thus potentially requiring rules to protect consumers and to offer more regulatory clarity to companies beyond the FTC’s case-by-case enforcement. As FTC Chair Lina Khan said in her statement accompanying the ANPR: “[T]he growing digitization of our economy—coupled with business models that can incentivize endless hoovering up of sensitive user data and a vast expansion of how this data is used —means that potentially unlawful practices may be prevalent, with case-by-case enforcement failing to adequately deter lawbreaking or remedy the resulting harms.”

FTC Invitation for Comment

After describing the FTC view on the issues, the Commission invites public comment on whether it should implement new trade regulation rules or other regulatory alternatives concerning the ways companies (1) collect, aggregate, protect, use, analyze, and retain consumer data, as well as (2) transfer, share, sell, or otherwise monetize that data in ways that are unfair or deceptive.  Within the ANPR are a myriad of questions (too numerous to list here; a fact sheet is available here and the press release also offers a breakdown). Though, perhaps the multimillion-dollar questions asked by the agency are: Which kinds of data should be subject to a potential privacy rule?  To what extent, if at all, should a new regulation impose limitations on companies’ collection, use, and retention of consumer data?

On July 11, 2022, the Federal Trade Commission (FTC) published “Location, health, and other sensitive information: FTC committed to fully enforcing the law against illegal use and sharing of highly sensitive data,” on its Business Blog.  The blog post is likely related to an Executive Order (the “EO”) signed by President Biden in the wake of the Supreme Court’s Dobbs decision. Among other things, the EO directed the FTC to consider taking steps to protect consumers’ privacy when seeking information about and related to the provision of reproductive health care services.

While this latest drumbeat on this issue came from the FTC, we expect to see attention to this issue by other regulators, including, perhaps, the Department of Justice as well as state attorneys general.

Although the FTC post centers on location data and reproductive health services, it is likely that there will be more scrutiny of the collection and use of location data in general. This renewed focus will potentially subject a wide group of digital ecosystem participants to increased attention.  The spotlight will likely fall on interactive platforms, app publishers, software development kit (SDK) developers, data brokers and data analytics firms – over practices concerning the collection, sharing and perceived misuse of data generally.

On June 15, 2022, Senator Elizabeth Warren introduced a bill, cosponsored by a host of other Democratic and independent Senators, the “Health and Location Data Protection Act of 2022,” which, subject to a few exceptions, would, among other things, prohibit the selling, sharing or transferring location data and health data. The bill gives the Federal Trade Commission (FTC) rulemaking and enforcement authority for violations of the law and also grants state attorneys general the right to bring actions; notably, the law would also give a private right of action to persons adversely affected by a violation of the proposed law.

The concept of the “metaverse” has garnered much press coverage of late, addressing such topics as the new appetite for metaverse investment opportunities, a recent virtual land boom, or just the promise of it all, where “crypto, gaming and capitalism collide.”  The term “metaverse,” which comes from Neal Stephenson’s 1992 science fiction novel “Snow Crash,” is generally used to refer to the development of virtual reality (VR) and augmented reality (AR) technologies, featuring a mashup of massive multiplayer gaming, virtual worlds, virtual workspaces, and remote education to create a decentralized wonderland and collaborative space. The grand concept is that the metaverse will be the next iteration of the mobile internet and a major part of both digital and real life.

Don’t feel like going out tonight in the real world? Why not stay “in” and catch a show or meet people/avatars/smart bots in the metaverse?

As currently conceived, the metaverse, “Web 3.0,” would feature a synchronous environment giving users a seamless experience across different realms, even if such discrete areas of the virtual world are operated by different developers. It would boast its own economy where users and their avatars interact socially and use digital assets based in both virtual and actual reality, a place where commerce would presumably be heavily based in decentralized finance, DeFi. No single company or platform would operate the metaverse, but rather, it would be administered by many entities in a decentralized manner (presumably on some open source metaverse OS) and work across multiple computing platforms. At the outset, the metaverse would look like a virtual world featuring enhanced experiences interfaced via VR headsets, mobile devices, gaming consoles and haptic gear that makes you “feel” virtual things. Later, the contours of the metaverse would be shaped by user preferences, monetary opportunities and incremental innovations by developers building on what came before.

In short, the vision is that multiple companies, developers and creators will come together to create one metaverse (as opposed to proprietary, closed platforms) and have it evolve into an embodied mobile internet, one that is open and interoperable and would include many facets of life (i.e., work, social interactions, entertainment) in one hybrid space.

In order for the metaverse to become a reality – that is, successfully link current gaming and communications platforms with other new technologies into a massive new online destination – many obstacles will have to be overcome, even beyond the hardware, software and integration issues. The legal issues stand out, front and center. Indeed, the concept of the metaverse presents a law school final exam’s worth of legal questions to sort out.  Meanwhile, we are still trying to resolve the myriad of legal issues presented by “Web 2.0,” the Internet we know it today. Adding the metaverse to the picture will certainly make things even more complicated.