In Cody v. Jill Acquisition LLC, No. 25-937 (S.D. Cal. June 30, 2025), the Southern District of California declined to enforce a retail site’s terms of use and compel arbitration, holding that the plaintiff, who used guest checkout to place an online order at the retail clothing site, did not have adequate notice of the terms and the arbitration clause. This case should serve as a wake-up call for online entities to reexamine electronic contracting processes. It exemplifies how, even if a website’s visual design and its placement of the hyperlinked Terms of Use during user checkout are comparable to other presentations that have been deemed enforceable, a court could still decline to enforce online terms if the context of the transaction is not the typical e-commerce transaction between a registered customer and a retail site. In this case, the court found that by checking out as a guest without creating an account, the user was less likely to expect a continuing relationship and, therefore, the site’s notice and presentation of the terms below the “Place Order” button were not conspicuous enough in this instance to bind the plaintiff.

On June 3, 2025, Oregon Governor Tina Kotek signed HB 2008 into law to amend the Oregon Consumer Privacy Act,[1] the state’s comprehensive data privacy law. Among other items, effective January 1, 2026, the “sale” of two categories of personal data will be prohibited

  • Precise geolocation information that can pinpoint an individual or device with a 1,750-foot radius, absent some specific communications or utility-related exceptions
  • Personal data of anyone under sixteen years of age, provided that the data controller “has actual knowledge that, or willfully disregards whether, the consumer is under 16 years of age”[2]

The location data provision echoes a similar prohibition that was passed in Maryland last year.[3] 

Location data is considered “sensitive” because it can be readily collected from mobile devices or web browsing activities and can reveal a great deal about an individual’s habits, interests and movements. Beyond targeted advertising, anonymized location data can be a valuable source of alternative data for businesses gathering insights on competitors or consumer foot traffic or migration patterns and population growth.

As a result, the Oregon law – and the possibility of other similar state enactments that could restrict the sale of precise location data – represents an important development affecting data brokers and entities that use such data for location-based advertising and profiling and to create other data products and insights from location data. HB 2008’s definition of “sale” may potentially affect not just direct sales of precise location data but bundling and other licensing arrangements, subject to certain exceptions and uses. The new law will also add to customers’ due diligence process examining their data vendors’ collection practices.

  • Law establishes national prohibition against nonconsensual online publication of intimate images of individuals, both authentic and computer-generated.
  • First federal law regulating AI-generated content.
  • Creates requirement that covered platforms promptly remove depictions upon receiving notice of their existence and a valid takedown request.
  • For many online service providers, complying with the Take It Down Act’s notice-and-takedown requirement may warrant revising their existing DMCA takedown notice provisions and processes.
  • Another carve-out to CDA immunity? More like a dichotomy of sorts…. 

On May 19, 2025, President Trump signed the bipartisan-supported Take it Down Act into law. The law prohibits any person from using an “interactive computer service” to publish, or threaten to publish, nonconsensual intimate imagery (NCII), including AI-generated NCII (colloquially known as revenge pornography or deepfake revenge pornography). Additionally, the law requires that, within one year of enactment, social media companies and other covered platforms implement a notice-and-takedown mechanism that allows victims to report NCII.  Platforms must then remove properly reported imagery (and any known identical copies) within 48 hours of receiving a compliant request.

In May 2024, we released Part I of this series, in which we discussed agentic AI as an emerging technology enabling a new generation of AI-based hardware devices and software tools that can take actions on behalf of users. It turned out we were early – very early – to the discussion, with several months elapsing before agentic AI became as widely known and discussed as it is today. In this Part II, we return to the topic to explore legal issues concerning user liability for agentic AI-assisted transactions and open questions about existing legal frameworks’ applicability to the new generation of AI-assisted transactions.

Background: Snapshot of the Current State of “Agents”[1]

“Intelligent” electronic assistants are not new—the original generation, such as Amazon’s Alexa, have been offering narrow capabilities for specific tasks for more than a decade. However, as OpenAI’s CEO Sam Altman commented in May 2024, an advanced AI assistant or “super-competent colleague” could be the killer app of the future. Later, Altman noted during a Reddit AMA session: “We will have better and better models. But I think the thing that will feel like the next giant breakthrough will be agents.” A McKinsey report on AI agents echoes this sentiment: “The technology is moving from thought to action.” Agentic AI represents not only a technological evolution, but also a potential means to further spread (and monetize) AI technology beyond its current uses by consumers and businesses. Major AI developers and others have already embraced this shift, announcing initiatives in the agentic AI space. For example:  

  • Anthropic announced an updated frontier AI model in public beta capable of interacting with and using computers like human users;
  • Google unveiled Gemini 2.0, its new AI model for the agentic era, alongside Project Mariner, a prototype leveraging Gemini 2.0 to perform tasks via an experimental Chrome browser extension (while keeping a “human in the loop”);
  • OpenAI launched a “research preview” of Operator, an AI tool that can interface with computers on users’ behalf, and launched beta feature “Tasks” in ChatGPT to facilitate ongoing or future task management beyond merely responding to real time prompts;
  • LexisNexis announced the availability of “Protégé,” a personalized AI assistant with agentic AI capabilities;
  • Perplexity recently rolled out “Shop Like a Pro,” an AI-powered shopping recommendation and buying feature that allows Perplexity Pro users to research products and, for those merchants whose sites are integrated with the tool, purchase items directly on Perplexity; and
  • Amazon announced Alexa+, a new generation of Alexa that has agentic capabilities, including enabling Alexa to navigate the internet and execute tasks, as well as Amazon Nova Act, an AI model designed to perform actions within a web browser.

Beyond these examples, other startups and established tech companies are also developing AI “agents” in this country and overseas (including the invite-only release of Manus AI by Butterfly Effect, an AI developer in China). As a recent Microsoft piece speculates, the generative AI future may involve a “new ecosystem or marketplace of agents,” akin to the current smartphone app ecosystem.  Although early agentic AI device releases have received mixed reviews and seem to still have much unrealized potential, they demonstrate the capability of such devices to execute multistep actions in response to natural language instructions.

Like prior technological revolutions—personal computers in the 1980s, e-commerce in the 1990s and smartphones in the 2000s—the emergence of agentic AI technology challenges existing legal frameworks. Let’s take a look at some of those issues – starting with basic questions about contract law.

UPDATE (April 17, 2025): The below reflects a development occurring after our publication of the original post.

On April 11, 2025, the National Security Division (the “NSD”) released several documents setting out initial guidance on how to comply with the Rule, which the NSD refers to as the Data Security

  • Uses of lnformation Limited to “What is Reasonably Necessary”
  • Use of Deidentified Data Not Within Scope
  • Screen Scraping Survives

After a yearslong lead-up, the Consumer Financial Protection Bureau (CFPB) published its final “open banking” rule in October. The rule effectuates the section of the Consumer Financial Protection Act, which charged

After several weeks of handwringing about the fate of SB 1047 – the controversial AI safety bill that would have required developers of powerful AI models and entities providing the computing resources to train such models to put appropriate safeguards and policies into place to prevent critical harms – California

On September 17, 2024, Governor Gavin Newsom signed AB 2602 into California law (to be codified at Cal. Lab. Code §927).  The law addresses the use of “digital replicas” of performers.  As defined in the law, a digital replica is:

a computer-generated, highly realistic electronic representation that is readily identifiable

In an ongoing dispute commenced in 2016, the Eleventh Circuit for the second time in the lifetime of the litigation considered trade secret misappropriation and related copyright claims in a scraping case between direct competitors.

The case involved plaintiff Compulife Software, Inc. (“Plaintiff” or “Compulife”) – in the business of