On April 13, 2026, Virginia Governor Abigail Spanberger signed the bipartisan bill SB338, amending the Virginia Consumer Data Protection Act (VCDPA) by prohibiting data controllers from selling or offering for sale a consumer’s precise geolocation data. SB 338 replaces the VCDPA’s prior consent-based treatment of precise geolocation data –
Legislation
White House AI Framework Stakes Out National Policy Position – Developers and Deployers Now Watch and Wait
On March 20, 2026, the White House announced a comprehensive national legislative framework (the “Framework”) that tracks with its December 2025 AI Preemption Executive Order and its July 2025 AI Action Plan and takes aim at hot-button AI policy topics such as child safety and privacy, AI training and copyright, liability protections and preemption of state laws that the Trump administration believes are necessary to maintain America’s leadership in AI innovation. The Administration set an ambitious deadline, calling on Congress to turn this policy blueprint into law by year’s end (some administration advisors are cautiously optimistic that a bipartisan solution based on the Framework is within reach). At the same time, Senator Marsha Blackburn released a 291-page discussion draft for national AI legislation, or “one federal rulebook for AI,” that would generally codify White House policy, but also diverges in several important ways.
At this juncture – with doubts about whether Congress can form a consensus around AI regulation even as state legislatures step up to fill the void – developers and deployers are left to watch and wait. For now, a patchwork of state AI laws remains, covering everything from child protection, health and safety, transparency measures, and automated decisionmaking.
Your Data, Your Price – New York Rolls Out Personalized Algorithmic Pricing Law: Ecommerce Compliance Challenges Ahead
Have you noticed a message on your food delivery app that reads: “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA”?
If so, the reason may be New York’s new personalized algorithmic pricing law (General Business Law § 349-a). Enacted in May 2025 and effective as…
Oregon Strengthens Geolocation Data Privacy and Children’s Personal Data Protections, Adding to Compliance for Data Brokers and Others
On June 3, 2025, Oregon Governor Tina Kotek signed HB 2008 into law to amend the Oregon Consumer Privacy Act,[1] the state’s comprehensive data privacy law. Among other items, effective January 1, 2026, the “sale” of two categories of personal data will be prohibited
- Precise geolocation information that can pinpoint an individual or device with a 1,750-foot radius, absent some specific communications or utility-related exceptions
- Personal data of anyone under sixteen years of age, provided that the data controller “has actual knowledge that, or willfully disregards whether, the consumer is under 16 years of age”[2]
The location data provision echoes a similar prohibition that was passed in Maryland last year.[3]
Location data is considered “sensitive” because it can be readily collected from mobile devices or web browsing activities and can reveal a great deal about an individual’s habits, interests and movements. Beyond targeted advertising, anonymized location data can be a valuable source of alternative data for businesses gathering insights on competitors or consumer foot traffic or migration patterns and population growth.
As a result, the Oregon law – and the possibility of other similar state enactments that could restrict the sale of precise location data – represents an important development affecting data brokers and entities that use such data for location-based advertising and profiling and to create other data products and insights from location data. HB 2008’s definition of “sale” may potentially affect not just direct sales of precise location data but bundling and other licensing arrangements, subject to certain exceptions and uses. The new law will also add to customers’ due diligence process examining their data vendors’ collection practices.
Take it Down Act Signed into Law, Offering Tools to Fight Non-Consensual Intimate Images and Creating a New Image Takedown Mechanism
- Law establishes national prohibition against nonconsensual online publication of intimate images of individuals, both authentic and computer-generated.
- First federal law regulating AI-generated content.
- Creates requirement that covered platforms promptly remove depictions upon receiving notice of their existence and a valid takedown request.
- For many online service providers, complying with the Take It Down Act’s notice-and-takedown requirement may warrant revising their existing DMCA takedown notice provisions and processes.
- Another carve-out to CDA immunity? More like a dichotomy of sorts….
On May 19, 2025, President Trump signed the bipartisan-supported Take it Down Act into law. The law prohibits any person from using an “interactive computer service” to publish, or threaten to publish, nonconsensual intimate imagery (NCII), including AI-generated NCII (colloquially known as revenge pornography or deepfake revenge pornography). Additionally, the law requires that, within one year of enactment, social media companies and other covered platforms implement a notice-and-takedown mechanism that allows victims to report NCII. Platforms must then remove properly reported imagery (and any known identical copies) within 48 hours of receiving a compliant request.
California Enacts Additional Generative AI Bills Touching on Training Data
After several weeks of handwringing about the fate of SB 1047 – the controversial AI safety bill that would have required developers of powerful AI models and entities providing the computing resources to train such models to put appropriate safeguards and policies into place to prevent critical harms – California…
California Enacts Generative AI Law Addressing “Digital Replicas” of Performers
On September 17, 2024, Governor Gavin Newsom signed AB 2602 into California law (to be codified at Cal. Lab. Code §927). The law addresses the use of “digital replicas” of performers. As defined in the law, a digital replica is:
a computer-generated, highly realistic electronic representation that is readily identifiable…
Colorado Expands “Right-to-Repair” Law
On May 28, 2024, Colorado Governor Jared Polis signed into law the “Consumer Right to Repair Digital Electronic Equipment” bill (HB24-1121). The legislation expands the state’s 2023 right to repair law, which currently applies to agricultural equipment and powered wheelchairs. Generally, the law will broaden the ability of…
The King is Back (in the Digital Era) | The ELVIS Act, Generative AI and Right of Publicity
On March 21, 2024, in a bold regulatory move, Tennessee Governor Bill Lee signed the Ensuring Likeness Voice and Image Security (“ELVIS”) Act (Tenn. Code Ann. §47-25-1101 et seq.) – a law which, as Gov. Lee stated, covers “new, personalized generative AI cloning models and services that enable human…
Generative AI Providers Subject to Reduced CDA Immunity Under Proposed Legislation
One of the many legal questions swirling around in the world of generative AI (“GenAI”) is to what extent Section 230 of the Communications Decency Act (CDA) applies to the provision of GenAI. Can CDA immunity apply to GenAI-generated output and protect GenAI providers from potential third party liability?
On June 14, 2023, Senators Richard Blumenthal and Josh Hawley introduced the “No Section 230 Immunity for AI Act,” bipartisan legislation that would expressly remove most immunity under the CDA for a provider of an interactive computer service if the conduct underlying the claim or charge “involves the use or provision of generative artificial intelligence by the interactive computer service.” While the bill would eliminate “publisher” immunity under §230(c)(1) for claims involving the use or provision of generative artificial intelligence by an interactive computer service, immunity for so-called “Good Samaritan” blocking under § 230(c)(2)(A), which protects service providers and users from liability for claims arising out of good faith actions to screen or restrict access to “objectionable” material from their services, would not be affected.