One of the many legal questions swirling around in the world of generative AI (“GenAI”) is to what extent Section 230 of the Communications Decency Act (CDA) applies to the provision of GenAI.  Can CDA immunity apply to GenAI-generated output and protect GenAI providers from potential third party liability?

On June 14, 2023, Senators Richard Blumenthal and Josh Hawley introduced the “No Section 230 Immunity for AI Act,” bipartisan legislation that would expressly remove most immunity under the CDA for a provider of an interactive computer service if the conduct underlying the claim or charge “involves the use or provision of generative artificial intelligence by the interactive computer service.” While the bill would eliminate “publisher” immunity under §230(c)(1) for claims involving the use or provision of generative artificial intelligence by an interactive computer service, immunity for so-called “Good Samaritan” blocking under § 230(c)(2)(A), which protects service providers and users from liability for claims arising out of good faith actions to screen or restrict access to “objectionable” material from their services, would not be affected.

The bill defines generative AI very broadly as: “an artificial intelligence system that is capable of generating novel text, video, images, audio, and other media based on prompts or other forms of data provided by a person.’’

Because the bill’s CDA carveout extends to any interactive computer service where the underlying conduct relates to the use of generative AI, it seems as if it could extend to computer-based services which have generative AI built into them. This may be beyond the intent of the drafters of the bill.  In fact, the press release accompanying the bill states that the bill was written to help build a framework for “AI platform accountability” and that the bill would strip immunity from “AI companies” in “civil claims or criminal prosecutions involving the use or provision of generative AI.”  Thus, it seems that the drafters may have written the proposed law to remove CDA immunity for “AI companies” (whatever that means) and not every service that builds in some generative AI functionality into its offerings.

The bill raises a key question: Would a court deem a generative AI platform or a computer service with GenAI functionality to be the information content provider or creator of its output that is responsible, in whole or in part, for the creation or development of the information provided (no CDA immunity), or merely the publisher of third party content based on third party training data (potential immunity under CDA Section 230)?

One might contend that generative AI tools, merely by that term, suggest they “generate” content and thus the service would not enjoy CDA immunity for claims arising from the outputted content (as opposed to a social media platform that hosts user-generated content). On the other hand, one might argue that a generative AI tool is not a person or entity creating independent content, rather an algorithm that arranges third party training data in some useful form in response to a user prompt, and thus should be protected by CDA immunity for output.  The Supreme Court recently declined the opportunity to comment on CDA immunity as it pertains to a social media platform’s algorithmic organization or presentation of content. With the introduction of a bill that would carve out generative AI providers from CDA publisher immunity, it may be that the senators, knowingly or not, have waded into this thorny legal issue.

Will this bill progress? Who knows. Congress does not have a successful track record in passing CDA-related legislation (see the travails of CDA reform), but GenAI seems to be a matter of bipartisan concern.  We will watch the bipartisan-sponsored bill’s progress carefully to see if it has a chance of passage through a divided Congress.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Jeffrey Neuburger Jeffrey Neuburger

Jeffrey Neuburger is co-head of Proskauer’s Technology, Media & Telecommunications Group, head of the Firm’s Blockchain Group and a member of the Firm’s Privacy & Cybersecurity Group.

Jeff’s practice focuses on technology, media and intellectual property-related transactions, counseling and dispute resolution. That expertise…

Jeffrey Neuburger is co-head of Proskauer’s Technology, Media & Telecommunications Group, head of the Firm’s Blockchain Group and a member of the Firm’s Privacy & Cybersecurity Group.

Jeff’s practice focuses on technology, media and intellectual property-related transactions, counseling and dispute resolution. That expertise, combined with his professional experience at General Electric and academic experience in computer science, makes him a leader in the field.

As one of the architects of the technology law discipline, Jeff continues to lead on a range of business-critical transactions involving the use of emerging technology and distribution methods. For example, Jeff has become one of the foremost private practice lawyers in the country for the implementation of blockchain-based technology solutions, helping clients in a wide variety of industries capture the business opportunities presented by the rapid evolution of blockchain. He is a member of the New York State Bar Association’s Task Force on Emerging Digital Finance and Currency.

Jeff counsels on a variety of e-commerce, social media and advertising matters; represents many organizations in large infrastructure-related projects, such as outsourcing, technology acquisitions, cloud computing initiatives and related services agreements; advises on the implementation of biometric technology; and represents clients on a wide range of data aggregation, privacy and data security matters. In addition, Jeff assists clients on a wide range of issues related to intellectual property and publishing matters in the context of both technology-based applications and traditional media.

Photo of Jonathan Mollod Jonathan Mollod

Jonathan P. Mollod is an attorney and content editor and a part of the firm’s Technology, Media and Telecommunications (TMT) Group. Jonathan earned his J.D. from Vanderbilt Law School. He focuses on issues involving technology, media, intellectual property and licensing issues and general…

Jonathan P. Mollod is an attorney and content editor and a part of the firm’s Technology, Media and Telecommunications (TMT) Group. Jonathan earned his J.D. from Vanderbilt Law School. He focuses on issues involving technology, media, intellectual property and licensing issues and general online/tech law issues of the day.