Photo of Jeffrey Neuburger

Jeffrey Neuburger is co-head of Proskauer’s Technology, Media & Telecommunications Group, head of the Firm’s Blockchain Group and a member of the Firm’s Privacy & Cybersecurity Group.

Jeff’s practice focuses on technology, media and intellectual property-related transactions, counseling and dispute resolution. That expertise, combined with his professional experience at General Electric and academic experience in computer science, makes him a leader in the field.

As one of the architects of the technology law discipline, Jeff continues to lead on a range of business-critical transactions involving the use of emerging technology and distribution methods. For example, Jeff has become one of the foremost private practice lawyers in the country for the implementation of blockchain-based technology solutions, helping clients in a wide variety of industries capture the business opportunities presented by the rapid evolution of blockchain. He is a member of the New York State Bar Association’s Task Force on Emerging Digital Finance and Currency.

 

Jeff counsels on a variety of e-commerce, social media and advertising matters; represents many organizations in large infrastructure-related projects, such as outsourcing, technology acquisitions, cloud computing initiatives and related services agreements; advises on the implementation of biometric technology; and represents clients on a wide range of data aggregation, privacy and data security matters. In addition, Jeff assists clients on a wide range of issues related to intellectual property and publishing matters in the context of both technology-based applications and traditional media.

In the closing days of August, two federal appeals courts issued noteworthy decisions at the intersection of workplace conduct, computer law and online platforms.  The two opinions were released during a period of time this past summer amidst the continuing flurry of AI-related case developments and perhaps did not get wide media attention (but which might prove to be important cases in the future).

  • Second Circuit – CDA Section 230. The court ruled that a software platform was not entitled to CDA Section 230 immunity – at least at the early stage in the case – based on allegations that it actively contributed to the unlawful software content at issue by manufacturing and distributing an emissions-control “defeat devices.” (U.S. v. EZ Lynk, SEZC, No. 24-2386 (2d Cir. Aug. 20, 2025)). The opinion’s discussion of what it means to be a “developer” of content has implications for future litigation that might involve generative AI, app stores, marketplaces, and IoT ecosystems, where certain fact patterns could blur the line between passive hosting and active co-development.
  • Third Circuit – CFAA and Trade Secrets: Days later, the Third Circuit issued an important decision (subsequently amended, with minor changes that did not change the holding) that further develops CFAA case law post-Van Buren. The court held that CFAA liability, an anti-hacking statute, does not extend to workplace computer use violations. (NRA Group, LLC v. Durenleau, No. 24-1123 (3d Cir. Aug. 26, 2025) (vacated by Oct. 7, 2025 amended opinion), reh’g en banc denied (Oct. 7, 2025)). The court also addressed and rejected a novel claim of trade secret misappropriation based on access to account passwords.     

Together, the cases show how courts continue to interpret the reach of technology-related statutes in contexts never contemplated when those laws were first enacted.

  • Law establishes national prohibition against nonconsensual online publication of intimate images of individuals, both authentic and computer-generated.
  • First federal law regulating AI-generated content.
  • Creates requirement that covered platforms promptly remove depictions upon receiving notice of their existence and a valid takedown request.
  • For many online service providers, complying with the Take It Down Act’s notice-and-takedown requirement may warrant revising their existing DMCA takedown notice provisions and processes.
  • Another carve-out to CDA immunity? More like a dichotomy of sorts…. 

On May 19, 2025, President Trump signed the bipartisan-supported Take it Down Act into law. The law prohibits any person from using an “interactive computer service” to publish, or threaten to publish, nonconsensual intimate imagery (NCII), including AI-generated NCII (colloquially known as revenge pornography or deepfake revenge pornography). Additionally, the law requires that, within one year of enactment, social media companies and other covered platforms implement a notice-and-takedown mechanism that allows victims to report NCII.  Platforms must then remove properly reported imagery (and any known identical copies) within 48 hours of receiving a compliant request.

UPDATE (April 17, 2025): The below reflects a development occurring after our publication of the original post.

On April 11, 2025, the National Security Division (the “NSD”) released several documents setting out initial guidance on how to comply with the Rule, which the NSD refers to as the Data Security

  • Uses of lnformation Limited to “What is Reasonably Necessary”
  • Use of Deidentified Data Not Within Scope
  • Screen Scraping Survives

After a yearslong lead-up, the Consumer Financial Protection Bureau (CFPB) published its final “open banking” rule in October. The rule effectuates the section of the Consumer Financial Protection Act, which charged

After several weeks of handwringing about the fate of SB 1047 – the controversial AI safety bill that would have required developers of powerful AI models and entities providing the computing resources to train such models to put appropriate safeguards and policies into place to prevent critical harms – California

On September 17, 2024, Governor Gavin Newsom signed AB 2602 into California law (to be codified at Cal. Lab. Code §927).  The law addresses the use of “digital replicas” of performers.  As defined in the law, a digital replica is:

a computer-generated, highly realistic electronic representation that is readily identifiable

In an ongoing dispute commenced in 2016, the Eleventh Circuit for the second time in the lifetime of the litigation considered trade secret misappropriation and related copyright claims in a scraping case between direct competitors.

The case involved plaintiff Compulife Software, Inc. (“Plaintiff” or “Compulife”) – in the business of

On May 9, 2024, a California district court dismissed, with leave to amend, the complaint brought by social media platform X Corp. (formerly Twitter) against data provider Bright Data Ltd. (“Bright Data”) over Bright Data’s alleged scraping of publicly available data from X for use in data products sold