Photo of Jeffrey Neuburger

Jeffrey Neuburger is co-head of Proskauer’s Technology, Media & Telecommunications Group, head of the Firm’s Blockchain Group and a member of the Firm’s Privacy & Cybersecurity Group.

Jeff’s practice focuses on technology, media and intellectual property-related transactions, counseling and dispute resolution. That expertise, combined with his professional experience at General Electric and academic experience in computer science, makes him a leader in the field.

As one of the architects of the technology law discipline, Jeff continues to lead on a range of business-critical transactions involving the use of emerging technology and distribution methods. For example, Jeff has become one of the foremost private practice lawyers in the country for the implementation of blockchain-based technology solutions, helping clients in a wide variety of industries capture the business opportunities presented by the rapid evolution of blockchain. He is a member of the New York State Bar Association’s Task Force on Emerging Digital Finance and Currency.

 

Jeff counsels on a variety of e-commerce, social media and advertising matters; represents many organizations in large infrastructure-related projects, such as outsourcing, technology acquisitions, cloud computing initiatives and related services agreements; advises on the implementation of biometric technology; and represents clients on a wide range of data aggregation, privacy and data security matters. In addition, Jeff assists clients on a wide range of issues related to intellectual property and publishing matters in the context of both technology-based applications and traditional media.

On May 9, 2024, a California district court dismissed, with leave to amend, the complaint brought by social media platform X Corp. (formerly Twitter) against data provider Bright Data Ltd. (“Bright Data”) over Bright Data’s alleged scraping of publicly available data from X for use in data products sold

Late last year, Chegg Inc. (“Chegg”), an online learning platform, obtained a preliminary injunction based on allegations that the various operators of the Homeworkify website (“Defendants”) – which allows users to view Chegg’s paywalled solutions without creating an account – violated the Computer Fraud and Abuse Act (CFAA). (Chegg

On January 23, 2024, a California district court released its opinion in a closely-watched scraping dispute between the social media platform Meta and data provider Bright Data Ltd. (“Bright Data”) over Bright Data’s alleged scraping of publicly-available data from Facebook and Instagram for use in data products sold to third

  • Flight and travel data has always been valuable for data aggregators and online travel services and has prompted litigation over the years.
  • Latest suit from Air Canada against a rewards travel search site raises some interesting liability issues under the CFAA.
  • The implications of this case, if the plaintiffs are successful, could impact the legal analysis of web scraping in a variety of circumstances, including for the training of generative AI models.

In a recent post, we recounted the myriad of issues raised by recently-filed data scraping suits involving job listings, company reviews and employment data.  Soon after, another interesting scraping suit was filed, this time by a major airline against an award travel search site that aggregates fare and award travel data.  Air Canada alleges that Defendant Localhost LLC (“Localhost” or “Defendant”), operator of the Seats.aero website, unlawfully bypassed technical measures and violated Air Canada’s website terms when it scraped “vast amounts” of flight data without permission and purportedly caused slowdowns to Air Canada’s site and other problems. (Air Canada v. Localhost LLC, No. 23-01177 (D. Del. Filed Oct. 19, 2023)).[1]   

The complaint alleges that Localhost harvested data from Air Canada’s site and systems to populate the seats.aero site, which claims to be “the fastest search engine for award travel.” 

It also alleged that in addition to scraping the Air Canada website, Localhost engaged in “API scraping” by impersonating authorized requests to Air Canada’s application programming interface.  

The Federal Trade Commission (FTC) has long expressed a concern about the potential misuse of location data.  For example, in a 2022 blog post, “Location, health, and other sensitive information: FTC committed to fully enforcing the law against illegal use and sharing of highly sensitive data,” the agency termed the entire location data ecosystem “opaque” and has investigated the practices and brought enforcement actions against mobile app operators and data brokers with respect to sensitive data.

One such FTC enforcement began with an August 2022 complaint against Kochava, Inc. (“Kochava”), a digital marketing and analytics firm, seeking an order halting Kochava’s alleged acquisition and downstream sale of “massive amounts” of precise geolocation data collected from consumers’ mobile devices. In that complaint, the FTC alleged that Kochava, in its role as a data broker, collects a wealth of information about consumers and their mobile devices by, among other means, purchasing data from outside entities to sell to its own customers.  Among other things, the FTC alleged that the location data provided by Kochava to its customers was not anonymized and that it was possible, using third party services, to use the geolocation data combined with other information to identify a mobile device user or owner.

In May 2023, an Idaho district court granted Kochava’s motion to dismiss the FTC’s complaint, with leave to amend. Subsequently, the FTC filed an amended complaint, and Kochava requested that the court keep the amended complaint under seal, which it did until it could rule on the merits of the parties’ arguments.

On November 3, 2023, the court granted the FTC’s motion to unseal the amended complaint, finding no “compelling reason” to keep the amended complaint under seal and rejecting Kochava’s arguments that the amended complaint’s allegations were “knowingly false” or “misleading.” (FTC v. Kochava Inc., No. 22-00377 (D. Idaho Nov. 3, 2023)). As a result, the FTC’s amended complaint has been unsealed to the public.

On October 30, 2023, President Biden issued an “Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence” (Fact Sheet, here) designed to spur new AI safety and security standards, encourage the development of privacy-preserving technologies in conjunction with AI training, address certain instances of algorithmic discrimination, advance the responsible use of AI in healthcare, study the impacts of AI on the labor market, support AI research and a competitive environment in the industry, and issue guidance on the use of AI by federal agencies.  This latest move builds on the White House’s previously-released “Blueprint for an AI Bill of Rights” and its announcement this past summer that it had secured voluntary commitments from major AI companies focusing on what the White House termed as “three principles that must be fundamental to the future of AI – safety, security, and trust.” 

UPDATE: On February 5, 2024, the California district court granted the defendant Aspen Technology Labs, Inc.’s motion to dismiss Jobiak LLC’s web scraping complaint for lack of personal jurisdiction, with leave to amend. The court found that Jobiak had not adequately alleged that its copyright and tort-related claims arose out of the defendant’s forum-related activities and that there were no allegations that Jobiak’s database or website was hosted on servers in the California forum.  On March 8, 2024, the court dismissed the action with prejudice, as Jobiak did not submit an amended complaint within the time allowed by the court.  

In recent years there has been a great demand for information about job listings, company reviews and employment data.   Recruiters, consultants, analysts and employment-related service providers, amongst others, are aggressively scraping job-posting sites to extract that type of information. Recall, for example, the long-running, landmark hiQ scraping litigation over the scraping of public LinkedIn data.

The two most recent disputes regarding scraping of employment and job-related data were brought by Jobiak LLC (“Jobiak”), an AI-based recruitment platform.  Jobiak filed two nearly-identical scraping suits in California district court alleging that competitors unlawfully scraped its database and copied its optimized job listings without authorization. (Jobiak LLC v. Botmakers LLC, No. 23-08604 (C.D. Cal. Filed Oct. 12, 2023); Jobiak LLC v. Aspen Technology Labs, Inc., No. 23-08728 (C.D. Cal. Filed Oct. 17, 2023)).

In the rapidly-evolving AI space, the last few days of this week saw significant AI developments occur perhaps even faster than usual.  For example, seven AI companies agreed to voluntary guidelines covering AI safety and security and ChatGPT rolled out a custom preferences tool to streamline usage. In addition, as a related point, Microsoft issued a transparency note for the Azure OpenAI service.  And on top of that, this week saw announcements of a number of generative AI commercial ventures which are beyond the scope of this particular post.

One of the many legal questions swirling around in the world of generative AI (“GenAI”) is to what extent Section 230 of the Communications Decency Act (CDA) applies to the provision of GenAI.  Can CDA immunity apply to GenAI-generated output and protect GenAI providers from potential third party liability?

On June 14, 2023, Senators Richard Blumenthal and Josh Hawley introduced the “No Section 230 Immunity for AI Act,” bipartisan legislation that would expressly remove most immunity under the CDA for a provider of an interactive computer service if the conduct underlying the claim or charge “involves the use or provision of generative artificial intelligence by the interactive computer service.” While the bill would eliminate “publisher” immunity under §230(c)(1) for claims involving the use or provision of generative artificial intelligence by an interactive computer service, immunity for so-called “Good Samaritan” blocking under § 230(c)(2)(A), which protects service providers and users from liability for claims arising out of good faith actions to screen or restrict access to “objectionable” material from their services, would not be affected.