Photo of Jeffrey Neuburger

Jeffrey Neuburger is co-head of Proskauer’s Technology, Media & Telecommunications Group, head of the Firm’s Blockchain Group and a member of the Firm’s Privacy & Cybersecurity Group.

Jeff’s practice focuses on technology, media and intellectual property-related transactions, counseling and dispute resolution. That expertise, combined with his professional experience at General Electric and academic experience in computer science, makes him a leader in the field.

As one of the architects of the technology law discipline, Jeff continues to lead on a range of business-critical transactions involving the use of emerging technology and distribution methods. For example, Jeff has become one of the foremost private practice lawyers in the country for the implementation of blockchain-based technology solutions, helping clients in a wide variety of industries capture the business opportunities presented by the rapid evolution of blockchain. He is a member of the New York State Bar Association’s Task Force on Emerging Digital Finance and Currency.

 

Jeff counsels on a variety of e-commerce, social media and advertising matters; represents many organizations in large infrastructure-related projects, such as outsourcing, technology acquisitions, cloud computing initiatives and related services agreements; advises on the implementation of biometric technology; and represents clients on a wide range of data aggregation, privacy and data security matters. In addition, Jeff assists clients on a wide range of issues related to intellectual property and publishing matters in the context of both technology-based applications and traditional media.

On January 7, 2019, the Securities and Exchange Commission’s Office of Compliance Inspections and Examinations (OCIE) announced its 2020 examination priorities. In doing so, OCIE identified certain areas of technology-related concern, and in particular, on the issue of alternative data and cybersecurity. [For a more detailed review of OCIE’s

On January 7, 2019, the federal Office of Management and Budget (OMB) released a draft of a memorandum setting forth guidance to assist federal agencies in developing regulatory and non-regulatory approaches regarding artificial intelligence (AI).  This draft guidance will be available for public comment for sixty days, after which it will be finalized and issued to federal agencies.

According to the draft, the guidance was developed with the intent to reduce barriers to innovation while also balancing privacy and security concerns and respect for IP. The proposed guidance features ten principles to guide regulatory approaches to AI applications.  In addition, in what may be a boon for those in the private sector developing AI infrastructure, the OMB reinforces the objective of making federal data and models generally available to the private sector for non-federal use in developing AI systems.

Initial responses to the proposed guidance has been mixed, and it remains to be seen how the principles in the guidance (when finalized) will be put in practice. Notably, however, those who intend to invest significant resources in AI-based infrastructure should be aware of what may prove to be the emerging blueprint for AI regulation in the near future.

It is that time of year when we look back to see what tech-law issues took up most of our time this year and look ahead to see what the emerging issues are for 2020.

Data: The Issues of the Year

Data presented a wide variety of challenging legal issues in 2019. Data is solidly entrenched as a key asset in our economy, and as a result, the issues around it demanded a significant level of attention.

  • Clearly, privacy and data security-related data issues were dominant in 2019. The GDPR, CCPA and other privacy regulations garnered much consideration and resources, and with GDPR enforcement ongoing and CCPA enforcement right around the corner, the coming year will be an important one to watch. As data generation and collection technologies continued to evolve, privacy issues evolved as well.  In 2019, we saw many novel issues involving mobile, biometric and connected cars. Facial recognition technology generated a fair amount of litigation, and presented concerns regarding the possibility of intrusive governmental surveillance (prompting some municipalities, such as San Francisco, to ban its use by government agencies).
  • Because data has proven to be so valuable, innovators continue to develop new and sometimes controversial technological approaches to collecting data. The legal issues abound.  For example, in the past year, we have been advising on the implications of an ongoing dispute between the City Attorney of Los Angeles and an app operator over geolocation data collection, as well as a settlement between the FTC and a personal email management service over access to “e-receipt” data.  We have entertained multiple questions from clients about the unsettled legal terrain surrounding web scraping and have been closely following developments in this area, including the blockbuster hiQ Ninth Circuit ruling from earlier this year. As usual, the pace of technological innovation has outpaced the ability for the law to keep up.
  • Data security is now regularly a boardroom and courtroom issue, with data breaches, phishing, ransomware attacks and identity theft (and cyberinsurance) the norm. Meanwhile, consumers are experiencing deeper and deeper “breach fatigue” with every breach notice they receive. While the U.S. government has not yet been able to put into place general national data security legislation, states and certain regulators are acting to compel data collectors to take reasonable measures to protect consumer information (e.g., New York’s newly-enacted SHIELD Act) and IoT device manufacturers to equip connected devices with certain security features appropriate to the nature and function of the devices secure (e.g., California’s IoT security law, which becomes effective January 1, 2020). Class actions over data breaches and security lapses are filed regularly, with mixed results.
  • Many organizations have focused on the opportunistic issues associated with new and emerging sources of data. They seek to use “big data” – either sourced externally or generated internally – to advance their operations.  They are focused on understanding the sources of the data and their lawful rights to use such data.  They are examining new revenue opportunities offered by the data, including the expansion of existing lines, the identification of customer trends or the creation of new businesses (including licensing anonymized data to others).
  • Moreover, data was a key asset in many corporate transactions in 2019. Across the board in M&A, private equity, capital markets, finance and some real estate transactions, data was the subject of key deal points, sometimes intensive diligence, and often difficult negotiations. Consumer data has even become a national security issue, as the Committee on Foreign Investment in the United States (CFIUS), expanded under a 2018 law, began to scrutinize more and more technology deals involving foreign investment, including those involving sensitive personal data.
  • For more information about developments over the past year on data-related issues, and to keep abreast on new developments in the future, you may want to subscribe to Proskauer’s privacy blog, privacylaw.proskauer.com. You may also want to review our Practical Law article “Trends in Privacy and Data Security:2018” and get a hold of our update that will publish in winter 2020.

I am not going out on a limb in saying that 2020 and beyond promise many interesting developments in “big data,” privacy and data security.

Recently, the Ninth Circuit reinstated a $460,000 jury verdict against print-on-demand site Zazzle, Inc. (“Zazzle”) for willful copyright infringement, putting a final stamp (perhaps) on a long-running dispute that explored important DMCA safe harbor issues for online print-on-demand services. (Greg Young Publishing, Inc. v. Zazzle, Inc., No. 18-55522 (9th Cir. Nov. 20, 2019) (unpublished). The appeals court found that Zazzle’s anti-infringement oversight mechanisms were insufficient during the period of infringement when a number of the plaintiff’s Greg Young Publishing, Inc.’s (“GYPI”) visual art works were uploaded by users onto Zazzle’s site without authorization.

With the online shopping season in full swing, the FTC decided that online retailers might benefit from a reminder as to the dos and don’ts for social media influencers.  Thus, the FTC released a new guide, “Disclosures 101 for Social Media Influencers,” that reiterates its position about the responsibility of “influencers” to disclose “material” connections with brands when endorsing products in online posts.  Beyond this new guide, which is written in an easy-to-read brochure format (with headings such a “How to Disclose” and “When to Disclose”), the FTC released related videos to convey the message that influencers should “stay on the right side of the law” and disclose when appropriate the relationship with a brand he or she is endorsing.  This latest reminder to influencers comes on the heels of the FTC sending 90 letters to influencers in April 2017 notifying them of their responsibilities under the FTC”s Endorsement Guides, and the prior publishing of an Endorsement Guides FAQ. With the release of fresh guidance, now is a good time for brands with relationships with influencers to ensure endorsements are not deceptive and remain on the right side of the law.  Indeed, advertisers should have reasonable programs in place to train and monitor members of their influencer network and influencers themselves should remain aware of requirements under the Endorsement Guides. 

Last month, LinkedIn Corp. (“LinkedIn”) filed a petition for rehearing en banc of the Ninth Circuit’s blockbuster decision in hiQ Labs, Inc. v. LinkedIn Corp., No. 17-16783 (9th Cir. Sept. 9, 2019). The crucial question before the original panel concerned the scope of Computer Fraud and Abuse Act

The Georgia Supreme Court ruled that the retrieval of electronic automobile data from an electronic data recording device (e.g., airbag control modules) without a warrant at the scene of a fatal collision was a search and seizure that implicates the Fourth Amendment, regardless of any reasonable expectations of privacy. (

On October 11, 2019, LinkedIn Corp. (“LinkedIn”) filed a petition for rehearing en banc of the Ninth Circuit’s blockbuster decision in hiQ Labs, Inc. v. LinkedIn Corp., No. 17-16783 (9th Cir. Sept. 9, 2019). The crucial question before the original panel concerned the scope of Computer Fraud and

UPDATE: On October 14, 2019, the parties entered into a Joint Stipulation dismissing the case, with prejudice.  It appears from some reports that Stackla’s access to Facebook has been reinstated as part of the settlement.
UPDATE: On September 27, 2019, the California district court issued its written order denying Stackla’s request for a TRO.  In short, the court found that, at this early stage, Stackla only demonstrated “speculative harm” and its “vague statements” did not sufficiently show that restoration of access to Facebook’s API would cure the alleged impending reality of Stackla losing customers and being driven out of business (“The extraordinary relief of a pre-adjudicatory injunction demands more precision with respect to when irreparable harm will occur than ‘soon.’”).  As for weighing whether a TRO would be in the public interest, the court, while understanding Stackla’s predicament, found that issuing a TRO could hamper Facebook’s ability to “decisively police its social-media platforms” and that there was a public interest in allowing a company to police the integrity of its platforms (“Facebook’s enforcement activities would be compromised if judicial review were expected to precede rather than follow its enforcement actions”). [emphasis in original]. This ruling leaves the issue for another day, perhaps during a preliminary injunction hearing, after some additional briefing of the issues.

The ink is barely dry on the landmark Ninth Circuit hiQ Labs decision. Yet, a new dispute has already cropped up testing the bounds of the CFAA and the ability of a platform to enforce terms restricting unauthorized scraping of social media content. (See Stackla, Inc. v. Facebook, Inc., No. 19-5849 (N.D. Cal. filed Sept. 19, 2019)).  This dispute involves Facebook and a social media sentiment tracking company, Stackla, Inc., which, as part of its business, accesses Facebook and Instagram content. This past Wednesday, September 25th, the judge in the case denied Stackla, Inc.’s request for emergency relief restoring its access to Facebook’s platform. While the judge has yet to issue a written ruling, the initial pleadings and memoranda filed in the case are noteworthy and bring up important issues surrounding the hot issue of scraping.

The Stackla dispute has echoes of hiQ v LinkedIn. Both involve the open nature of “public” websites (although the “public” nature of the content at issue appears to be in dispute.)  Both disputes address whether the Computer Fraud and Abuse Act (the “CFAA”) can be used as a tool to prevent the scraping of such sites. Both disputes address how a platform may use its terms of use to prohibit automated scraping or data collection beyond the scope of such terms, although the discussion in hiQ was extremely brief.  And like hiQ, Stackla asserts that if not for the ability to use Facebook and Instagram data, Stackla would be out of business. Thus both disputes address whether a court’s equitable powers should come into play if a platform’s termination of access will result in a particular company’s insolvency.  Given the Ninth Circuit’s opinion in favor of hiQ, it is highly likely that Stackla’s lawyers believed the Ninth Circuit decision was their golden ticket in this case. The judge’s ruling on the request for emergency relief suggests they may be disappointed.

In a ruling that is being hailed as a victory for web scrapers and the open nature of publicly available website data, the Ninth Circuit today issued its long-awaited opinion in hiQ Labs, Inc. v. LinkedIn Corp., No. 17-16783 (9th Cir. Sept. 9, 2019). The crucial question before the court was whether once hiQ Labs, Inc. (“hiQ”) received LinkedIn Corp.’s (“LinkedIn”) cease-and-desist letter demanding it stop scraping public LinkedIn profiles, any further scraping of such data was “without authorization” within the meaning of the federal Computer Fraud and Abuse Act (CFAA). The appeals court affirmed the lower court’s order granting a preliminary injunction barring the professional networking platform LinkedIn from blocking hiQ, a data analytics company, from accessing and scraping publicly available LinkedIn member profiles to create competing business analytic products. Most notably, the Ninth Circuit held that hiQ had shown a likelihood of success on the merits in its claim that when a computer network generally permits public access to its data, a user’s accessing that publicly available data will not constitute access “without authorization” under the CFAA.

In light of this ruling, data scrapers, content aggregators and advocates of a more open internet will certainly be emboldened, but we reiterate something we advised back in our 2017 Client Alert about the lower court’s hiQ decision: while the Ninth Circuit’s decision suggests that the CFAA is not an available remedy to protect against unwanted scraping of public website data that is “presumptively open to all,” entities engaged in scraping should remain careful. The road ahead, while perhaps less bumpy than before, still contains rough patches. Indeed, the Ninth Circuit cautioned that its opinion was issued only at the preliminary injunction stage and that the court did not “resolve the companies’ legal dispute definitively, nor do we address all the claims and defenses they have pleaded in the district court.”