We have written before about the issues presented by the Illinois Biometric Information Privacy Act, 740 Ill. Comp Stat. 14/1 (“BIPA”). BIPA is still the only state biometric privacy statute with a private right of action. It has garnered national attention and become the epicenter of biometrics-based litigation, with dozens of cases pending alleging violations of the statute (defendants include employers of all types, social media platforms, service providers, and many other businesses that interact with Illinois residents). Just as the privacy concerns surrounding the collection and storage of biometric data have come into sharper focus with more and more companies employing such technologies for digital authentication, security and other uses, the litigation surrounding BIPA has garnered much controversy and the legislature has previously been called upon to amend the statute to limit its reach. The Illinois legislature is now considering a bill (SB3053) that would fundamentally alter the privacy protections under BIPA Continue Reading
Today, the President signed H.R. 1865, the “Allow States and Victims to Fight Online Sex Trafficking Act of 2017” (commonly known as “FOSTA”). The law is intended to limit the immunity provided under Section 230 of the Communications Decency Act (“CDA Section 230”) for online services that knowingly host third-party content that promotes or facilitates sex trafficking. As drafted, the law has retroactive effect and applies even with respect to activities occurring prior to its enactment. Continue Reading
This week, the District Court for the Northern District of California dismissed the Gullen putative class action asserting Illinois biometric privacy claims brought by “non-users” based on evidence that the social media site did not use its facial recognition technology on business or organizational accounts (as opposed to personal social media pages). (Gullen v. Facebook, Inc., No. 16-00937 (N.D. Cal. Apr. 3, 2018)). This ruling on the merits follows a decision last month where the court refused to dismiss the action due to lack of standing. In Gullen, the plaintiff alleged that Facebook violated the Illinois Biometric Information Privacy Act (BIPA), 740 Ill. Comp. Stat. 14/1 et seq. (“BIPA”), by collecting his biometric identifiers without notice or consent via Tag Suggestions, its facial recognition-based system of photo tagging. The plaintiff’s claim was based upon a single photograph uploaded to an organizational page. A declaration by a software engineer for the defendant confirmed that not all photos uploaded to Facebook undergo facial recognition and that plaintiff’s photo was not scanned, and since plaintiff failed to rebut such evidence, the court granted summary judgment in the site’s favor.
While the Gullen action was dismissed on factual grounds, the companion In re Facebook Biometric Privacy Litig. action involving Facebook users remains ongoing and raises important legal issues surrounding BIPA, including the scope of the statute as it relates to uploaded photographs and the sufficiency of Facebook’s notice and consent procedures, as well as constitutional issues regarding the extraterritorial reach of BIPA to activities and cloud-based transactions that allegedly occurred outside of Illinois.
In this long-running dispute that has been previously dubbed “The World Series of IP cases” by the presiding judge, Oracle America Inc. (“Oracle”) accuses Google Inc. (“Google”) of unauthorized use of some of its Java-related copyrights in Google’s Android software platform. Specifically, Oracle alleges that Google infringed the declaring code of certain Java API packages for use in Android, including copying the elaborate taxonomy covering 37 packages that involves multiple classes and methods. Google had declined to obtain a license from Oracle to use the Java APIs in its platform or license the same under an open source GPL license; instead it copied the declaring code from the 37 Java API packages (over 11,000 lines of code), but wrote its own implementing code. Google designed it this way, believing that Java application programmers would want to find the same 37 sets of functionalities in the new Android system callable by the same names as used in Java. Continue Reading
In the flurry of deal-making that resulted in a 2,232-page funding bill released Wednesday, lawmakers negotiated the inclusion of “The Clarifying Lawful Overseas Use of Data Act” (often referred to as the “CLOUD Act”) (see page 2,201 of the bill text). The CLOUD Act provides a procedural structure for law enforcement to pursue the preservation or production of data and other information residing on servers located overseas that is within the possession, custody or control of the provider.
In this age of cloud computing, data can rest overseas or in multiple locations. As we’ve previously discussed, it is increasingly common to see extraterritorial legal disputes arise when parties attempt to apply laws passed before the digital age to our current landscape. Continue Reading
UPDATE: On March 19, 2018, the district court granted the defendant’s motion for certification of the court’s February 15th partial summary judgment decision for interlocutory appeal to the Second Circuit. In allowing immediate appeal, the court agreed that its prior order “has created tremendous uncertainty for online publishers” and “given the frequency with which embedded images are ‘retweeted,’ the resolution of this legal question has an impact beyond this case.” We will closely follow the appeal of this important copyright issue.
A New York district court recently held that a host of online news publishers and media websites that embedded certain tweets (containing unauthorized uploads of plaintiff’s copyrighted photo) on their websites violated the plaintiff’s exclusive display right, despite the fact that the image at issue was hosted on a server owned and operated by an unrelated third party (i.e., Twitter). (See Goldman v. Breitbart News Network, LLC, No. 17-03144 (S.D.N.Y. Feb. 15, 2018)). In doing so, the court declined to adopt the Ninth Circuit’s so-called “server test” first espoused in the 2007 Perfect 10 decision, which held that the infringement of the public display right in a photographic image depends, in part, on where the image was hosted.
Under the “server test,” only a server that actually stored the photographs and “serves that electronic information directly to the user (`i.e., physically sending ones and zeroes over the Internet to the user’s browser’) could infringe the copyright holder’s rights.” In its ruling, the Goldman court granted the plaintiff’s motion for partial summary judgment, and determined that the reasoning of the Perfect 10 decision, which applied to a search engine’s image search function and display of full-size images hosted on third-party servers to a user, was not applicable to the embedding practices the media sites engaged in. Continue Reading
UPDATE: On March 2, 2018, in a related biometric privacy litigation surrounding Tag Suggestions brought by non-users of Facebook, a California district court in a brief order declined to dismiss the action for lack of standing, citing its reasoning in the Patel opinion. (Gullen v. Facebook, Inc., No. 16-00937 (N.D. Cal. Mar. 2, 2018)). While Facebook offered evidence that it does not store faceprint data on non-users, but only analyzes it to see if there is a match, the court stated such substantive arguments are best left for summary judgment or trial. Note: the Gullen case is related to the consolidated Facebook biometric privacy litigation and as such, is being heard before the same judge. The difference between the two actions is that Gullen involves non-Facebook users, whereas the plaintiffs in In re Facebook are registered users.
This past week, a California district court again declined Facebook’s motion to dismiss an ongoing litigation involving claims under the Illinois Biometric Information Privacy Act, 740 Ill. Comp Stat. 14/1 (“BIPA”), surrounding Tag Suggestions, its facial recognition-based system of photo tagging. In 2016, the court declined to dismiss the action based upon, among other things, Facebook’s contention that BIPA categorically excludes digital photographs from its scope. This time around, the court declined to dismiss the plaintiffs’ complaint for lack of standing under the Supreme Court’s 2016 Spokeo decision on the ground that plaintiffs have failed to allege a concrete injury in fact. (Patel v. Facebook, Inc., No. 15-03747 (N.D. Cal. Feb. 26, 2018) (cases consolidated at In re Facebook Biometric Information Privacy Litig., No. 15-03747 (N.D. Cal.)). As a result, Facebook will be forced to continue to litigate this action.
This dispute is being closely watched as there are a number of similar pending BIPA suits relating to biometrics and facial recognition and other defendants are looking at which of Facebook’s defenses might hold sway with a court. Continue Reading
UPDATE: On February 22, 2018, the district court granted 3taps’s motion to relate its action to the ongoing hiQ v. LinkedIn litigation. This motion was based upon a local Northern District of California rule that holds that cases should be related when the actions concern substantially the same parties, transaction or event, and there would be an “unduly burdensome duplication of labor…or conflicting results” if the cases were heard before different judges. As a result, the 3taps case, over the opposition of LinkedIn, was reassigned to Judge Edward Chen, who also presided over the lower court proceedings in the hiQ v. LinkedIn litigation.
In the latest development in the legal controversy over scraping, 3taps, Inc. (“3taps”), a data aggregator and “exchange platform” for developers, filed suit against LinkedIn seeking a declaratory judgment that 3taps would not be in violation of the Computer Fraud and Abuse Act (CFAA) if it accesses and collects publicly-available data from LinkedIn’s website. (3Taps Inc. v. LinkedIn Corp., No. 18-00855 (C.D. Cal. filed Feb. 8, 2018)). The basis of 3Taps’s complaint is last year’s hotly-debated California district court ruling (hiQ Labs, Inc. v. LinkedIn, Corp., 2017 WL 3473663 (N.D. Cal. Aug. 14, 2017)), where the court granted a preliminary injunction compelling LinkedIn to disable any technical measures it had employed to block a data analytics company from scraping the publicly available data on LinkedIn’s website. The hiQ ruling essentially limited the applicability of the CFAA as a tool against the scraping of publicly-available website data. [For an analysis of the hiQ lower court decision, please read the Client Alert on our website]. Continue Reading
A California district court issued an important opinion in a dispute between a ticket sales platform and a ticket broker that employed automated bots to purchase tickets in bulk. (Ticketmaster L.L.C. v. Prestige Entertainment, Inc., No. 17-07232 (C.D. Cal. Jan. 31, 2018)). For those of us who have been following the evolution of the law around the use of automation to scrape websites, this case is interesting. The decision interprets some of the major Ninth Circuit decisions of recent memory on liability for web scraping. Indeed, two weeks ago, we wrote about a case in which the Ninth Circuit interpreted certain automated downloading practices under the CFAA and CDAFA. Also, we wrote about and are awaiting the decision in the hiQ v. LinkedIn appeal before the Ninth Circuit. Also prior posts on the topic include a discussion of a noteworthy appeals court opinion that examined scraping activity under copyright law and the scope of liability under the DMCA anticircumvention provisions. These seminal decisions and the issues they raise were expressly or implicitly addressed in the instant case. While we will briefly review some of the highlights of this decision below, the case is a must-read for website operators and entities that engage in web scraping activities. Continue Reading
Facebook recently announced that it would make changes to its news feed to prioritize content that users share and discuss and material from “reputable publishers.” These changes are part of what Mark Zuckerberg says is a refocusing of Facebook from “helping [users] find relevant content to helping [users] have more meaningful social interactions.” This refocus highlights the tensions between Facebook’s conflicting roles as a social media platform on one hand, and, in effect, a distributor of third party content on the other. We have discussed this issue in previous posts.
As Facebook implements these newly-announced changes in the way third party content will be presented — focusing on “trusted content” — the operational models powering Facebook’s use of third party content (user generated and otherwise) will also evolve. Lawyers should keep an eye on what the changes might mean for Facebook from a liability perspective. For example, will Facebook’s direct or indirect control of third party content impact its immunity from publisher and distributor liability under Section 230 of the Communication Decency Act? Or, rather will changes to its algorithm to prioritize trusted content still be deemed to be quintessential publisher conduct, and therefore within the scope of Section 230? Also, to the extent that Facebook directly or indirectly curates third party content, could it possibly lose the benefits afforded by the safe harbors of the Digital Millennium Copyright Act?
Given the fact that the immunities and safe harbors for online service providers are so crucial for the business model of social media platforms such as Facebook, one can be sure that counsel to Facebook will highlight any changes that may call into question the availability of those immunities and safe harbors. All the same, you can be sure that a creative plaintiff’s lawyer may attempt to use any change where Facebook is somehow more engaged in the curation, display or distribution of third party content to pierce through the CDA/DMCA armor to hold Facebook responsible for allegedly problematic third party content.