New Media and Technology Law Blog

Warrantless Retrieval of Electronic Automobile Data Held to Be Unreasonable Search – Ruling Points to Private Nature of Digital Data Collected in Today’s World

The Georgia Supreme Court ruled that the retrieval of electronic automobile data from an electronic data recording device (e.g., airbag control modules) without a warrant at the scene of a fatal collision was a search and seizure that implicates the Fourth Amendment, regardless of any reasonable expectations of privacy. (Mobley v. State, No. S18G1546 (Ga. Oct. 21, 2019)). The Court went on to hold that such retrieval of data was an unreasonable search and seizure forbidden by the Fourth Amendment, and that because the State failed to identify any recognized exception to the warrant requirement applicable to the facts, the trial court should have granted the motion to suppress.  As such, the judgment of the Court of Appeals affirming the conviction of the defendant for vehicular homicide was reversed.

As described in an earlier post, the defendant was convicted of vehicular homicide based on evidence retrieved from his vehicle’s electronic data that showed that he was travelling at a high rate of speed prior to the accident.  The defendant appealed the decision of the trial court (which was affirmed by the appellate court) that denied his motion to suppress evidence of the data that law enforcement officers retrieved without a warrant from an electronic data recording device on his vehicle (note: a search warrant was obtained for the physical device the next day).

Putting aside the state criminal procedural issues and the sufficiency of the evidence against the particular defendant in this case, the decision is an important follow-up to the Supreme Court’s guidance in the area of digital privacy that it set out in recent years in the Riley and Carpenter decisions.  With cars becoming more like computers and sensors on four wheels, automated automobile data may potentially be viewed as sensitive as certain types of private data collected by mobile devices.  With the advent of autonomous cars, the Mobley court recognized how one’s private sphere can extend beyond the home and, depending on the factual circumstances and the nature of the search, may include automated data collected by one’s devices (both small, like a mobile phone, and large, as an automobile). With new technologies like digital personal assistants and the coming of 5G and the supposed Internet of Things (IoT) revolution of connectedness, we imagine these issues will coming up more and more in the coming years.

LinkedIn Petitions Circuit Court for En Banc Review of hiQ Scraping Decision

On October 11, 2019, LinkedIn Corp. (“LinkedIn”) filed a petition for rehearing en banc of the Ninth Circuit’s blockbuster decision in hiQ Labs, Inc. v. LinkedIn Corp., No. 17-16783 (9th Cir. Sept. 9, 2019). The crucial question before the original panel concerned the scope of Computer Fraud and Abuse Act (CFAA) liability to unwanted web scraping of publicly available social media profile data and whether once hiQ Labs, Inc. (“hiQ”), a data analytics firm, received LinkedIn’s cease-and-desist letter demanding it stop scraping public profiles, any further scraping of such data was “without authorization” within the meaning of the CFAA. The appeals court affirmed the lower court’s order granting a preliminary injunction barring LinkedIn from blocking hiQ from accessing and scraping publicly available LinkedIn member profiles to create competing business analytic products. Most notably, the Ninth Circuit held that hiQ had shown a likelihood of success on the merits in its claim that when a computer network generally permits public access to its data, a user’s accessing that publicly available data will not constitute access “without authorization” under the CFAA.

In its petition for en banc rehearing, LinkedIn advanced several arguments, including:

  • The hiQ decision conflicts with the Ninth Circuit Power Ventures precedent, where the appeals court held that a commercial entity that accesses a website after permission has been explicitly revoked can, under certain circumstances, be civilly liable under the CFAA. Power Ventures involved Facebook user data protected by password (that users initially allowed a data aggregator permission to access). LinkedIn argued that the hiQ court’s logic in distinguishing Power Ventures was flawed and that the manner in which a user classifies his or her profile data should have no bearing on a website owner’s right to protect its physical servers from trespass.

“Power Ventures thus holds that computer owners can deny authorization to access their physical servers within the meaning of the CFAA, even when users have authorized access to data stored on the owner’s servers.”

“The panel sought to distinguish Power Ventures on the basis that while Power ‘was gathering user data that was protected by Facebook’s username and password authentication system, the data hiQ was scraping was available to anyone with a web browser.’ But Power Ventures did not turn on how a data owner decided to make her data available to third parties—by expressly sharing a password (in Power Ventures) or by making it viewable to members of the public (here). Nothing about a data owner’s decision to place her data on a website changes LinkedIn’s independent right to regulate who can access its website servers. To the contrary, Power Ventures acknowledged that ‘permission’ to access websites that are ‘presumptively open to all comers’ could be ‘revoked expressly’ by the website owner.”

  • The language of the CFAA should not be read to allow for “authorization” to be assumed (and unable to be revoked) for publicly available website data, either under Ninth Circuit precedent or under the CFAA-related case law of other circuits.

“Nothing in the CFAA’s text or the definition of ‘authorization’ that the panel employed—“[o]fficial permission to do something; sanction or warrant,” suggests that enabling websites to be publicly viewable is not ‘authorization’ that can be revoked.”

  • The privacy interests enunciated by LinkedIn on behalf of its users is “of exceptional importance,” and the court discounted the fact that hiQ is “unaccountable” and has no contractual relationship with LinkedIn users, such that hiQ could conceivably share the scraped data or aggregate it with other data.

“Instead of recognizing that LinkedIn members share their information on LinkedIn with the expectation that it will be viewed by a particular audience (human beings) in a particular way (by visiting their pages)—and that it will be subject to LinkedIn’s sophisticated technical measures designed to block automated requests—the panel assumed that LinkedIn members expect that their data will be ‘accessed by others, including for commercial purposes,’ even purposes antithetical to their privacy setting selections. That conclusion is fundamentally wrong.

Both website operators and open internet advocates will be watching closely to see if the full Ninth Circuit decides to rehear the appeal, given the importance of the CFAA issue and the prevalence of data scraping of publicly available website content. We will keep a close watch on developments.

hiQ v. LinkedIn Redux? Ninth Circuit Decision Tested in New Case

UPDATE: On October 14, 2019, the parties entered into a Joint Stipulation dismissing the case, with prejudice.  It appears from some reports that Stackla’s access to Facebook has been reinstated as part of the settlement.

 

UPDATE: On September 27, 2019, the California district court issued its written order denying Stackla’s request for a TRO.  In short, the court found that, at this early stage, Stackla only demonstrated “speculative harm” and its “vague statements” did not sufficiently show that restoration of access to Facebook’s API would cure the alleged impending reality of Stackla losing customers and being driven out of business (“The extraordinary relief of a pre-adjudicatory injunction demands more precision with respect to when irreparable harm will occur than ‘soon.’”).  As for weighing whether a TRO would be in the public interest, the court, while understanding Stackla’s predicament, found that issuing a TRO could hamper Facebook’s ability to “decisively police its social-media platforms” and that there was a public interest in allowing a company to police the integrity of its platforms (“Facebook’s enforcement activities would be compromised if judicial review were expected to precede rather than follow its enforcement actions”). [emphasis in original]. This ruling leaves the issue for another day, perhaps during a preliminary injunction hearing, after some additional briefing of the issues.

The ink is barely dry on the landmark Ninth Circuit hiQ Labs decision. Yet, a new dispute has already cropped up testing the bounds of the CFAA and the ability of a platform to enforce terms restricting unauthorized scraping of social media content. (See Stackla, Inc. v. Facebook, Inc., No. 19-5849 (N.D. Cal. filed Sept. 19, 2019)).  This dispute involves Facebook and a social media sentiment tracking company, Stackla, Inc., which, as part of its business, accesses Facebook and Instagram content. This past Wednesday, September 25th, the judge in the case denied Stackla, Inc.’s request for emergency relief restoring its access to Facebook’s platform. While the judge has yet to issue a written ruling, the initial pleadings and memoranda filed in the case are noteworthy and bring up important issues surrounding the hot issue of scraping.

The Stackla dispute has echoes of hiQ v LinkedIn. Both involve the open nature of “public” websites (although the “public” nature of the content at issue appears to be in dispute.)  Both disputes address whether the Computer Fraud and Abuse Act (the “CFAA”) can be used as a tool to prevent the scraping of such sites. Both disputes address how a platform may use its terms of use to prohibit automated scraping or data collection beyond the scope of such terms, although the discussion in hiQ was extremely brief.  And like hiQ, Stackla asserts that if not for the ability to use Facebook and Instagram data, Stackla would be out of business. Thus both disputes address whether a court’s equitable powers should come into play if a platform’s termination of access will result in a particular company’s insolvency.  Given the Ninth Circuit’s opinion in favor of hiQ, it is highly likely that Stackla’s lawyers believed the Ninth Circuit decision was their golden ticket in this case. The judge’s ruling on the request for emergency relief suggests they may be disappointed.

Continue Reading

In Blockbuster Ruling, Ninth Circuit Affirms hiQ Injunction — CFAA Claim Likely Not Available for Scraping Publicly Available Website Data

In a ruling that is being hailed as a victory for web scrapers and the open nature of publicly available website data, the Ninth Circuit today issued its long-awaited opinion in hiQ Labs, Inc. v. LinkedIn Corp., No. 17-16783 (9th Cir. Sept. 9, 2019). The crucial question before the court was whether once hiQ Labs, Inc. (“hiQ”) received LinkedIn Corp.’s (“LinkedIn”) cease-and-desist letter demanding it stop scraping public LinkedIn profiles, any further scraping of such data was “without authorization” within the meaning of the federal Computer Fraud and Abuse Act (CFAA). The appeals court affirmed the lower court’s order granting a preliminary injunction barring the professional networking platform LinkedIn from blocking hiQ, a data analytics company, from accessing and scraping publicly available LinkedIn member profiles to create competing business analytic products. Most notably, the Ninth Circuit held that hiQ had shown a likelihood of success on the merits in its claim that when a computer network generally permits public access to its data, a user’s accessing that publicly available data will not constitute access “without authorization” under the CFAA.

In light of this ruling, data scrapers, content aggregators and advocates of a more open internet will certainly be emboldened, but we reiterate something we advised back in our 2017 Client Alert about the lower court’s hiQ decision: while the Ninth Circuit’s decision suggests that the CFAA is not an available remedy to protect against unwanted scraping of public website data that is “presumptively open to all,” entities engaged in scraping should remain careful. The road ahead, while perhaps less bumpy than before, still contains rough patches. Indeed, the Ninth Circuit cautioned that its opinion was issued only at the preliminary injunction stage and that the court did not “resolve the companies’ legal dispute definitively, nor do we address all the claims and defenses they have pleaded in the district court.” Continue Reading

Ninth Circuit Releases Another Important CDA Section 230 Opinion With Broad Application – Automated Content Recommendation and Notification Tools Do Not Make Social Site the Developer of User Posts

In the swirl of scrutiny surrounding the big Silicon Valley tech companies and with some in Congress declaiming that Section 230 of the Communications Decency Act (CDA) should be curtailed, 2019 has quietly been an important year for CDA jurisprudence with a number of opinions enunciating robust immunity under CDA Section 230. In particular, there has been a trio of noteworthy circuit court-level opinions rejecting plaintiffs’ attempt to make an “end run” around the CDA based on the assertion that online service providers lose immunity if they algorithmically recommend or recast content in another form to other users of the site.

This week, in a case with an unsettling fact pattern, the Ninth Circuit made it a quartet – ruling that a now-shuttered social networking site was immune from liability under the CDA for connecting a user with a dealer who sold him narcotics that culminated in an overdose. The court found such immunity because the site’s functions, which included content recommendations and notifications to members of discussion groups, were “content-neutral tools” used to facilitate communications. (Dyroff v. The Ultimate Software Group, Inc., No. 18-15175 (9th Cir. Aug. 20, 2019)).  Continue Reading

Recent Rulings Highlight Limits of CDA Immunity in Products Liability Cases against E-Commerce Platforms

UPDATE: On August 23, 2019, the Third Circuit granted Amazon’s petition for rehearing en banc in the Oberdorf case.  As per the order, the opinion dated July 3, 2019 is vacated.

In early July, an appeals court ruled that Amazon should be considered a “seller” of goods under Pennsylvania products liability law and subject to strict liability for consumer injuries caused by the defective goods sold on its site by third-party vendors. (Oberdorf v. Amazon.com, No. 18-1041 (3rd Cir. July 3, 2019)). While the decision involved interpretation of Pennsylvania law – and Amazon has previously prevailed on the “seller” issue in various courts around the country in recent years – the ruling is still noteworthy as it was based upon § 402A Restatement (Second) of Torts (which other states may follow), and the ruling may signal a willingness to reinterpret the definition of “seller” in the modern era of online platforms. The decision also highlights the limits of immunity under Section 230 of the Communications Decency Act (CDA) for online marketplaces when it comes to products liability claims based on a site’s sales activity, as opposed to editorial decisions related to third-party product listings. Continue Reading

Personal Email Management Service Settles FTC Charges over Allegedly Deceptive Statements to Consumers over Its Access and Use of Subscribers’ Email Accounts

This week, the FTC entered into a proposed settlement with Unrollme Inc. (“Unrollme”), a free personal email management service that offers to assist consumers in managing the flood of subscription emails in their inboxes. The FTC alleged that Unrollme made certain deceptive statements to consumers, who may have had privacy concerns, to persuade them to grant the company access to their email accounts. (In re Unrolllme Inc., File No 172 3139 (FTC proposed settlement announced Aug. 8, 2019).

This settlement touches many relevant issues, including the delicate nature of online providers’ privacy practices relating to consumer data collection, the importance for consumers to comprehend the extent of data collection when signing up for and consenting to a new online service or app, and the need for downstream recipients of anonymized market data to understand how such data is collected and processed.  (See also our prior post covering an enforcement action involving user geolocation data collected from a mobile weather app). Continue Reading

Finding Article III Standing, Ninth Circuit Declines to Do an About-Face in Illinois Biometric Privacy Class Action against Facebook

In an important opinion, the Ninth Circuit affirmed a lower court’s ruling that plaintiffs in the ongoing Facebook biometric privacy class action have alleged a concrete injury-in-fact to confer Article III standing and that the class was properly certified. (Patel v. Facebook, Inc., No. 18-15982 (9th Cir. Aug. 8, 2019)). Given the California district court’s prior rulings which denied Facebook’s numerous motions to dismiss on procedural and substantive grounds, and the Illinois Supreme Court’s January 2019 blockbuster ruling in Rosenbach, which held that a person “aggrieved” by a violation of the Illinois Biometric Information Privacy Act (“BIPA”) need not allege some actual injury or harm beyond a procedural violation to have standing to bring an action under the statute, the Ninth Circuit’s decision was not entirely surprising. Still, the ruling is significant as a federal appeals court has ruled on important procedural issues in a BIPA action and found standing. The case will be sent back to the lower court with the prospect of a trial looming, and given BIPA’s statutory damage provisions, Facebook may be looking at a potential staggering damage award or substantial settlement.      Continue Reading

Facebook Shielded by CDA Immunity against Federal Claims for Allowing Use of Its Platform by Terrorists

In recent years, there have been a number of suits filed in federal courts seeking to hold social media platforms responsible for providing material support to terrorists by allowing members of such groups to use social media accounts and failing to effectively block their content and terminate such accounts. As we’ve previously written about, such suits have generally not been successful at either the district court or circuit court level and have been dismissed on the merits or on the basis of immunity under Section 230 of the Communications Decency Act (CDA).

This past month, in a lengthy, important 2-1 decision, the Second Circuit affirmed dismissal of federal Anti-Terrorism Act (ATA) claims against Facebook on CDA grounds for allegedly providing “material support” to Hamas. The court also declined to exercise supplemental jurisdiction over plaintiff’s foreign law claims. (Force v. Facebook, Inc., No. 18-397 (2d Cir. July 31, 2019)).  Despite the plaintiffs’ creative pleadings that sought to portray Facebook’s processing of third-party content as beyond the scope of CDA immunity, the court found that claims related to supplying a communication forum and failing to completely block or eliminate hateful terrorist content necessarily treated Facebook as the publisher of such content and were therefore barred under the CDA.  Continue Reading

Ticketmaster Reaches Settlement with Ticket Broker over Unauthorized Use of Automated Bots

In early July, Ticketmaster reached a favorable settlement in its action against a ticket broker that was alleged to have used automated bots to purchase tickets in bulk, thus ending a dispute that produced notable court decisions examining the potential liabilities for unwanted scraping and website access. (Ticketmaster L.L.C. v. Prestige Entertainment West Inc., No. 17-07232 (C.D. Cal. Final Judgment July 8, 2019)).

In the litigation, Ticketmaster alleged that the defendant-ticket broker, Prestige, used bots and dummy accounts to navigate Ticketmaster’s website and mobile app to purchase large quantities of tickets to popular events to resell for higher prices on the secondary market. Under the terms of the settlement, Prestige is permanently enjoined from using ticket bot software to search for, reserve or purchase tickets on Ticketmaster’s site or app (at rates faster than human users can do using standard web browsers or mobile apps) or circumventing any CAPTCHA or other access control measure on Ticketmaster’s sites that enforce ticket purchasing limits and purchasing order rules.  Prestige is also barred from violating Ticketmaster’s terms of use or conspiring with anyone else to violate the terms, or engage in any other prohibited activity. Continue Reading

LexBlog