Epic Games, Inc. (“Epic”) is the publisher of the popular online multiplayer videogame Fortnite, released in 2017. In recent years, Fortnight has gained worldwide popularity with gamers and esports followers (culminating in July 2019 when a sixteen-year-old player won the $3 million prize for winning the Fortnite World Cup).  Players, in one version of the game, are dropped onto a virtual landscape and compete in a battle royale to survive.  In the real world, Epic recently survived its own encounter – not with the help of scavenged weapons or shield potions – but through its well-drafted end user license agreement (“EULA” or “terms”).

Earlier this month, the District Court for the Eastern District of North Carolina granted Epic’s motion to compel individual arbitration of the claims of a putative class action.  The action arose in connection with a cyber vulnerability that allowed hackers to breach user accounts. The court concluded that the arbitration provision contained in the EULA was enforceable in this case, even where a minor was the person who ultimately assented to the terms. (Heidbreder v. Epic Games, Inc., No. 19-348 (E.D.N.C. Feb. 3, 2020)).   

Recently, the Ninth Circuit reinstated a $460,000 jury verdict against print-on-demand site Zazzle, Inc. (“Zazzle”) for willful copyright infringement, putting a final stamp (perhaps) on a long-running dispute that explored important DMCA safe harbor issues for online print-on-demand services. (Greg Young Publishing, Inc. v. Zazzle, Inc., No. 18-55522 (9th Cir. Nov. 20, 2019) (unpublished). The appeals court found that Zazzle’s anti-infringement oversight mechanisms were insufficient during the period of infringement when a number of the plaintiff’s Greg Young Publishing, Inc.’s (“GYPI”) visual art works were uploaded by users onto Zazzle’s site without authorization.

With the online shopping season in full swing, the FTC decided that online retailers might benefit from a reminder as to the dos and don’ts for social media influencers.  Thus, the FTC released a new guide, “Disclosures 101 for Social Media Influencers,” that reiterates its position about the responsibility of “influencers” to disclose “material” connections with brands when endorsing products in online posts.  Beyond this new guide, which is written in an easy-to-read brochure format (with headings such a “How to Disclose” and “When to Disclose”), the FTC released related videos to convey the message that influencers should “stay on the right side of the law” and disclose when appropriate the relationship with a brand he or she is endorsing.  This latest reminder to influencers comes on the heels of the FTC sending 90 letters to influencers in April 2017 notifying them of their responsibilities under the FTC”s Endorsement Guides, and the prior publishing of an Endorsement Guides FAQ. With the release of fresh guidance, now is a good time for brands with relationships with influencers to ensure endorsements are not deceptive and remain on the right side of the law.  Indeed, advertisers should have reasonable programs in place to train and monitor members of their influencer network and influencers themselves should remain aware of requirements under the Endorsement Guides. 

Last month, LinkedIn Corp. (“LinkedIn”) filed a petition for rehearing en banc of the Ninth Circuit’s blockbuster decision in hiQ Labs, Inc. v. LinkedIn Corp., No. 17-16783 (9th Cir. Sept. 9, 2019). The crucial question before the original panel concerned the scope of Computer Fraud and Abuse Act

On October 11, 2019, LinkedIn Corp. (“LinkedIn”) filed a petition for rehearing en banc of the Ninth Circuit’s blockbuster decision in hiQ Labs, Inc. v. LinkedIn Corp., No. 17-16783 (9th Cir. Sept. 9, 2019). The crucial question before the original panel concerned the scope of Computer Fraud and

UPDATE: On October 14, 2019, the parties entered into a Joint Stipulation dismissing the case, with prejudice.  It appears from some reports that Stackla’s access to Facebook has been reinstated as part of the settlement.
UPDATE: On September 27, 2019, the California district court issued its written order denying Stackla’s request for a TRO.  In short, the court found that, at this early stage, Stackla only demonstrated “speculative harm” and its “vague statements” did not sufficiently show that restoration of access to Facebook’s API would cure the alleged impending reality of Stackla losing customers and being driven out of business (“The extraordinary relief of a pre-adjudicatory injunction demands more precision with respect to when irreparable harm will occur than ‘soon.’”).  As for weighing whether a TRO would be in the public interest, the court, while understanding Stackla’s predicament, found that issuing a TRO could hamper Facebook’s ability to “decisively police its social-media platforms” and that there was a public interest in allowing a company to police the integrity of its platforms (“Facebook’s enforcement activities would be compromised if judicial review were expected to precede rather than follow its enforcement actions”). [emphasis in original]. This ruling leaves the issue for another day, perhaps during a preliminary injunction hearing, after some additional briefing of the issues.

The ink is barely dry on the landmark Ninth Circuit hiQ Labs decision. Yet, a new dispute has already cropped up testing the bounds of the CFAA and the ability of a platform to enforce terms restricting unauthorized scraping of social media content. (See Stackla, Inc. v. Facebook, Inc., No. 19-5849 (N.D. Cal. filed Sept. 19, 2019)).  This dispute involves Facebook and a social media sentiment tracking company, Stackla, Inc., which, as part of its business, accesses Facebook and Instagram content. This past Wednesday, September 25th, the judge in the case denied Stackla, Inc.’s request for emergency relief restoring its access to Facebook’s platform. While the judge has yet to issue a written ruling, the initial pleadings and memoranda filed in the case are noteworthy and bring up important issues surrounding the hot issue of scraping.

The Stackla dispute has echoes of hiQ v LinkedIn. Both involve the open nature of “public” websites (although the “public” nature of the content at issue appears to be in dispute.)  Both disputes address whether the Computer Fraud and Abuse Act (the “CFAA”) can be used as a tool to prevent the scraping of such sites. Both disputes address how a platform may use its terms of use to prohibit automated scraping or data collection beyond the scope of such terms, although the discussion in hiQ was extremely brief.  And like hiQ, Stackla asserts that if not for the ability to use Facebook and Instagram data, Stackla would be out of business. Thus both disputes address whether a court’s equitable powers should come into play if a platform’s termination of access will result in a particular company’s insolvency.  Given the Ninth Circuit’s opinion in favor of hiQ, it is highly likely that Stackla’s lawyers believed the Ninth Circuit decision was their golden ticket in this case. The judge’s ruling on the request for emergency relief suggests they may be disappointed.

In a ruling that is being hailed as a victory for web scrapers and the open nature of publicly available website data, the Ninth Circuit today issued its long-awaited opinion in hiQ Labs, Inc. v. LinkedIn Corp., No. 17-16783 (9th Cir. Sept. 9, 2019). The crucial question before the court was whether once hiQ Labs, Inc. (“hiQ”) received LinkedIn Corp.’s (“LinkedIn”) cease-and-desist letter demanding it stop scraping public LinkedIn profiles, any further scraping of such data was “without authorization” within the meaning of the federal Computer Fraud and Abuse Act (CFAA). The appeals court affirmed the lower court’s order granting a preliminary injunction barring the professional networking platform LinkedIn from blocking hiQ, a data analytics company, from accessing and scraping publicly available LinkedIn member profiles to create competing business analytic products. Most notably, the Ninth Circuit held that hiQ had shown a likelihood of success on the merits in its claim that when a computer network generally permits public access to its data, a user’s accessing that publicly available data will not constitute access “without authorization” under the CFAA.

In light of this ruling, data scrapers, content aggregators and advocates of a more open internet will certainly be emboldened, but we reiterate something we advised back in our 2017 Client Alert about the lower court’s hiQ decision: while the Ninth Circuit’s decision suggests that the CFAA is not an available remedy to protect against unwanted scraping of public website data that is “presumptively open to all,” entities engaged in scraping should remain careful. The road ahead, while perhaps less bumpy than before, still contains rough patches. Indeed, the Ninth Circuit cautioned that its opinion was issued only at the preliminary injunction stage and that the court did not “resolve the companies’ legal dispute definitively, nor do we address all the claims and defenses they have pleaded in the district court.”

In the swirl of scrutiny surrounding the big Silicon Valley tech companies and with some in Congress declaiming that Section 230 of the Communications Decency Act (CDA) should be curtailed, 2019 has quietly been an important year for CDA jurisprudence with a number of opinions enunciating robust immunity under CDA Section 230. In particular, there has been a trio of noteworthy circuit court-level opinions rejecting plaintiffs’ attempt to make an “end run” around the CDA based on the assertion that online service providers lose immunity if they algorithmically recommend or recast content in another form to other users of the site.

This week, in a case with an unsettling fact pattern, the Ninth Circuit made it a quartet – ruling that a now-shuttered social networking site was immune from liability under the CDA for connecting a user with a dealer who sold him narcotics that culminated in an overdose. The court found such immunity because the site’s functions, which included content recommendations and notifications to members of discussion groups, were “content-neutral tools” used to facilitate communications. (Dyroff v. The Ultimate Software Group, Inc., No. 18-15175 (9th Cir. Aug. 20, 2019)).