We have been closely following the legal and legislative developments relating to biometric privacy, and in particular, the flow of litigation under the Illinois biometrics privacy law.   It was interesting to see how the Illinois law (as well as a similar Texas law) influenced Google’s  offering of a new facial recognition feature on the Google Arts & Culture app. (It is also interesting to note that the media coverage of the app has made the Illinois and Texas laws subjects of mainstream discourse.)

The Google Arts & Culture app, which was originally released a couple years ago, offers users virtual tours of museums and a searchable database of other art-related content.  What recently made it one of the hottest free apps is a new entertaining tool that compares a selfie to a database of great works of art and presents the results that most closely match the user’s face.  [Note: My classical art doppelgänger is “Portrait of a Gentleman in Red” by Rosalba Carriera. What’s yours?].  However, out of an apparent abundance of caution, Google has disabled this art-twinning function in Illinois and Texas, presumably because those states have biometric privacy laws that regulate the collection and use of biometric identifiers like facial templates; while the Texas statute can only be enforced by the state attorney general, Illinois’s Biometric Information Privacy Act (BIPA) contains a private right of action and remedies that include statutory damages. Interestingly, Washington users are able to access this tool, despite Washington having enacted its own biometric privacy law last year.  Perhaps that is because, as described in the referenced blog post, compliance under the Washington statute is less demanding than under the Illinois or Texas statutes.

As we have previously written about, there has been a wave of biometric privacy suits filed in Illinois state courts against Illinois employers and other businesses alleging procedural violations of Illinois’s BIPA, which generally regulates the collection, retention and disclosure of personal biometric identifiers and biometric information, and requires businesses to obtain prior consent before collecting such data and to employ reasonable safeguards when storing such data.  The statute contains defined terms and limitations, and parties in ongoing suits are currently litigating what “biometric identifiers” and “biometric information” mean under the statute and whether the collection of facial templates from uploaded photographs using sophisticated facial recognition technology fits within the ambit of the statute.  Indeed, Google itself is defending a putative class action alleging that the cloud-based Google Photos service violated BIPA by automatically uploading plaintiffs’ mobile photos and allegedly scanning them to create unique face templates (or “faceprints”) for subsequent photo-tagging without consent.

Google’s Arts & Culture app selfie tool requires users to give consent before the app collects a selfie and also informs users that the app uses the photo for comparison purposes only and does not store selfies afterward:


Therefore, while it is not clear whether or not Google complies with all of the requirements of Illinois and Texas law, it does appear that at least some of the basic requirements are fulfilled.  Still, given the flood of BIPA litigation over the past year (which may see a downturn due to a recent Illinois appellate court opinion which dismissed procedural BIPA claims) and the ongoing disputes over mobile photo tagging and facial recognition features, it is not surprising that Google chose to block Illinois and Texas users to avoid legal risk.  In fact, a facial recognition feature on Google’s Nest Cam IQ security camera is disabled in Illinois.

One wonders whether Google intended to send a veiled message to state legislatures (particularly Illinois) – if you enact biometrics laws that give rise to endless litigation, residents of your state may not be offered all of the benefits of certain new technologies.  To the extent others follow Google’s lead in disabling biometric applications in certain states, that message, whether intended or not, could become a lot clearer