The big data revolution is quietly chugging along: devices, sensors, websites and networks are collecting and producing significant amounts of data, the cost of data storage continues to plummet, public and private sector interest in data mining is growing, data computational and statistical methods have advanced, and more and more data scientists are using new software and capabilities to make sense of it all. The potential benefits of big data are now well-known, but what are some of the legal, ethical and compliance risks and when do modern data analytics produce unintended discriminatory effects? To explore these issues, the FTC held a workshop in September 2014, and earlier this month, released a report “Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues.”
Companies that use big data are likely already familiar with the myriad of privacy-related legal issues — data collection and online behavioral tracking, notice and consumer choice, data security, anonymization and de-identification, intra-company data sharing, retail consumer tracking, and many others. But beyond these concerns, the FTC’s report discusses another set of issues surrounding big data. The report outlines the risks created by the use of big data analytics with respect to consumer protection and equal opportunity laws. The report also directs companies to attempt to minimize the risk that data inaccuracies and inherent biases might harm or exclude certain consumers (particularly with respect to credit offers, and educational and employment opportunities). The Report outlines a number of potential harms, including:
- Individuals mistakenly being denied opportunities. Participants in the FTC’s workshop raised concerns that companies using big data to better know their customers may, at times, base their assumptions disproportionately on the comparison of a consumer with a generalized data set with which the consumer shares similar attributes.
- Ad targeting practices that reinforce existing disparities.
- The exposure of consumer’s sensitive information.
- The targeting of vulnerable consumers for fraud.
- The creation of new justifications for exclusion of certain populations from particular opportunities.
- Offering higher-priced goods and services to lower income communities.
Consumer Protection Laws Potentially Applicable to Big Data
The Report mentions several federal laws that might apply to certain big data practices, including the Fair Credit Reporting Act, equal opportunity laws, and the FTC Act.
Fair Credit Reporting Act
As the report notes, the Fair Credit Reporting Act (FCRA) applies to companies, known as consumer reporting agencies or CRAs, that compile and sell consumer reports containing consumer information that is used or expected to be used for decisions about consumer eligibility for credit, employment, insurance, housing, or other covered transactions. Among other things, CRAs must reasonably ensure accuracy of consumer reports and provide consumers with access to their own information, and the ability to correct any errors. Traditionally, CRAs included credit bureaus and background screening companies, but the scope of the FCRA may extend beyond traditional credit bureaus. See e.g., United States v. Instant Checkmate, Inc., No. 14-00675 (S.D. Cal. filed Mar. 24, 2014) (website that allowed users to search public records for information about anyone and which was marketed to be used for background checks was subject to the FCRA; entity settled FTC charges, paid a $550,000 civil fine and agreed to future compliance).
Companies that use consumer reports also have FCRA obligations, such as providing consumers with “adverse action” notices if the companies use the consumer report information to deny credit or other certain benefits. The Report notes, however, that the FCRA does not apply when companies use data derived from their own relationship with customers for purposes of making decisions about them. Big data has created a new twist on compliance, though. The Report mentions a growing trend where companies purchase predictive analytics products for eligibility determinations, but instead of comparing a traditional credit characteristic (e.g., payment history), these new products may use non-traditional characteristics (e.g., zip code or social media usage) to evaluate creditworthiness as compared to an anonymized data set of groups that share the same characteristics. The FTC states that if an outside analytics firm regularly evaluates a company’s own data and provides evaluations to the company for eligibility determinations, the outside firm would likely be acting as a CRA, the company would likely be a user of consumer reports, and both entities would be subject to Commission enforcement under the FCRA. This new stance apparently runs counter to prior FTC policy which had made an exception for anonymized data. In a footnote, the agency explains that its prior interpretation was inaccurate and that “if a report is crafted for eligibility purposes with reference to a particular consumer or group of consumers…the Commission will consider the report a consumer report even if the identifying information of the consumer has been stripped.”
Equal Opportunity Laws
Certain federal equal opportunity laws might also apply to certain big data analytics, such as the Equal Credit Opportunity Act (ECOA), Title VII of the Civil Rights Act of 1964, the Americans with Disabilities Act, the Age Discrimination in Employment Act, the Fair Housing Act, and the Genetic Information Nondiscrimination Act. Generally speaking, these laws prohibit discrimination based on protected characteristics. To prove a violation of such laws, plaintiffs typically must show “disparate treatment” or “disparate impact.” The Report offers an example: if a company makes credit decisions based on zip codes, it may be violating ECOA if the decisions have a disparate impact on a protected class and are not justified by a legitimate business necessity. The specific requirements of each federal statute are beyond the scope of this post, but the question of whether a practice is unlawful under equal opportunity laws is a fact-specific inquiry.
The FTC Act
Section 5 of the Federal Trade Commission Act prohibits unfair or deceptive acts or practices in or affecting commerce. The agency advises companies using big data to consider whether they are violating any material promises to consumers involving data sharing, consumer choice or data security, or whether companies have otherwise failed to disclose material information to consumers. Such violations of privacy promises have formed the basis of multiple FTC privacy-related enforcement actions in recent years. The Report states that companies that maintain big data on consumers should reasonably secure the data. The FTC also notes that companies may not sell their big data analytics products to customers if they know or have reason to know that those customers will use the products for fraudulent or discriminatory purposes.
Questions for Legal Compliance
In light of the above federal laws, the Report outlines several questions that companies already using or considering engaging in big data analytics should ask to remain in compliance:
- If you compile big data for others who will use it for eligibility decisions, are you complying with the accuracy and privacy provisions of the FCRA?
- If you receive big data products from another entity that you will use for eligibility decisions, are you complying with the provisions applicable to users of consumer reports?
- If you are a creditor using big data analytics in a credit transaction, are you complying with the requirement to provide statements of specific reasons for adverse action under ECOA?
- If you use big data analytics in a way that might adversely affect people in their ability to obtain credit, housing, or employment, are you treating people differently based on a prohibited basis, or do your practices have an adverse effect or impact on a member of a protected class?
- Are you honoring promises you make to consumers and providing consumers material information about your data practices?
- Are you maintaining reasonable security over consumer data?
- Are you undertaking reasonable measures to know the purposes for which your customers are using your data (e.g., fraud, discriminatory purposes)?
The Big Data Report also points to research that has shown how big data could potentially be used in the future to disadvantage underserved communities and adversely affect consumers on the basis of legally protected characteristics. To be sure, the potential risks of data mining are not new, but inherent in any statistical analysis. To maximize the benefits and limit the harms, the Report suggests companies should consider the following questions raised by researchers as big data use increases:
- How representative is your data set? The agency advises that it is important to consider the digital divide and other issues of under-representation and over-representation in data inputs before launching a product or service to avoid skewed results.
- Does your data model account for biases? Companies should consider whether biases are being incorporated at both the collection and analytics stages of big data’s life cycle, and develop strategies to overcome any unintended impact on certain populations.
- How accurate are your predictions? The Report advises that human oversight of data and algorithms may be worthwhile when big data tools are used to make important decisions, such as those implicating health, credit, and employment.
- Does your reliance on big data raise ethical or fairness concerns? The Report states that companies should assess the factors that go into an analytics model and balance the predictive value of the model with fairness considerations.
With the issuance of its Big Data Report (and last year’s Data Broker Report), the FTC has signaled it will actively monitor areas where data collection and big data analytics could violate existing laws and will push for public-private cooperation to ensure the benefits of big data are maximized, the risks minimized. The Big Data Report is an important document for companies that provide big data analytic services or purchase such services for use in analyzing consumer behavior or aid in consumer eligibility decisions. It remains to be seen how the FTC’s policy statement will be received by industry (or subsequently reviewed by the courts), particularly the FTC’s assertion that certain uses of anonymized consumer data might implicate the FCRA. We have previously discussed the use of anonymized data for marketing and other purposes with respect to the Video Privacy Protection Act, and will continue to follow developments in this area closely to see how emerging practices mesh with privacy laws and regulations.