On January 6, the FTC issued a report on the commercial use of big data, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, summarizing the results of a September 2014 workshop and numerous public comments, including a paper and workshop comments by Alston & Bird Senior Counsel Peter Swire. The report addresses the commercial use of big data (as opposed to the collection, compilation, or analysis of such data) and cautions against uses that have the potential to be exclusionary, discriminatory or that may violate applicable consumer protection laws. In its report, the FTC acknowledges the numerous benefits of big data, but highlights the potential legal and ethical risks involved in the use of such data, emphasizing that “big data may be used to categorize consumers in ways that can result in exclusion of [underserved or low income] populations[,]” whether because of inaccurate or incomplete data, hidden biases, or faulty or irrelevant correlations.
Compliance Considerations in Using Big Data
The FTC’s report acknowledges that the use of big data is nearly ubiquitous and focuses the majority of the discussion on how companies can use big data in a manner that “benefits them and society, while minimizing legal and ethical risks.” The report includes a lengthy discussion of the laws that companies should be aware of in connection with their use of big data (including examples of applicable FTC enforcement actions) and specifically notes that the FTC will continue to monitor for compliance and bring enforcement actions where appropriate. The laws discussed include the following:
- The Fair Credit Reporting Act (FCRA). The FCRA applies to consumer reporting agencies (CRAs) and also imposes certain obligations on those who use reports generated by CRAs. CRAs are entities that assemble and analyze consumer information to create and sell reports to third parties that will be used for credit, employment, insurance, housing, or other similar eligibility decisions (typically credit bureaus, employment background screening companies, etc.). The FCRA (among other things) requires CRAs to implement reasonable procedures to ensure the accuracy of consumer reports and to provide consumers with access to their own information and the ability to correct errors in such information.The FTC cautions that data brokers that compile non-traditional information, including social media information, may be considered CRAs subject to the FCRA if such information will be used for eligibility determinations. For example, the FTC investigated online data broker Spokeo for marketing consumer profiles to human resource departments specifically for employment eligibility purposes. The FTC’s complaint noted that Spokeo assembled personal information from multiple online and offline data sources, including social networks, which it merged to create detailed individual profiles (including name, hobbies, ethnicity, and religion) that it marketed for recruiting purposes. Thus, the FTC determined Spokeo was acting as a CRA subject to the requirements of the FCRA. The claim resulted in a consent decree that required Spokeo to pay $800,000 in civil penalties for failing to comply with the requirements of the FCRA.
The FTC further reminds us that, in addition to CRAs, companies that use consumer reports for eligibility-related purposes also have obligations under the FCRA. For example, if a company uses credit report information to deny an individual credit, housing or employment, or if it charges an individual more for credit or insurance products based on such information, it must provide the consumer with notice so that the consumer may correct any inaccuracies in the report. The FTC clarifies that the FCRA does not apply when a company uses its own customer data (derived from its own customer relationships) for purposes of eligibility determinations, but that it would apply if the company engaged a third party to evaluate such customer data on the company’s behalf, as the third party would then be acting as a CRA (and the company as a user of a consumer report).
Also of note in the report is the FTC’s reversal of its position on a key issue for many companies using big data: de-identification and its impact on consumer reports. In a 2011 guidance document entitled “40 Years of Experience with the Fair Credit Reporting Act,” the FTC took the position that “a ‘consumer report’ is a report on a ‘consumer’ to be used for certain purposes involving that ‘consumer.’ Information that does not identify a specific consumer does not constitute a consumer report even if the communication is used in part to determine eligibility.” Many companies interpreted this sentence to excuse FCRA compliance for datasets that might otherwise constitute consumer reports where the data was appropriately de-identified (i.e., where sufficient steps were taken to remove direct and indirect identifiers of specific individuals). However, in a particularly notable footnote, the FTC reversed its position on this issue, stating: “In 2011, FTC staff issued the 40 Years Report. In that report, staff stated that ‘[i]nformation that does not identify a specific consumer does not constitute a consumer report even if the communication is used in part to determine eligibility.’ The Commission does not believe that this statement is accurate. If a report is crafted for eligibility purposes with reference to a particular consumer or set of particular consumers (e.g., those that have applied for credit), the Commission will consider the report a consumer report even if the identifying information of the consumer has been stripped.” Although the new FTC report is not a binding regulation, companies that have relied on the Commission’s previous position in their use of de-identified datasets may need to revisit those practices in order to determine whether such datasets constitute consumer reports subject to additional FCRA compliance obligations.
- Federal Equal Opportunity Laws. Companies should also be aware of federal laws prohibiting discrimination based on protected characteristics such as race, color, sex or gender, religion, age, disability status, national origin, marital status, and genetic information, such as the Equal Credit Opportunity Act (“ECOA”). The report includes a discussion of how a company’s use of big data may violate federal equal credit and employment opportunity laws if such use results in “disparate treatment” or “disparate impact” on consumers. For example, a creditor refusing to lend to single people based on research indicating that single people are less likely than married couples to repay loans may constitute disparate treatment in violation of ECOA. Similarly, a creditor who uses zip codes as a factor in determining whether to extend credit, which results in denial of credit to ethnic groups concentrated in particular zip codes, has a disparate impact on such ethnic groups and may be in violation of ECOA (unless the practice serves a legitimate business need that cannot reasonably be achieved by means that have a less disparate impact).
- Section 5 of the FTC Act. Section 5 of the FTC Act prohibits unfair or deceptive acts or practices affecting commerce, such as failing to honor statements contained in a privacy policy or failing to reasonably secure a consumer’s data where there is a significant risk of harm to the consumer if such data is disclosed. The FTC recommends that companies engaged in the use or storage of big data implement appropriate security measures using a risk-based approach (e.g., social security numbers and financial account data merit more stringent protections than name and email address). The FTC also notes that it is considered an unfair practice to sell data (including data-based products) if the company selling the data knows – or should reasonably know – that the buyer intends to use it for fraudulent purposes.
The report includes a list of questions for companies already using, or considering using, big data for analytics purposes to consider, which are designed to tease out potential compliance issues (e.g., if you receive “big data” from others and use it for eligibility determinations, are you complying with the FCRA provisions applicable to users of consumer reports?). The report concludes with the recommendation that, in addition to the compliance concerns, businesses consider a set of questions in connection with their use of big data designed to “maximize the benefits and limit the harms” associated with such use:
- How representative is your data set? Are you missing significant pieces of data from particular populations, or are there otherwise significant instances or over- or under-representation in your data sets?
- Does your data model account for biases? Companies should review data sets and algorithms for hidden biases that may have an unintended impact on certain groups.
- Are your predictions based on big data accurate? The FTC recommends “human oversight” to make sure correlations are relevant and meaningful, particularly where the data may be used to make decisions regarding a consumer’s credit, health or employment.
- Does your use of big data raise ethical or fairness concerns?
The report is a wake-up call to companies using big data, many of whom may not even be aware of the potential implication of federal consumer protection laws, particularly when the data is received from non-traditional CRAs or otherwise involves non-financial data or other data elements not typically associated with eligibility determinations. Companies should review their marketing and big data practices in light of the potential issues raised by the FTC’s report to ensure that their practices are compliant with consumer protection laws.