For all its potential benefits, big data can lead to discrimination and worsen economic disparity, the Federal Trade Commission warned in a new report that includes caveats and guidelines for businesses. Entitled "Big Data: A Tool for Inclusion or Exclusion" the report stems from a 2014 FTC workshop by the same name and incorporates the public comments that followed.
Among the report's conclusions is that big data can benefit under-served populations through better opportunities for education, credit, health care and employment. On the flip side, however, it can lead to reduced opportunities and the targeting of vulnerable consumers for fraud and higher prices.
Overall, big data can end up perpetuating existing economic disparities or creating new ones, the FTC said.
“Big data’s role is growing in nearly every area of business, affecting millions of consumers in concrete ways,” said Edith Ramirez, chairwoman of the FTC. “Businesses must ensure that their big data use does not lead to harmful exclusion or discrimination.”
Toward that end, the FTC outlined some of the laws that apply to the use of big data, including the Fair Credit Reporting Act, the FTC Act and equal opportunity laws. It also offered a range of questions for businesses to consider when they examine whether their big data programs comply with these laws.
Four key policy questions proposed in the report, meanwhile, aim to help companies determine how best to maximize the benefit of their use of big data while limiting possible harms.
Last year, the FTC's Bureau of Consumer Protection established the Office of Technology Research and Investigation dedicated to understanding algorithmic transparency and other related issues. This week at CES, Ramirez also spoke out to urge companies to expand their privacy efforts.
Market researcher Gartner predicts that the improper use of big data analytics will cause half of all business ethics violations by 2018.
The FTC's new report is available for download from the FTC site.
The Equal Credit Opportunity Act has been a key tool in the battle against algorithmic discrimination, said Rachel Goodman, a staff attorney for the American Civil Liberties Union's Racial Justice Program, in an email.
"Communities of color have long been victimized by credit discrimination, alternately starved of credit or flooded with predatory loans, depending on the era," Goodman explained.
It will be just as crucial for combating what's known as "digital redlining," which is the newest form of credit discrimination, she said.
Also important is that the report recognizes that predictive analytics may lead companies to engage in discrimination that violates civil rights laws, Goodman pointed out.
"While it rightly urges companies to be careful not to discriminate, self-monitoring is not enough," Goodman said. "We need systems for auditing the proprietary algorithms that make crucial decisions about housing, credit and employment, in order to ensure that they treat everyone fairly."