When Big Data Becomes Bad Data by Lauren Kirchner ProPublica, Sep. 2, 2015, 12:24 p.m. A recent ProPublica analysis of The Princeton Review's prices for online SAT tutoring shows that customers in areas with a high density of Asian residents are often charged more. When presented with this finding, The Princeton Review called it an "incidental" result of its geographic pricing scheme. The case illustrates how even a seemingly neutral price model could potentially lead to inadvertent bias 2014 bias that's hard for consumers to detect and even harder to challenge or prove. Over the past several decades, an important tool for assessing and addressing discrimination has been the "disparate impact" theory. Attorneys have used this idea to successfully challenge policies that have a discriminatory effect on certain groups of people, whether or not the entity that crafted the policy was motivated by an intent to discriminate. It's been deployed in lawsuits involving employment decisions, housing and credit. Going forward, the question is whether the theory can be applied to bias that results from new technologies that use algorithms. One unexpected effect of the company's geographic approach to pricing is that Asians are almost twice as likely to be offered a higher price than non-Asians, an analysis by ProPublica shows. Read the story. The term "disparate impact" was first used in the 1971 Supreme Court case Griggs v. Duke Power Company. The Court ruled that, under Title VII of the Civil Rights Act, it was illegal for the company to use intelligence test scores and high school diplomas 2014 factors which were shown to disproportionately favor white applicants and substantially disqualify people of color... More