When Accurate Data Creates an Unfair Picture
Posted at 6:36 p.m. on June 19, 2014
What happens when a good-faith effort to move away from broad categorization about a person accidentally leads to discrimination? A joint big-data event hosted by the White House Office of Science and Technology Policy and Georgetown University brought up an interesting example concerning car insurance.
Nicole Wong, the White House’s deputy chief technology officer, said it’s currently legal for companies to determine a consumer’s insurance rate based on age or gender. If you don’ t like the idea of having your car insurance rates determined by these factors, technology now provides for getting data directly from the car to determine driving performance, she said.
But what if the policy holder’s rates increase because, say, they drive at night or live in a poor neighborhood? Is that fair, asked Commissioner Julie Brill of the Federal Trade Commission.
Brill said it’s a matter “we ought to examine carefully and ask some real questions about.” The correlation may be accurate, and at some level it makes sense, but it “has a disparate impact,” Brill said.