...the most significant findings in the report focus on the recognition that data can be used in subtle ways to create forms of discrimination — and to make judgments, sometimes in error, about who is likely to show up at work, pay their mortgage on time or require expensive treatment. The report states that the same technology that is often so useful in predicting places that would be struck by floods or diagnosing hard-to-find illnesses in infants also has “the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education and the marketplace.”
The report focuses particularly on “learning algorithms” that are frequently used to determine what kind of online ad to display on someone’s computer screen, or to predict their buying habits when searching for a car or in making travel plans. Those same algorithms can create a digital picture of person, Mr. Podesta noted, that can infer race, gender or sexual orientation, even if that is not the intent of the software.
“The final computer-generated product or decision — used for everything from predicting behavior to denying opportunity — can mask prejudices while maintaining a patina of scientific objectivity,” the report concludes.