|propublica algorithmic bias||0.73||0.1||8176||46|
Though “algorithmic bias” is the popular term, the foundation of such bias is not in algorithms. It is in data. Algorithms are not biased, data is! Algorithms learn the persistent patterns that are present in the training data.Can algorithms reduce bias against black defendants in criminal risk scores?
ProPublica’s analysis of bias against black defendants in criminal risk scores has prompted research showing that the disparity can be addressed — if the algorithms focus on the fairness of outcomes.Why did ProPublica use the compas algorithm?
So ProPublica did its own analysis. We chose to examine the COMPAS algorithm because it is one of the most popular scores used nationwide and is increasingly being used in pretrial and sentencing, the so-called “front-end” of the criminal justice system.Does ‘machine bias’ predict the future of criminals?
“Machine Bias,” the headline read, and the teaser proclaimed: “There’s software used across the country to predict future criminals. And it’s biased against blacks.”