•Data about individual through body, behavior
•Profiles, demographic data, medical records, employment data
•Search, purchase history
•Tweets, texts, emails, calls, GPS location coordinates
•Accumulated just by living ordinary life.
•"Economics of personal data": companies (Google) profit off personal data, but must balance profit with trust (from consumer).
COMPAS ranks defendants as "low," "medium," or "high" risk
The score is not intended to reveal whether a person is dangerous or if they should go to prison, but informs what the probation conditions ought to be
Brennan [Founder] said score was not designed to be used in sentencing, yet judges cite scores in sentencing decisions
ProPublica found that black defendants are twice as likely as white defendants to be labeled higher risk but not actually re-offend.
For white defendants, COMPAS does the opposite: more likely than blacks to be labeled lower risk, but go on to reoffend
Consequence: even though black defendants did not go on to commit crime, they were subject to harsher treatment by the courts
Northpoint said fair. Regardless of race, score means the same thing
Algorithmic sentencing: help judges make decisions. Risk assessment tools
Flaws - not fair
(1) the defendant does not have access to insight about this process, and
(2) the different risk assessment tools are not compared and there are no studies of their differences
Is the algorithm fair?
Is it equally fair to all groups of people? (Or is it biased? Discriminating?)
Is it fair to any defendant?
Is it fair to society?
Evaluate the technology in context
structural inequalities reproduce themselves in technologies, with consequences for everyone
Changing technological capacities and changing social contexts together transform perception of what is good, fair, appropriate technological use