Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor


Automating Inequality is a book that provides a detailed analysis of how high-tech tools are used to profile, police, and punish the poor. The book explores how technology is increasingly being utilized to automate decision-making processes that have significant impacts on individuals and communities who are socioeconomically disadvantaged.

Virginia Eubanks provides an in-depth examination of the ways in which technology is being used to perpetuate and exacerbate existing inequalities in society. The book takes a critical approach, analyzing the potential negative consequences of using technology to automate decision-making processes.

Cases of Automating Inequality

In her book, Virginia Eubanks scrutinizes a wide range of cases that illustrate how high-tech tools are being used for decision-making processes that disproportionately affect low-income and marginalized communities.

These are five cases from the book that I found interesting:

  • Automated Eligibility Determination: Has been shown to be flawed, resulting in many eligible individuals being denied benefits from government assistance programs.
  • Predictive policing: Has been employed by law enforcement agencies but has been known to amplify deep-seated biases. It’s also been found to disproportionately target minority communities, making it a controversial data-driven technique.
  • Child Protective Services: The use of predictive analytics in child welfare has been criticized for being inaccurate and leading to unnecessary removal of children from their homes.
  • Employment Screening: Many employers are using algorithms to screen job applicants. However, these algorithms can be biased against individuals with certain backgrounds or characteristics.
  • Algorithmic Risk Assessments: Are being used by judges to gauge the risk of a defendant reoffending. This tool has raised some questions regarding its accuracy and potential for bias based on racial or socioeconomic factors. Its use in the criminal justice system needs to be further scrutinized to ensure fairness and justice is upheld.

These cases show how sophisticated technology is being leveraged to automate decisions that have huge effects on people and communities. It is important to note the importance of these cases as they help us understand the implications of using such tools.

Eubanks seeks to raise people’s consciousness of the potential risks of automating decision-making processes with technology, along with pushing for more equitable and reflective usage of tech within the public sector through the further analysis of such cases.

The Human Cost of Automating Inequality

  1. DeeDee was an Indiana resident eligible for Medicaid but her benefits were denied as a result of an automated eligibility determination algorithm. Her disability put her in a state where it was impossible to earn money and meet the financial requirements for the medication she needed.
  2. Kim was a lone mom living in LA and was nonplussed when CPS took her kids away after an inaccurate predictive analytics tool labelled her family as high-risk. Despite the lack of any proof of mistreatment or neglect, the system predicted otherwise.
  3. Janet was a welfare recipient in Allegheny County, PA, who was flagged as high-risk by an algorithm used by the county’s Department of Human Services. The algorithm had incorrectly identified her as a potential child abuser based on her zip code and the fact that she had been a victim of domestic violence.

Impact of System in Action

The example of the ‘Integrated Justice Information System’ (IJIS) in Los Angeles County illustrates the potential risks and hurdles that come with introducing advanced technology to the criminal justice system. Right from the beginning, this system had a lot of issues: inaccurate data, user-unfriendly interfaces, and various input errors. These problems hindered its performance significantly. The system often gave wrong or incomplete information to judges, attorneys, and other law enforcement people which could have major repercussions for defendants. Consequently, this can be a damaging issue in criminal justice. This situation demonstrates how essential it is to take the necessary measures and perform tests prior to introducing high-tech tools in the public sector.

Criticisms of Automating Inequality

Automating Inequality has been praised by many, but there have also been some critiques of the book and the points made within it. While Eubanks has received criticism for simplifying the state of affairs, there are examples where technology has actually been beneficial to low-income and marginalized groups. This proves that with the correct execution, technological solutions can be used positively to bring about positive changes. It has been argued that instead of stressing over the potential harms of technology, Eubanks should focus on providing viable solutions and alternatives. Despite the critiques, Automating Inequality remains a valuable contribution to the ethical debate surrounding automation of decision-making in government offices. Its importance cannot be overstated given the potential ramifications involved.

Impact and Solutions

The book, published in 2018, has been praised by critics and frequently referenced when talking about the ethical implications of using technology to automate decisions in the public sector. Even though it’s tough to measure its exact impact so far, it is clear that the book has been well-received.

Virginia Eubanks is often invited to speak at conferences and events about her expertise in the subject and is highly sought-after for her insights.

Virginia Eubanks believes companies manufacturing inequity software should continuously audit their algorithms in order to detect and handle prejudices. Regular assessments can help enhance the fairness of these systems.

She also believes that organizations utilizing such AI algorithms should be held accountable if their actions resulted in a negative impact on an individual, either by providing legal aid or offering financial reparations. She acknowledges that this is a complex problem that must be studied and discussed in depth. There is no straightforward answer, but what we can do is increase public engagement and raise awareness on the matter so we can better understand it.

Article by: