As technology has become a more integral part of our world, it has been incorporated into how government interacts with the people. Across the country, technology is being used to try to streamline government programs in diverse areas, among them to reform the problems of old criminal justice practices like cash bail. Cash bail is the bond system in which people arrested, for even low-level offenses, are imprisoned before trial unless they pay a fee. These fees can be completely arbitrary and often are beyond the means of those arrested, leaving the poorest arrestees, often minorities, locked up without even being convicted of a crime.
As this practice has increasingly come under fire, many jurisdictions across the country are moving away from this cash system and toward one based on risk-assessment algorithms that attempt to calculate the likelihood of a given individual committing a crime. However, it is becoming apparent that while these algorithms are intended to remove human biases, they do not actually do so. Therefore, to ensure that these technologies are fairly implemented, it is important for the government to set regulatory standards to ensure the algorithm’s fairness and transparency.
While these algorithms are trying to reform the flaws of the old cash bail system, it has become clear that they too have biases In the United States. For example, while African Americans and Hispanics are only approximately 32% of the US population, they make up almost 56% of all incarcerated people. Because these statistics reflect decades of police practices that included racial profiling and surveillance of minority and lower-income communities, these algorithms have been trained on data that is fundamentally flawed. Additionally, in some cases, police departments had been found to have a culture of manipulating or falsifying data under intense political pressure to bring down crime rates. Therefore, the results of these algorithms are influenced by structural racism that distorts the data.
Algorithms are difficult to understand, and how the data that goes into them creates the results we see seems inexplicable. However, there are principles that can be followed to fix these biases, starting with ensuring that the data that the algorithms are trained on is representative of the entire population. Additionally, transparency can be achieved by regularly releasing information about the inputs and outputs of these algorithms to be audited by trusted third-party organizations like civil rights groups.
Finally, a series of checks should be installed to ensure that these life-altering decisions, especially within the criminal justice system, are not made entirely by human or formula, but by a combination of both. For these principles to be implemented, they must be transformed into regulation, as the
While there is always a concern that this would simply make the government more intrusive, and put more sensitive information into its hands, giving the government the power to regulate this application of algorithms in public life will ensure some accountability. Leaving this to the private sector will hurt those who are directly affected by these decisions, as they will be left without any means of understanding why.
Of course, only fixing these algorithms will not solve the flaws in broader society that have led to the racial and class gap over incarceration. However, ensuring that these algorithms are fair is a step in the right direction towards using government data to achieve a positive goal.
While the discussion about criminal justice reform, such as the recently passed First Step Act, has focused on the federal government, in fact, the vast majority of incarcerated people are held in facilities controlled by state and local government. It is therefore essential to advocate for the regulation of algorithms at all levels of government, and ultimately ensure that the algorithms used in every court in the country are held to the same standards of representativeness, fairness, and transparency.
This piece was originally published on The Morningside Post and slightly modified for the Columbia Public Policy Review.
 Misra, Tanvi. “When Welfare Decisions Are Left to Algorithms.” The Atlantic. February 15, 2018. Accessed March 25, 2019. https://www.theatlantic.com/business/archive/2018/02/virginia-eubanks-automating-inequality/553460/
 Day, Megan, and Bhaskar Shankara. “‘Modern Day Debtors’ Prisons’.” The New York Times, August 6, 2018. Accessed March 25, 2019. https://www.nytimes.com/2018/08/06/opinion/columnists/bernie-sanders-cash-bail.html
 Hao, Karen. “Police across the US Are Training Crime-predicting AIs on Falsified Data.” MIT Technology Review. February 13, 2019. Accessed March 25, 2019. https://www.technologyreview.com/s/612957/predictive-policing-algorithms-ai-crime-dirty-data/
 Carlisle, Madeleine. “The Bail-Reform Tool that Most Activists Want Abolished.” The Atlantic. September 21, 2018. Accessed on March 25, 2019. https://www.theatlantic.com/politics/archive/2018/09/the-bail-reform-tool-that-activists-want-abolished/570913/
 H.R. 5682, 115th Cong. (2018) (enacted) https://www.congress.gov/bill/115th-congress/house-bill/5682/text
 Kann, Drew. “5 facts behind America’s high incarceration rate.” CNN. July 10, 2018. Accessed March 25, 2019. https://www.cnn.com/2018/06/28/us/mass-incarceration-five-key-facts/index.html