Date of Thesis

Spring 2022


In today’s world, there is an increasing reliance on algorithms and automated systems for making decisions. Algorithmic decision-making permeates almost all aspects of our everyday lives, including in critical contexts such as the criminal justice system. For a fair and just criminal justice system, it is imperative that these automated decisions are unbiased and the decision process is transparent. This research examines one such algorithm called the Pennsylvania Additive Classification Tool (PACT) adopted by the Department of Corrections (DOC) of Pennsylvania. We illustrate the inner workings of the PACT by modeling the tool’s behavior, providing some insight into how it impacts the lived experience of an incarcerated person, and investigate the DOC’s claim about the objectivity of the decisions made by PACT. Furthermore, this work contributes to the existing literature on algorithmic fairness by empirically working with existing theoretical fairness metrics, and establishing a novel way of analyzing such algorithms through a method called sensitivity analysis. The thesis shows that PACT does not meet our definitions of fairness with respect to age, gender and race and demonstrates a level of arbitrariness in the PACT, indicating that it is not ”objective” or unbiased. The larger goal of this research is also to contribute to bringing transparency to a portion of the criminal justice system that is largely kept secret and out of the public eye.


Algorithmic fairness, Criminal Justice, Pennsylvania Additive Classification Tool, Modeling, Uncertainty

Access Type

Honors Thesis (Bucknell Access Only)

Degree Type

Bachelor of Science in Computer Science and Engineering


Computer Science

Second Major


First Advisor

Darakhshan J. Mir

Second Advisor

Vanessa Massaro