What is pretrial incarceration and what are its negative impacts?
Pretrial incarceration is the practice of holding people accused of crimes in jail while their cases are decided, from initial arrest through trial, dismissal or a guilty plea. Generally, judges set bail that a person could pay to be released during this pretrial period, though, in some limited circumstances, judges may order a person held without the opportunity to pay bail. Judges may also release people without requiring bail.
Pretrial incarceration keeps people who have not been convicted of any crime in jail. By law, these people are presumed innocent, but they are punished with imprisonment based on mere allegations. Many are in fact innocent, and their cases will eventually be dismissed or they will be found “not guilty” at trial. In California alone, Human Rights Watch found that, from 2011-2015, over a quarter of a million people who were either held in custody following a felony arrest or who paid bail were never charged. Jailing innocent people costs taxpayers millions of dollars each year.
People who are incarcerated pretrial, and cannot afford to pay bail, suffer the misery of jail, including violence by guards and other prisoners, boredom, lack of exercise, bad food, lack of health care, uncomfortable and crowded living conditions, and the loss of self-determination. They lose their jobs, are separated from their families, miss school, disrupt medical treatment, and lose their homes and property. Many people, including those who are innocent, plead guilty simply to be released more quickly, accepting criminal convictions that limit future opportunities and probation sentences that may lead to future incarceration. Many prosecutors ask for bail to increase their leverage in extracting guilty pleas, and many judges set bail to facilitate rapid pleas. Those who resist pressure to plead and do fight their cases from jail are less likely to get good results because custody creates many barriers to effective representation.
Some people borrow money, or have their family or friends raise money, to pay bail. This often leaves them in crushing debt that will cause severe financial hardships for them and their families. They pay non-refundable fees to bail bond agents, who continue to collect payments long after the case is resolved. Even if charges are not filed, the accused still must pay.
Pretrial incarceration through money bail discriminates based on wealth and contributes to a system that greatly favors those with money, while harming poor people.
Are there efforts to change the system of money bail?
Across the US, there is a growing recognition of this harm that has led to efforts to change the money bail system. Several states, including New Jersey, Kentucky and Alaska, have enacted reform, with varying impacts on the rates of pretrial incarceration. Other states, including California and New York, are considering bail reform legislation that would limit or eliminate the use of money bail. Senators Kamala Harris (D-CA) and Rand Paul (R-KY) have proposed a federal bill that would provide financial incentives to jurisdictions that moved away from the money bail system.
Nearly all of the reforms currently implemented or under consideration seek to replace money bail with some use of profile based risk assessment. Some, like Kentucky and New Jersey, use profile based risk assessment tools to recommend release or incarceration for all detained defendants. Others, like California’s current bill, intend to use the tools only to recommend conditions of release for most defendants. One bill under consideration in New York seeks to reform bail without using profile based risk assessment.
What is Profile Based Risk Assessment?
Profile based risk assessment means the use of data and mathematical formulas to estimate the likelihood that an individual will commit some future misconduct. It is used in a variety of contexts in the criminal justice and other systems, including decisions on sentencing, parole dates, levels of probation supervision, child custody and pretrial incarceration. The formulas, called tools, generate risk scores for the person, not unlike a credit score. The risk score is associated with a recommended outcome. In the pretrial setting, a low risk score may be associated with release, while a high risk score may be associated with a decision to incarcerate or set a high bail. These tools are being used in an increasing number of jurisdictions and are seen by many as an important trend. This document only addresses their use for pretrial detention decisions, but the overall critique may apply to their use in other aspects of the system.
How Does Profile Based Risk Assessment Work in Pretrial Decision-Making?
Actuarial or profile based risk assessment tools apply statistical methods to aggregate data about a large population to identify factors that correlate with committing re-arrest or missing a court date and then create a mathematical formula or algorithm that predicts the statistical likelihood that a given individual, based on their own record, will do so. In a highly simplified illustration, if 10 out of 100 people who have two misdemeanor convictions and three traffic violations missed a court date or got arrested again, assuming those were the only variables considered, then the tool would estimate a 10 percent risk that such an individual would miss a court date or get arrested again if released. Of course, the actual tools are more complex and sophisticated in their design-- they use more factors and may weigh those factors differently-- but the illustration captures the basic idea.
Each tool uses a different set of factors, though it seems that most of the tools used in pretrial incarceration decisions tend to stress criminal history. Other characteristics known to be used by some tools include, among others, employment status, residential stability, and past drug use. Many tools are so opaque that information about the characteristics they use is not available. Presumably none explicitly use race, gender or social class, but instead use information about the person’s past. However, the factors used may embody racial and class bias, regardless of the intentions of those who created them.
What about individualized characteristics and context?
Profile based risk assessment tools only consider supposedly objective characteristics like criminal history, but are inherently unable to comprehend or weigh the many subjective elements that make up the unique context of a person’s life, or even of their criminal record. Risk assessment tools tend to evaluate binary questions: “Does the person have a prior conviction? Yes or no.”
The risk assessment tools currently in use do not make sufficient contextual distinctions between gradations of offenses or other conduct. For example, one of the most widely used tools scores for “prior violent conviction” without distinguishing between a misdemeanor battery involving a push and an attack with a knife causing serious injury. The same tool similarly scores for “prior felony conviction,” which can range from drug possession for personal use to murder. If someone missed a court date because they could not get child care, but came to court the next morning, this risk assessment tool would score the “prior failure to appear” the same as a person who missed one court date because they left the country to avoid prosecution.
The tools sort people into a range of very broad categories and make statistical estimates relative to those categories, without factoring in individual circumstances such as that a person has worked steadily and raised a family since a prior conviction. The tool referred to above scores people on a scale of one through six, while others simply categorize low, medium and high.
Conceivably, some future tool might be designed that could account for some of the contextual differences, though the trend is toward making them simpler and more efficient. Even if tool developers were motivated to design formulas to account for greater context, courts would have to adopt recordkeeping that registers the contextual distinctions in ways that could be input into the computerized assessments. It is unlikely that they would or could do so, or would be able to do so in ways that accurately reflect the nuances of the individual cases. Without such nuanced recordkeeping, increased tool sophistication would do little to address the concern.
Do these tools accurately estimate the likelihood that someone will commit a crime?
Proponents of profile based risk assessment tools claim that they estimate the statistical chance of a person committing a new crime or missing a court date. However, they do not really predict new crime—they predict new arrest. This distinction is very significant. Predicting new arrest is in large part about the behavior of the police, not just of the individual being assessed. A person’s likelihood of being arrested depends very much on where they live and how aggressively police operate in that community. Racial and class bias in policing greatly influences who gets arrested.
For example, a middle class white person may illegally carry a gun in his car, but this is unlikely to be detected if police rarely come to their neighborhood and are unlikely to pull residents over even if they do. A black man who lives in a highly patrolled neighborhood is much more apt to be stopped and his car to be searched, leading to exposure of the gun and subsequent arrest. This isn’t just hypothetical. For example, white and black people commit drug crimes at roughly equal rates, but police arrest black people at substantially higher rates for these offenses. What the tools assess as “risk” is, to a significant degree, really a measure of police behavior, not a person’s actual likelihood of committing a new crime but a likelihood of arrest.
The tools also estimate the likelihood of missing a court date, in part by looking back at past instances of missed court appearances. Poor people tend to have higher rates of missing court dates for a variety of reasons, many of which are functions of poverty, like the inability to find transportation or child care. Because poor people and people of color are more likely to be arrested and charged with crimes, they are more likely to have multiple court dates, which increases their likelihood of missing court dates.
Since race often influences the risk these tools estimate, do they racially profile?
Because so much of the data risk assessment tools consider is to some degree a product of bias on the part of the criminal justice system and of society more broadly, risk assessment tools can lead courts that rely on them to discriminate based on race and class. The characteristics they evaluate, including criminal history, employment history, residential stability, and education levels, are all influenced by racial and class bias, and unequal opportunity in the United States. For example the tools often place a particularly heavy emphasis on criminal history, which is greatly influenced by police enforcement policies, racial profiling by individual officers, and poverty. The tools may not be designed to be racist. But because they rely on racially biased inputs, their outputs or recommendations will reflect that bias.
Some proponents of these tools acknowledge that the tools’ recommendations reflect racial bias in the broader society and criminal system. However, they contend that reforming bail is so urgent that we may need to accept the use of risk assessments, even with their flaws and bias. They accurately point out that judges often make decisions based on race, class and other biases. However, since the tools invariably allow judicial overrides, they do not effectively replace that discriminatory discretion and because of their claim to scientific objectivity, they may provide a veneer of legitimacy to that discrimination.
It is also commonly argued that whatever their flaws, risk assessment tools cannot actually make racial bias worse. Because of the factors described immediately above, however, a tool could disproportionately persuade courts to keep people of one race or economic class in custody, making those people more likely to plead guilty. Risk scores would then rise across that demographic just as steadily as they would if all judges in the jurisdiction made biased decisions. Therefore, the tools may make racially disparate outcomes worse over time. Again, because the tools have the appearance of scientific objectivity, their racially influenced results are difficult to challenge.
Opponents of profile based risk assessment argue that even if the tools merely reproduced existing bias, courts should reject any “reform” that retains inherent discrimination.
What are some of the other major criticisms of profile based risk assessment tools?
Any significant reliance on these tools undermines due process rights by leading courts to judge—and incarcerate—accused people largely based on the actions of other people. Judges who rely entirely on the tools make crucial decisions about who goes free and who gets locked up based, not on what the person has done or their individual circumstances, but on statistical estimates of what they might do, that come from surveys of what others have done in the past. Even if a judge considers other factors, to the extent the risk assessment score had any impact on the decision, the unfairness of using these statistical estimates diminishes the decision’s legitimacy.
The estimates are not necessarily accurate. A ProPublica study found that the tool they analyzed was only slightly more accurate than a coin toss. Another study found that this same tool was about as accurate as if random people made the decisions. The training data used by tools almost always comes from another time and place. Without current, localized data to train a model, the predictions make assumptions about risk that may not reflect the true risk. The tool may not reflect the concept of what entails “risk” in a jurisdiction. Tools generally categorize “risk” by predicting likelihood of re-arrest or failure to appear. Not all arrests, convictions or pretrial misconduct carry the same risk, but they may be treated identically by the tool. Tools may label a person “high risk” based on an estimated 10 percent chance that they will commit a new violent crime, likely resulting in pretrial incarceration recommendations for a large number of people who are statistically unlikely to cause harm. Ultimately, the court or whatever entity oversees the tool will decide what probability of future misconduct will result in pretrial incarceration.
The tools generally are not transparent. Most are developed by private entities that keep their algorithms and underlying research a proprietary secret, even from the courts or government agencies that use them. This lack of transparency means that a person incarcerated by a judge who has relied on or even just consulted the tool will not be able to learn how it arrived at the recommendation, and will not have a reasonable opportunity to challenge the recommendation. Without transparency, we do not know if the decision is fair. While most jurisdictions have not adequately demanded transparency, some are beginning to require it, at least in part. Full transparency would allow for some enhanced oversight. At a minimum, due process should require making the formulas, factors evaluated, and underlying data that led to the formulas, as well as decisions made in developing the tools, open to people who challenge incarceration decisions informed by these tools.
Will these tools lower pretrial incarceration rates?
Some proponents of profile based risk assessment tools argue that they will lower pretrial incarceration rates. Given the recognized harm of pretrial incarceration, that would be a benefit. They point to Kentucky and New Jersey as examples of bail reform regimes that have successfully reduced those rates. However, a recently published independent study of implementation in Kentucky found that the tools have not made a significant change. In New Jersey, which implemented bail reform in 2017, there have been noticeable and encouraging drops in pretrial incarceration rates. However, it is still early, and the improvement may not be attributable entirely to the tools, as the reforms include requiring the release of many arrestees with a citation instead of being taken into custody. The office of New Jersey’s Attorney General has adjusted its policy to increase the categories of crimes for which it is requesting pretrial incarceration, so the release numbers may continue to change. In Lucas County, Ohio, implementation of the Arnold Foundation tool led to increased rates of pretrial detention, and the percentage of people pleading guilty on their first court appearance doubled. 
The tools can be adjusted to lower or raise the rate of pretrial incarceration by changing the scoring system. Assume the court administration initially sets the system so that a risk score of 10 or below is considered “low risk” so that anyone above 10 will be detained but anyone below 10 will be released. However, if the administration decides that too many people are getting out, they can simply change the score so that only a 5 or below is “low risk.” Now all those with scores of 6-9 will also be detained.
In Santa Cruz County, for example, the probation department found that the number of people released under supervision (such as electronic monitoring or requirements to check in regularly with a pretrial services agent) had dropped, so they adjusted the “decision making framework,” which dramatically increased the numbers . In San Francisco, the judges who control implementation of their tools have adjusted their “decision making framework” and moved various categories of people into higher risk classifications.  There are other more complex ways to adjust the scoring, like changing the algorithm itself, which would be hard to monitor due to the lack of transparency of the formulas.
The tools can be used to justify reductions, maintenance or increases in levels of incarceration, release and supervision depending on the motivations of the entity in control of their implementation. It is, therefore, crucial to understand who controls the tools and what their institutional interests are.
Who will control the risk assessment tools and can they be trusted to use those tools to reduce incarceration rates?
In every jurisdiction that uses these tools, the judiciary and court administration control them, just as those institutions have controlled schedules and policies for money bail under the existing system. As they have used bail to maintain high levels of pretrial incarceration, it seems likely that they will use the new tools to do the same.
In theory, strong community input, along with regulations requiring oversight and attempting adjustment for racial or other bias, data collection and transparency, might make the tools less prone to biased influence or to tweaking in ways that tend to favor pretrial incarceration. In practice, community oversight tends to be advisory at best, often lacking in expertise or resources, and sometimes unrepresentative.
New Jersey has an advisory board that includes various advocacy organizations, like the Drug Policy Alliance and ACLU, which have access to expertise, favor lowering pretrial incarceration rates, and can give their opinions on how to use the tools. However, the judiciary has the final say and can ignore this board’s recommendations.
In practice, would the judiciary be motivated to use the tools to increase or decrease release rates?
Under the existing money bail system, judges already have the power to increase release rates by simply ordering release more often. Unfortunately, many judges set bail and use pretrial incarceration to pressure defendants to plead guilty quickly. Defendants often plead guilty to get out of jail sooner. Human Rights Watch analysis of data in six representative California counties revealed that between 70 and 90 percent of misdemeanor and non-serious felony defendants were pleading guilty to be released before their first possible trial date. On average, defendants who are out of custody fight their cases longer.
If a risk assessment tool recommended release in sufficient numbers and judges followed those recommendations, then people would not plead guilty as quickly and court calendars would slow considerably, giving judges an incentive to either disregard release recommendations or pressure court administrators to adjust the tools to recommend incarcerating more people. Risk assessment tools can easily be manipulated while maintaining the fiction that they are objectively assessing risk.
Additionally, individual judges always retain the right to override the tools’ recommendations. This is as it should be, but there is evidence that judges overall are more likely to override the tools to favor incarceration over release. The Kentucky study found that judges used this override power to maintain higher pretrial incarceration rates, even when the tool initially lowered them. Human Rights Watch research in California found that judges overwhelmingly overrode release recommendations in favor of incarceration and rarely overrode recommendations to detain.
Is there a better way to reduce pressures on judges’ court calendars than pressuring people for guilty pleas through pretrial detention?
Courts are overwhelmed in large part because we have turned to law enforcement and punishment, and therefore, court proceedings, to solve too many societal problems. Rather than invest in housing and appropriate services for unhoused people and those with mental illnesses, most jurisdictions enforce laws that criminalize behavior that results directly from those conditions. Human Rights Watch has called for decriminalizing drug use, which makes up a significant portion of caseloads in US courts. Addressing poverty, inequality, disinvestment from poor communities and communities of color would lower crime rates and lower the burden on courts.
Profile based risk assessment, even if it could be made less biased, less arbitrary and more transparent, remains a tool to efficiently process cases, which allows the criminal justice system to continue its current pattern of over-criminalization.
Despite their flaws, wouldn’t risk assessment tools be a better method than the existing system, especially if we can regulate them?
As noted above, many proponents of profile based risk assessment tools acknowledge their potential for racial bias and their lack of transparency, but argue that with safeguards they may be better than the current biased and unequal system.
Some strong critics of risk assessment demand independent validation, oversight and full openness of the data and algorithm.  Some advocates call for limits on the judges’ discretion to override the tools’ recommendations, or building in some form of presumption of release. These measures may help reduce harm, but will not sufficiently address the inherent bias within the tools or the power relations that determine how they are used. For example, validation of the tool’s predictive ability does not change the degree to which the tool is predicting police behavior and not necessarily future crime.
Oversight has political limits. While bail reformers may initially have influence to curb many of the problems with these tools, over time, that influence may wane and social and political forces that historically have driven mass incarceration may well remove or soften any safeguards on the tools. Federal Sentencing Guidelines offer a cautionary lesson. They purported to remove bias and inconsistency from judges’ sentencing decisions, but resulted instead in overly harsh, racially discriminatory outcomes that did not account for individual circumstances and greatly increased incarceration. The tools themselves, like the sentencing guidelines, with their efficient operation and their appearance of objectivity, can become extremely effective instruments of over-incarceration.
The operation of the existing criminal justice system in the United States in many respects is disturbingly dehumanizing. The trend toward making decisions based on computerized formulas, using profiled characteristics and not the individual person’s actual situation, makes it even more dehumanizing, even if regulated. Jailing certain people because others like them have committed misconduct is not only unfair, but also reduces real people and their stories to statistics and commodities to be managed.
Replacing one biased system with another is not a solution that comports with human rights requirements. Accepting such an approach means giving up on the idea that the justice system should be fair and afford real due process, in favor of the idea that the best we can hope for is a mitigation of some of its worst tendencies.
What does Human Rights Law have to say about the Use of Profile-Based Risk Assessment?
Article 9(1) of the International Covenant on Civil and Political Rights (ICCPR), which the US has ratified, says that everyone has the right to liberty and security of person and that one’s right to liberty may not be curtailed arbitrarily, either through arbitrary laws or through arbitrary enforcement of the law. Courts that rely too mechanically on risk assessment tools may violate this guarantee by depriving defendants of their liberty based largely or entirely on a rigid application of the tools, rather than through a careful and individualized process of judicial decision-making. This holds true where courts purport to reach individualized decisions before imposing pre-trial detention, but in reality have displayed an inappropriate degree of deference to the risk assessment tool’s output. This analysis has made clear that courts face powerful institutional pressures to lapse into such behavior, and that it is not reasonable to assume that they won’t. The Inter-American Commission on Human Rights (IACHR), which interprets the American Convention on Human Rights, has emphasized that because “justice cannot function ‘automatically,’” pretrial custody decisions should not be made by reference to pre-set formulas, patterns or stereotypes, but, instead, must be grounded in reasoning that contains specific, individualized facts and circumstances justifying such detention.
The arguments in favor of risk assessment tools largely assume that courts will use them responsibly, and cast them as a source of valuable guidance and objective data. However, this analysis has described how the data the tools draw on to judge a defendant’s risk can be deeply skewed by patterns of racial and other bias in policing, prosecution and sentencing. To the extent that this is true, courts that rely significantly on risk assessment tools may violate defendants’ right to freedom from discrimination by introducing bias into their own decision-making.
The ICCPR mandates that all individuals are entitled to freedom from discrimination and to equal protection of the law. The Convention on the Elimination of all Forms of Racial Discrimination (CERD), which the US has also ratified, imposes an obligation to work towards the elimination of all forms of racial discrimination, and Article 5(1) of that Convention requires parties to guarantee “equal treatment” before courts and other judicial mechanisms. To the extent that risk assessment tools reflect and reinforce existing patterns of racial bias within the justice system, their use is directly at odds with that obligation.
Is there a rights respecting alternative to money bail and to profile-based risk assessment that will reduce over-incarceration while providing for community safety?
The best way to reduce pretrial incarceration is to respect the presumption of innocence and stop jailing people who have not been convicted of a crime absent concrete evidence that they pose a serious and specific threat to others if they are released. Human Rights Watch recommends having strict rules requiring police to issue citations with orders to appear in court to people accused of misdemeanor and low-level, non-violent felonies, instead of arresting and jailing them. For people accused of more serious crimes, Human Rights Watch recommends that the release, detain, or bail decision be made following an adversarial hearing, with right to counsel, rules of evidence, an opportunity for both sides to present mitigating and aggravating evidence, a requirement that the prosecutor show sufficient evidence that the accused actually committed the crime, and high standards for showing specific, known danger if the accused is released, as opposed to relying on a statistical likelihood.
This system would limit pretrial incarceration to people shown to be an actual danger. Prosecutors would not be able to seek bail for everyone accused to pressure them into guilty pleas. They would have to carefully choose which defendants truly warrant the time and resources necessary to secure their pretrial incarceration. Similarly, judges would not be able to default to locking people up before trial—and instead would have to reserve that option for cases where it is truly necessary, and to encourage prosecutors to do the same. Because pretrial incarceration would be limited to a far narrower group of people -- those believed to be an actual danger or flight risk that could not be mitigated by other means, courts would spend less time and local governments would spend less money addressing and jailing people accused of lower-level offenses.
More significantly, the burden and unfairness of pretrial incarceration, and incarceration in general, would diminish if we stopped using criminal law as a solution to societal problems. We should stop criminalizing behaviors like drug use and sex work. We should stop prosecuting homeless people and those with mental illnesses for acts that reflect their status. We should invest in health treatment for drug abuse, education, and community development to reduce gang and other crime, and community- based voluntary mental health care for those who need it. “Solutions” like profile based risk assessment tools that make the incarceration system more efficient miss the point—the system does not need to be made more efficient. It needs to be forced to respect the rights of the accused, and doing so would in turn encourage a far more fundamental reconsideration of behavior that should be prosecuted as crimes.
 Laura and John Arnold Foundation, “Public Safety Assessment: Risk Factors and Formula,” http://www.arnoldfoundation.org/wp-content/uploads/PSA-Risk-Factors-and-Formula.pdf.
 Human Rights Watch, Not in it for Justice: How California’s Pretrial Detention and Bail System Unfairly Punishes Poor People, April, 2017, pp. 88-89.
 Public Safety Assessment, Laura and John Arnold Foundation. http://www.arnoldfoundation.org/initiative/criminal-justice/crime-prevention/public-safety-assessment/
 The tools do not predict new crime, because data on crime is primarily data on arrests, as there is no way to accurately quantify crime. Similarly, data on new crimes committed during the pretrial period, used to estimate the statistical likelihood a given defendant will commit a crime if released, is just data on arrests, not even on convictions. See David G. Robinson, “The Challenges of Prediction: Lessons from Criminal Justice,” (October 16, 2017). I/S: A Journal of Law and Policy for the Information Society. file:///C:/Users/raphlij/Downloads/SSRN-id3054115.pdf
 Human Rights Watch, Every 25 Seconds: The Human Toll of Criminalizing Drug Use in the United States, October, 2016.
 Michael Tonry, Legal and Ethical Issues in the Prediction of Recidivism, Federal Sentencing Reporter, vol.26 (2014), 167-176.
 Elizabeth Hinton et al., “An Unjust Burden: The Disparate Treatment of Black Americans in the Criminal Justice System,” Vera Institute of Justice, May 2018. https://storage.googleapis.com/vera-web-assets/downloads/Publications/for-the-record-unjust-burden/legacy_downloads/for-the-record-unjust-burden-racial-disparities.pdf
 Human Rights Watch, Not in it for Justice: How California’s Pretrial Detention and Bail System Unfairly Punishes Poor People, April, 2017, pp. 97-98.
 Human Rights Watch, Not in it for Justice: How California’s Pretrial Detention and Bail System Unfairly Punishes Poor People, April, 2017, pp. 96-97.
 Julia Angwin et al., “Machine Bias,” ProPublica, May 23, 2016, https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
 Rowan Walrath, “Software Used to Make ‘Life-Altering’ Decisions Is No Better Than Random People at Predicting Recidivism,” Mother Jones, January 17, 2018. https://www.motherjones.com/crime-justice/2018/01/compas-software-racial-bias-inaccurate-predicting-recidivism/
 Justice System Partners, on a contract with the Arnold Foundation, presented to San Francisco County stakeholders. They calculated the chance of committing a new violent crime in their highest risk category as 11.1 percent. Zach Dal Pra, “LJAF Public Safety Assessment—PSA,” Laura and John Arnold Foundation, Slide 33.
 The most prominent tool, developed by the Arnold Foundation, has some degree of transparency about the factors it considers and how they are weighted. http://www.arnoldfoundation.org/wp-content/uploads/PSA-Risk-Factors-and-Formula.pdf. However, Arnold has been criticized for not revealing how it developed its algorithms, why it used the data it chose to develop the system, whether it performed validation, and, if it did, what the outcomes were. John Logan Koepke and David G. Robinson, Danger Ahead: Risk Assessment and the Future of Bail Reform, March 17, 2018, p. 55. file:///C:/Users/raphlij/Downloads/SSRN-id3041622%20(3).pdf
 One of the troubles with understanding the impact of profile based risk assessment tools is that there is very little independent research. There is a great deal of research funded by the companies and foundations that have created or promoted the tools. Because of the lack of data transparency, it is difficult to evaluate the quality of those studies.
 Megan Stevenson, Assessing Risk Assessment in Action, (December 8, 2017), George Mason Legal Studies Research Paper No. LS 17-25. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3016088
 New Jersey Judiciary, 2017 Report to the Governor and the Legislature, February 2018. https://www.judiciary.state.nj.us/courts/assets/criminal/2017cjrannual.pdf
New Jersey Judiciary, 2017 Report to the Governor and the Legislature, February 2018, p. 19. https://www.judiciary.state.nj.us/courts/assets/criminal/2017cjrannual.pdf. Jerry DeMarco, “NJ’s Top Cop Revises Controversial Bail Reform Terms,” Hackensack Daily Voice, May 24, 2017. http://hackensack.dailyvoice.com/police-fire/njs-top-cop-revises-controversial-bail-reform-terms/711759/
 Human Rights Watch, Not in it for Justice: How California’s Pretrial Detention and Bail System Unfairly Punishes Poor People, April, 2017, p. 91.
 Santa Cruz County Probation Department, Alternatives to Custody Report 2015, April 2016, p. 11. file:///C:/Users/raphlij/Downloads/Snapshot-5956.pdf; Human Rights Watch, Not in it for Justice: How California’s Pretrial Detention and Bail System Unfairly Punishes Poor People, April, 2017, pp. 99-100.
 Correspondences with (name withheld), San Francisco County official, San Francisco, CA, 12/18/17, on file with Human Rights Watch.
 Correspondences with Roseanne Scotti, State Director, New Jersey Drug Policy Alliance, 5/25/17, on file at Human Rights Watch.
 Human Rights Watch, Not in it for Justice: How California’s Pretrial Detention and Bail System Unfairly Punishes Poor People, April, 2017, pp. 59-62.
 Human Rights Watch, Not in it for Justice: How California’s Pretrial Detention and Bail System Unfairly Punishes Poor People, April, 2017, p. 56.
 Human Rights Watch, Not in it for Justice: How California’s Pretrial Detention and Bail System Unfairly Punishes Poor People, April, 2017, p. 60-61.
 Human Rights Watch, Not in it for Justice: How California’s Pretrial Detention and Bail System Unfairly Punishes Poor People, April, 2017, p. 61.
 Megan Stevenson, Assessing Risk Assessment in Action, (December 8, 2017), George Mason Legal Studies Research Paper No. LS 17-25. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3016088
 Human Rights Watch, Not in it for Justice: How California’s Pretrial Detention and Bail System Unfairly Punishes Poor People, April, 2017, pp. 92-93.
 Hannah Sassaman, “Artificial Intelligence is Racist Yet Computer Algorithms are Deciding Who Goes to Prison,” Newsweek, January 24, 2018. http://www.newsweek.com/ai-racist-yet-computer-algorithms-are-helping-decide-court-cases-789296
 Inter-American Commission on Human Rights, Report on the Use of Pretrial Detention in the Americas, OEA/Ser.L/V/VII, Doc. 46/13 (2013), para 186, http://www.oas.org/en/iachr/media_center/PReleases/2014/001.asp (accessed May 30, 2018). The United States has signed, but not ratified, the American Convention, and as such is not legally bound by its provisions. However, the Inter-American Commission’s guidance is a useful and authoritative guide to the protection of fundamental human rights. This is particularly true in this area, because the American Convention’s due process guarantees are in many respects similar to those guaranteed under US law and by international instruments binding on the United States.
 Human Rights Watch, Every 25 Seconds: The Human Toll of Criminalizing Drug Use in the United States, October, 2016.