The Legacy of Race
Predictable Prejudice: Predictive Policing Software Promises Unbiased Crime-Fighting, but Can It Deliver?
When a Homewood Police Department officer starts his shift, the laptop in his police cruiser is fed data from a program called PredPol. The data fills a city map with boxes where PredPol forecasts that property crimes are most likely to occur, and the officer is expected to give those areas extra attention during his shift.
Predictive policing software programs such as PredPol have grown in popularity among law enforcement agencies over the past decade, including adoption by the Homewood Police Department and Jefferson County Sheriff’s Office in 2016 and Birmingham Police Department in 2019.
These programs promise high-tech, efficient policing and reduced crime rates based on cold, hard data and algorithms. Amid renewed national attention on racism and bias in police departments, the seeming color-blindness of decisions made based on computer code sounds all the more alluring.
But many opponents of predictive policing say the technology isn’t as objective as it appears and is simply perpetuating discrimination in a new way.
How Predictive Policing Works
There are a couple kinds of predictive policing programs that have been used in departments across the country.
Person-based and group-based systems use data on an individual’s previous criminal behavior, age, interactions and people they associate with as indicators of whether they’re likely to commit a crime. Andrew Ferguson, a law professor at American University Washington College of Law who has studied predictive policing since 2011, said people who make it onto lists based on that data are then subjected to additional surveillance or even home visits from officers.
Location-based programs such as PredPol are more common and analyze data from police call logs and incident reports to find trends in where and when particular types of crimes occur.
“This data-driven approach aids in assisting patrol operation with intel of the types of crimes being committed in a geographical area. The data is solely based on crimes reported, and this is subject to change with the rise and decline of crime,” Birmingham Police Department public information officer Sgt. Rodarius Mauldin said via email. Mauldin declined to answer further questions about the department’s use of PredPol.
Sgt. Joni Money of the Jefferson County Sheriff’s Office said PredPol initially used 10 years of Sheriff’s Office property crime data — burglary, robbery, vehicle burglary and vehicle theft — and “continuously ingests newly occurring offenses” to update its predictions.
As one example of the trend predictions, Homewood Police Sgt. John Carr said the program in his city has shown that those four types of property crimes are more likely to occur on a Tuesday.
“They pull our records for the past 180 days, and with their fancy logarithm, they determine (hotspots) based on where those crimes occurred,” Carr said.
Officers are expected to spend 10 to 15 minutes per hour during their shifts on “missions” to patrol the geographic boxes that PredPol identifies as likely property crime areas.
“They’re there to decrease the likelihood that crime is going to occur,” whether through the increased police visibility or through arrests, Carr said.
“I can pretty much guarantee you that on that day, at that shift, you’re not going to have a robbery” due to the extra patrols, he said.
Carr said PredPol recently has added features to provide more data analysis and points of interest based on previous offenses, such as domestic disturbances.
Money said the county Sheriff’s Office spent $125,000 on its initial three-year PredPol contract and $10,000 per year since then. The City of Homewood’s proposed budget for the 2021 fiscal year includes $12,000 for PredPol.
What You Give Is What You Get
Ferguson said predictive policing is not actually forecasting crimes; it’s predicting patterns in police activity such as dispatches and incident reports. The quality of the data police departments put in, he said, will influence the accuracy of the predictions they receive.
Racial biases can show up in where officers choose to spend extra time patrolling and stopping people, as well as when citizens choose to call 911 to report suspicious behavior.
“Predictive policing tools, despite the name, aren’t about the future, but are about the past. That’s because the systems have to rely on historical data to accomplish anything. And what does that historic data tell us? That the system of policing in the United States is structurally racist,” said Logan Koepke of Upturn, a nonprofit focused on “equity and justice in the design, governance and use of technology.”
Any biases, whether intentional or not, that appear in police records and call logs are aggregated by predictive programs, which can create a feedback loop where the same areas continue to be targeted for enforcement, regardless of whether more actual crime is occurring in these “hotspots” than in their neighbors.
“That changes where (officers) go and how they see the neighborhoods around them,” Ferguson said.
Without records of criminal charges and convictions also being part of the data, there is no way for the data to show how many of those incident and dispatch records were linked to a proven crime.
“Differences in data collection can affect the results, projecting a certain outcome or tipping priorities toward certain crimes,” a 2013 American Bar Association Journal article says.
An April 2020 report by the Brennan Center for Justice said: “Relying on historical crime data can replicate biased police practices and reinforce over-policing of communities of color, while manipulating crime numbers to meet quotas or produce ambitious crime reduction results can give rise to more policing in the neighborhoods in which those statistics are concentrated.”
This feedback loop is especially dangerous when the data coming out of the predictive software is viewed as scientific, objective and reliable, according to a report by the Electronic Frontier Foundation.
Rigorous oversight of the software’s predictions can be hard to achieve, since companies often consider their predictive algorithms proprietary knowledge and are not open about how the predictions are made. This makes it challenging to spot flaws in the data going into the system, error rates in the predictions or limitations of what the algorithms can accurately predict.
“These new technologies are giving additional power to police and unchecked power and untransparent power,” Ferguson said.
The predictions can become self-fulfilling prophecies, according to the foundation, which focuses on rights, free speech and privacy related to technology.
Once given a hotspot to patrol, an officer may become primed to expect a crime to occur because the computer told her it would. That mindset may cause her to put extra suspicion on behavior that would be unremarkable in another part of her beat, leading to more searches and seizures, arrests and incident reports.
“Not every crime committed has an equal chance of being recorded by police. Locations heavily patrolled by police are over-represented in police data. Low-income areas of color are disproportionately targeted and patrolled by police, so using police records to predict future crimes will lead to higher predictions of criminality in these over-policed communities,” Karen Gullo, an analyst and senior media relations specialist with the foundation said in an email.
Many groups, including the American Civil Liberties Union and the Brennan Center, have called predictive policing an infringement of civil rights because it subjects certain residents of a city to extra police scrutiny and suspicion, regardless of their own criminal background, because they live in an area identified by an algorithm as a potential hotspot.
Ferguson called it an “invasion of freedom.”
Is Predictive Policing Effective?
In light of many of these concerns, some cities have abandoned their predictive programs.
Ferguson said Chicago found its person-based predictive program to be discriminatory, and Santa Cruz, California, one of the early adopters of PredPol, in 2011, put a moratorium on the technology in 2017 and outright banned it this summer “because of their recognition that this was not the appropriate way to think about public safety.”
According to a 2019 Los Angeles Times article, Palo Alto, California; Rio Rancho, New Mexico; and Hagerstown, Maryland, are among cities that have ended their predictive policing contracts due to concerns about effectiveness or budget constraints.
“I think many of the critiques (of predictive policing) that have sort of risen to the fore in the big cities … haven’t necessarily made it into the other cities that should be having the conversation,” Ferguson said.
Carr acknowledged the concerns that had been raised, but he felt those instances of bias or inaccuracy were less likely to be a problem in Homewood than in larger cities. One possible explanation, he said, is that most property crimes in the city are not committed by Homewood citizens, so the PredPol hotspots are less about neighborhood residents and more about characteristics that make those areas appealing targets for theft and burglary.
“Maybe it’s more successful here because we’re being put in areas where visibility is increased and it deters the criminal element from coming here,” Carr said.
Money said these concerns have not arisen at the Jefferson County Sheriff’s Office either. She noted that putting patrol officers in the hotspots determined by PredPol is more about preventing crime by visibility rather than making arrests.
“As far as the actions that the deputies take, their mere presence is thought to be a deterrent, but if they take any enforcement action, that action should be based on articulable, reasonable suspicion or probable cause and not based on this software’s predictions,” she said via email.
Research on predictive policing’s effectiveness has been challenging to complete, both because of the proprietary nature of the algorithms and because it’s hard to conduct full, controlled experiments using an active police department.
A 2012 National Institute of Justice study in Shreveport, Louisiana, found little difference in crime rates between districts that were patrolled using predictive programs and those patrolled using traditional methods. However, the study did note that use of the predictive program was inconsistent during the study, making definite conclusions impossible.
A 2019 internal study in the Los Angeles Police Department also was inconclusive about the effectiveness of predictive software. The LAPD ended its 10-year predictive policing program, which had cost millions of dollars, in 2020 due to budget constraints caused by the COVID-19 pandemic. Activists had pressured the police department for months to end predictive policing because of racial bias concerns, though Police Chief Michel Moore continued to praise predictive analytics even as he announced the end of the PredPol contract.
“If we don’t know that this system works, why are you spending taxpayer money on using it?” Ferguson asked, noting that the money could be spent on social services, education and other routes to reduce crime.
Homewood and Birmingham both have had decreases in crime rates since they began using PredPol, though these trends are influenced by multiple factors.
In Homewood, all types of crime have trended downward since 2015, not just property crimes tracked by PredPol. Carr said that, during that same time period, the city also hired a new police chief, Tim Ross, and increased staffing and other policing efforts.
“It’s obviously a combination of things,” he said.
BPD also has put more officers on the street and changed some of its policies in the same time period that it has been using PredPol.
“The Birmingham Police Department has had an overall decrease in crime by implementing various strategies as well as resources,” Mauldin said via email.
Though the exact level of effectiveness is hard to determine, Carr said predictive policing seems to be working for Homewood and will likely continue to be part of its policing work in the future.
“This is a tool. This is one tool in a toolbox that we use,” Carr said. “It doesn’t dictate what we do every single day. It has a place and a benefit but it’s not the only thing that we use. It doesn’t replace a highly trained and proactive police officer out there working.”