Chicago cops begin using algorithm identifying ‘propensity for violence’ to tamp down shootings
The Chicago police department is using a unique algorithm in its battle against armed violence, to figure out who is most likely to be involved in a shooting, either as a victim or perpetrator.
The computer program takes into account various factors such as criminal records, gang affiliations, gunshot wounds already suffered, or the number of past arrests.
Its evaluations are used to create a database called the “Strategic Subject List,” which is supposed to help police battle the bloodshed in the city brought on by retaliatory gang violence.
But the exact nature of the criteria used by the predictive algorithm is secret and controversial. The program’s principal designer Miles Wernick of the Illinois Institute of Technology, did not respond when contacted by AFP.
Critics say the secretive system violates freedoms by stigmatizing people as having an “alleged propensity for violence.”
But police justify the use of the algorithm by saying it ensures they focus resources on people who are most likely to commit gun violence or be threatened by it.
“Chicago is the most racially-segregated major city in the United States. It has the largest and the most persistent American gang problem,” said David Kennedy of the John Jay College of Criminal Justice in New York.
– Daily shootings –
Since early 2016, gun violence has already left about 250 people dead and 1,150 injured in the city. The vast majority of these victims were on the Strategic Subject List.
“We know we have a lot of violence in Chicago, but we also know there’s a small segment that’s driving this stuff,” Eddie Johnson, the new police chief of the city of 2.7 million people said recently in an interview with The New York Times.
Appointed in late March, Johnson, who is African-American, wants to boost the image of the police department following the shooting death of a black teenager by a white police officer.
Under his guidance, the Chicago Police Department last week launched a drug and gang raid, arresting 140 people, more than 80 percent of whom were on the list.
“For a long time now American police forces have been using computer technology and data analysis to focus on high crime areas and to focus the resources there, to allocate more officers. It’s actually on the whole pretty effective in reducing crime,” said Robert Weisberg of the Criminal Justice Center at Stanford University.
But in Chicago, he said, “this goes a step farther in terms of actually listing individuals.”
– ‘Preventive visits’ –
The algorithm, regularly updated since its launch three years ago, is mainly used to identify people that might potentially benefit from a personal visit from authorities.
These visits give police officers or social workers the opportunity to propose rehabilitation and a way out of gangs, as well as drug treatment programs and other aid. And they also warn those on the list of the potential consequences of gun-related crimes.
Studies have shown that offenders were often unaware of the penalties for particular offenses.
Thus, someone with a criminal history who is caught in the streets of Chicago with a round of ammunition could be prosecuted in federal court and automatically sentenced to 15 years in prison.
Some in Chicago fear that the list could unjustly result in tougher prosecutions and heavier convictions for some.
Others worry about the secrecy surrounding how the algorithm operates, and question its effectiveness given the statistics: shootings are on the rise in Chicago, with the threshold of 1,000 shooting victims crossed one or two months earlier in 2016 than in previous years.
The question left unanswered is whether the shooting statistics would have been worse without the algorithm.
“People literally don’t know what goes into the underlying analysis that produces the folks on the list. The list is produced by people in university circles in Chicago. It’s produced by academics,” said John Jay College’s Kennedy.
“People not part of their group don’t know how these calculations are made, so there is real concern about transparency and questions of underlying bias.”