Wednesday, July 26, 2023

New evidence that we're solving more murders. By Matthew Yglesias — Read time: 10 minutes

MATTHEW YGLESIAS

JUL 26, 2023

New evidence that we're solving more murders

Is the great decline in homicide clearance rates all a big misunderstanding?


Real world police departments are, unfortunately, much worse than their fictional counterparts at solving serious violent crimes.


A lot of that comes down to the types of offenses fiction tends to focus on. Over the past six months, I’ve read all four of Michael Connelly’s Harry Bosch novels and across all those capers in greater Los Angeles, basically only one case amounted to “guy in a gang killed another guy in a gang.” It’s of course difficult to fully characterize crimes that aren’t solved. But as best anyone can tell, a very large share of unsolved murders are the result of lethal but routine gang violence — crimes that are difficult to solve because witnesses tend not to cooperate, and where even if authorities are pretty sure Crew X did this shooting and Crew Y did that shooting, they still need to identify a specific trigger man to close the case.


At any rate, a lot of unsolved murders stay unsolved. And as anyone who’s ever looked up clearance rates knows, not only are there a lot of unsolved murders, but clearance rates have fallen precipitously over time.



This is normally taken as a sign of falling police productivity. Either witnesses are becoming less cooperative, murderers are getting better at avoiding detection (by using guns, for example), or detectives are getting less skilled. Another interpretation is that Warren Court jurisprudence made it harder for investigators to crack cases. Or perhaps it’s the opposite, and prosecutors used to railroad innocent people but now they don’t.


An important new article available now in pre-print from the Annual Review of Criminology suggests that this is all wrong.


Instead, in “The Sixty-Year Trajectory of Homicide Clearance Rates: Toward a Better Understanding of the Great Decline,” Duke’s Philip J. Cook and the University of South Carolina’s Ashley Mancik argue that the decline in clearance rates is completely benign.


This is because, if you look through the clearance rates to actual convictions, it turns out that the ratio of people convicted of murder to murders has, if anything, gone up. The authors are careful to avoid too much speculation, but to put it in my own terms, the missing clearances aren’t solved cases or innocent people behind bars — they’re sloppy arrests that didn’t result in prosecutable cases. Police have, in fact, gotten better at bringing cases against perpetrators.


What it means to “clear” a case

I’ve written previously about how the quality of the data available about crime and criminal justice in the United States is really low.


Unlike the case with our economic data, which though imperfect is generally quite good, crime and criminal justice data is just a total mess. If you want to know the answer to a really basic question like “how many people have been murdered in the U.S. so far in 2023,” the officials sources have no idea.


The best source of information on this question is Jeff Asher, who obtains timely homicide counts from the 100 largest cities. We know from his work that murder is down by about 12% year-to-date in those cities, and history tells us the national trajectory almost always tracks the large cities database. But that’s an empirical regularity, not a law of nature. And if something changed such that the crime trends diverged, it would take us a while to find out. Back in October of 2022, the FBI reported national crime stats for 2021, so we should find out if Asher’s generalizations about 2023 are valid some time in the fall of 2024.


By the same token, one of the main upshots of this Cook/Mancik paper seems to be that the FBI’s data on clearances is misleading.


According to the FBI, there are two ways a case can be cleared — by arrest or by exceptional means. Cleared by exceptional means is supposed to be reserved for situations in which a prosecution is logistically impossible because the suspect is dead or has fled to a jurisdiction that won’t extradite. Cleared by arrest means the suspect has been arrested, charged with a crime, and “turned over to the court for prosecution.” But there’s no requirement that the arrestee actually be convicted or even prosecuted. The FBI is basically asking police departments to report whether they arrested someone. And one arrest is as good as another for those purposes.


I can see a case for counting it this way. Prosecutors might decline to prosecute a case for all kinds of reasons, including the basic reality of limited prosecutorial and judicial resources. That’s not a reflection on the success or failure of the police investigation. On the other hand, if a detective arrests someone on the basis of inadmissible evidence and the killer goes free, that really is a failure of the investigation. Or if they work the case for a while, become convinced Guy X probably did it, arrest him in hopes of gathering more evidence (or “breaking” him in interrogation), but then it doesn’t come together, that’s a failed investigation. In particular, when it comes to homicides, it’s very unlikely that prosecutors are letting cases slide because they are too busy. Even the most hardcore progressive prosecutors want to win murder cases.


And the new paper shows that the image of falling success rates in the second half of the 20th century is basically an illusion.


Conviction rates have gone up, not down

Here, though, we run into another data problem. Rather than defining clearance in one particular way, if I ran the zoo, we would simply record all the information. There would be a big data frame where each crime gets its own row. One column would say whether the crime was cleared by extraordinary means; another would say whether there was an arrest. A third column would say whether there was a prosecution, and a fourth column would say whether there was a conviction or a guilty plea. A fifth would say whether there was prison time. So instead of asking whether the case was cleared, you could specifically ask “was anyone arrested? Was he convicted? Did he go to prison?”


Sadly, in the real world, we do not have this information.


What we have instead is somewhat spotty data from the Bureau of Justice Statistics about who is in prison and why through surveys conducted in 1970, 1991, 2000, and 2010. This lets us see how many people are sent to prison for murder, which is useful. Unfortunately, not every state participated in all four surveys. So in this table, the paper’s authors report results two ways: one uses all the data available, even though it isn’t strictly comparable, and the other uses the consistent data even though it isn’t comprehensive. The trends come out the same.


The big thing you see is that even though the clearance rate plunged between 1970 and 2000, the convict-to-victim ratio soared. Over the next 10 years it fell slightly, but remained much higher than it had been back in the good old days.



Their basic conclusion is that contrary to the impression given by the falling clearance rate, the odds of being held accountable for a murder seem to have gone up. The “response %” lines are showing what share of the national population and what share of homicides are in the survey. There’s a big leap in coverage between 1970 and 1991, so you might think the rising accountability ratio is some kind of statistical artifact. But the 19-state consistent sample covers about half the nation’s murders and shows the same trend. It’s a pretty miscellaneous collection of states — California, Colorado, Georgia, Hawaii, Illinois, Kentucky, Maryland, Minnesota, Mississippi, Missouri, Nevada, New York, Ohio, Oklahoma, South Carolina, Tennessee, Utah, Washington, and West Virginia — so the sample seems unbiased.


Again, ideally, someone would be tracking convictions in a more precise way. But given the data available, I think it’s convincing: the falling clearance rate almost certainly reflects a higher standard for arresting someone, not a falling likelihood of being held accountable.


Other possible interpretations

A slight fly in the ointment here is that there is not a one-to-one correspondence between homicide victims and homicide perpetrators.


Some people kill multiple victims, and sometimes multiple people are legally responsible for the death of one person. Without comprehensive linkage of individual inmates to individual crime reports, we can’t definitively establish that the odds of someone being convicted of any given homicide have gone up. I feel pretty sure that’s what happened, but it’s possible in principle that there was some kind of exogenous rise in the number of perpetrators per crime that led to a rising convicts-to-victims ratio without the odds of accountability rising. But that seems very unlikely to me.


And once we have the correct facts at hand, we can re-interpret what we know about other changes in American society.


For starters, whatever concerns one might have about the Warren Court’s criminal justice jurisprudence, they do not seem to have created a situation where lots of guilty people were suddenly going free. Either the quality of investigations rose enough to counterattack the impact of new legal protections for defendants, or else — perhaps more plausibly — most of the borderline cases were never getting prosecuted anyway. Perhaps the main thing that changed was clarifying to everyone what the standard was, such that police departments started getting more rigorous about which arrests they would make in the first place.


A related issue is that this should perhaps alter how we think about the (previously documented) divergence in clearance trends based on the race of the victim, since the decline has been driven by decreasing clearance of murders involving Black victims.



Discussion of this point has generally taken for granted that the falling clearance rate represents a genuine decline in the efficacy of police investigations. That then suggests a conversation about either departments ignoring Black victims, or perhaps anti-police sentiment in the Black community making it harder to find cooperative witnesses. Those both seem broadly plausible as partial explanations for the gap, but they’ve never sat well with me as explanations of the trend. Did police departments really get more racist during the final third of the 20th century? That doesn’t sound right.


With the new macro-scale interpretation of clearance trends, I think we can line up the racial gap with the more intuitive idea that police departments have become less cavalier about making low-quality arrests of Black suspects — arrests that were so low-quality they rarely led to prosecutions and convictions anyway. In “Devil in a Blue Dress,” the LAPD arrests Easy Rawlins on very thin evidence, cursorily tries to beat a confession out of him, then lets him go because they have nothing. That’s not quality police work, and the fact that it’s become less common shouldn’t be mistaken for a decline in investigative quality.


Solving more crimes would be good

This is all basically good news about policing in America, but the bad news is that even absent the disturbing long-term trend, it does seem like there was a short-term decline in clearances during the 2020 murder boom and that all things considered, the clearance rate is a lot lower than we’d like it to be.


I’ve written about this before, but one good way to solve more crimes seems to be to have more detectives. A very telling study out of Boston looked at why the Boston PD was so much more likely to solve murder cases than non-fatal shootings. Part of the answer is that homicide is a more serious crime than “shot a guy but he lived,” so they invested more detective hours in trying to solve the homicides. Philip Cook, Anthony Braga, Brandon Turchan, and Lisa Barao looked at how this played out in practice and found that for the first two days, the department deployed similar amounts of manpower whether the shooting was fatal or not. But for non-fatal shootings, the investigation ended after day two, while for murders, they kept working the case. And while about half of all cleared homicides were solved within that two-day window, the other half were solved through prolonged investigation. Inside that window, fatal and non-fatal shootings were solved at the same rate. But fatal shootings were twice as likely to be solved overall because of the value of continued investigation.


This ties in to ideas I’ve written about many times. To actually improve policing, we’re going to need to spend more money on it. But we also can’t conjure competent detectives up out of nothing. We need ideas like Police for America that improve the police recruiting pipeline. Anyone who urges younger, better-educated, and more liberal people to take policing careers seriously as an option is doing America’s cities a favor.


Last, though, I think everyone should be open to the idea of more surveillance. As Jennifer Doleac explained, surveillance technologies — everything from DNA databases to cameras to GPS and beyond — seem to actually deter crimes rather than merely expand the number of people who end up in the criminal justice system. But they do deter crime largely by raising the prospect that you’re likely to get caught and face consequences if you break the rules. And that’s the kind of system we should be aiming for — one where acts of violence are very likely to be punished, and therefore only rarely occur. The good news from this new study is that we have, in fact, made progress toward this goal rather than regressing. But we still have a long way to go.


No comments:

Post a Comment

Note: Only a member of this blog may post a comment.