OSD 223: You know too much
For some, the returns on additional knowledge are positive. For others, they're not.
When a bullet is found in a victim or at a crime scene, if police can get their hands on the gun suspected to have fired the bullet, they can do a simple test. Fire a new bullet from the gun, and compare the striations on it to those on the bullet from the crime scene. The principle is that due to microscopic manufacturing inconsistencies, each gun leaves a unique pattern of striations on the bullets it fires, similar to a fingerprint. If the striations on the two bullets match, you know you’ve got the gun that fired the bullet at the crime scene.
Over on his Substack, Radley Balko wrote this week about how bullet matching might actually be bullshit:
For more than a century, forensic firearms analysts have been telling juries that they can match a specific bullet to a specific gun, to the exclusion of all other guns. This claimed ability has helped to put tens of thousands of people in prison, and in a nontrivial percentage of those cases, it’s safe to say that ballistics matching was the only evidence linking the accused to the crime.
But as with other forensic specialties collectively known as pattern matching fields, the claim is facing growing scrutiny. Scientists from outside of forensics point out that there’s no scientific basis for much of what firearms analysts say in court. These critics, backed by a growing body of research, make a pretty startling claim — one that could have profound effects on the criminal justice system: We don't actually know if it's possible to match a specific bullet to a specific gun. And even if it is, we don't know if forensic firearms analysts are any good at it.
The article is fairly damning. It makes the case that while bullet matching does have some predictive value, it falls far short of both the confidence with which it’s portrayed and the certainty that should be required to send somebody to prison.
In a way, this shouldn’t be surprising. Bite mark evidence turned out to be bullshit. So did much of arson fire forensics. Hell, even fingerprints aren’t as reliable as people think.
The most surprising part of Balko’s piece is the part where he says that because of these concerns, a judge in Chicago ruled that bullet matching evidence is inadmissible in a case he’s presiding over.
Traditionally, courts aren’t great with math. For example, in a Ninth Circuit ruling upholding the lifelong gun ban on people who’ve been involuntarily committed, they reasoned that “the scientific evidence supported … that those who have been committed involuntarily to a mental institution still pose an increased risk of violence even years after their release from commitment.” But the ruling went on to cite just one study showing increased risk of suicide, and that study only followed patients for 8.5 years. The plaintiff had been mentally healthy for 20 years.
Courts might be the only professional setting where this level of innumeracy remains acceptable. It wouldn’t survive in a setting where the decision-makers have to bear the cost of their mistakes. Judge Roger Benitez even called this out in his original Freedom Week ruling in 2019’s Duncan v. Becerra. On page 47 of the ruling, he laid into California for the shoddiness of the statistics it submitted as evidence — a survey by Mayors Against Illegal Guns and a dataset compiled by Mother Jones:
This is federal court. The Attorney General has submitted two unofficial surveys to prove mass shootings are a problem made worse by firearm magazines holding more than 10 rounds. Do the surveys pass the Federal Rule of Evidence Rule 403 test for relevance? Yes. Are the surveys admissible under Federal Rule of Evidence Rule 802? No. They are double or triple hearsay. No foundation has been laid. No authentication attempted. Are they reliable? No. Are they anything more than a selected compilation of news articles – articles which are themselves inadmissible? No. Are the compilers likely to be biased? Yes.
Where are the actual police investigation reports? The Attorney General, California’s top law enforcement officer, has not submitted a single official police report of a shooting. Instead, the Attorney General relies on news articles and interest group surveys. Federal Constitutional rights are being subjected to litigation by inference about whether a pistol or a rifle in a news story might have had an ammunition magazine that held more than 10 rounds.
He continued in a footnote:
This Court has observed that the quality of the evidence relied on by the State is remarkably thin. The State’s reliance and the State’s experts’ reliance on compilations such as the Mother Jones Magazine survey is an example. The survey is found in the Attorney General’s Opposition to Plaintiff’s Motion for Summary Judgment at Exhibit 37. It purports to be a survey of mass shootings. It does not indicate how its data is selected, or assembled, or tested. It is unaccompanied by any declaration as to its accuracy. It is probably not peer-reviewed. It has no widely-accepted reputation for objectivity. While it might be something that an expert considers in forming an admissible opinion, the survey by itself would be inadmissible under the normal rules of evidence.
You’d think the noteworthy thing is for government lawyers to show up to court with data that would fail a middle school stats class. That’s actually the normal part. The noteworthy thing is that a judge didn’t allow it.
This generalizes to gun stuff broadly — when you hear science-based ideas about reducing shootings, it’s weirdly often that they live at the Dunning-Kruger sweet spot. Science-y enough to sound good, but that’s where the critical thinking stopped.
An example is the catchy, recurrent idea to track credit card purchases and flag buying habits that only a mass shooter-in-the-making would have. Quick math problem. Imagine you’re searching for 5 targets in a population of 100 million. You’ve built the world’s best bad guy detector by several orders of magnitude, with a false positive rate of 0.1% and a false negative rate of 0.
Your detector flags one of the accounts in your database. What are the odds that that account is one of your 5 targets?
Answer: there’s a 19,999 in 20,000 chance — 99.995% — that you’ve caught an innocent person. Ignoring this is called “base rate neglect”. Tldr it’s very hard to find rare targets in a large population without getting overwhelmed by false positives.
The rebuttal might be, “Fine, but it’s worth it if you catch the target.” But consider how this would work in real life. Dealing with an alarm that’s wrong 99.995% of the time is expensive. The false positives would so consume the system’s resources that the true positives would fade right back into the crowd.
So why would otherwise extremely smart people advocate this quixotry? If you think of it like, “Let’s figure out if gun rights are a net positive or net negative”, this makes no sense. The incentive should be to learn all the technical details you could, to be an expert on all aspects of the field. So ignorance would be hard to explain.
But if instead of an open “are guns noxious?”, someone’s starting point is “guns are noxious, now how do we stop that?”, this starts to make a lot more sense. Now technical details become dangerous, not helpful. The foundational premise is that guns are noxious. So it would be game theoretically stupid to learn about them, because that knowledge can only cut against your premise. So a rational actor would acquire only enough technical knowledge to sound convincing to the laity, and no more.
This is really a question is about the incentives of seeking — or avoiding — detailed technical knowledge about guns and related stats. We are lucky that the game theory shakes out such that gun rights benefit from the spread of knowledge. Gun control, conversely, works better the less people know about the technical details.
The more people learn about guns, the more they tend to see things differently from what they might have been taught to believe. Or, more to the point, taught to fear. The lesson is clear, then. Teach more info to more people. The deeper they go, the better off we’ll all be.
This week’s links
The Slow Mo Guys and Kentucky Ballistics blowing up a gun at 187,000 frames per second
That’s 6233x slower than real life.
$7184 to get carry permits in 11 states and D.C.
From School of the American Rifle. Through reciprocity, those permits cover 44 states.
“Firearms classes taught me, and America, a very dangerous lesson”
New York Times op-ed. And here’s a counterpoint from David Yamane.
OSD Discord server
For those that enjoy talking about gun stuff and want a welcoming place to do so, join our Discord server. The OSD team and readers are there. Good vibes only.
Merch
Gun apparel you’ll want to wear out of the house.
Office hours
If you’re a new gun owner, thinking about becoming one, or know someone who is, come to OSD office hours. It’s a free 30-minute video call with an OSD team member to ask any and all your questions.
Well said:
"""
But if instead of an open “are guns noxious?”, someone’s starting point is “guns are noxious, now how do we stop that?”, this starts to make a lot more sense. Now technical details become dangerous, not helpful. The foundational premise is that guns are noxious. So it would be game theoretically stupid to learn about them, because that knowledge can only cut against your premise. So a rational actor would acquire only enough technical knowledge to sound convincing to the laity, and no more.
"""
Glad to see Base Rate Neglect here.
It was brought up to good effect in Mr. Correia's book and no less effective here.