This weekend’s mass murder at a supermarket in Buffalo has rekindled debate that killers are enabled by the constitutional enshrinement of a natural right to possess weaponry. Some are once again calling for Congress to impose restrictions that will end the US’s uniquely permissive legal regime around access to military-grade weapons.
That’s right, New York Gov. Kathy Hochul is calling for more limits on racists’s access to modern communication tools:
For Hochul, the massacre reflected a failure not just to limit access to guns but to curb the ability to openly share and distribute hate speech.
The governor told ABC that the heads of technology companies “need to be held accountable and assure all of us that they’re taking every step humanly possible to be able to monitor this information.”
“How these depraved ideas are fermenting on social media — it’s spreading like a virus now,” she said, adding that a lack of oversight could lead to others emulating the shooter.
We talk about guns here, but we’ve also talked a lot about how if you’re worried about decentralized power benefitting bad actors, computer technology should be even scarier. From “OSD 87: You are holding a weapon of war right now”:
[Up through the Vietnam War] encryption had always been first and foremost a weapon of war. In World War 2, encryption research happened at the most secretive levels of each side’s war effort, and people lived and died by encryption as surely as they did by guns and bombs. It was so important that sharing encryption algorithms with the enemy was punishable as espionage — i.e. punishable by death.
There’s a big difference between those wartime encryption algos and the ones you’re using right now to read this post: your encryption algos are far more powerful. Before the 1970s, encryption wasn’t widely available to the public, and the few available forms were trivially breakable by any powerful actor, and certainly by any interested government. But by the ‘70s, theoretical advances in computer science and practical advances in microprocessor manufacturing led to a world-historical first: publicly available encryption that nobody, no matter how powerful or how well-resourced, could break. (The specific algorithmic advance is called public-key cryptography.)
That’s powerful. And that wasn’t lost on the entities whose monopoly on power it undermined. There’s a set of regulations in the US called International Traffic in Arms Regulations (ITAR) that defines the rules for exporting military weapons — to whom, by whom, with which approvals, and so on.
Until the mid-‘90s, encryption algorithms were on the ITAR list. Under the same rules that make it illegal for you to export missiles and landmines, it was illegal for you to export — by simply, say, posting a description of the math on the internet — public-key encryption algorithms. As web browsers started building in those algos (to do things like protect your credit card info when you shop online), the regulations became unenforceable. So they ultimately fell by the wayside — but not for lack of trying from the government’s end. To this day, it remains the government’s position that unbreakable encryption should be illegal — i.e. that this math is a weapon of war, that only the government can be trusted with it, and that if push comes to shove, you should be imprisoned for having it. (As is tradition, anti-terrorism is the sharp end of the wedge.)
This promiscuous distribution of power is such a common feature of technological progress that it’s actually not separable. The purpose of technology is that it’s a force multiplier for its users. And the nature of it is that it gets steadily cheaper and more widespread. Therefore, a workable definition of tech is “a force multiplier that becomes more accessible over time”.
If you’re thinking about how that’s dangerous with respect to guns, or to strong encryption, or to social media, you’re thinking too small. They’re all just examples of a force multiplier becoming more accessible. That’s what all tech is. So you mustn’t be afraid to dream a little bigger and address the underlying question: is tech itself dangerous?
This is really a philosophical bet about whether the benefits of technology accrue more to constructive uses or destructive ones.
Of course, it’s not quite that simple. Many people would answer “well it depends, and the role of law is to prevent destructive uses while allowing constructive ones”. Matt Parlmer captures one response to that here:
The implication is that technological discovery requires the very same messy exploration that legal crackdowns exist to stop. Kind of a “everything is permitted until it’s banned vs. everything is banned until it’s permitted” proposition.
The response to that, understandably, is “‘Messy exploration?’ That sounds romantic in a newsletter, but here in real life, ‘messy’ means that people die because some unfettered new tech empowered a malicious or incompetent actor. We can’t afford that kind of messiness.”
That’s actually a great point! The naive answer is “Regulate smarter and more surgically.” But it’s a red herring to debate the correct shape of regulation, because remember the premise: tech is a force multiplier that becomes more accessible over time. The whole “the Net interprets censorship as damage and routes around it” thing. So what do you do with a tool whose defining feature is that access to it can’t be limited?
There are two levels to answering that. Individual-level and system-level.
At an individual level, this is a classic prisoner’s dilemma. We would all be better off if, magically, nobody had the capacity to act against us against our will. We would all be worse off if others have that capacity and we don’t. But irrespective of what others do, we can always improve our outcome by retaining that capacity vs. relinquishing it. So “retain that capacity” is the dominant strategy.
And, welp, that might sound depressing. “True peace is not for this world, so stay strapped.” Not exactly a Hallmark card.
But zooming out to the system level, this starts to look rosier.
It’s easy to paint wide accessibility to tech as scary. But wide accessibility is the engine that drives improvements. “Given enough eyeballs, all bugs are shallow”. Through that lens, free access to powerful tech isn’t the regrettably optimal solution to a prisoner’s dilemma — it’s the most effective process for good actors to keep bad actors in check. The solution to bad actors weaponizing past technological advances wasn’t to limit access, it was to expand access and sow innovation. That only becomes more true over time.
This week’s links
FieldCraft Survival on women’s fashion while carrying a gun
Amber on some gun-compatible sartorial concepts. For carry to keep getting more popular, it has to work for people who aren’t going to compromise the fit and fashion of their clothes in order to carry a gun. That describes most people!
Redditor makes a good point about pedantic objections to the term “assault weapon”
Engage the substance of other people’s ideas.
Clark Neily’s 45-minute class on Second Amendment law
Review for many of you, but this is a good overview that Clark recorded last Friday with the National Constitution Center.
TIL in the 1800s, the Papal States had its own custom rifle caliber
🤌 12.7x45mmR Remington Pontificio 🤌
OSD office hours
If you’re a new gun owner, thinking about becoming one, or know someone who is, come to OSD office hours. It’s a free 30-minute video call with an OSD team member to ask any and all your questions.
Support
Like what we’re doing? You can support us at the link below.
Merch
And the best kind of support is to rock our merch and spread the word. Top-quality hats, t-shirts, and patches with a subtle OSD flair. Even a gallery-quality print to hang on the wall.
Guns & gun parts not covered by a ban are just as capable and lethal as those that were.
If you ban certain guns, fewer people will die with THOSE guns.
Just do not expect overall murder rates to change.