OSD 269: The social credit system has missiles now
In which we attempt to discuss OSD-relevant topics in Gaza while staying out of the politics.
Guns are about individual power. Every disagreement about gun ownership comes down to who’s allowed to be unilaterally powerful. That’s what we focus on around here.
We don’t talk much about the cutting edge of government empowerment. What’s the state of the art there in 2024?
Here’s one recent data point on that from +972 Magazine, writing about how Israel picks bombing targets in Gaza. They use a software system called Lavender. It uses 24/7 real-time location tracking of every cell phone in Gaza, with some data running back for years, to track associations between people. Starting with the cell phone of, say, a known Hamas leader, it attempts to predict the role of other associated cell phone owners, eventually spidering out to map the entire society.
If the reality is what the article claims, these excerpts are useful for understanding how the system works:
During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male….
…
The sources said that the approval to automatically adopt Lavender’s kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel “manually” checked the accuracy of a random sample of several hundred targets selected by the AI system. When that sample found that Lavender’s results had reached 90 percent accuracy in identifying an individual’s affiliation with Hamas, the army authorized the sweeping use of the system. From that moment, sources said that if Lavender decided an individual was a militant in Hamas, they were essentially asked to treat that as an order, with no requirement to independently check why the machine made that choice or to examine the raw intelligence data on which it is based.
“At 5 a.m., [the air force] would come and bomb all the houses that we had marked,” B. said. “We took out thousands of people. We didn’t go through them one by one — we put everything into automated systems, and as soon as one of [the marked individuals] was at home, he immediately became a target. We bombed him and his house.”
…
B., a senior officer who used Lavender, echoed to +972 and Local Call that in the current war, officers were not required to independently review the AI system’s assessments, in order to save time and enable the mass production of human targets without hindrances.
“Everything was statistical, everything was neat — it was very dry,” B. said. He noted that this lack of supervision was permitted despite internal checks showing that Lavender’s calculations were considered accurate only 90 percent of the time; in other words, it was known in advance that 10 percent of the human targets slated for assassination were not members of the Hamas military wing at all.
For example, sources explained that the Lavender machine sometimes mistakenly flagged individuals who had communication patterns similar to known Hamas or PIJ operatives — including police and civil defense workers, militants’ relatives, residents who happened to have a name and nickname identical to that of an operative, and Gazans who used a device that once belonged to a Hamas operative.
…
B. said that the reason for this automation was a constant push to generate more targets for assassination. “In a day without targets [whose feature rating was sufficient to authorize a strike], we attacked at a lower threshold. We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us. We finished [killing] our targets very quickly.”
The cutting edge of state empowerment technology in 2024 is a system that can do the following:
Complete government surveillance of all electronic activity within a territory, and real-time 24/7 location data on all cell phones.
Use the data from #1 to assign every person in the territory a score.
Automatically kill them and their family when their score crosses a threshold.
Lower the threshold as desired, to use up a target quantity of munitions.
Gun rights advocates get mocked for imagining the far-fetched ways that a social credit system, or even the basic power to subpoena cell phone location data, could play out. Lavender ends the debate on how far-fetched those ideas are. They’re not only possible, they’re deployed in the wild today. So there’s no longer a question of what a government can do with this data. It’s only a matter of which governments do it.
How will this play out?
Well, a few ways.
In authoritarian states it has been playing out for years and will continue to do so. Usually in much more mundane ways than in a war. Take, for example, this excerpt from a recent episode of the Angry Planet podcast about China (at 18:22). The speaker is Jane Perlez, former Beijing bureau chief for The New York Times:
I know of friends who were outside the Forbidden City late at night a couple months ago and the cops asked, “Why are you here?” They had a pretty good reason, and the cops just pressed a button on their cell phone and they could trace where these people had come from in the previous half-hour.
In war settings, it’ll play out in the above ways, where cell phone association data is used for targeting. But since the stakes are higher, the returns on active countermeasures will be higher. That’s where radio jamming technology gets interesting, as seen in the cat-and-mouse game on that between Russia and Ukraine.
The most unclear bit is how it’ll play out in the handful of societies with strong individual rights. The cat is out of the bag. We often discuss how weapons technology can’t be uninvented and only becomes more available over time. That’s just as true for governments as it is for individuals. So it is inevitable that weaponized surveillance systems will (a) get more powerful over time and (b) eventually get into the hands of all governments.
Above we said, “So there’s no longer a question of what a government can do with this data. It’s only a matter of which government does it.” So the only way to not end up on the wrong end of one of these systems is to make sure your government doesn’t point it at you. This is where your impulse might be to invoke some come-and-take-it bravado. But fighting a motivated state is not as glamorous as internet commenters say. Most of the Wolverines will die, and the survivors will get to rule over a pile of rubble. Not a great first choice.
The best way to win a battle is to never have to fight it in the first place. People are not kept free by fighting government action. They’re kept free by the government tripping over its own feet and not managing to take action in the first place. At a 2011 Senate hearing, Antonin Scalia said this about why individuals in the US remain uniquely free compared to other countries:
The real key to the distinctiveness of America is the structure of our government. One part of it, of course, is the independence of the judiciary. But there’s a lot more.
There are very few countries in the world, for example, that have a bicameral legislature…. Very few countries have two separate bodies in the legislature, equally powerful. That’s a lot of trouble, as you gentlemen doubtless know, to get the same language through two different bodies elected in a different fashion.
Very few countries in the world have a separately elected chief executive. Sometimes I go to Europe to talk about separation of powers, and when I get there, I find that all I’m talking about is independence of the judiciary. Because the Europeans don’t even try to divide the two political powers, the two political branches, the legislature and the chief executive. In all of the parliamentary countries, the chief executive is the creature of the legislature. There’s never any disagreement between them and the prime minister, as there is sometimes between you and the president. When there’s a disagreement, they just kick him out. They have a no-confidence vote, a new election, and they get a prime minister who agrees with the legislature.
The Europeans look at [our] system and they say, “Well, it passes one house, it doesn’t pass the other house, sometimes the other house is in the control of a different party. It passes both and then this president who has a veto power vetoes it.” They look at this and they say, “It is gridlock.”
And I hear Americans saying this nowadays. There’s a lot of it going around, they talk about “a dysfunctional government” because there’s disagreement. And the framers would have said, “Yes! That’s exactly the way we set it up. We wanted this to be power contradicting power, because the main ill that beset us, as Hamilton talked about in The Federalist when he talked about a separate senate, he said, ‘Yes, it seems inconvenient. But inasmuch as the main ill that besets us is an excess of legislation, it won’t be so bad.’”
This is 1787, he didn’t know what an excess of legislation was. So unless Americans can appreciate that and learn to love the separation of powers — which means learning to love the gridlock which the framers believed would be the main protection of minorities. The main protection. If a bill is about to pass which really comes down hard on some minority, they think it’s terribly unfair, it doesn’t take much to throw a monkey wrench into this complex system.
Americans are uniquely free where gridlock prevents government action (e.g. gun rights), and they’ve lost their freedoms in exactly those areas where government moves swiftly (e.g. the administrative state, the highly efficient plea bargaining system, or Fourth Amendment exceptions for guns and drugs). So the best bet for keeping AI-powered airborne death robots in check isn’t paper rights, it’s paperwork. Distribute veto power promiscuously and the death robots won’t work because the people running them won’t be able to agree on what to do.
The other safeguard is competition. Innovating in individual empowerment even faster than in state empowerment. The pitch for the Palantirs of the world is, “Technology is inevitable. So given that <scary state empowerment innovation> is going to exist, do you want the Chinese government to be the only ones that have it? Or do you want the US to be able to hold its own?”
Framed like that, sure, the US government is going to continue to develop increasingly powerful tech. But true American dynamism is about individuals, not government. Gun rights are valuable for that. They promote the ethos that the consumer market should be innovating in the defense space. And as long as that ethos is alive and well, the consumer market will be able to move faster than government vendors.
This week’s links
“Why America (Still) Loves Guns”
From the Horses channel on YouTube. h/t Discord subscriber Helvetism.
First time at a gun range
Periodic reminder: go on YouTube a couple times a year and watch videos of people shooting guns for the first time. It’s a great way to remind yourself what first-timers are thinking about. Also check out our guide on taking a newbie shooting.
More about Open Source Defense
Merch
Rep OSD.
OSD Discord server
If you like this newsletter and want to talk live with the people behind it, join the Discord server. The OSD team is there along with tons of readers. See you there.
"The cutting edge of state empowerment technology in 2024 is a system that can do the following:
Complete government surveillance of all electronic activity within a territory, and real-time 24/7 location data on all cell phones.
Use the data from #1 to assign every person in the territory a score.
Automatically kill them and their family when their score crosses a threshold.
Lower the threshold as desired, to use up a target quantity of munitions."
"AI-powered airborne death robots"
I've personally imagined before a system created from the intersection of the following realities:
Massive widespread surveillance and record-keeping
An ever increasing body of laws and regulations so large and complicated that nobody actually knows the extent of, a la "three Felonies a Day".
Artificial intelligence systems that look for patterns.
Combine this with the story from OSD 267 regarding identification of gun owners from credit card data.
A cell phone, when called, basically transmits its presence in response to a "wake-up" signal from the nearest tower. This happens before it even rings. Anyone who remembers the old days where your stereo makes a buzzing noise just before your phone rings? That's what's going on.
The result is a terrifying situation that is basically "Show me the man, and I'll show you the crime, and while we're at it, lets just have a drone drop a radio-guided Hellfire on him."
You've been identified as a high probability bad actor based on an algorithmic system that compares your lifetime of activities and social connections against a superprosecutor's understanding of the entirety of International, Federal, State, and local laws and regulations. You're a known gun owner even though there's no registry, so you're a "high risk" target. The system has all of your vital data as well as the ability to track your phone and vehicle, which can generate a real-time location, and feed it to a high endurance drone carrying weaponry that can home in on your phone's signal. Mercifully, the Hellfire is capable of supersonic speed, so you won't even hear it coming. Just "boom".
This can also be fully automated so the conversation can just be: "Kill the guy with the highest score" and the system will do it and GPT will type up a very good press release explaining why.
Welcome to the future.
Person of Interest was a documentary...Change my Mind