OSD 187: The framework of frameworks for content moderation
On September 9, the International Organization for Standardization voted to create an MCC (merchant category code, the four-digit number that categorizes what business a merchant is in) specific to gun shops. On September 10, Visa announced that they’ll adopt that MCC for transactions on their network. (The other card networks are likely to follow close behind.)
That caused concerns about deplatforming, and Visa released a statement on the topic. Titled “Protecting legal commerce”, it’s about as strong a defense of commerce in guns as you could hope for from a company Visa's size:
We do not believe private companies should serve as moral arbiters. Asking private companies to decide what legal products or services can or cannot be bought and from what store sets a dangerous precedent. Further, it would be an invasion of consumers’ privacy for banks and payment networks to know each of our most personal purchasing habits. Visa is firmly against this.
As we do when ISO creates a new merchant code, Visa adopts the standards that apply to our industry. For us, that means working with our financial institution clients to enable them to implement this new MCC when ISO makes it available.
A fundamental principle for Visa is protecting all legal commerce throughout our network and around the world and upholding the privacy of cardholders who choose to use Visa. That has always been our commitment, and it will not change with ISO’s decision. Our rules require financial institutions involved in transactions to evaluate and process all legal transactions. Our network does not allow any financial institution member to deny transactions for the purchase of legal goods or services based on which MCC they fall under.
Visa provides our services to everyone, everywhere, so long as they are used for legal purchases. We believe that is the appropriate standard.
A few years ago, Ben Thompson wrote “A Framework for Moderation”, where he laid out the idea (not original to him, but he summed it up well) that the deeper in the stack you go, the more the standard should simply be “anything that’s legal is allowed”.
In this framework, at the top of the stack you’ve got, say, sites generating their own content, which obviously exercise a lot of editorial control. At the bottom of the stack you’ve got DNS, where the only editorial decision is “Is it legal?” And everything else goes in between.
This is useful, but incomplete.
First, most glaringly, user-generated content platforms don’t map cleanly to it. They’re kind of top-of-the-stack, since they’re end-user-facing. But as platforms, they also a kind of infrastructure.
Second, as soon as you deviate from an “is it legal?” standard, you’re making judgement calls. That’s not necessarily bad. All real-world systems require judgement calls. It just means that you can’t rely on different actors to reach consistent conclusions.
But there’s a funny thing about that: different actors do reach consistent conclusions. YouTube, Facebook, and Twitter have functionally indistinguishable policies (in both opacity and outcomes) on gun content. For some reason, their one-off judgement calls tend to all line up.
The reason is that these calls aren’t made in a vacuum. Companies talk to each other, and more importantly they are all subject to the same external pressures. That’s a natural part of being a company. People of all stripes tell you what they want you to do, and you decide what to do with that input. But they will couch their input in terms of what you have said is important. You say you’ll only allow legal content? People will litigate what’s legal. You say you don’t allow promotion of violence? People will debate the borders of that definition. You don’t allow misinformation? Get ready for rules-lawyering about what that means. And so on.
That dynamic highlights something that goes undiscussed. It seems like the big decisions in content moderation are the individual judgement calls. But that’s too late to have much of an effect. The actual big decision is which framework to use in the first place. Legality? Misinformation? Good taste? That foundational decision pretty much sets all the Schelling points for you, because it defines the Overton window. By the time you get to individual judgement calls, you’re just tidying up.
By making legality their standard, Visa essentially says they won’t entertain a deplatforming discussion unless it’s centered on showing that a particular item is illegal. This won’t be the end of the story — for example, the incoming ATF rule about pistol braces is going to spark a debate about whether braces are “officially” illegal enough to be cause for a merchant getting kicked off of the card networks — but it’s not too shabby for a company the size of Visa. And it sets the turf on which the discussion has to take place.
Legality is a permissive standard, and one that generally works well for gun rights. Visa deserves some kudos for setting this standard proactively, and we should encourage more infra providers to do the same.
This week’s links
OSD Range Day in San Jose this Saturday, September 24
This is an all-day pistol course, and we’ve got two spots left. Reply to this email for details!
Isaac from T.Rex with a hands-on history of gun technology.
Prof. Yamane teaches a “Sociology of Guns” course at Wake Forest every year, and he takes the students to the range as part of it. Always interesting to read their writeups.
OSD office hours
If you’re a new gun owner, thinking about becoming one, or know someone who is, come to OSD office hours. It’s a free 30-minute video call with an OSD team member to ask any and all your questions.
Like what we’re doing? You can support us at the link below.
And the best kind of support is to rock some merch and spread the word. Top-quality hats, t-shirts, and patches.
Thanks for reading Open Source Defense! Subscribe to get a new post every Monday.