Designing Distribution For The Information Age
Is the old guard of the internet wrong? Was it all a huge mistake? I don’t think so, other than a certain stubborn shortsightedness. And there’s still time to make amends.
I do think information still wants to be free. The alternative is much worse — we are seeing the results of censorship, de facto and de jure both, and it’s not a pretty picture. Censorship inherently favors those in power, centralizing command to a smaller and smaller subset.
We must maintain the ability of people to communicate freely; without it, democracy is only an accident, a wart quickly to be removed.
What we need to do is stop handing the information equivalent of a machine gun to those who would do ill.
The mistake of the past— and it took some heavy blinkers to think this way, given that all this growth happened in the context of the free-for-all that was UseNet — was forgetting that misinformation is still information.
Truth has no special privilege.
Truth doesn’t sound different, despite what you might think. A lie and a truth, put behind glass, look the same. Indeed, a juicy lie can be much more powerful than the truth. And we knew this. “A lie is halfway around the world before truth has its boots on” is not an internet-era saying. And yet, we pretended that sharing more would necessarily make the truth louder.
But information freely-shareable is information easily stripped of its context. Context helps us understand the backgrounds, bias, and credibility of the source of information; but once cut-and-paste was invented (and its cousin the screenshot), context was easy to discard and hard to recover.
This showed up in relatively harmless ways first — internet “just-so” stories, FOAFs, urban legends, newly reinvigorated by a fluent distribution mechanism. Like the little brother of a friend of mine, who totally died after eating pop-rocks and soda. (Turns out it’s Mentos and soda, but close enough.) Or the guy in my friend’s other gaming group, the one who tried to shoot a gazebo, or the guy who died to protect his pet fox from his staff sergeant. (Oh, wait, that was an ancient Greek urban legend.)
These were distributed as fact, but with a wink and a nod; on the internet, the wink and nod sometimes got discarded, and people took them as truth instead. And sometimes, frustratingly, they changed the world, at least for a while. This, again, was nothing new — look up the “Satanic Panic” of the 80s for a pre-internet example — it was just easier, quicker, and more widespread with email and newsgroups.
The next attack was brigading; and again, this one should have been spotted earlier.
Early Web 2.0 things were web things based on the idea that people would come to your site and contribute, as opposed to 1.0, where you made a static website for people to discover information and leave. Web 2.0 was based around the idea that there is one truth and many lies, and therefore, like incoherent light not contributing to your reflection in a mirror, the lies would cancel out and the truth would self-reinforce.
And it did, for a while. Wikipedia, one of the few unalloyed vestiges still available, is the result of a concerted effort to find and make truth visible, or at least, sourced information.
But people are clever, and realized that if enough people insist on something, then it becomes seen as truth. Once this was weaponized, we were deeply into the information war we’re in today. And companies — notably Facebook and Google, but many others — were left scrambling to change how they defined truth. (Not what the definition of truth is; instead, how their algorithms decide what to promote and what to discard.)
Wikipedia was an early battleground for this; tireless effort by a small host of editors acts as its daily immune system, rejecting lies where possible, reverting vandalism, banning users who attempt to be one-man brigades or who go on pointless crusades or climb the Reichstag dressed as Spider-Man.
Brigading is the simple act of recruiting people, willingly or unwillingly, to re-transmit your message. That’s always been around; what’s changed is how easy it is to recruit new people, how easy it is to dupe them, and how wide the audience they can reach is. We’re in the “Black Death” era of the internet right now: a worldwide marketplace of ideas that makes it trivial to spread misinformation to populations not yet inoculated against it.
It’s surprisingly effective. Folks get recruited instantly to spread misinformation, to put their stamp of authority (or at least familiarity) on that information. Your uncle repeated it; it must have some veracity, right? But no.
We haven’t figured out the way to combat it yet. One method may be via the very social graph that binds us; brigaders are often tied together tightly via the social graph. If ten people say something and they’re all only friends with each other, that might be one “vote”: in comparison to ten people, each of whom have rich, different but slightly overlapping, and varied social circles.
And so we’ve gotten to the point where the medium must maintain the message.
The medium — the distributor of information — is not blameless, is not neutral, cannot be neutral. And the mistaken belief that it could be has led to infestations that could have been designed out, but weren’t.
So here we are; time for us to do better.