In some ways they’re right. Pushing back against falsehood is not the same thing as being the source of that falsehood. None of these tweeters and journalists intend to pollute. The information ecology, however, doesn’t give a shit about anyone’s intentions. What matters most is consequence. And the consequence of those retweets, litanies, and articles is to spread the pollution further.
Some will resist this idea. Some will think, “But my audience understands that RTs aren’t endorsements, that the theories aren’t true, and that I’m just fact-checking! Anyway people need to know what’s happening—otherwise how can we start pushing back?” Hand to heart, I hear you. I’m sure the bulk of your audience does understand. I also agree that people need to know what’s happening; we can’t have a functioning democracy otherwise. And yes, 100-percent, we can’t resist things that haven’t been named.
In a perfect world, the conversation could end there. But ours is not a perfect world. Basing our decisions on the networks we wish we had, rather than the ones we actually do have, won’t get us any closer to the shared goal of a less-terrible internet.
Within these networks, the very notion of an intended audience disintegrates; the result of context collapse (the unpredictable comingling of audiences), Poe’s Law (the difficulty of determining meaning online), and an attention economy that incentivizes the fastest possible spread of the most possible information. We can talk about “our audience” all we want, of course, and tailor our messages to what we think they need to hear. But we often have no way of knowing how other, wholly unintended audiences will react to the things we post. In response to our messages, retweets, invectives and, yes, WIRED articles, maybe some will respond exactly as we hope. But others could be further emboldened because they’ve triggered a lib, lol, and that means they must be onto something. Still others could begin to wonder if there’s any truth to the idea—“I mean, Graham is a U.S. senator; and don’t they have access to top-secret information?”—and start Googling, in the process encountering even worse and more misleading information.
These are just a few possibilities; there are so many network variables, we often can’t even know what we don’t know about our unintended audiences. Once we publish or send or retweet, our messages are no longer ours; in an instant, they can ricochet far beyond our own horizons, with profound risks to the environment. At least potentially. On the other hand, if we only published or sent or retweeted things when we knew with absolute certainty what would happen next, we’d never say anything. That would be bad!
Maybe we’ll never know exactly where our posts and our retweets will travel. We can reflect, however, on the kinds of outcomes to avoid—the easy victories for chaos agents. Far-right operatives, for example, need and want liberals to help spread their messages; they rely on the mainstream majority for signal-boosting. This was true in Iowa, and it will be true throughout the 2020 election. The question then becomes: What messages work least well as free publicity for them?
There are no one-size-fits-all solutions here; that’s not how things work in a complex ecosystem. Nor is it practical to demand zero pollution. When pollution is avoidable, it should be avoided: certain things simply don’t need saying. That’s particularly true when someone is posting a response for the sake of responding, or when they don’t realize that the the very concept of joking online belongs in scare quotes. Other times, there’s no way to avoid the sludge; certain things do need saying, because the messages add context or nuance or moral clarity, even as they help publicize the source-pollution. That pollution might be unavoidable, but you can reduce the runoff by considering two separate waste sites: the spot where you’re standing, and the areas downstream. What pollution might your message mitigate, and for whom? How does that compare to the pollution it might generate? Whose bodies might be nourished, and whose might be poisoned? Are those costs worth the benefits?