This New Yorker piece from Sheon Han mourning the end of Twitter is, well, it’s something. Look, I don’t begrudge an ex-employee sharing such an idealized vision of what his old company was. Certainly not considering the circumstances. But that rose-colored perspective hints at the fundamental problem with not just Twitter but the Web 2.0 principles that underlie it, which Han’s affection seemingly prevents him from understanding. Elon Musk is now a problem with Twitter, but even had the company been bought by some benevolent philosopher king, the core dynamics of a constant mass-broadcasting service for everyone would be deeply unhealthy. “Everybody in the same room all the time” social networking is problematic on its face and uniquely pernicious to journalism and media. If Twitter as we knew it really is over, this is a great time for the industry to finally grapple with what it has wrought. In fairness to Han and others, it’s not Twitter itself; in a different universe, some other social network that exposes everyone to each other’s opinions all the time might have wrecked media. In this one, it was Twitter. But the basic concept itself is inimical to what journalism and commentary has to do, no matter the name of the network, who owns it, or what its terms of service are. Forcing everyone into one discursive space inevitably creates peer pressure and conformity, in an industry where independence of thought has always been a treasured virtue.
Very early in Twitter’s rise, it became a matter of holy writ that writers, journalists, editors, pundits, and everyone else involved in the reporting and opinionating business had to tweet. Even when I started writing for a public audience in 2008, when the service was only a couple of years old and the blogosphere was still puttering along, the idea that you simply had to be on Twitter was already growing, and in a few years became inescapable. If you didn’t tweet, as a writer, you didn’t exist. This wasn’t just a media phenomenon, either. I went through grad school in the early 2010s, and you already heard then that Twitter was becoming mandatory even for academics; professors at conferences would frequently ask you for your “at.” The message was relentless: if you weren’t on Twitter, you weren’t part of The Conversation, and you couldn’t get your work out there, regardless of what kind of work it was. (Hence the rise of, for example, Twitter realtors and Twitter life coaches and Twitter nutritionists.) And, indeed, everybody being in the same room together all the time did make it easy to share your work, and there was a chance it would be amplified and echoed across the reach of the network. There are great pieces that I found on Twitter that I probably would never have read otherwise. But I’m afraid most every other consequence was a bad one.