221 Comments

I get it, but it strikes me as a teen thing. You are trying to define yourself as you climb out of the burrow and look around. I climbed out of the burrow in the late 70s and was horrified esthetically, and latched onto the punk rock like it was life preserver. Yes! Whew! I’m in! Finally! Sign me up!

And what you latch onto when you are 14 has a lasting imprint, but ideally you keep looking around you and figuring things out for yourself. Ideally!

Expand full comment

The UK census has just revealed that those who identify themselves as Christian has dropped below 50% for the first time.

I don't think anything you describe is necessarily a new phenomenon, although it's illuminatingly described. Only the consumerist objects or the political and sexual identies are new.

When it was religion, these people would be called pious. There were people who made religion their identity to greater and lesser degrees but it was those who were most overtly devoted that held their heads highest in church.

Clout chasing seems to me a part of human nature going back ages.

Expand full comment

One of my favorite things you've written

Expand full comment

Some thoughts:

"...as a species we are famously unaware of ourselves ..."

I agree that many people's self perception does not always match others' perception of them, but as a species? It seems more likely that we are the most self-aware of any species. I mean, certainly my teenager is much more acutely aware that she is unbelievably lazy than my cat is, despite the fact that they are equally unproductive.

"The question I come back to is, if you stripped away my Marxism, if you left aside my identity as a writer, if you looked past my pretenses to being a dissident, if you ignored my mental illness, who would I be?"

The crankiness. You'd be left with the crankiness.

Expand full comment

The word "loathe" in here should be "loath". Great piece, loved it.

Expand full comment

<< ...The world is a fabric we weave daily on the great looms of information, discussions, films, books, gossip, little anecdotes. Today the purview of these looms is enormous—thanks to the internet, almost everyone can take place in the process, taking responsibility and not, lovingly and hatefully, for better and for worse. When this story changes, so does the world. In this sense, the world is made of words.

How we think about the world and—perhaps even more importantly—how we narrate it have a massive significance, therefore. A thing that happens and is not told ceases to exist and perishes. This is a fact well known to not only historians, but also (and perhaps above all) to every stripe of politician and tyrant. He who has and weaves the story is in charge.

Today our problem lies—it seems—in the fact that we do not yet have ready narratives not only for the future, but even for a concrete now, for the ultra-rapid transformations of today’s world. We lack the language, we lack the points of view, the metaphors, the myths and new fables. Yet we do see frequent attempts to harness rusty, anachronistic narratives that cannot fit the future to imaginaries of the future, no doubt on the assumption that an old something is better than a new nothing, or trying in this way to deal with the limitations of our own horizons. In a word, we lack new ways of telling the story of the world.

We live in a reality of polyphonic first-person narratives, and we are met from all sides with polyphonic noise. What I mean by first-person is the kind of tale that narrowly orbits the self of a teller who more or less directly just writes about herself and through herself. We have determined that this type of individualized point of view, this voice from the self, is the most natural, human and honest, even if it does abstain from a broader perspective. Narrating in the first person, so conceived, is weaving an absolutely unique pattern, the only one of its kind; it is having a sense of autonomy as an individual, being aware of yourself and your fate. Yet it also means building an opposition between the self and the world, and that opposition can be alienating at times.

I think that first-person narration is very characteristic of contemporary optics, in which the individual performs the role of subjective center of the world. Western civilization is to a great extent founded and reliant upon that very discovery of the self, which makes up one of our most important measures of reality. Here man is the lead actor, and his judgment—although it is one among many—is always taken seriously. Stories woven in first person appear to be among the greatest discoveries of human civilization; they are read with reverence, bestowed full confidence. This type of story, when we see the world through the eyes of some self that is unlike any other, builds a special bond with the narrator, who asks his listener to put himself in his unique position... >> ~~ from Olga Tokarczuk's Nobel Lecture: The Tender Narrator (2019)

Expand full comment

"...the poets down here don't write nothin' at all, they just stand back and let it all be..." ― The Boss, "Jungleland" (1975)

Expand full comment

I hope we are not the mere sum of our self-selected memberships and fandoms.

Expand full comment

Very related, I think. It’s one thing to have internal dialogue and introspection; it’s quite another to do actual self-definition. I think that is actually at the heart of the post today - these superficial identify things just exist but are not a coherent or intentional statement on “identity” because that level of defining has not been done.

Expand full comment

Excellent piece. For a lot of people, their Twitter feed is who they are. So now that we have Elon running things, they want to leave but can't. And deBoer's description of what happens when you become captive to that "one thing" is better than anything I could write, that's for sure.

Expand full comment

'On Cinema' is one of the most in depth explorations of this concept, I'd wager. However, Freddie, you'd probably not like it for similar reasons to you being a bit non-plussed by Nathan Fielder's stuff.

Expand full comment

As a 45-year-old gay man myself, I can say there are plenty of gays who never grew out of the Super Gay phase and they are insufferable humans.

Expand full comment

To engage with the actual text: good post, I think it's a nice roundup capstone tying-together of some various themes you expound on frequently. The "Your Identity Doesn't Work 2.0", to borrow titling from the education series. I'm reminded of the similar classic arguments made by Paul Graham*: be careful adding on externally-defined identity modules, because it's easy to lose the true You beneath all the precut pieces. You don't wanna become *overly* predictable, cause then you're just a tower of assumptions, and no one likes a caricature! The whole game is to differentiate each of us as individuals, and while obvious signifers can aid in comprehensibility and first-level filtering, there has to be some actual core there, there. A composite personality is, ultimately, less than the sum of its brands...since it tells you the important fact of, this is a person too immature/insecure to Do The Work of building their own brand. (Yeah, I don't like the corporatization of identity either, but those particular metaphors of self-as-brand are easily understood by <s>capitalists</s> people experiencing capitalism. Comparative advantage, niche-finding, specialization.)

The Super Gay thing...yeah, haha, lived through that a few times. For a while it's all ****s and rainbows, Pride this, straight-bashing that. (Or the equivalent for The Forbidden Topic.) And then some point closer to 30, everyone realizes they're grown-ass adults, and conversations turn from Teh Fashions and Cishet Oppression to...paying the bills, how's the bullshit job, wow how about those Chargers last night. I think this is the charitable interpretation of social conservatives painting gayness as an "alternative lifestyle": yes, it's a big deal and a major component of one's life direction, but so much of quotidian living has at best specious relation to one's amorous inclinations. Not mastering those basics leads to a world of pain, and no amount of protected-class largesse can fill that hole, either. Also part of why I don't make those communities a centerpiece of my social circle anymore: they fulfill an important role when first coming out, but ultimately tend to feel stifling. As you note with Macs, certain aesthetic tastes and hobbies tend to become strongly associated, which can end up limiting if one's actual interests lie outside those favoured by the community. That's the thing I like most about normies: they're incredibly, incredibly diverse! In a more holistic way, rather than the narrow axes we usually define that spectrum...

I do wonder if there's any meat on the bone of the (admittedly conspiratorial) take that capitalism doesn't just try to sell you Hole Fillers, but goes out of its way to drill more holes in the first place, making it harder and harder to live in an authentically load-bearing way. Certainly there are plenty of indignities associated with capitalism, but somehow I suspect people living under communism or socialism or whatever to have suspiciously similar troubles. Being human is just hard, and offloading that struggle to "because socioeconomics" or whatever other meta-condition is just another way of kicking the can down the road. I think. The seductive appeal of Systemic Forces arguments is that they absolve the individual of any personal responsibility, you know? But nothing can be you but you, like your subtitle says.

*http://paulgraham.com/identity.html

Expand full comment

As a veteran of the Mac-PC Wars (from the Mac side), I think I can provide context as to why it cooled.

Many don't remember that the Mac was on death's door throughout the 90's, and its future was uncertain even after the return of Jobs and the success of the iMac. Depending whose numbers you believed, market share was as low as 3%, and certainly not above 10%; and the vast majority of software (minus some specialized AV and graphic design stuff) was Windows exclusive. This was a terrible chicken-and-egg problem: many were nervous to buys Macs because they didn't run most software, while software vendors didn't bother porting to Mac because the market was too small.

So there's a sort of "material class analysis" angle, in addition to "product id-pol". Beneath all of the bluster and smug superiority about the Mac's virtues (some of it defensible, some of it not) was a desperation, a fight for legitimacy and the right to (digitally) exist, with the dreaded alternative being a Microsoft hegemony over all computing, seemingly forever. (Nevermind that Apple would later show its true colors by pursuing its own quasi-monopoly, in ways far more egregious IMO.)

So what changed? Sure, people did move on to other forms of product-based identity, and/or political tribalism. But there were also exogenous changes to computing itself; the biggest being the Cambrian explosion of the web, which made native software less and less important. Even before web apps became viable replacements, knowing that your expensive candy-coated appliance could always be useful to browse websites, if nothing else, mitigated that fear of investing in a dead platform. After more time and iteration, it became completely irrelevant whether Mac had native Microsoft Office (a huge concern for a long time, and a pivotal point in the '97 deal between Apple and MS), because Google Docs could do everything most users would ever need.

The other change was behind the scenes from consumers, but important all the same: it got easier and easier to write multi-platform software. A myriad of tools and frameworks proliferated, which lowered costs (the two big ones now being Electron for apps, and Unity for games), changing the math for many software vendors (aided by the still-low, but significantly better, Mac market share numbers). Apple also met them halfway in its OS switch to UNIX underpinnings, and a chip switch to Intel, both of helped make Mac ports more viable.

Everything in this piece stands; but the Mac-PC tension had an extra level of intensity for reasons beyond Ford-vs-Chevy, PS-vs-Xbox, which shiny consumer good represents your personality. A (somewhat) pluralist computing ecosystem was largely a technological achievement, and one that's easy to now take for granted.

Expand full comment

>(Seriously, you have to try pretty hard to get a virus these days.)

Not that long ago I ran into one of these True Believers in the wild...she was at the bank, refusing to touch any tablets (because they transmit viruses, so insecure, steals your info), and insisted on being able to utilize the bank's own desktops to access her online banking account. Then the subject of 2FA security came up, and she started into the whole anti-Android evangelism spiel (because they transmit viruses, so insecure, steals your info). Everyone should get an iPhone! The only Truly Safe Device. Why, I get my identity stolen multiple times a year, I don't know what I'd do without LifeLock(tm) and Apple tech support to help me out and reboot my devices...the irony, of course, being that this particularly gullible woman had virus issues far above the median rate for typical Windows/Android users, and Apple had nothing to do with it. PEBKAC, as we used to say. Was a real throwback experience...

>For some, being into Apple products really was like a religion, a defining element of the self1.

"We're calling it iPhone S, for Same!" This was the footnote or link I was expecting, from way back in Those Bad Old Days: https://www.digitaltrends.com/computing/apple-causes-religious-reaction-in-brains-of-fans-say-neuroscientists/

Expand full comment

Take it from an old guy: it doesn’t get better unless you work at it.

“I don't like getting old. I hate it, in fact. I don't know an honest person who likes it. You just thin out and all your energies go toward surviving or moving safely from one room to another. But the mind thrives, thank God. Or mine does. I used to try very hard not to regret. I thought that regrets were a waste of time, a sign of weakness. I think only the most insensitive of people have no regrets, because in this time, this slower time, your mind goes back to so many instances when there should have been more kindness, more attention paid to others. I missed so many opportunities to be a better friend, a better mother, a better actress. Of course I can't remember now what I was in such a hurry to get to that I grew so bad at the important things. So I regret and I think. Old age is the big index to the foolish young people we were.”— (Deborah Kerr/Interview with James Grissom.)

Expand full comment