23 Comments

I was so engrossed in the story of the stick that I forgot this was supposed to be a book review and by the time the review came around I was disappointed because I only wanted more about the stick.

Please, please tell me that there is more visual documentation of the stick, and please share if so.

Maybe after work I will get back to the actual book review but right now I’m just delighted by the spontaneous roadside-shrine-ness of the thing, before destroyed by senseless philistinism.

Expand full comment

I loved the story about the stick.

Expand full comment

I can't even read beyond the stick story, yet. I need to sit with it for a while. The book review itself can come later.

Expand full comment

FdB is modeling how to do a great book review here guys: pull em in to a micro moment, then show them how it’s related to the whole world. Anyway, stick story reminded me of this, which I use for a half a class period in AP Literature to connect to The Scarlet Letter:

Sticks

by George Saunders

Every year Thanksgiving night we flocked out behind Dad as he dragged the Santa suit to the road and draped it over a kind of crucifix he'd built out of metal pole in the yard. Super Bowl week the pole was dressed in a jersey and Rod's helmet and Rod had to clear it with Dad if he wanted to take the helmet off. On the Fourth of July the pole was Uncle Sam, on Veteran’s Day a soldier, on Halloween a ghost. The pole was Dad's only concession to glee. We were allowed a single Crayola from the box at a time. One Christmas Eve he shrieked at Kimmie for wasting an apple slice. He hovered over us as we poured ketchup saying: good enough good enough good enough. Birthday parties consisted of cupcakes, no ice cream. The first time I brought a date over she said: what's with your dad and that pole? and I sat there blinking.

We left home, married, had children of our own, found the seeds of meanness blooming also within us. Dad began dressing the pole with more complexity and less discernible logic. He draped some kind of fur over it on Groundhog Day and lugged out a floodlight to ensure a shadow. When an earthquake struck Chile he lay the pole on its side and spray painted a rift in the earth. Mom died and he dressed the pole as Death and hung from the crossbar photos of Mom as a baby. We'd stop by and find odd talismans from his youth arranged around the base: army medals, theater tickets, old sweatshirts, tubes of Mom's makeup. One autumn he painted the pole bright yellow. He covered it with cotton swabs that winter for warmth and provided offspring by hammering in six crossed sticks around the yard. He ran lengths of string between the pole and the sticks, and taped to the string letters of apology, admissions of error, pleas for understanding, all written in a frantic hand on index cards. He painted a sign saying LOVE and hung it from the pole and another that said FORGIVE? and then he died in the hall with the radio on and we sold the house to a young couple who yanked out the pole and the sticks and left them by the road on garbage day.

Expand full comment

"And yet I recognize the confusion and hypocrisy of the situation"

->leather jacket and perfectly tussled hair all the way down<-

Expand full comment

Agree with Freddie on how annoying memes are.

Expand full comment

Years ago I volunteered to help build a small community garden spot, with +/- 30 raised beds, lined up, each 10'x10'. We built a shed, put up a fence, spread cardboard then woodchips. That garden bed was remarkably productive, with full sunshine, new compost, and unlimited watering available. Once the major tasks were accomplished, our group quickly devolved into micromanagement, as we created bylaws, which banned things like gambling in the community garden. Also no running children. There were long debates about how to make sure our composted manure was locally sourced, and how to prove it. We spent dozens of hours building handicap accessible beds that were never, ever used. I would frequently get emails about my plot that started with "you really ought to consider..." I was super busy with a new job and little kids, so for a few years I just planted 100 square feet of fingerling potatoes, which I would weed once every two weeks and mound up the rows as the plants grew.

I was breaking the unwritten rules, the need to have a picturesque plot ready for social media photos. There was a shared aesthetic that my ugly garden was ruining. Eventually we moved to a new house with a south facing yard so I no longer needed the community plot. "To break the rules effectively, you have to do so well" is a great sentence. To this day I regret not renting the plot for one more summer, rolling out perfectly smooth turf, and making a putting green that would last until the bylaws were updated.

Expand full comment

For some people, you built an outlet for their gardening hobby, for some an outlet for their advocacy hobby, and for others, for their management hobby.

Expand full comment

Thank you for this fascinating piece. The imagery of your story about the stick, uh, stuck with me through the whole review, making me appreciate the book review itself on a deeper level than I think I would have without that anecdote.

Lots to think about here. Thanks again.

Expand full comment

I'd like to pick on one particular part of this review (and, it seems, the underlying book). Freddie writes:

> Core to Chayka’s point is the obvious but essential fact that these algorithms can do little else but recommend to us that which other people have already clicked on, suggesting an implicit (and deeply ideological) assumption that we always want more of what we’ve gotten before. If it’s true that our culture can create nothing new, maybe it’s because everyone is constantly fed a steady diet of stuff that’s like the other stuff that was like the stuff that came before it. “Under algorithmic feeds, the popular becomes more popular, and the obscure becomes even less visible,” Chayka writes. The fact that code is forever telling me that I’ll like something because I liked something else that’s superficially similar is deeply insulting.

I am pretty sure that word "assumption" in the first sentence is wrong. Tech companies do not "assume" that you're more likely to look at similar content than dissimilar content; they have proven it over and over and over again over the last decades. If you came up with an algorithm for getting clicks and views by showing people fresh, challenging work that was more reliable and resulted in more clicks and views than the current algorithms, companies would absolutely switch over.

I think this mirrors a point Freddie makes about standardized tests: it's not that opponents of the SAT actually dislike the tests, it's that they dislike what the tests tell them, and then they deflect that feeling onto the test itself. Unfortunately, I think it's very likely that most people genuinely prefer to look at stuff that is similar to what their friends looked at last week. That might be disappointing from a social, creative and aesthetic point of view, but it's not Instagram's fault.

Expand full comment

In fact, recommendation systems don't necessarily even know anything about the content they're recommending: they may end up recommending content that's "superficially similar" to what you've liked in the past, not because that was the intent of their designers, but simply as a consequence of recommending things that are popular with other people whose tastes seem similar to yours.

Imagine a record store, where the guy behind the counter offers recommendations: you tell him what some of your favorite albums are, and he recommends other albums you should try. Now, he *could* be an uber-hipster who knows everything about music, and when he hears your list of favorites, he deduces what you like about those albums ("female vocalist, uptempo, cathartic bridges"), and then thinks of other albums that have the same qualities... but he doesn't have to be! If he's worked there long enough, and has a good enough memory, he could recommend albums simply based on the preferences he's heard from other customers ("people who say they like X also say they like Y").

That's the uncomfortable truth they reveal: that most people's tastes are mostly consistent, both internally (if you like five albums with these qualities, you'll probably like this sixth album with the same qualities) and externally (if you're a Person Who Likes X, you'll probably have similar tastes to other People Who Like X).

However... there's still a problem, and it has to do with how these systems try to gauge what your preferences are. Most of the time, they aren't just using an explicit list of favorites you've given them; they're trying to deduce what you like from your actions.

Several years ago, I spent a few months hooked on a couple mobile games. Any "objective" measurement of my gaming preferences would conclude that I liked those games, and recommend me more of them - and I might have played the games it recommended. But they were terrible games! Playing them felt like a chore. It just happened to be a chore that I did reliably, until I finally freed myself by deleting them.

Similarly, when a recommendation system serves you content based on what it thinks your revealed preferences are, it can end up recommending content that you'll diligently consume (in a way that looks like success to the system), even if you don't come away from the experience feeling like it was a good use of your time.

In other words, the systems optimize for some goal like "getting clicks" or "getting views" or "getting watch time" or "getting positive sentiment in comments", which their creators might assume is aligned with enjoyment. But over time, they become unaligned, especially as content creators learn how to target them directly (clickbait, engagement bait, etc.).

Expand full comment

But the "assumption" the algorithms provide is that we "always" want more of what we've gotten - its so obvious as to be useless to note that isn't true 100 percent of the time for 100 percent of the people, yet the algorithms by design know how to provide nothing else.

Expand full comment

Something that I've noticed is that since covid, the algorithms have been really bad at suggesting to me new music I would like. Basically The Last Dinner Party is the only new artist that's been suggested to me since 2020 that I've liked. Even before then, the algorithms spent decades telling me I should like Jane's Addiction, which I absolutely hate.

Expand full comment

1.) The only meme I can think of off the top of my head that didn't get incredibly irritating almost immediately was the Bernie mittens one. That was nice. Might be biased, though.

2.) I bought Elites and liked it, but in regards to the organic people power thing you mention, it felt a little difficult to recommend with the title and, to a lesser extent, the chapter names. Saying "How the ah, Elites, ate the, ah, social justice movement" out loud at the granola bookstore when I picked it up made me wince a bit.

Expand full comment

not my decision I'm afraid

Expand full comment

They didn't let you pick the title??

Expand full comment

oh you sweet summer child

Expand full comment

I don't understand the point of these kind of books. The degree to which algorithms have affected the built environment and the arts is an empirical question. You can string together as many anecdotes as you like with as much beautiful language as you like, but you aren't proving anything. All you're doing is coming up with a theory that you never have any intention of testing and that nobody else will ever test. It might as well be fiction, and if I'm going to read a piece of fiction, I can think of a lot of better options.

If the sales of these books is underwhelming, it's probably because nobody finds this kind of humanities-mysticism speculation all that compelling, or at least the only people who do are those who have been acculturated in this way of thinking via their undergraduate majors, and their numbers are dwindling.

Expand full comment

How would you design a test for this thesis that would accurately confirm causality?

Expand full comment

Who cares if it can be tested. Most of philosophical inquiries couldn’t be tested until hundreds or thousands of years after they were framed. Does that mean they weren’t worth writing? Newton's laws were retained despite the fact that they were contradicted for decades by the motions of the perihelion of Mercury and the perigee of the moon and the theory of relativity also showed some holes in Newton’s theory, yet we don’t stop using Newtonian formulas because they work. Democritus largely got the general concept of atoms right even though it would not be readable for over 1000 years. This pedantic focus on falsifiability as some necessary precursor for whether anything is worthy of being discussed is something I never got. Even a lot of things that were proven by science and falsified later, theories created before falsifiability was available guided the later scientists in developing the frameworks and hypotheses.

Expand full comment

Another example, Bertrand Russell wrote once “One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision.” That quote inspired the authors of the Dunning Kruger effect to do their famous peer reviewed study. This happens often where a lot of things scientists test and prove were influential ideas floating around from before they could be tested.

Expand full comment

I've actually been reading Filterworld myself and think I have been coming away with it a little more down on it than even you are. I've been a physical media die-hard my whole life, and constantly rolled my eyes as my friends got iphones and ipods all throughout the last twenty years. Even if the anecdotes and observations he's made are often spot on, it's hard to gel with the book when the author willingly he admits he threw out his CD collection and embraced Uber and AirBNB as they came out.

Or, to put it as a meme so you can better enjoy it, Freddie:

Me sowing: this rules

Me reaping: this sucks!

Expand full comment

(Quick aside: Was Amazon’s grunt work gig site for human identification, called Mechanical Turk, mentioned in the book? Seems like a weird omission if not)

Initially I was puzzled by your desire to tear down the gauche addition to The Stick, since you say earlier that part of the creation of art is being willing to fail, but you recognize it later, and sometimes what’s initially hypocrisy becomes an invitation. I think your end of the review makes this apt for Chayka as well.

I heard him give an interview with Willa Paskin at the podcast Decoder Ring, which delves into weird aspects of pop culture, and those stories are also rife with “path dependency”.

This concept of cultural collapse reminded me of Retromania: https://www.theguardian.com/books/2011/may/29/retromania-simon-reynolds-review Note the year: 2011, just before Chayka dates contemporary algorithmic curating. Is this the same kind of “sameiness” lamented by Chayka? If everyone can explore the last century of culture, when would anyone have the time to make something new?

(I did just see a theater performance I loved that was new to me: Message In A Bottle. Yes, it’s Sting’s music (though I hope Summers and Copeland get a cut as well), so “derivative”, but entirely in dance, about a family that become refugees, and their losses, and attempts to start over elsewhere. That description doesn’t do it justice; the show works, it just does.)

Slightly tech take, to help clarify things: As others have pointed out, what’s lamented here isn’t really “tech companies” “dictating” paths, its people deciding what they want to look at at such a large scale that it feels like something else. Ironically, there *have* been times that companies have put their thumbs on the scales (“pivot to video”), but these don’t seem to be mentioned? Despite those moments of deliberate meddling, the question of aesthetics is more likely to revolve around the more typical case of “giving the people what they want”. Jon Stokes did a good job of showing a little of what’s behind the curtain in a slightly techy way here: https://www.jonstokes.com/p/googles-colosseum

The problem, as Stokes presents it, is that all of these systems have resource constraints, so you have to cut bait at some point and get “close enough”. This will deemphasize edge cases, where the “rule breaking” usually happens.

From an end user standpoint, the less you let the system “know” about you, the more generic the recommendations. I’m personally OK with that, and am happy to ignore the umpteenth ad for the same thing because I looked at it on Amazon the day before. This does mean that it’s more likely that everyone else’s taste will be recommended to you well before your revealed preferences get taken into account.

I think from an individual perspective, you treat these systems as tools and do more exploration with intent (search for who you want to read; keep trying to expand your world). However, on a mass scale I don’t know what the answer is; go out more?

Expand full comment