I'm always surprised when I hear that people look forward to driverless cars. I love long drives. They're relaxing. If I'm alone I get to listen to music or podcasts; if I'm with my family we get to chat without distractions.
You know when you get your driverless car it's going to end up being just another satellite office, right? You'll answer emails, give presentations, all the same invasive shit that most tech advances end up being.
I remember when I was a kid in the eighties the big selling point of all the fancy new office tech was how much easier it was going to make clerical work. With affordable computers, printers, photocopiers, fax machines, etc., secretaries would be able to get their whole work day done in three hours, and then can go home! They printed articles on that in like Time Magazine and the NY Times. I remember it distinctly because it was the first time I ever thought "what I'm reading is nonsense. Nobody is going to pay a secretary for eight hours of work if she only works three. They're just going to give her more work."
The utopian/dystopian conversation is fun, but I don't believe either is coming, so I never find it useful. When we start to figure out ways to give people TIME back - when we work less and spend more time with our family, friends and community without worrying that it's costing us our career or costing us too much money - then we're actually advancing. The rest of this stuff, whether it's driverless cars or refrigerators that order my snow boots or whatever the fuck, is, always, always, going to free us up to work more so that we can afford more stuff that frees us up to work more.
Hilariously the only bright spot on this topic that I've lived through is the white-collar office revolution since the pandemic. I've never seen anything like it. We got hours of our day back by not commuting (I get to make breakfast for my kids in the morning and eat dinner with them in the evening, neither of which was true when I was commuting into NY), and people are actually refusing to give that up. I would never have put money on that.
It happened where I work. They told us all to report to work 4 days a week starting in July (the on-site five day work week is definitely history for the professional crowd, which is lovely), AND ALMOST EVERYONE REFUSED. So they knocked it down to 2-3 days, and then 1 day a week "to ease us back into it", and then gave up and said we'll try again in January. That's kind of wild, isn't it? I can't think of an equivalent.
Oh god, the texting while driving makes me NUTS. I think Marc Maron put it best: "texting while driving is more dangerous than driving while drunk. At least when you're drunk driving someone is actually DRIVING THE CAR."
Sorry - I definitely didn't mean to come off as aggressive. :) It was supposed to be conversational. Fucking internet. Tone is impossible to convey.
I look forward to driverless cars for 2 reasons. 1) less accidents as Sage points out. Having a teen who just got his license get into a car and drive somewhere is one of the most terrifying moments as a parent. 2) efficiency. I fantasize about a time that you can get in the car at night w seats that flatten into beds, go to sleep and wake at your destination.
I've got three kids. They're not getting drivers licenses until they're 80, and by then they won't be allowed to. Yes I know this makes no sense, but I still get panicked when I put my oldest on the BUS, and I'm not even sure why anymore.
My two are almost 20 and 21 and I still worry when they drive anywhere far. The 20 yr old is driving home from San Diego this weekend and I won’t relax until he’s home safe. Not sure when that fear will end. Maybe never
"You know when you get your driverless car it's going to end up being just another satellite office, right? You'll answer emails, give presentations, all the same invasive shit that most tech advances end up being."
...
"Nobody is going to pay a secretary for eight hours of work if she only works three. They're just going to give her more work."
When we went from producing documents that were manually typed in by secretaries (two sheets of paper at a time with carbon paper in between) and had to be hand collated and stuffed into envelopes to be mailed (and later faxed), to composing, saving and sending documents to infinite numbers of people in a second, all it meant was that we were now tasked with producing and circulating more documents in the same work day (now extended, see below).
The cell phone and email (and of course email on your cell phone) also meant that a previously 9-12 hour work day became a 15-18 hour work day because we were now accessible and expected to read and respond to documents at all hours. In my profession, technology massively increased our productivity and also our work load and stress and work intensity levels, and severely ate into down time. In 22 years I haven't taken a single vacation where I didn't work. If you tell someone you'll be gone for a few days, they interpret that as being away from your office desk, but you'll still have your iPhone and laptop so you're really not gone at all.
Yungins, let me know if you'd like "carbon paper" explained.
"Yungins, let me know if you'd like 'carbon paper' explained."
Oh, it's that wonderful stuff you use when you're not working or doing computer-aided art, and you give yourself the luxury of artisanally transferring and improving drawings by hand! Love it! And I think I last used it... sometime in college.
I couldn't resist a chance to poke fun at my young'un-ness. I saw my grandma squeeze dagwoods of carbon and onion paper through her typewriter and wale away on it to eke out as many copies as possible. When she could no longer reuse her carbons, she'd give them to us for art supplies. When I was young enough, I through grandmas were where carbon paper comes from.
And I forgot to mention.... Never mind that technology didn't let secretaries do their job in 3 hours and then go home. It decimated the secretarial profession. It made secretary positions largely obsolete. Up until about 40 years ago, executives in corporations, and partners and associates in professional organizations each had a secretary. Sometimes it was a secretary to share with one other person, but the most "important" person in an organization had their own secretary. Over the last 40 years technology has made secretaries less and less necessary, especially as the workforce started to fill up with those who already knew how to use a computer and the relevant business software, and didn't need someone taking dictation (remember that?), typing letters, answering phones, or scheduling appointments.
When I was in high school, my school actually offered secretarial training as an elective. The ones who took the elective, learned how to write short hand (something I found incomprehensible, especially since I can't even decipher my own regular handwriting). That was a skill unique to secretaries, but I assume it's a dead skill now, obsoleted by technology.
Yes, very good point. In STEM academia, secretaries used to type papers on typewriters. Now the secretaries are all gone, and the professors type their own papers on computers.
Typing was the most valuable thing I learned in high school (early 70s). I took a summer class. It was me and 29 girls. I was there because I thought typewriters were cool tech, the girls were there to get secretarial jobs some day.
>I'm always surprised when I hear that people look forward to driverless cars. I love long drives. They're relaxing. If I'm alone I get to listen to music or podcasts; if I'm with my family we get to chat without distractions.
I have family that lives about three and a half hours away from me. I've been making the drive down to see them eight or so times a year for about a decade and a half. I can't wait not to have to actually manage that drive.
I, too, enjoy all the things you mention about long drives, but none of those things are the act of keeping your attention on the road and managing the car. I would love to be able to give my full attention to sightseeing, or an extra movie night with the kids, or conversation, or whatever else I'd like to do while my family is on the way to visit my mom, grandmother and sister. But I have to stay stapled to the driver's seat and keep a good deal of my attention on the road. The actual driving is the least enjoyable aspect of a long drive, and I look eagerly forward to off-loading it.
This strikes me as a bit utopian. The economy isn't capitalist because the capitalists are in charge of making day-to-day investment and purchasing decisions, it's capitalist because the benefits of those decisions accrue to the capitalists (I am aware of the seeming tautology thanks to the terminology, but I trust you can look past that), and I don't know why algorithms would change that. I mean thanks to managerial capitalism, for a lot of capitalists it probably already largely resembles your vision anyway. Who cares who is making those decisions, whether a computer or an MBA; the money still flows upward in ever-greater proportions. The fundamental change will still have to be political. Cybersyn wouldn't have been socialist without the decision to spread its benefits among the population.
I don't think it's utopian; the computers could create an even more unequal and unjust economy then we have now. And the humans in charge will say, "Hey, the computers make the decisions...."
I see. It's not clear in the post that you see this is value neutral, given your history and the implications of the phrase "command economy." I'm still not really sure what makes your idea so very much different from what we have now, though. Even the "hey the computers make the decisions" line has a present-day reflection in, "well, that's the policy." I've often said we've already created the paperclip maximizing AI, and it's the global industrial capitalist system itself (currently furiously making crypto "currencies," which are somehow less useful than paperclips. Right now decisions are made largely on what will generate the most profits for capitalists. On what other basis will those decisions be made by these computers if the relations of production don't change?
The computers didn't create the inequality. The humans in charge allowed the growth of the monopolies, Amazon, Facebook, Google, and Big Tech in general. There's no reason Amazon has to own Amazon Cloud Services and Prime Streaming. There's no reason Facebook needs to own WhatsApp and Instagram. There's no reason Google needs to own YouTube and 90% of web advertising. The laws need to be changed to break up these monopolies.
The president who did the most to lessen income inequality, at least by the statistics, was Trump. Socialists spent 4 years lying about him to get rid of him. I'm not saying Trump wasn't rude and arrogant, but his policies worked. The current Socialist Green New Deal Policies have increased inequality, unemployment and inflation.
The Socialist idea of equality is everybody is equally poor, except Party members of course.
Sounds like what's become of humanity in Wall-E, and it's difficult to see how it could fail to be dystopic. It's easy to find groups across ideologies who also expect it and anticipate dystopia. Hopefully opt-out possibilities are established.
On this scenario it seems like an arms race between the disastrous effects of climate change and technical solutions preventing the worst. Disaster likely means the automated revolution will never be completed.
How likely is significant voluntary resistance or even more self-conscious guidance? I don't know how Marxian your communism is, but I think he's right that it is not productive activity people resist so much as never ending drudgery and the threats that prevent ending it. To the extent that people by and large do need productive activity, genuine agency, etc., Wall-E world seems not only bad, but likely to be resisted.
I think you're exactly right. You could say the same thing about online privacy (in fact, privacy in general.) 1984 relied on an all-powerful, dystopian one-Party state and its ability to install intrusive telescreens in everyone's apartment. We now have two-way telescreens everywhere - all of our phone records are recorded, all of our online chats and communications are recorded and scanned, and on the consumer side there was no one monolithic corporation that imposed its iron will on us. It happened piece by piece, Amazon by Facebook, preferences by marketing list, opt-in by microphone-enabled.
Our art, our fiction, our video games - I've been re-playing Civ 6 lately, enjoying the firm, determinative clicks that tranisition my little nation from Oligarchy to Theocracy - focus mostly on great historical moments and men of action, crossings of Rubicons and peace in our time. It's seldom that clear-cut, is it? But it makes for a hell of a story when it is. Our switch to a managerial economy run by The Algorithm may even sound chillingly future-tech, very Blade Runner, but it'll be mundane and rubbish, won't it?
While I generally concede the logic, I always roll my eyes at most "AI takes over the world" predictions and don't actually expect them to happen, and this post has crystallized a big part of why that is for me. I feel like there's a general idea that AI is a single entity, and there's one paperclip maximizer that will rule everything, but...why? Netflix has AI, Amazon has AI, Apple has AI, and as they get bigger and more sophisticated, there's no reason they should be able to cooperate any better than humans do. Collecting action dilemmas don't exist because people are stupid, they exist because oftentimes incentives genuinely line up that way, and it won't be any less true for AI.
But this is kind of the point - you don't need one central system of control for a significant aspect of the economy to no longer be operating in a market system; you just need enough systems taking over human decisions
But if a bunch of AIs are making economic decisions and using prices to guide and implement those decisions, isn't that a market economy? I think the logic that it becomes a post human economy makes perfect sense, but there's an additional step I'm unconvinced of to turn that into a command economy. Admittedly it's my bias as an economist that I think a market system is the most efficient way to organize production and AIs would continue to use it, but when absent they, if we're talking about gradual change there should be some reason AIs would stop using it.
Yes, the AI economy Freddie articulates would still have significant knowledge dispersal in the Hayekian sense that mattered to the debate over command efficiency. But meaningful agency in the market would be diminished. So the two come apart in an interesting way
I'm not really seeing this, with one potential exception.
It seems to me that in most cases the rise of AI is no different from the rise of earlier forms of mechanization. We didn't shift from a capitalist to a socialist system when farmers started harvesting their crops with tractors instead of scythes, so how exactly is AI going to change the nature of our economy? (You could argue that subscribing to a hairbrush takes away the consumer's decision-making role, but I'd say what it does is automate the implementation of the decision: not really the same thing.)
The exception would be this. If AI started to make decisions about investment--by which I mean actual business decisions by corporations, not just algorithms for trading shares of stock--then in a real sense we might cease to live under capitalism.
One of the ways in which real-world capitalism is different from an economist's mathematical model is that owners of capital don't robotically maximize their own profits. They exercise agency, and that agency is what puts business leaders at the top of the human social hierarchy (a thing that's not even visible when you look at the equations in a model).
If software someday takes that away, so that nobody wants to read about Elon Musk or Mark Zuckerberg in the papers any more, we'll probably notice that we're living under a totally new regime.
All of these problems are government political failures, not something any AI could fix. The People's Republic of California voted to put Democrats in charge, and recently rejected a recall that was a chance to change things. President Joe Biden could suspend the state regulations until the ports are cleared, but he won't for political reasons. Is Freddy saying an AI would have avoided these unintended consequences?
The current economy doesn't function the way it does because hundreds of thousands of people sit at a desk each day and say, "Ah, let's implement a market economy with strong state intervention and a robust import-export model. And then lunch!" It functions the way it does because of a confluence of historical factors and current incentives. The same will be true of AI. It'll take all the little things that have led Netflix (a non-player here), Facebook (a big one), Amazon (a big one), the CCP, the US security apparatus, whatever else there is. "Taking over the world", in this context, doesn't mean a Dr. Evil-style theory of everything with frickin' lasers on its head. It means hegemony, or at the least some kind of multipolarity, of systems and not people. And it'll be as imprecise and messy as today, only with less accountability and more annoying spam mails.
An AI-managed economy is absolutely in our future, but it is a huge jump to assume that it will be centrally managed.
It's counter-intuitive to most, but the more the economy depends on AI (really math and programming), the lower the barrier to entry becomes. Renaissance Technologies, by far the most successful investors of all time, is a couple dozen Math and physics PhDs. Jane Street, a massive player in trading, has 1% the headcount of major banks. There are countless examples of a few programmers with VC backing upending entire industries in a few years. A single paper with a few authors can shift the direction of the entire field of Machine Learning.
The future I see is that of a few big players doing what they always have: undercutting or just buying out the competition. But also an endless stream of new competing AIs that are better
I am not much of an economic thinker, but "AI" stuff drives me crazy. I think we should strike out "artificial intelligence" from our usage completely -- it's too hyped and imprecise (and in my opinion speaks of something as a given that I believe is impossible). And even "algorithm" has been corrupted. We just mean computer programs, right? Some of which are machine learning programs, which behave a little differently than traditional ones, but yeah still computer programs, which are written and maintained by people and are just electrical signals performing binary math to represent information that is intelligible to people.
I know it quickly gets to a point where the complexity of the operations being done and the speed at which they're done can make it difficult and time-consuming for people to untangle what a computer is doing, but the computer itself actually does not know what it is doing. It needs human assistance for everything it does. Behind the magic feeling of Amazon or Apple or automated trading or whatever is thousands of people working on clunky machines all day.
Whenever I am welcomed to imagine a magical mechanized future by a hyped up PR statement that just sounds like industry press to make investors excited masquerading as academic lecture at Harvard from Goldman Sachs, where they brag that ...
" 'In 2000, we had 600 humans making markets in U.S. stocks,' Chavez told the crowd in Harvard Science Center. 'Today, we have two people and a lot of software.' ...
Now, one in three Goldman Sachs employees are engineers, Chavez said.
'The future of the financial industry lies in virtual machines and strong API contracts,” Chavez said. “We are redesigning our businesses around those principles.' "
... I have mental sirens blaring. What the fuck does the Chief Financial Officer at Goldman Sachs know about "strong API contracts" ???
Ok, yes, the nature of their business has changed, and so also has the nature of the economy. But computers are just code and code is written by people. And code, though it feels magic when it works, is actually very weak, messy, dumb, vulnerable, and constantly changing, constantly in need of human help.
Anyway this is tangential to your point, which is that people like it when computers do shit for them, which I think is very true. And I think you're right that the spread of more "automated" experiences in everyday life will be gradual, like with anti-lock brakes or cruise control.
But man I get angry when I read "AI" hype !! Computers are not "intelligent"! There is no such thing as a "driverless" car and never will be! People like it when computers do shit for them, but they shouldn't. In fact it is not the computers doing the shit, but the code, which was written by other people, and I guess in the case of your linked article, Goldman Sachs people. Anyway sometimes I think about developing this general anger into more detailed arguments, but I'll stop now.
I couldn't agree more. I work for Acme Inc.; we provide knowledge services for a particular segment of professionals (sorry for the vagueness). In the past all the creation and organization of this knowledge was done by people. Now, with our much vaunted and hyped machine learning, all the creation and organization is... done by the same people.
The machine learning helps a lot with search algorithms (though people still have to fix those). Other than that it helps organize our tasks really well, but the work is the same as it's been for about eighty years.
I think like half of machine learning is done so that clueless business leaders can proudly tell other clueless business leaders that they're using machine learning
I really agree. And on a long enough timeline it always morphs from "machine learning" to "artificial intelligence" via detours through marketing and the C-suite which just gives the wrong impression on what's actually going on at the company.
I've worked at two companies: a fortune 500 and a smaller founder-lead tech firm. In the latter, the CEO clearly knows what's going on in the company and even understands a lot of the tech. In the fortune 500, no leader I spoke to even had the faintest idea of what's going on. And I'm not even talking about the C-suite, even your standard SVP lacked even basic knowledge of their business unit. I think their heads are up their asses on immaterial financial stuff, and they got their positions by politicking, not by talent.
I work at a fortune 500 writing financial software. They keep encouraging us to incorporate "machine learning" and I'm like...for what? The bulk of the programming we do is making sure our APIs work with our downstream partners and that they comply with the law. This doesn't seem like the sort of area where you want a computer making judgment calls about what it thinks guardrail laws probably are in the state of Rhode Island. There is no point of our process where machine learning is applicable, and trying to shove it in is going to open us to lawsuits.
3. Reinforcement learning: Arguably a subset of #1, but I think of it differently. This is like a chess AI where it tries something, fails, and then adjusts its moves to get more the reward. I imagine self-driving cars use a really complex reinforcement learning algorithm
If it's not one of those three things, then the person saying "machine learning" probably has no idea what they're talking about.
Yep. One of the higher ups at my work read about it online and wanted to hire a team. It was a totally inappropriate application, not that we even had the right data in the first place. Anyway, we convinced him it would be racist, and he dropped it.
We consider the current system a capitalist one because companies more or less try to maximize profits, individuals make fairly free decisions about purchases and employment, and the government to whatever degree stays out of all of those company and individual decisions. It's actually a mixed economy, but we can leave that aside for now. I don't see those fundamental characteristics changing much as a result of delegating more of the specific tasks, so I'm not sure why we wouldn't still call the overall system capitalist. We could consider employees as an analogy. When the owner of a firm delegates a decision to an employee, the intent remains more or less the same. Sure, there are some agency problems, but that's a side effect. I would overall expect decisions to remain fundamentally capitalistic in their goals and criteria even if they're made by an AI rather than an employee.
Perhaps we would see some emergent changes in the system once most decisions are made by AIs (e.g., more implicit collusion), but that seems purely speculative at this point.
I agree broadly about the AI command economy, but I think the implications for humanity are pretty stark, especially when considered together with the prospect of climate change. Luxury Gay Space Communism or whatever will be a thing, but only for a small population that survives with enough capital to justify their place in that order.
I'm pretty sure Amazon already had a pilot program where they just shipped you things based on what it thought you'd want. The way it worked is you could keep the stuff you wanted but then just seal the box back up and ship back what you didn't want.
I can see our consumption habits becoming more automatic and controlled by AI. That’s certainly Amazon’s goal – they advertise products I want, and they really push subscriptions. Amazon once tried to get me to subscribe to a hairbrush (like, the default selection was that I’d get a new hairbrush every three months).
But I’m not sure how the change will impact workers. I assume the purpose of sending me products is to bill me for them – and that the products will stop coming if I can’t pay. I will still need a job. We could use all this technology to free people from the 40+ hour work week, and to provide everyone with what they need regardless of their ability to pay -- but it’s sort of hard to imagine that we will.
Though I've always been a techie and early adopter of just about everything I can reasonably get my hands on, I'm also the one that won't give Apple or anyone else my fingerprints, I cover up cameras on all my devices, until needed, and I never ever use or buy anything Alexa, Siri, Google, or anything else that purports to "smart" my life by spying on me and collecting all data about me and my household as is possible. I puke all over the smart fridge that tells you when you need to replenish eggs, and the Amazon tokens that auto reorder essentials. I know tech is still collecting all my data it can get its hands on, but I'm not willingly handing any of it over. There is massive resistance to the conveniences companies are trying to sell us, because they are creepy as hell. I don't see people jumping en masse on the grocery-reordering-fridge. I don't know how long that product will last before it ends in the failed-products dump heap. All the tech I gladly adopted throughout my life made my life more pleasurable and efficient in some way, brought something new I didn't have before, did something in a better, faster way. The current spy-tech is designed to wring value out of me, under the guise of giving value to me, value I reject for being undesirable.
I wonder what portion of the overall economy is devoted to moving around non-essentials. Utilities are a standard and largely unembellished, one-size-fits-all essential (we don't shop for the best electricity or water). You can say health care is an essential that doesn't need to be all that customized. Housing is an essential, but it's far from standard in its particulars. Just about everything else that I can think of falls into the highly customizable purchase category, in terms of quantity, quality, and other utilitarian or stylistic features. It's one thing to delegate tiresome chores (transportation) and decisions about essentials (including how to produce and deliver it) to AI. It's quite another to hand over decision making over non-essentials, a big chunk of the economy, and our entitlement to pleasures and escaping the hell that's sameness.
I'm always surprised when I hear that people look forward to driverless cars. I love long drives. They're relaxing. If I'm alone I get to listen to music or podcasts; if I'm with my family we get to chat without distractions.
You know when you get your driverless car it's going to end up being just another satellite office, right? You'll answer emails, give presentations, all the same invasive shit that most tech advances end up being.
I remember when I was a kid in the eighties the big selling point of all the fancy new office tech was how much easier it was going to make clerical work. With affordable computers, printers, photocopiers, fax machines, etc., secretaries would be able to get their whole work day done in three hours, and then can go home! They printed articles on that in like Time Magazine and the NY Times. I remember it distinctly because it was the first time I ever thought "what I'm reading is nonsense. Nobody is going to pay a secretary for eight hours of work if she only works three. They're just going to give her more work."
The utopian/dystopian conversation is fun, but I don't believe either is coming, so I never find it useful. When we start to figure out ways to give people TIME back - when we work less and spend more time with our family, friends and community without worrying that it's costing us our career or costing us too much money - then we're actually advancing. The rest of this stuff, whether it's driverless cars or refrigerators that order my snow boots or whatever the fuck, is, always, always, going to free us up to work more so that we can afford more stuff that frees us up to work more.
Hilariously the only bright spot on this topic that I've lived through is the white-collar office revolution since the pandemic. I've never seen anything like it. We got hours of our day back by not commuting (I get to make breakfast for my kids in the morning and eat dinner with them in the evening, neither of which was true when I was commuting into NY), and people are actually refusing to give that up. I would never have put money on that.
It happened where I work. They told us all to report to work 4 days a week starting in July (the on-site five day work week is definitely history for the professional crowd, which is lovely), AND ALMOST EVERYONE REFUSED. So they knocked it down to 2-3 days, and then 1 day a week "to ease us back into it", and then gave up and said we'll try again in January. That's kind of wild, isn't it? I can't think of an equivalent.
Oh god, the texting while driving makes me NUTS. I think Marc Maron put it best: "texting while driving is more dangerous than driving while drunk. At least when you're drunk driving someone is actually DRIVING THE CAR."
Sorry - I definitely didn't mean to come off as aggressive. :) It was supposed to be conversational. Fucking internet. Tone is impossible to convey.
I look forward to driverless cars for 2 reasons. 1) less accidents as Sage points out. Having a teen who just got his license get into a car and drive somewhere is one of the most terrifying moments as a parent. 2) efficiency. I fantasize about a time that you can get in the car at night w seats that flatten into beds, go to sleep and wake at your destination.
I've got three kids. They're not getting drivers licenses until they're 80, and by then they won't be allowed to. Yes I know this makes no sense, but I still get panicked when I put my oldest on the BUS, and I'm not even sure why anymore.
My two are almost 20 and 21 and I still worry when they drive anywhere far. The 20 yr old is driving home from San Diego this weekend and I won’t relax until he’s home safe. Not sure when that fear will end. Maybe never
Less accidents, until the BIg Accident when the central computer controlling all the cars goes down one day.
"You know when you get your driverless car it's going to end up being just another satellite office, right? You'll answer emails, give presentations, all the same invasive shit that most tech advances end up being."
...
"Nobody is going to pay a secretary for eight hours of work if she only works three. They're just going to give her more work."
Yes yes yes exactly.
When we went from producing documents that were manually typed in by secretaries (two sheets of paper at a time with carbon paper in between) and had to be hand collated and stuffed into envelopes to be mailed (and later faxed), to composing, saving and sending documents to infinite numbers of people in a second, all it meant was that we were now tasked with producing and circulating more documents in the same work day (now extended, see below).
The cell phone and email (and of course email on your cell phone) also meant that a previously 9-12 hour work day became a 15-18 hour work day because we were now accessible and expected to read and respond to documents at all hours. In my profession, technology massively increased our productivity and also our work load and stress and work intensity levels, and severely ate into down time. In 22 years I haven't taken a single vacation where I didn't work. If you tell someone you'll be gone for a few days, they interpret that as being away from your office desk, but you'll still have your iPhone and laptop so you're really not gone at all.
Yungins, let me know if you'd like "carbon paper" explained.
"Yungins, let me know if you'd like 'carbon paper' explained."
Oh, it's that wonderful stuff you use when you're not working or doing computer-aided art, and you give yourself the luxury of artisanally transferring and improving drawings by hand! Love it! And I think I last used it... sometime in college.
Never thought of carbon paper and artisanal in the same sentence, but you sure made it sound wonderful! I think I know what you mean.
I couldn't resist a chance to poke fun at my young'un-ness. I saw my grandma squeeze dagwoods of carbon and onion paper through her typewriter and wale away on it to eke out as many copies as possible. When she could no longer reuse her carbons, she'd give them to us for art supplies. When I was young enough, I through grandmas were where carbon paper comes from.
And I forgot to mention.... Never mind that technology didn't let secretaries do their job in 3 hours and then go home. It decimated the secretarial profession. It made secretary positions largely obsolete. Up until about 40 years ago, executives in corporations, and partners and associates in professional organizations each had a secretary. Sometimes it was a secretary to share with one other person, but the most "important" person in an organization had their own secretary. Over the last 40 years technology has made secretaries less and less necessary, especially as the workforce started to fill up with those who already knew how to use a computer and the relevant business software, and didn't need someone taking dictation (remember that?), typing letters, answering phones, or scheduling appointments.
When I was in high school, my school actually offered secretarial training as an elective. The ones who took the elective, learned how to write short hand (something I found incomprehensible, especially since I can't even decipher my own regular handwriting). That was a skill unique to secretaries, but I assume it's a dead skill now, obsoleted by technology.
Yes, very good point. In STEM academia, secretaries used to type papers on typewriters. Now the secretaries are all gone, and the professors type their own papers on computers.
Typing was the most valuable thing I learned in high school (early 70s). I took a summer class. It was me and 29 girls. I was there because I thought typewriters were cool tech, the girls were there to get secretarial jobs some day.
Given the boy to girl ratio in that class, I'm surprised more boys didn't sign up for some summertime typewriting.
>I'm always surprised when I hear that people look forward to driverless cars. I love long drives. They're relaxing. If I'm alone I get to listen to music or podcasts; if I'm with my family we get to chat without distractions.
I have family that lives about three and a half hours away from me. I've been making the drive down to see them eight or so times a year for about a decade and a half. I can't wait not to have to actually manage that drive.
I, too, enjoy all the things you mention about long drives, but none of those things are the act of keeping your attention on the road and managing the car. I would love to be able to give my full attention to sightseeing, or an extra movie night with the kids, or conversation, or whatever else I'd like to do while my family is on the way to visit my mom, grandmother and sister. But I have to stay stapled to the driver's seat and keep a good deal of my attention on the road. The actual driving is the least enjoyable aspect of a long drive, and I look eagerly forward to off-loading it.
Sort of reminds me of "People's Republic of Walmart" (https://www.versobooks.com/books/2822-the-people-s-republic-of-walmart)
The parts of capitalism that won't go away because of AI are the ideas and the risk. How can AI replace those?
This strikes me as a bit utopian. The economy isn't capitalist because the capitalists are in charge of making day-to-day investment and purchasing decisions, it's capitalist because the benefits of those decisions accrue to the capitalists (I am aware of the seeming tautology thanks to the terminology, but I trust you can look past that), and I don't know why algorithms would change that. I mean thanks to managerial capitalism, for a lot of capitalists it probably already largely resembles your vision anyway. Who cares who is making those decisions, whether a computer or an MBA; the money still flows upward in ever-greater proportions. The fundamental change will still have to be political. Cybersyn wouldn't have been socialist without the decision to spread its benefits among the population.
I don't think it's utopian; the computers could create an even more unequal and unjust economy then we have now. And the humans in charge will say, "Hey, the computers make the decisions...."
I see. It's not clear in the post that you see this is value neutral, given your history and the implications of the phrase "command economy." I'm still not really sure what makes your idea so very much different from what we have now, though. Even the "hey the computers make the decisions" line has a present-day reflection in, "well, that's the policy." I've often said we've already created the paperclip maximizing AI, and it's the global industrial capitalist system itself (currently furiously making crypto "currencies," which are somehow less useful than paperclips. Right now decisions are made largely on what will generate the most profits for capitalists. On what other basis will those decisions be made by these computers if the relations of production don't change?
I don't know how I could have missed that. Maybe it was added in after my comment?
Haven't edited this one since it went up!
The computers didn't create the inequality. The humans in charge allowed the growth of the monopolies, Amazon, Facebook, Google, and Big Tech in general. There's no reason Amazon has to own Amazon Cloud Services and Prime Streaming. There's no reason Facebook needs to own WhatsApp and Instagram. There's no reason Google needs to own YouTube and 90% of web advertising. The laws need to be changed to break up these monopolies.
The president who did the most to lessen income inequality, at least by the statistics, was Trump. Socialists spent 4 years lying about him to get rid of him. I'm not saying Trump wasn't rude and arrogant, but his policies worked. The current Socialist Green New Deal Policies have increased inequality, unemployment and inflation.
The Socialist idea of equality is everybody is equally poor, except Party members of course.
Sounds like what's become of humanity in Wall-E, and it's difficult to see how it could fail to be dystopic. It's easy to find groups across ideologies who also expect it and anticipate dystopia. Hopefully opt-out possibilities are established.
On this scenario it seems like an arms race between the disastrous effects of climate change and technical solutions preventing the worst. Disaster likely means the automated revolution will never be completed.
How likely is significant voluntary resistance or even more self-conscious guidance? I don't know how Marxian your communism is, but I think he's right that it is not productive activity people resist so much as never ending drudgery and the threats that prevent ending it. To the extent that people by and large do need productive activity, genuine agency, etc., Wall-E world seems not only bad, but likely to be resisted.
I think you're exactly right. You could say the same thing about online privacy (in fact, privacy in general.) 1984 relied on an all-powerful, dystopian one-Party state and its ability to install intrusive telescreens in everyone's apartment. We now have two-way telescreens everywhere - all of our phone records are recorded, all of our online chats and communications are recorded and scanned, and on the consumer side there was no one monolithic corporation that imposed its iron will on us. It happened piece by piece, Amazon by Facebook, preferences by marketing list, opt-in by microphone-enabled.
Our art, our fiction, our video games - I've been re-playing Civ 6 lately, enjoying the firm, determinative clicks that tranisition my little nation from Oligarchy to Theocracy - focus mostly on great historical moments and men of action, crossings of Rubicons and peace in our time. It's seldom that clear-cut, is it? But it makes for a hell of a story when it is. Our switch to a managerial economy run by The Algorithm may even sound chillingly future-tech, very Blade Runner, but it'll be mundane and rubbish, won't it?
"Most things are never meant.
This won’t be, most likely; but greeds
And garbage are too thick-strewn
To be swept up now, or invent
Excuses that make them all needs.
I just think it will happen, soon."
Philip Larkin - "Going, Going"
While I generally concede the logic, I always roll my eyes at most "AI takes over the world" predictions and don't actually expect them to happen, and this post has crystallized a big part of why that is for me. I feel like there's a general idea that AI is a single entity, and there's one paperclip maximizer that will rule everything, but...why? Netflix has AI, Amazon has AI, Apple has AI, and as they get bigger and more sophisticated, there's no reason they should be able to cooperate any better than humans do. Collecting action dilemmas don't exist because people are stupid, they exist because oftentimes incentives genuinely line up that way, and it won't be any less true for AI.
But this is kind of the point - you don't need one central system of control for a significant aspect of the economy to no longer be operating in a market system; you just need enough systems taking over human decisions
But if a bunch of AIs are making economic decisions and using prices to guide and implement those decisions, isn't that a market economy? I think the logic that it becomes a post human economy makes perfect sense, but there's an additional step I'm unconvinced of to turn that into a command economy. Admittedly it's my bias as an economist that I think a market system is the most efficient way to organize production and AIs would continue to use it, but when absent they, if we're talking about gradual change there should be some reason AIs would stop using it.
Yes, the AI economy Freddie articulates would still have significant knowledge dispersal in the Hayekian sense that mattered to the debate over command efficiency. But meaningful agency in the market would be diminished. So the two come apart in an interesting way
I'm not really seeing this, with one potential exception.
It seems to me that in most cases the rise of AI is no different from the rise of earlier forms of mechanization. We didn't shift from a capitalist to a socialist system when farmers started harvesting their crops with tractors instead of scythes, so how exactly is AI going to change the nature of our economy? (You could argue that subscribing to a hairbrush takes away the consumer's decision-making role, but I'd say what it does is automate the implementation of the decision: not really the same thing.)
The exception would be this. If AI started to make decisions about investment--by which I mean actual business decisions by corporations, not just algorithms for trading shares of stock--then in a real sense we might cease to live under capitalism.
One of the ways in which real-world capitalism is different from an economist's mathematical model is that owners of capital don't robotically maximize their own profits. They exercise agency, and that agency is what puts business leaders at the top of the human social hierarchy (a thing that's not even visible when you look at the equations in a model).
If software someday takes that away, so that nobody wants to read about Elon Musk or Mark Zuckerberg in the papers any more, we'll probably notice that we're living under a totally new regime.
Right now, all those AIs are failing to straighten out the supply chain mess. The main causes of the California mess are:
1. Restrictions on trucks that are legal to operate in CA.
2. AB5 restrictions on independent contractors that have essentially outlawed owner operators in California.
3. Under investment in the port facilities of Los Angeles/Long Beach, and the transportation infrastucture into and out of the ports.
4. Vaccine mandates forcing transportation sector layoffs.
All of these problems are government political failures, not something any AI could fix. The People's Republic of California voted to put Democrats in charge, and recently rejected a recall that was a chance to change things. President Joe Biden could suspend the state regulations until the ports are cleared, but he won't for political reasons. Is Freddy saying an AI would have avoided these unintended consequences?
*sigh* AB5 does NOT "outlaw" owner-operators, it just requires those who hire them to provide the same benefits that employees would receive.
That provision makes owner operators uncompetative. My statement was figurative, not literal.
The current economy doesn't function the way it does because hundreds of thousands of people sit at a desk each day and say, "Ah, let's implement a market economy with strong state intervention and a robust import-export model. And then lunch!" It functions the way it does because of a confluence of historical factors and current incentives. The same will be true of AI. It'll take all the little things that have led Netflix (a non-player here), Facebook (a big one), Amazon (a big one), the CCP, the US security apparatus, whatever else there is. "Taking over the world", in this context, doesn't mean a Dr. Evil-style theory of everything with frickin' lasers on its head. It means hegemony, or at the least some kind of multipolarity, of systems and not people. And it'll be as imprecise and messy as today, only with less accountability and more annoying spam mails.
An AI-managed economy is absolutely in our future, but it is a huge jump to assume that it will be centrally managed.
It's counter-intuitive to most, but the more the economy depends on AI (really math and programming), the lower the barrier to entry becomes. Renaissance Technologies, by far the most successful investors of all time, is a couple dozen Math and physics PhDs. Jane Street, a massive player in trading, has 1% the headcount of major banks. There are countless examples of a few programmers with VC backing upending entire industries in a few years. A single paper with a few authors can shift the direction of the entire field of Machine Learning.
The future I see is that of a few big players doing what they always have: undercutting or just buying out the competition. But also an endless stream of new competing AIs that are better
and keep the system decentralized.
I am not much of an economic thinker, but "AI" stuff drives me crazy. I think we should strike out "artificial intelligence" from our usage completely -- it's too hyped and imprecise (and in my opinion speaks of something as a given that I believe is impossible). And even "algorithm" has been corrupted. We just mean computer programs, right? Some of which are machine learning programs, which behave a little differently than traditional ones, but yeah still computer programs, which are written and maintained by people and are just electrical signals performing binary math to represent information that is intelligible to people.
I know it quickly gets to a point where the complexity of the operations being done and the speed at which they're done can make it difficult and time-consuming for people to untangle what a computer is doing, but the computer itself actually does not know what it is doing. It needs human assistance for everything it does. Behind the magic feeling of Amazon or Apple or automated trading or whatever is thousands of people working on clunky machines all day.
Whenever I am welcomed to imagine a magical mechanized future by a hyped up PR statement that just sounds like industry press to make investors excited masquerading as academic lecture at Harvard from Goldman Sachs, where they brag that ...
" 'In 2000, we had 600 humans making markets in U.S. stocks,' Chavez told the crowd in Harvard Science Center. 'Today, we have two people and a lot of software.' ...
Now, one in three Goldman Sachs employees are engineers, Chavez said.
'The future of the financial industry lies in virtual machines and strong API contracts,” Chavez said. “We are redesigning our businesses around those principles.' "
... I have mental sirens blaring. What the fuck does the Chief Financial Officer at Goldman Sachs know about "strong API contracts" ???
Ok, yes, the nature of their business has changed, and so also has the nature of the economy. But computers are just code and code is written by people. And code, though it feels magic when it works, is actually very weak, messy, dumb, vulnerable, and constantly changing, constantly in need of human help.
Anyway this is tangential to your point, which is that people like it when computers do shit for them, which I think is very true. And I think you're right that the spread of more "automated" experiences in everyday life will be gradual, like with anti-lock brakes or cruise control.
But man I get angry when I read "AI" hype !! Computers are not "intelligent"! There is no such thing as a "driverless" car and never will be! People like it when computers do shit for them, but they shouldn't. In fact it is not the computers doing the shit, but the code, which was written by other people, and I guess in the case of your linked article, Goldman Sachs people. Anyway sometimes I think about developing this general anger into more detailed arguments, but I'll stop now.
I couldn't agree more. I work for Acme Inc.; we provide knowledge services for a particular segment of professionals (sorry for the vagueness). In the past all the creation and organization of this knowledge was done by people. Now, with our much vaunted and hyped machine learning, all the creation and organization is... done by the same people.
The machine learning helps a lot with search algorithms (though people still have to fix those). Other than that it helps organize our tasks really well, but the work is the same as it's been for about eighty years.
I think like half of machine learning is done so that clueless business leaders can proudly tell other clueless business leaders that they're using machine learning
I really agree. And on a long enough timeline it always morphs from "machine learning" to "artificial intelligence" via detours through marketing and the C-suite which just gives the wrong impression on what's actually going on at the company.
I've worked at two companies: a fortune 500 and a smaller founder-lead tech firm. In the latter, the CEO clearly knows what's going on in the company and even understands a lot of the tech. In the fortune 500, no leader I spoke to even had the faintest idea of what's going on. And I'm not even talking about the C-suite, even your standard SVP lacked even basic knowledge of their business unit. I think their heads are up their asses on immaterial financial stuff, and they got their positions by politicking, not by talent.
Yes
I'd put it closer to 75%, if not higher.
I work at a fortune 500 writing financial software. They keep encouraging us to incorporate "machine learning" and I'm like...for what? The bulk of the programming we do is making sure our APIs work with our downstream partners and that they comply with the law. This doesn't seem like the sort of area where you want a computer making judgment calls about what it thinks guardrail laws probably are in the state of Rhode Island. There is no point of our process where machine learning is applicable, and trying to shove it in is going to open us to lawsuits.
Right, I don't think most of these people know what machine learning or AI is, they just saw it on their business blogs or at a conference.
I think of ML as being one of three tasks
1. Supervised learning: ie prediction
2. Unsupervised learning: Essentially, finding similarities: dimensionality reduction, clustering, similarity matching, market basket analysis, network analysis
3. Reinforcement learning: Arguably a subset of #1, but I think of it differently. This is like a chess AI where it tries something, fails, and then adjusts its moves to get more the reward. I imagine self-driving cars use a really complex reinforcement learning algorithm
If it's not one of those three things, then the person saying "machine learning" probably has no idea what they're talking about.
Yep. One of the higher ups at my work read about it online and wanted to hire a team. It was a totally inappropriate application, not that we even had the right data in the first place. Anyway, we convinced him it would be racist, and he dropped it.
...that's epic
We consider the current system a capitalist one because companies more or less try to maximize profits, individuals make fairly free decisions about purchases and employment, and the government to whatever degree stays out of all of those company and individual decisions. It's actually a mixed economy, but we can leave that aside for now. I don't see those fundamental characteristics changing much as a result of delegating more of the specific tasks, so I'm not sure why we wouldn't still call the overall system capitalist. We could consider employees as an analogy. When the owner of a firm delegates a decision to an employee, the intent remains more or less the same. Sure, there are some agency problems, but that's a side effect. I would overall expect decisions to remain fundamentally capitalistic in their goals and criteria even if they're made by an AI rather than an employee.
Perhaps we would see some emergent changes in the system once most decisions are made by AIs (e.g., more implicit collusion), but that seems purely speculative at this point.
I agree broadly about the AI command economy, but I think the implications for humanity are pretty stark, especially when considered together with the prospect of climate change. Luxury Gay Space Communism or whatever will be a thing, but only for a small population that survives with enough capital to justify their place in that order.
I'm pretty sure Amazon already had a pilot program where they just shipped you things based on what it thought you'd want. The way it worked is you could keep the stuff you wanted but then just seal the box back up and ship back what you didn't want.
Sounds great for the environment.
The thing that stops your car is a brake, not a break
Glad I won't be around to see it. Though I would have liked driverless cars.
I can see our consumption habits becoming more automatic and controlled by AI. That’s certainly Amazon’s goal – they advertise products I want, and they really push subscriptions. Amazon once tried to get me to subscribe to a hairbrush (like, the default selection was that I’d get a new hairbrush every three months).
But I’m not sure how the change will impact workers. I assume the purpose of sending me products is to bill me for them – and that the products will stop coming if I can’t pay. I will still need a job. We could use all this technology to free people from the 40+ hour work week, and to provide everyone with what they need regardless of their ability to pay -- but it’s sort of hard to imagine that we will.
Your duty is clear. To build and maintain those robots!
More corporate bullshit jobs, I guess
Though I've always been a techie and early adopter of just about everything I can reasonably get my hands on, I'm also the one that won't give Apple or anyone else my fingerprints, I cover up cameras on all my devices, until needed, and I never ever use or buy anything Alexa, Siri, Google, or anything else that purports to "smart" my life by spying on me and collecting all data about me and my household as is possible. I puke all over the smart fridge that tells you when you need to replenish eggs, and the Amazon tokens that auto reorder essentials. I know tech is still collecting all my data it can get its hands on, but I'm not willingly handing any of it over. There is massive resistance to the conveniences companies are trying to sell us, because they are creepy as hell. I don't see people jumping en masse on the grocery-reordering-fridge. I don't know how long that product will last before it ends in the failed-products dump heap. All the tech I gladly adopted throughout my life made my life more pleasurable and efficient in some way, brought something new I didn't have before, did something in a better, faster way. The current spy-tech is designed to wring value out of me, under the guise of giving value to me, value I reject for being undesirable.
I wonder what portion of the overall economy is devoted to moving around non-essentials. Utilities are a standard and largely unembellished, one-size-fits-all essential (we don't shop for the best electricity or water). You can say health care is an essential that doesn't need to be all that customized. Housing is an essential, but it's far from standard in its particulars. Just about everything else that I can think of falls into the highly customizable purchase category, in terms of quantity, quality, and other utilitarian or stylistic features. It's one thing to delegate tiresome chores (transportation) and decisions about essentials (including how to produce and deliver it) to AI. It's quite another to hand over decision making over non-essentials, a big chunk of the economy, and our entitlement to pleasures and escaping the hell that's sameness.