There's good reason to think 1 is likely - if only because This Can't Go On https://www.cold-takes.com/this-cant-go-on/ (highly recommend his other posts in the Most Important Century series)
A man is walking through Washington Square Park when he spots his favorite blogger playing chess...with a dog. He watches for awhile, and although far short of expert level himself, he knows enough to see that the dog has some game. "Incredible," he exclaims. "A chess-playing dog!".
"Aw he's not so great," replies Freddie. "I beat him two out of three."
Good read, thanks! I agree with you about the Irresponsible hype about Artificial Intelligence. Much of it stems from the common misunderstanding of the word "intelligence." As many physicists and mathematicians will tell you, AI is not creating an 'intelligence' that will somehow morph into "Artificial Generalized Intelligence" and take over the world. That is a misnomer. It's a computational process that manipulate bits of information. That's very different. If we misperceive it as intelligence we will be misdirected in terms of how to deal with it and deploy it. If we think that it's doing the same thing that real human beings do, we will be misdirected in terms of how we use it to understand the natural world (for instance, taking its advice at face value). It is a great tool, to be sure, and it is remarkable in many ways. Although it can mimic behavior that appears to be "intelligent" at the output (surface) level, it's not doing what natural systems do, and many people doubt that it will ever will do what natural systems do. As I said, it's a tool. You can use it wisely or not. Here's another perspective:
Since we have the capacity to accidentally destroy life on earth with nuclear weapons, biohazards, etc, and have enjoyed more than one near-miss, I'm interested to know why you (or anyone) *doesn't* think we live in the most *potentially and actually consequential* period of human history.
We tend to severely overestimate the impact a new technology will have in the next 20-40 years. But we tend to severely underestimate this impact over a historical timeframe. Some woman rubbing two stick together to create a tiny fire paved way to us currently attempting nuclear fusion. There will come a time when humans would regard us with our pre-AI lives the same way we regard chimpanzees using a basic stone tool.
I agree that we are wired to think we are more important than we are and that 'everything will be forgotten' and so on, but just looking at childhood (the child is the father of the man etc) it seems pretty clear that something essential about humanity has begun to change. And not just for the people who escaped poverty. For everyone to a lesser or greater degree. But AI aside, the biosphere is crumbling and that is a first (for humanity, not for the earth). So yes, I think maybe we are living in the 'most important' time of humanity. Will we fiddle? Will we fight for it. That is the question. LOVE your writing. Thank you.
I don't think you understand how remarkable I am, compared to most of my fellow humans throughout history. I live in a palace, attended by invisible servants who cool and heat me and bring me water. My chariot is drawn by 100 horses. I carry the Library of Alexandria in my pocket and I have a magic seeing stone on my desk. So why shouldn't amazing things like true AI happen in my lifetime?
We in the WEIRD regions are isolated from most of the historical problems. We don't expect to bury 1-2 of our kids, we're not 2 lean years from starvation, war is something on TV, and we're fairly confident that a doctor can fix anything that does break down. Even our last plague killed <1% of the population; our medieval ancestors would call that a mild year. But we're socially atomized, increasingly isolated, continuously pumped full of anxiety, and feeling like this luxury world isn't quite all there is. but at the same time, we like it. And this is another thing that brings Change. Change is good - in retrospect. But at the moment, it's disruption in an already fragile and strained system. Sure, our descendants will inherit the earth - at least the survivors will.
I think the reason the AI hype seems so over the top is because the media world is where it is hitting the hardest. Most people are going about their lives unawares. AI isn't going to mow the lawn, fix the tractor, thin the trees and burn the slash. It isn't going to wash the dishes or decide that the cat is sick and needs to be taken to the vet. It isn't going to take care of our elderly or our babies.
However, I do think the ramifications in technology and crime will be profound. The ability to access and use massive world wide data will speed development of new ways of doing things a lot. I'm not sure how well our species is going to deal with it. Most likely it will be the usual combination of bad and good.
It's hit a group of creative people who thought they had something that set them apart: creative skill. And now they're finding out that all their hard work doesn't matter; audiences (and bosses) don't CARE if it's mediocre, they just want to churn out content, and attention to detail can fuck off. The fact that this is cannibalizing their creative works is just an extra kick.
Best case scenario: Maybe what will happen is that artists will be forced to disengage from traditional commerce / market values and go back to the time when they had PATRONS (Czars, The Church, Elon Must, whatever)... hopefully with patrons who will not force on them an agenda all the time, only some of the time. This concept does exist already, even in Hollywood and such. I am saddened by what AI has done to visual art, but I think ChatGPT etc is still pretty shit and not a real threat to creative writers - as opposed to 'content creation'... Yet.
Respectfully disagree. Our grandparents’ generation was the first to develop the tools which could enable a small handful of people to wipe human civilization from the face of the earth.
In just the last 200 years, we’ve gone from a world lit by fire to one blazing with electricity, a world where no one could travel faster than a horse or sailing ship to one where a person can traverse a continent in hours, and a message can cross the globe in milliseconds. A world where the population exploded from one billion to eight billion. A world which has sent emissaries to the moon. All of this is just a sample of what’s happened in the last 0.1% of the era of homo sapiens. This *is* an extraordinary time.
That’s not narcissism; *I* had nothing to do with it and take no credit. In certain respects, it’s still true that there’s nothing new under the sun. People are still people, and certain fundamental truths about life and humanity still obtain. In the grand scheme of human history, though, I think it’s hard to deny that this is an era of remarkable and accelerating change.
Things are getting faster, and more complex, and more chaotic. We're more connected, but further from each other. No one knows exactly how to change anything; any action taken by the ordinary person, from voting to protesting to smashing windows, seemingly has no effect.
If human vanity and regression to the mean is going to be the lens we are to look through to try and understand AI and the future then we should also look at what AI truly is. A human made tool. From fire to the wheel to the ship and then the splitting of the atom each time these tools came with both good and bad intentions. Fire made food, the wheel, commerce, the ship made discovery and the splitting of the atom ended large scale wars. They also contributed greatly to the misery of human kind as well. AI will be no different. It will bring about much that is good and useful but under our current system of governance in the west that has inarguably become highly militarized by the capture of both the media and governance it’s not difficult to see where those with bad intentions will want to lead this man made tool. Power and control. Is it so difficult to see how AI can and will be weaponized in the future if for no other reason that humans incentivized for such a use of the tool do exist in large quantities especially in the US Pentagon. We don’t have to predict what AI will ultimately become, we only have to look to the same historical motivations that repeat themselves and the humans incentivized to use powerful tools like AI to understand what it can become.
I think the line of thought here is that A.I. is the next great step from the world wide web, which was the next step from the printing press/ "linear mind" which was the biggest step since the creation of civilization, which was the biggest step since fire.
So if that logic tracks, were seeing exponential growth in some ways. hundered thousand years ago, six thousand years ago, five hundred years ago, twenty five years ago, this year.
Information increases, and the rate at which it accumulates and spreads increases, and it multiplies and mutates a billion different ways. And so fast - people around the globe can become aware of your funny cat, social faux pas, or revolutionary idea. And there is no control; capital can try to move information towards its ends, but the Internet is too big and wild for any power to tame.
Hmmm I think Freddie is doing a bit of safe uncertainty here. And perhaps ignoring even the possibility of a fat tails or black swan event.
And while I think the title of the free press article leans into false dichotomy territory, Bari Weiss’s actual conversation with Sam Altman is considerably more nuanced than that. I would suggest judging the conversation and not just the title/by-line.
I'm reminded of Nietzsche's observation, that we are tempted to adopt a fallacious "belief in opposites" instead of recognizing that the course of human affairs is defined by gradations and subtleties. Of course, gradations and subtleties don't sell subscriptions to The Free Press . . .
One of the things that Marx intuited about capitalism is its capacity to generate social and cultural changes that are unpredicted and are never really decided on by any central political decision-maker. When did any government of any advanced capitalist democracy decide that the fertility rate of that country would be below replacement level by 2020? Yet now this is the case in every one of them with the exception of Israel. When cell phones (or mobile phones as we call them in the Antipodes) began to appear in the 1980s, who was predicting that by 2020 possession of one would be a quasi-compulsory condition of social citizenship, not to mention all the possibilities (positive and negative) that they have opened up?
"One of the things that Marx intuited about capitalism is its capacity to generate social and cultural changes that are unpredicted and are never really decided on by any central political decision-maker."
Interesting sentence and idea... What are the alternatives? A top down command economy? Who is at the top? It was a disaster when it was attempted in the 20th century. It might work in traditional societies but reverting back to the idyllic self sufficient villages that never existed means the return of nasty things like blood feuds, small scale genocide, starvation...and the loss of running water. Our lifestyle requires lots of flowing electrons and that won't happen without huge collective societies.
How should they be governed and by whom? We aren't governed by capitalism. Capitalism is a process of dividing goods and services. The ills of capitalism could be solved by the people in the society. Citizens in a democratic society could collectively decide that having criminals hiding their loot in Delaware is wrong, make them show a drivers license when they open up a LLC. (Is there a legit purpose for a shell company?) We could decide that it's important to have competition in the marketplace and demand politicians put an end to monopoly.
I totally believe in the rightness of each to their ability and each to their need (or however that beautifully goes) but who decides need and ability? Who makes the rules by which we play the game? We either have a king or some form of democracy. If there's a third way what is it? AI?
There's good reason to think 1 is likely - if only because This Can't Go On https://www.cold-takes.com/this-cant-go-on/ (highly recommend his other posts in the Most Important Century series)
A man is walking through Washington Square Park when he spots his favorite blogger playing chess...with a dog. He watches for awhile, and although far short of expert level himself, he knows enough to see that the dog has some game. "Incredible," he exclaims. "A chess-playing dog!".
"Aw he's not so great," replies Freddie. "I beat him two out of three."
Good read, thanks! I agree with you about the Irresponsible hype about Artificial Intelligence. Much of it stems from the common misunderstanding of the word "intelligence." As many physicists and mathematicians will tell you, AI is not creating an 'intelligence' that will somehow morph into "Artificial Generalized Intelligence" and take over the world. That is a misnomer. It's a computational process that manipulate bits of information. That's very different. If we misperceive it as intelligence we will be misdirected in terms of how to deal with it and deploy it. If we think that it's doing the same thing that real human beings do, we will be misdirected in terms of how we use it to understand the natural world (for instance, taking its advice at face value). It is a great tool, to be sure, and it is remarkable in many ways. Although it can mimic behavior that appears to be "intelligent" at the output (surface) level, it's not doing what natural systems do, and many people doubt that it will ever will do what natural systems do. As I said, it's a tool. You can use it wisely or not. Here's another perspective:
"ChatGPT, Lobster Gizzards, and Intelligence"
https://everythingisbiology.substack.com/p/chatgpt-lobster-gizzards-and-intelligence
Thankfully, it’s the end of the world so I don’t have to contemplate it.
Since we have the capacity to accidentally destroy life on earth with nuclear weapons, biohazards, etc, and have enjoyed more than one near-miss, I'm interested to know why you (or anyone) *doesn't* think we live in the most *potentially and actually consequential* period of human history.
I read the Free Press for some possible insight and I read Freddie DeBoer for sanity.
We tend to severely overestimate the impact a new technology will have in the next 20-40 years. But we tend to severely underestimate this impact over a historical timeframe. Some woman rubbing two stick together to create a tiny fire paved way to us currently attempting nuclear fusion. There will come a time when humans would regard us with our pre-AI lives the same way we regard chimpanzees using a basic stone tool.
I agree that we are wired to think we are more important than we are and that 'everything will be forgotten' and so on, but just looking at childhood (the child is the father of the man etc) it seems pretty clear that something essential about humanity has begun to change. And not just for the people who escaped poverty. For everyone to a lesser or greater degree. But AI aside, the biosphere is crumbling and that is a first (for humanity, not for the earth). So yes, I think maybe we are living in the 'most important' time of humanity. Will we fiddle? Will we fight for it. That is the question. LOVE your writing. Thank you.
I don't think you understand how remarkable I am, compared to most of my fellow humans throughout history. I live in a palace, attended by invisible servants who cool and heat me and bring me water. My chariot is drawn by 100 horses. I carry the Library of Alexandria in my pocket and I have a magic seeing stone on my desk. So why shouldn't amazing things like true AI happen in my lifetime?
We in the WEIRD regions are isolated from most of the historical problems. We don't expect to bury 1-2 of our kids, we're not 2 lean years from starvation, war is something on TV, and we're fairly confident that a doctor can fix anything that does break down. Even our last plague killed <1% of the population; our medieval ancestors would call that a mild year. But we're socially atomized, increasingly isolated, continuously pumped full of anxiety, and feeling like this luxury world isn't quite all there is. but at the same time, we like it. And this is another thing that brings Change. Change is good - in retrospect. But at the moment, it's disruption in an already fragile and strained system. Sure, our descendants will inherit the earth - at least the survivors will.
Best comment ever.
I think the reason the AI hype seems so over the top is because the media world is where it is hitting the hardest. Most people are going about their lives unawares. AI isn't going to mow the lawn, fix the tractor, thin the trees and burn the slash. It isn't going to wash the dishes or decide that the cat is sick and needs to be taken to the vet. It isn't going to take care of our elderly or our babies.
However, I do think the ramifications in technology and crime will be profound. The ability to access and use massive world wide data will speed development of new ways of doing things a lot. I'm not sure how well our species is going to deal with it. Most likely it will be the usual combination of bad and good.
It's hit a group of creative people who thought they had something that set them apart: creative skill. And now they're finding out that all their hard work doesn't matter; audiences (and bosses) don't CARE if it's mediocre, they just want to churn out content, and attention to detail can fuck off. The fact that this is cannibalizing their creative works is just an extra kick.
Best case scenario: Maybe what will happen is that artists will be forced to disengage from traditional commerce / market values and go back to the time when they had PATRONS (Czars, The Church, Elon Must, whatever)... hopefully with patrons who will not force on them an agenda all the time, only some of the time. This concept does exist already, even in Hollywood and such. I am saddened by what AI has done to visual art, but I think ChatGPT etc is still pretty shit and not a real threat to creative writers - as opposed to 'content creation'... Yet.
Respectfully disagree. Our grandparents’ generation was the first to develop the tools which could enable a small handful of people to wipe human civilization from the face of the earth.
In just the last 200 years, we’ve gone from a world lit by fire to one blazing with electricity, a world where no one could travel faster than a horse or sailing ship to one where a person can traverse a continent in hours, and a message can cross the globe in milliseconds. A world where the population exploded from one billion to eight billion. A world which has sent emissaries to the moon. All of this is just a sample of what’s happened in the last 0.1% of the era of homo sapiens. This *is* an extraordinary time.
That’s not narcissism; *I* had nothing to do with it and take no credit. In certain respects, it’s still true that there’s nothing new under the sun. People are still people, and certain fundamental truths about life and humanity still obtain. In the grand scheme of human history, though, I think it’s hard to deny that this is an era of remarkable and accelerating change.
Things are getting faster, and more complex, and more chaotic. We're more connected, but further from each other. No one knows exactly how to change anything; any action taken by the ordinary person, from voting to protesting to smashing windows, seemingly has no effect.
If human vanity and regression to the mean is going to be the lens we are to look through to try and understand AI and the future then we should also look at what AI truly is. A human made tool. From fire to the wheel to the ship and then the splitting of the atom each time these tools came with both good and bad intentions. Fire made food, the wheel, commerce, the ship made discovery and the splitting of the atom ended large scale wars. They also contributed greatly to the misery of human kind as well. AI will be no different. It will bring about much that is good and useful but under our current system of governance in the west that has inarguably become highly militarized by the capture of both the media and governance it’s not difficult to see where those with bad intentions will want to lead this man made tool. Power and control. Is it so difficult to see how AI can and will be weaponized in the future if for no other reason that humans incentivized for such a use of the tool do exist in large quantities especially in the US Pentagon. We don’t have to predict what AI will ultimately become, we only have to look to the same historical motivations that repeat themselves and the humans incentivized to use powerful tools like AI to understand what it can become.
I think the line of thought here is that A.I. is the next great step from the world wide web, which was the next step from the printing press/ "linear mind" which was the biggest step since the creation of civilization, which was the biggest step since fire.
So if that logic tracks, were seeing exponential growth in some ways. hundered thousand years ago, six thousand years ago, five hundred years ago, twenty five years ago, this year.
Information increases, and the rate at which it accumulates and spreads increases, and it multiplies and mutates a billion different ways. And so fast - people around the globe can become aware of your funny cat, social faux pas, or revolutionary idea. And there is no control; capital can try to move information towards its ends, but the Internet is too big and wild for any power to tame.
Hmmm I think Freddie is doing a bit of safe uncertainty here. And perhaps ignoring even the possibility of a fat tails or black swan event.
And while I think the title of the free press article leans into false dichotomy territory, Bari Weiss’s actual conversation with Sam Altman is considerably more nuanced than that. I would suggest judging the conversation and not just the title/by-line.
I'm reminded of Nietzsche's observation, that we are tempted to adopt a fallacious "belief in opposites" instead of recognizing that the course of human affairs is defined by gradations and subtleties. Of course, gradations and subtleties don't sell subscriptions to The Free Press . . .
One of the things that Marx intuited about capitalism is its capacity to generate social and cultural changes that are unpredicted and are never really decided on by any central political decision-maker. When did any government of any advanced capitalist democracy decide that the fertility rate of that country would be below replacement level by 2020? Yet now this is the case in every one of them with the exception of Israel. When cell phones (or mobile phones as we call them in the Antipodes) began to appear in the 1980s, who was predicting that by 2020 possession of one would be a quasi-compulsory condition of social citizenship, not to mention all the possibilities (positive and negative) that they have opened up?
"One of the things that Marx intuited about capitalism is its capacity to generate social and cultural changes that are unpredicted and are never really decided on by any central political decision-maker."
Interesting sentence and idea... What are the alternatives? A top down command economy? Who is at the top? It was a disaster when it was attempted in the 20th century. It might work in traditional societies but reverting back to the idyllic self sufficient villages that never existed means the return of nasty things like blood feuds, small scale genocide, starvation...and the loss of running water. Our lifestyle requires lots of flowing electrons and that won't happen without huge collective societies.
How should they be governed and by whom? We aren't governed by capitalism. Capitalism is a process of dividing goods and services. The ills of capitalism could be solved by the people in the society. Citizens in a democratic society could collectively decide that having criminals hiding their loot in Delaware is wrong, make them show a drivers license when they open up a LLC. (Is there a legit purpose for a shell company?) We could decide that it's important to have competition in the marketplace and demand politicians put an end to monopoly.
I totally believe in the rightness of each to their ability and each to their need (or however that beautifully goes) but who decides need and ability? Who makes the rules by which we play the game? We either have a king or some form of democracy. If there's a third way what is it? AI?