AGI, Immortality, & Visions of the Future with Adam Becker
54 min
•Nov 28, 20255 months agoSummary
Adam Becker, physicist and author of 'More Everything Forever,' critiques tech billionaires' visions of AGI, Mars colonization, and immortality as scientifically flawed and driven by hype rather than feasibility. The episode explores how science fiction is being misinterpreted by tech leaders to justify unrealistic promises and massive wealth concentration.
Insights
- Tech billionaires conflate business success with scientific expertise, leading to incoherent predictions about AGI and space colonization that lack scientific grounding
- Moore's Law and exponential growth trends are business decisions, not laws of nature, and have already plateaued in semiconductor advancement
- Mars colonization faces insurmountable biological challenges (radiation, life support, communication delays) that make near-term settlement implausible despite billionaire claims
- Science fiction is being weaponized as marketing narrative rather than cautionary tale, with tech leaders creating 'Torment Nexus' scenarios they claim to prevent
- Wealth concentration enables disproportionate power over policy, legislation, and societal direction without corresponding scientific or ethical accountability
Trends
Tech billionaires using science fiction narratives to justify massive R&D spending and government subsidies without realistic timelinesShift from solving Earth-based problems to escapist space/digital immortality solutions as wealth concentration increasesAGI hype cycle driving venture capital and government funding despite lack of consensus on feasibility or safety mechanismsMisinterpretation of dystopian sci-fi (Blade Runner, Neuromancer, Skynet) as aspirational blueprints rather than warningsProgressive taxation and wealth caps re-emerging as policy discussion in response to billionaire-driven technological directionCritical gap between scientific literacy and business/political influence among ultra-wealthy tech leadersConsciousness uploading and digital immortality gaining mainstream venture backing despite neuroscience consensus on implausibility
Topics
Artificial General Intelligence (AGI) feasibility and timelinesMars colonization technical and biological barriersMoore's Law plateau and semiconductor advancement limitsConsciousness uploading and digital immortalitySingularity theory and Ray Kurzweil's predictionsTech billionaire influence on policy and legislationScience fiction as cautionary tale vs. business blueprintRadiation exposure in deep space missionsTranshumanism and life extension technologyWealth concentration and power asymmetryAGI safety and control mechanismsSpace-based energy collection (Dyson spheres)Self-driving vehicle oversight and safetyProgressive taxation policy effectivenessScientific literacy in tech leadership
Companies
OpenAI
Sam Altman, CEO, claims AGI achievable within two years and will solve all problems including climate change
Tesla
Elon Musk's company; discussed regarding Mars colonization goals and Cybertruck design inspired by Blade Runner
SpaceX
Elon Musk's rocket company pursuing Mars colonization with goal of 1 million people by 2050
Amazon Web Services (AWS)
Jeff Bezos controls majority of cloud infrastructure; represents hidden power of tech billionaires beyond consumer-fa...
Blue Origin
Jeff Bezos's space company; mentioned as billionaire-owned rocket venture alongside SpaceX
Google
Parent company of Waymo autonomous vehicles; discussed regarding remote human supervision of self-driving cars
Waymo
Google's autonomous vehicle division; uses remote human supervision despite claims of autonomous capability
Intel
Co-founder Gordon Moore created Moore's Law; company drove semiconductor advancement that has now plateaued
Andreessen Horowitz
Venture capital firm led by Marc Andreessen; major funder of tech billionaire ventures and AGI research
Machine Intelligence Research Institute (MIRI)
AI research group inspired by Singularity theory; doesn't offer 401Ks to employees due to belief in imminent technolo...
People
Adam Becker
Physicist with PhD in computational cosmology; author of 'More Everything Forever' critiquing tech billionaire visions
Elon Musk
Tesla/SpaceX CEO; primary target of criticism for Mars colonization claims and dismissal of empathy as valuable
Sam Altman
OpenAI CEO; claims AGI achievable in 2 years and will solve climate change and create space exploration jobs
Ray Kurzweil
Futurist and singularity theorist; predicted singularity arrival in 2045 based on Moore's Law extrapolation
Marc Andreessen
Venture capitalist and tech industry influencer; represents billionaire class driving AGI and space colonization narr...
Jeff Bezos
Amazon/Blue Origin founder; controls AWS cloud infrastructure and pursues space ventures
Neil deGrasse Tyson
Astrophysicist and StarTalk host; moderates discussion and provides scientific perspective on tech claims
Andy Weir
Author of 'The Martian'; discussed regarding perchlorates on Mars surface that would poison crops
Gordon Moore
Intel co-founder; created Moore's Law describing semiconductor advancement that has now reached physical limits
Rod Serling
Twilight Zone creator; cited as example of using science fiction to critique society through allegory
Ursula K. Le Guin
Science fiction author; example of using speculative fiction to examine poverty, capitalism, and gender
William Gibson
Cyberpunk novelist; Neuromancer cited as warning about wealth concentration and technology-enabled power asymmetry
Frank Gorshin
Actor who played Riddler in Batman; referenced for Star Trek episode about racial prejudice
Julius Caesar
Historical reference for first dictator; used as analogy for billionaires creating uncontrollable AGI
Franklin D. Roosevelt (FDR)
Referenced for 90% progressive tax rates that funded public infrastructure without limiting wealth accumulation
Quotes
"We don't know whose hands are on the steering wheel, we don't know who's shaping this future, and that's why there's concern."
Neil deGrasse Tyson•Opening segment
"The whole idea is kind of nonsense to begin with. Intelligence is like a really complicated thing. It's not one number."
Adam Becker•Singularity discussion
"If you did build one that could solve global warming and you turn it on and said, how do you solve global warming? I'm pretty sure the first thing it would do is say, well, you shouldn't have built me."
Adam Becker•AGI climate change discussion
"They're so stupid, they have hundreds of billions of dollars and you don't. But that's what makes them so stupid."
Chuck Nice•Tech billionaire critique
"The problem isn't science fiction, the problem isn't science. The problem is like critical reading comprehension skills. Yeah, and money."
Adam Becker•Science fiction misinterpretation discussion
"You can't escape politics. You can't escape human nature."
Adam Becker•Mars colonization discussion
Full Transcript
Love me some future talk. But they spook me though, with where they take the end of civilization. But we don't know whose hands are on the steering wheel, we don't know who's shaping this future, and that's why there's concern. Well, I know who's saving it, and I'm scared to death. All right. It's you. Let's watch Chuck be scared to death as we discuss all the ways tech will be shaping our future. Coming right up, Star Talk, special edition. Welcome to Star Talk, your place in the universe where science and pop culture collide. Star Talk begins right now. This is Star Talk, special edition. Neil deGrasse Tyson, your personal astrophysicist. And when I say special edition, it means I turn to my right. And Gary O'Reilly is sitting there. I know. Hi. Would you get your British accent? Stolen. Just a thief. Chuck. Hey. You're otherwise known as Lord Nice, but we can call you Lord of Comedy. Lord of Comedy. Can we do that? Okay. Let's do that. So today, we're gonna explore a vision of the future. And everybody's got their take on the future. Yes. Everybody's got everybody. But they all have different takes because they come in from a different place. So you gotta hear it all. If you're gonna assimilate it into something that you're gonna take action on. True. So either make something happen or prevent something else from happening. Yeah. So set us up, Gary. All right, so what does the future hold for us? That'll include scientists, science fiction authors, tech CEOs, and the so-called futurists. Everyone has their own idea for the future technologies. Vision of AGI, nuclear fusion, the singularity, transhumanism, living on Mars. We gotta get to the moon first. Stuff we don't. No, you go straight to Mars. Do we? So don't get me started. You're gonna ask to Mars. Oh! And there you have it. Stuff we talk about all the time on Star So, and in the face of new technological developments, we're quickly going from science fiction to science reality. But are we headed towards the utopian? Or are we headed towards dystopia? That we'll get into. Are these technologies as close as they claim? Is science fiction always a guiding light, or can it be a blueprint for those in power? So on that note. So who do we have today? Adam Becker, Adam, welcome to Star Talk. Thanks for having me. It's good to be here. All right, you got a PhD in computational cosmology. Yes. Love it. That was back in 2012. And you wrote a book in 2018. Called What is Real? Yep. That's audacious. The unfinished quest for the meaning of quantum physics sound like there's a little bit of philosophy in there. Yeah, yeah. Or a lot of philosophy. Oh, there's some. Yeah. And you just came out with a new book because you've been writing science popularizer Maniac ever since your PhD. Pretty much. Yeah, yeah. So here's the title. I love this More Everything Forever. Mm-hmm. AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity. Oh, I got a better title. We're EFT. You know, you should have had that title. Yeah, you know, we considered it. But we just didn't think that it would really, you know, sell. So my researchers told me that we've corresponded before. Cause I've only just met you now. So what? Yes, that's true. So they told you we corresponded, but they didn't tell you, wow. Oh, okay. So what happened? So they set me up. That's what happened. Oh, someone like us. Oh, for real? What? Oh, God. Well, what happened was. What happened was, exactly. I was a snot-nosed kid in grad school and came to visit the museum and noticed what I thought was a mistake on one of the plaques. And so I emailed just the general astronomy department email here at the Rose Center. And then two weeks later, you wrote back. Oh. And what did he say? Yeah. Well, I'm sure I wouldn't even pull out. I'm a question. Was there a mistake and did Neil school you if there wasn't? Whether there was a mistake or not was a matter of some debate. Oh, really? Yeah, so the question was the size of the universe. Oh, yeah. It's a plaque. Oh, okay. There it is. Well, that's still a debate today. Well, no, not in the way that he's described. No, not like this. Yeah, yeah, yeah. So ideal and observables, okay. Right. As do many practical scientists. Right. So when you speak of what the universe is doing, you speak of what you see it's doing. Right. And we can see galaxies whose light has been traveling for 14 billion years, 13.8 billion. And so we will loosely say, it's definitely the age of the universe, but we speak of the size. We can be a little sloppy and say it's 13.8 billion light years to the edge of the universe. But that's not strictly accurate. What you have to do is since then, the universe has been expanding. So where's that galaxy now? It's like 45 billion light years away, but you can't see it. So you have to stick it into a model of the expansion rate of the universe and come out with a number that you cannot observe. So you being snotnosed about that. But that's fine. Tell me I was polite, because I think I'm polite. You're polite. You're polite. We had a little back and forth. And eventually, I think you were getting a little impatient. Oh, really? And you said, why don't you make a presentation of this to a wide audience in a way that you think is suitable. I remember that in correspondence now. Okay, okay, I get it. So you said that. And then I went up and did a podcast about it and sent it to you. Okay. All right, so in your book, Overlord Space Empires, you just go all out and you're coming to it from a physicist with a philosophy flavor. So you're gonna see this in ways pure tech people wouldn't or politicians or just regular everyday folk walking up and down the street. So how did you prepare for this book? I read a lot of really bad writing by tech CEOs and people defending tech CEOs online, writing long essays and books about why the future is inevitably going to be all about super intelligent AI. Why the future is going to inevitably be about space-based- Yeah, exactly. Some of these things were ideas that I had the expertise to say, okay, no, that's not true and here's why. But some of them were ideas about biology or areas of physics that I don't have expertise in or other things. And so then I went and interviewed a bunch of people who have expertise in those areas. Now you're playing journalist in that capacity. Yeah, yeah, yeah, yeah. And like read books on those subjects and pulled out what I needed and stuff like that. And then I tried to interview the tech CEOs themselves and almost all of them said no. No, right. Yeah, because they're looking at you as somebody who is intellectually honest with some integrity and they're like, we can't talk to you. Okay. Because they know they're full of crap. Yeah, well, I think that they just didn't see any reason to, right? You know, like I was very honest. I said, you know, like this is a book that's going to take a critical look at your ideas. That was your problem. Well, if you had gone in and said, I'm enamored of the fact that AI is going to be such an integral part of the next chapter in human history. And that you guys are the progenitors of this amazing tech. They would have been like, come on in, let's talk for a second. No, you don't need an appointment, stop by anytime. That was the first mistake. Yeah, man. Well, but you know, I got journalistic integrity. I think. All right, well good. Well, look, let's explore some of the scenarios that are going to be potentially the reality of us as a human race. It's going to go right on down the list. Yeah, the laundry list. Okay, well, let's look at Mars in 2050. Oh yeah. Are we saying maybe, maybe not, you're kidding me. Oh, that's definitely going to happen. Yeah, Elon Musk, he has said he wants to put a million people on Mars by 2050 to have a self-sustaining civilization that will survive there, even if, you know, the rockets from Earth stop coming because there's been an asteroid strike or nuclear war or something here. That's definitely not happening. There are a lot of reasons why that's not happening. Getting anyone to Mars by 2050 and bringing them back alive or just having them live there for a while, that would be incredibly difficult. The challenges just to put boots on Mars, the way that we did on the moon are enormous, right? Just learning how to keep someone alive in deep space that far away from Earth for as long as it takes to get to Mars, stay on Mars, come back. We do not know how to do that yet. Chuck, that's the problem. They want to put boots on Mars instead of sneakers on Mars. Sneaker contract. They'll pay the whole way. Nike would have been there by now. They just do it. Yeah, exactly. Absolutely. They just do it. Well played. Ain't it about boots, it's about sneakers. I mean, so the- What are the biggest challenges of going that far into space? Is it radiation or- Yeah, there's radiation. And that's not just when you're in space. That's also when you're on Mars, right? You know, the two things that primarily protect us from radiation here on Earth are our, you know, the Earth's magnetic field and the thick atmosphere that Earth has. Mars doesn't have either of those things. So when you're on the surface of Mars, you're getting pretty much the same radiation dose that you do, like, out in space. And that's not good, right? You know, like the thing that I tell people is, the movie The Martian is science fiction. One of the things that science fiction about it is, if Mark Watney really, you know, had to do all the stuff that he did in that movie, he'd come home and he'd be dead of cancer in a couple of years because he had too much radiation and he's sure hanging out in Mars. Mm. I'm Brian Futterman and I support Star Talk on Patreon. This is Star Talk with Neil deGrasse Tyson. What about the ISS? If Scott Kelly could stay up there for a year. One of the twins. One of the twins. One's staying on Earth and one on Earth. Right? And couldn't you just extend that for whatever time necessary to go to Mars? Even if it's not to live there, if it's just to go there and dig a hole and come back. Right, so there's a couple of things. First of all, on the ISS, they're still in the Earth's magnetic field. They still have a bunch of the shielding. Oh, wait, and what's that called, Neil? Wait, the field that goes all the way out like that? Oh, yeah. Oh, it's called the, oh. Magnetic tube. No, it's not the magnet, it's the magnetosphere. Yeah, yeah, think of X-Men. Yes, the magnetosphere. Yeah, exactly, it is like the X-Men. Yeah, they've still got that protection. Also, if something goes wrong on the ISS, they'll be back on the surface of the Earth in a matter of hours. Like they can just abort and come back home. And most, I mean, you can, you come out, you're down within a half hour. Exactly. The hours is you want to line up so you don't land in the middle of sharks. Right, exactly. Yeah, yeah, yeah, yeah. So you can get out easy. And you can also have a real time conversation with people on the ground. Okay. Because they're not that high up. And so the speed of light delay with the conversation doesn't matter. On Mars, it's a minimum of something like, I think, eight minutes each way and a maximum of something like 15 or 20 one way. And so if you send out a message, you are waiting at least 15, 20 minutes to get a message back, maybe more like 45. Better be a good message. Yeah. It better not be like, so how's it going? Over. Yeah. Yeah, put some context in there. Or watch out for the cliff. Yeah, exactly. Yeah. And the other thing is like, if you have a problem on the surface of Mars and you want to come back, that's going to take you at least nine months, maybe more if you happen to be near a launch window where the earth and Mars are like in the right positions. If you're not near a launch window, it could be well over a year before you can come home. And a full up round trip mission to Mars with ideal launch and return parameters is multiple years. Yeah. Right. But you can get to the moon and back in a week. Yep. In like a new cycle. Yep. Right. So if we overcome the logistics of getting from earth to Mars, if big if, where are they going to live? Because they're not going to go out there and start building. Yeah. And they're... And why don't you just build a little sort of half underground thing that shields you from, sure, radiation. Would you live in a tent? Yeah. Well, so then you have other problems, right? Yeah. You know, there's no air. You can bring in oxygen or, you know, do some sort of reaction to make oxygen on the surface, which yeah, you can do that, but it's not the easiest thing. You got to bring in all your food. Can't grow it there. The Martian surface, the dirt on Mars is filled with toxic chemicals. You're going to have a hard time getting it out of stuff because it's very fine. It's not... It's going to be, that's going to be here on earth soon too. So chemical... I mean, let's be for real. But we know you can grow poop potatoes on Mars. Right. Yes. We know that. Yes, exactly. Yeah, there was a proof of concept in the movie The Martian. It does. No, but actually it's funny though. The guy who wrote the book, what's his name? Andy Weir. Andy Weir, yeah. In fact, we had him on the show. He's in our archives. He has said that, you know, the discovery of these particular poisonous compounds in the Martian surface called perchlorates, he didn't know about that when he wrote the book because it wasn't widely known. And so now we know if you tried to, you know, farm poop potatoes on Mars, they'd be poisonous. Mm, yeah. So that's the unknown unknown. Yeah, yeah. Right, exactly. And that's still out there. Okay, that's not going to work. We're not thinking that for some time. Functional immortality. And there's a lot of ways we can get there. I mean, the biological immortality of growing organs and pigs and things and then transplanting is one thing, but are we getting towards singularity? Yeah. So, I mean, the biological replacing organs thing, you know, you can't replace the brain. Well, not yet. Yeah. I mean, not yet. We'll never do it with that attitude. No! Sonny! That was more a British approach to things rather than American. Someone needs a better attitude about things. But yeah, this idea of the singularity that we're going to get to this point where technology in general and AI in particular gets faster and faster and smarter and smarter until it gains god-like powers. It's a science fiction story. What does it have to do with living forever? Well, so the idea is that then you get this god-like AI that grants us immortality. It has essentially magic powers. Or it finds a way to take human minds. I mean, it's just smart enough to figure out how to make us live for it. Oh, okay. I can solve the problem. Yeah, immortality. Yes. I got you. But whose mortality problem is it going to solve? Yeah. Well, it doesn't have one. So... I'm just saying. There's no problems at all. Are there a select few or is this open for everybody? Oh, well, we know for a fact that it's going to be for a select few and it's going to be for the people who are the gatekeepers to AI. We're already seeing that now, but go ahead. Well, but the other thing is that the whole idea is kind of nonsense to begin with. Like this idea of singularity is, well, it's based on a few really serious flawed ideas. First, this idea that there is this single thing called intelligence. You can just ramp it up or down in a computer and it can just make itself more and more intelligent. That's not really how intelligence works. Intelligence is like a really complicated thing. It's not one number. And also the usual way talking about the singularity that the main popularizer of the singularity, Ray Kurzweil has. Who's been on the show? Who's been on the show. In our archives. I love these little commercials. I love that. Okay, go on. But yeah, Kurzweil, he says it's... It came out with a second book. The first one was the Singulatory is Near. Yes. You know the title of his second book? Yeah, the Singularity is Nearer. Nearer? Yeah. That's so funny. Now I tell people that they don't believe me. I'm like, go look it up. It's out. And his next book coming out is going to be called Almost There. Yeah. No, he thinks that Moore's Law, this idea that computer chips are just gonna get faster and more powerful and like double in speed every 18 months. He thinks that this is this specific instance of a more general like law of accelerating returns in technology and in nature. And he says he's traced it all the way back to the beginning of the universe and that it shows that a Singularity is coming in like 2045 or something like that. Precisely. Yeah, precisely. On October 12th. Oh man, that sounds to me like the end is near. Yeah, I know. Fast out of me like the people are like, I don't need a bank account. You know Jesus is coming back next week. You know, it's funny that he's... The end is near. That's what he should have made the title. So it's funny that you say that, right? Because he's got a picture of himself and the Singularity is Near with one of those like poster boards on him that says the Singularity is Near. Oh. Yes. And there's an AI research group that's inspired by these ideas of Singularity called the Machine Intelligence Research Institute, MIRI. They don't give their employees 401Ks because they think the end is near. That's some cheap ass. Yeah, I know, right? Oh, right. Wow. Yeah. Okay, I still want to get to the immortality because you haven't addressed the fact that right now with or without AI, there's a lot of research on not just replacing organs, although that might be an art offing, but delaying the aging functions of yourselves. Totally. That could work out to extend human lifespan or health span a certain amount. For a long period. I like the health span. Yeah, yeah. And I'm really interested in the health thing. That's a good word. Yeah, yeah, yeah. What do the Galapagos, what do they live to be? Like a hundred and... The tortoises? The tortoises, yeah. Like 200 or something. 200 or something, about 80 is the answer. They have AI, so that's how they live. Yeah, yeah, yeah. Remember we spoke with Venki Ramakrishnan. Yes. He was telling us about the Greenland shark. Yes. Being about 800 years of age. 800 years, yeah, that's crazy. Yeah, it is possible that some, you know, biotechnology will be developed that will radically extend human lifespan or health span. Maybe, yeah. But what these guys are talking about with singularity, they're generally not talking about that as the end game. The end game they have in mind is not just an extended lifespan, but real immortality by uploading their consciousness into the future. I was gonna say, that's really where this is going. Oh, yeah. So before we get to this. That's the immortality. Immortality of your mind. Yeah, yeah, yeah, yeah. Forget your body, yeah. Before we get to that, do we visit transhumanism? Do we get, and how are we defining? What is transhumanism? That's what I'm just saying, define that. Yeah, it's this idea that you can use technology to transcend like the limits of human biology and physics. Are we kind of already doing that, which is why we live twice as long as people 150 years ago? Yeah, no, there's definitely a sense of which. We're kind of already, if we told them what we're doing, we know about nutrition, vitamins, they'll say, what's a vitamin? Right, we've got vaccines. We got this, we got, what's a vaccine? We got, right? Are we already transhuman? Yeah. Compared to what the age nature would require us to be dead at? Totally, yeah, like, look, I think that we have used technology to make many things much better about being alive. Like, that's just true. The question is, does that trend continue indefinitely? No, because RFK is gonna make sure we go back. Back. So when we lived half as long as we do, that's what's happening. Let's be clear, that's RFK Junior. Yeah, that's RFK Junior. If we go to this uploaded consciousness and that becomes reality, that just doesn't exist without a power source. Right, the thing about the singularity and like Kurzweil's idea about like this accelerating returns and Moore's law just going on forever and you know, this power source thing, right? The idea that it would need increasing levels of power as well. And so this leads to this sort of exponential drive for materials and power. And the thing that Kurzweil forgets is exponential trends are not like laws of nature. The law of nature about exponential trends is they end. Right, they have to end. They have to end. And so, and it's because ultimately limited resources like energy, right? And so. Although, yeah. When we talk about how much longer a charge in our computer lasts today compared with the early days of laptops. Part of that is better batteries, but also part of that is more efficient chips. And when we get to quantum computing where much more computing happens in much less with much less of an energy draw, it could be that we're coming at it from the other side where the energy needs are dropping, thereby not requiring the power supplies necessary. I have long, well, in my memory, not your memory, I got a few years on you. A room this size was necessary. To cool a computer, otherwise a computer would overheat and the computer's doing like four function mathematics. So the efficiencies matter. All these tubes that had to be kept cool. So it's not obvious that it's a linear exponential. Can I say that? Where the exponential is just gonna hit a limit because you can come at it from other directions. Yes. However, the other thing is though, in nature, the exponential acceleration, it's more like the law of diminishing returns is more likely than the law of exponentials acceleration. Well, no, that's actually exactly right. Yeah, because there's, if you look at the history of Moore's law, like how it is that the semiconductor industry. Oh yeah, name for Gordon Moore. Gordon Moore. Co-founder of Intel. Yes. If you look at how Intel and other semiconductor companies actually made the chip smaller and faster over that time, it's not a law of nature. It's a decision, a business decision that these companies made. And in order to keep that trend going, they had to invest more and more and more money just to keep the same sort of level of doubling to keep that exponential trend going. And eventually it did stop, right? Moore's law is done. It's over. Because you can't make silicon transistors smaller than an atom of silicon. Right. Yeah, and what they're doing now is just adding more chips. So the more powerful computers are not smaller and denser, they're just bigger now. Yep. Right. Yeah, and they're putting them on top. You're stacking them this, right? So the solution in the minds of these tech billionaires is to arrive at a superintelligence to get an AGI. Yeah, yeah, an artificial chip intelligence. Sam Altman is saying within the next two years that will be achievable. Yeah, I think he said, yeah. All right, so they're looking at that as being the solution to this problem where we're saying we're not sure if it will be exponential, we're not sure where the endpoint is. They're looking at it as the solution to every problem. None of the tech bros have a degree in physics the way you do. So what are you bringing to the table that they don't see? I mean, they believe that AGI, I mean, Altman has said that AGI is gonna solve every problem, including like global warming, which is crazy. But it's crazy. Well, because... If it's smarter than you and you can't solve it, why is it crazy to think it could solve it? Well, first of all, the artificial intelligence systems that they're building now are just drawing more and more and more energy. If you did build one that could solve global warming and you turn it on and said, how do you solve global warming? I'm pretty sure the first thing it would do is say, well, you shouldn't have built me. Yeah, you turned me off. You turned me off. Yeah. That'll help. That would be a good test of its own self preservation. If you are causing most of our global warming, what's the best solution? Does it turn yourself off? Yeah. But I mean, the other thing is that we don't need AI to tell us how to solve it. We already know what the solution is. The issue is not like that insufficient intelligence has been thrown at the problem. The issue is primarily not even a technological problem at all at this point, aside from carbon capture. The main issue is, yeah, it's human behavior. It's greed. Exactly. Yeah, it's greed. It's greed. And it's... Greed is good. Oh, God. Yeah. Yeah. But the other thing is just that when Altman talks about, he's talked about things like, oh, AI means that in 10 years, college graduates are gonna be getting cool jobs exploring the solar system, right? And I can just look at that and say, well, that's bullshit. Or he says, AI is gonna discover new laws of physics and that's gonna remove limitations that we have in the world today. And I'm like, well, discovering new laws of physics, I mean, putting aside whether or not the AI can do that, that does not always remove limitations. Sometimes new laws of physics, in fact, a lot of times, the laws of physics... Create a limitation. Create a limitation, exactly. Einstein, with relativity, discovered a limitation in the speed of light, right? Newton didn't know that there was any such limitation. Right. Yeah, right. So is it possible that all of these, I'll call them postulates, because they're not that they're making, right? Are just a means of hyping up what they're doing to keep the revenue stream coming to them. Like, let's be honest. If I tell you this thing is gonna solve everything, right? I'll give you my money. You give me, right. You give me some money. I mean, it's kind of like... Not just me, the government will give you money. No, that's what I'm saying. Everybody's gonna give you money. It's kind of like... It's 21st century snake oil. Is that what you're telling me? I was about to say something different. I was playing... It's kind of the evangelical business model of a television evangelist. Like, the whole idea is, hey, you got problems. And these problems can be addressed, they can be solved. All you gotta do is send me some money. That's all you gotta do. And I'm gonna send you this little blessing cloth. And you know... Chuck was a preacher in his earlier career. And if he isn't, he will be in the future. He will be. I mean, I'm saying one thing, I made the wrong decision. Preaching is good money. Everything we've discussed has been about being somewhere else. About not solving problems here. So are these people looking at Earth and going, you are completely screwed. We're out of here and we're the ones that can afford it and we're the ones who detect to be able to achieve it. Why are they walking away? What's in their mind? What's their thinking about turning their back and moving? Well, they think that... But they didn't grant you an interview. Yeah, they didn't grant you. So you don't really know what's on their mind. Oh, you can read enough of their stuff to have a check. Yeah, I was gonna say, in fur what's on their mind. They've given other people interviews who are nicer to them. I could be a nice guy. Are you guys charming enough? Yeah, apparently not. I mean, I just sent an email. A couple of them were almost willing to do it, and then they changed their minds probably because they read the email again, they're like, oh, he's gonna disagree with us. Why should we talk to him? But whatever. Some of them are being very cynical, like the way that you were talking about, right? And saying, oh, I can just do this. And this is, I can claim that all of these things are coming in the future. And this is a way of generating more profit and getting people to give me more money. Some of them, I think, genuinely believe it. The idea that the future has to be elsewhere, I think some of it is just from this sense that they have that things are bad here on earth, and that trying to solve problems here on earth would be complicated and messy and difficult, and that somehow going to space would give them a fresh start, which is not true. You can't escape politics. You can't escape. We're still human. Yeah, exactly. You can't escape human nature, exactly. How does an AI overlord plug into these scenarios? Well, the idea is that they build a sort of AI God, and it just does whatever they want. It puts them- And this would be an AGI, Yeah, yeah, artificial general intelligence. So when we normally think of AI, we think of a task-driven AI. It can drive a car. It can make a perfect cup of coffee. It can fly an airplane. But AGI transcends all of that. It can just learn about anything and might even be achieve consciousness. Yeah, well, it would- Like Skynet. Yeah, absolutely, because it's, we are AGI. That's what we are. Yeah, we are just the equivalent of what they want as AGI. How do you AGI, 0.10? Yeah, 0.10. No, 0.1. 0.1, yeah, okay. So- Yeah, no, that's right. But the point is, whatever we are, AGI, so how do I only take you to go to school to open up all your textbooks and learn from them and get an exam? AGI will do what? Yeah, it's supposed to be able to do all of that much faster in 20 minutes. Yeah, 20 minutes. Yeah, exactly, right. We'll get your college degree in 20 minutes. Well, and the other thing is like those AI systems, all the things that you mentioned, right? Make a perfect cup of coffee, fly an airplane, drive a car. The AI systems we have right now can't do any of those things without human supervision, right? Even those self-driving cars that are all over the streets of San Francisco, there's actually a human remotely supervising and intervening pretty frequently. So they tell you. Yeah. Oh, that's funny. You talking about the Waymos? Yeah, yeah, yeah, yeah, exactly. The Waymo is Google, if I remember correctly. Yes, yeah, yeah, yeah, that's right. And so AGI is supposed to be able to do all of these things like independently, right? And then get smarter and smarter. And the human is not even in the equation. Human's not in the equation and you can just make it go, you can have it do all the things that a human does, but you can like overclock it, make it go faster, think faster than a human. And then the idea is it gets smarter and smarter and achieves these like super human, super intelligent powers. The idea then is that for the billionaires controlling it, it's like a genie. And for the rest of us, it's an overlord. It's an overlord. Yeah. But this is where they're so stupid and this is where all really rich people- They're so stupid, they have hundreds of billions of dollars and you don't. But that's what makes them so stupid. I agree. I'm serious. It's the fact that they have all this money and they've convinced themselves that they can transcend anything. It's evidence of their own genius. Right. So it's, their hubris is their downfall. It's like the first dictator, the very first dictator was a guy, a little guy by the name of Julius Caesar. But Julius Caesar was the very first dictator. You know how he became dictator? They said, all right, how about you be dictator, but you do it for a year. If you create a God like being, whether it's artificial, general, intelligence or whatever, and you think that you're gonna control it, it is not a God at that point. You are the God. And that's really what they're saying. They're saying we're gods. Yeah. And the thing is like, that's true. If they somehow did achieve it, who would be controlling it? But also it's an incoherent idea. Like it's not, the good news for the rest of us is that it's not like something that's actually coming because it doesn't make any sense. That's funny. It's like these guys are hanging their hat and you're just like, yeah, man, it's just a dumb idea. Yeah, it is. No, no, incoherent sounds way more of a beat down. Yeah. Your idea is incoherent. You know what I'm saying? It's dunking on somebody right there. Is this the misconception of the science, the misconception of science fiction? Yeah, I mean, I think a lot of the ideas, and this goes back to like, why are they trying to go somewhere else? I think they just get these ideas from science fiction and they just take it way too literally, they don't read it well, right? Like my favorite science fiction, the science fiction I grew up with was Star Trek, right? Yeah. The thing about Star Trek is, yeah, okay, they're on the Starship Enterprise, they're out there exploring strange new worlds, new life, new civilizations, all that jazz, right? Finish it. To boldly go where no one has gone before. Thank you. Not to go boldly. Yeah, not to go boldly, no, we can split the infinity. Split that infinity. Yeah, hell yeah. The thing is though, Star Trek was never really about space. It's about like us here now, right? And it was always an allegory and not even a particularly veiled one, right? I seem to recall an episode where Kirk and Spock were literally punching Nazis with swaps. And then there was also the episode with the two dudes and one of them, like the left half of his face was white and the right half was black and the other one, it was switch. That's Frank Gorshin. Yeah, Frank Gorshin, yes. He played the Riddler in Batman. Yeah, yeah, yeah, yeah, exactly. So. It's obvious why we persecute them. They're black on their right side. We're black on their left side. That was kind of blunt. Yeah, exactly, but Star Trek is always blunt, right? And that's kind of part of the fun, right? It is. But these guys watch Star Trek and they're like, oh yeah, warp drive is cool, let's do that. And you're not. And miss the whole point of Star Trek. Exactly the process. Yeah, right. Because Star Trek is utopian ideals in a galaxy that's descending towards dystopia and they're fighting it every step of the way. Could it be that to become a tech bro in the first place, you had to be really focused on a level to the exclusion of your social life and possibly even your personal hygiene. As a result, you achieve these places and part of your life's training did not include the emotions and feelings of others or how people think about the world or what their desires are. And you think that what you accomplish is for their best interest, even though you have no idea who they are. Oh, what? I think that's right. Right? Like the way I like to talk about it is, for someone who claims to care about humanity so much, Elon Musk doesn't really seem to care very much about humans. And isn't he the guy who said empathy is a bad thing? Right, yeah. And but he's also said, I'm gonna save humanity. Yeah, he did. But he's also said, I'm gonna save humanity by taking us to Mars. And like, buddy, first of all, no. And second, like, I don't think that you actually care that much about other humans. I think that what you said is exactly right, except you also have to add in, they think that the fact that they succeeded in business, which a lot of that's just luck, is proof of their- And government contracts. Right, government contracts, exactly. And government contracts, exactly. It goes through such subsidies, yeah. For the car business as well as the rocket business. Absolutely. But like Musk and others, right? Like Altman and Andreessen and these other people, they all think that this is proof that they are like the smartest people who've ever lived. These are the richest people who've ever lived. And that's just not how anything works. Just remind me, Altman is open AI? Yeah, Samuel is CEO of open AI. Just let's go down that list. Yeah, absolutely. And Mark Andreessen is the head of Andreessen Horowitz, the biggest tech venture capital firm. Oh, so you need that to get the confusions. So open AI is what gives us chess, got it, got it. And then of course we all know Elon, is Branson a player? Branson is less of a player in Silicon Valley. What role in the tech sector does Bezos play? I mean, you know, he's... I mean, he's got his own rockets. Yeah, he's got his own rockets. He's also like owns most of the infrastructure of the worldwide web. This I think is something that... AWS, AWS. Yeah, AWS, I think these people don't recognize this. Which stands for what? Amazon Web Services. Basically, most of the cloud, most of the cloud, most of the actual computers that compose the cloud belong to Jeff Bezos. So Amazon.com is like window dressing on a whole other operation that matters to him. The real operation. Yeah, exactly. That's why he doesn't have to make a buck selling you a... He's gonna sell you a book for 80% off. Right. Well, after this, I'll be lucky if he sells my book at all. No, no, no, we're going there. Yeah, let me remind... But I'll say these guys... Let's get the title of the book back in here. Just give us the title again. Yeah, it's More Everything Forever. AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate. That's what we're talking about. Yeah. Man, okay. So, when we think about the science fiction, and I think Neil's point about the isolation of these people growing up, if we think about the science fiction and you think about certain parts of Star Trek and then maybe Matrix or Blade Runner, and you go through the laundry list, how have they co-opted these and kind of bolted this to that and that to this... You think they were influenced by, say, Fine? Oh, yeah, totally. I mean, Elon Musk tweeted that science fiction shouldn't remain fiction forever. Okay, that's fair. Yeah, I sort of understand what he means, but which science fiction? Like Blade Runner's a dystopia. And then he comes out and says that the Cybertruck, that ugly piece of crap, looks like something that Blade Runner would drive, which Blade Runner is not the name of any character in Blade Runner. Hopefully you can put that aside. Just a profession in Blade Runner. Yes, exactly. That's great. But there's a number of dystopians. But they're all dystopic. Ready Player One is not another one. They're all dystopic. Some of them are, I mean, right? There's aspirational science fiction like Star Trek, right? Right, but even that is, like I said, and they're the bright spot. Yeah, the bright spot. The Federation's definitely the bright spot. Yeah. No, no, no, that's true. But like, there's a tweet that lives rent free in my head about a thing called the Torment Nexus. This is actually in my book at the very beginning. The Torment Nexus. The Torment Nexus. I'm afraid to answer what this is. But we're going there. We're going there, we're going there. Let's do it. So the tweet goes like this, science fiction author, in my book, I created the Torment Nexus as a cautionary tale. Tech billionaire, at long last, we've created the Torment Nexus from classic sci-fi novel, Don't Create the Torment Nexus. This is what these guys are doing, right? The Skynet. Yeah, right, they're Skynet. But also, if you go back and look at the classic cyberpunk novels by somebody like, say, William Gibson, right, who I think is a great novelist, a lot of those novels, like Neuromancer, are about the concentration of wealth and power and the way that the wealthy can and will use technology to remove themselves from the rest of us and accumulate wealth and power while insulating themselves from their consequences. And that's exactly what we see happening. And so when they say that they want to make science fiction into reality, we need to ask, okay, which ones? Because if you want to make Neuromancer reality, man, that's bad news for everyone who's not you. So how much of science fiction has always been the silent alarm call, a silent warning? Yeah, I mean, science- We go back to Fritz Lang and Metropolis back in 1927. Yeah, absolutely, yeah. Metropolis is very much like a movie about the need to keep emotional intelligence with pace with technology, right? I didn't get that out of it, but I believe you, I'm just saying, that's a deep read. To me, it was just a weird alien bot. I mean, it's a weird movie, for sure. But like at the end, they say the heart and the hand must work together or something like that, right? Yeah, yeah, yeah, yeah. And so that's how I read that at least. I think that science fiction, a lot of it, has always been like about looking at the world as it is now and saying, okay, if we push that a little bit, if we want to take a look at this situation in a different context and understand it in a different way by removing it from all of the sort of social and cultural connotations that a particular thing has here and now, we put it somewhere else and maybe we can see it more clearly, right? So what Star Trek does, it's what like, oh, my favorite science fiction author, Ursula the Gwynn, right? She did this over and over again, looking at poverty and equality, capitalism, gender, you name it, right? So, Rod Serling, back in 1959, he's interviewed about this new show called Twilight Zone and he says it, he said, look, there's stories I'm telling that you could not tell in just a dramatic way. It has to be set at a time and a place that is not you and now, otherwise I couldn't get away with these stories. And only then do people say, wait, might that be me? But if it's blatant and in your face, you reject it. And he said, in the end, we just trying to sell soap. He understood the situation, but tell an entertaining story, but set it in another place. I just looked up when Skynet achieved consciousness. It was 2.14 a.m. Eastern time, August 29th, 1997. Because the movie was 1984 or- Yeah, the first movie I think was 84. So that was only 13 years in the future. I hate when they do that. Go far enough where- Oh, I have a whole list. Well, that was the whole thing with Star Trek. It was set something like 200 years in the future. Yeah, that was well safely in the future. I got a whole list. I'm saying go safely into the future. Do you know Soiling Green was 2022? Oh, God. Well, that's why I'm eating people now. It's people? Everybody doesn't realize that's what the pandemic was about. It happened in Black Panther. I enjoy that burger. So what else is in your laundry list here? Basically, I think that what they want, this vision that they have is this idea of going to space and living forever, right? And so a lot of it is really about space colonization, going out and expanding to take over the universe. That's like, because they don't want to just stop with Mars. They want to put Dyson spheres around every single star in the observable universe and collect all of that energy. And that's, this is not gonna happen, man. There'll be a Kardashian scale five, I think. Where you control all the energy output of all stars in the known universe. But doesn't the Borg have some similar energy? Yeah, the Borg want to assimilate everything. And what was it, the phrase? They want to take your cultural and political distinctiveness and make it part of our own. Speaking as a scientist, I kind of like what science brings society. And shouldn't that be enough? Why does everyone go to science fiction? Is there some morbid fascination with science gone bad? And isn't that a problem with us, not with the storytellers themselves? I mean, I don't think that science fiction is in and of itself the problem, right? Like I'm a huge- That's what I'm getting at. Yeah, I'd say, yeah, no, I'm a huge sci-fi fan. I'm also a scientist by training, at least. The reason people find science fiction more compelling than science has a lot to do with the fact that it's not really about the future, that it's sort of these interesting what-if scenarios that reflect on where we are right now, right? If you tried to make a very realistic TV show about what life could actually be like a hundred years from now and made it as realistic as possible, people probably wouldn't watch it because it would involve so much slang that doesn't make any sense to us right now and shifts in language. Oh, MG! Right, yeah. WTF! Yeah, but this is like, and so many little things like that, right? It's not meant to be a realistic depiction of the future. So I mean, part of me wants to say, the problem isn't science fiction, the problem isn't science. The problem is like critical reading comprehension skills. Yeah, and money. Yeah, exactly. So the accumulation of wealth to a very few is always going to be a very bad thing for any society. But right now, unfortunately, there's a global society of billionaires that has popped up and their co- When did you become Marxist? What's that? When did you become Marxist? I'm not a Marxist, believe me. I'm pretty cool with capitalism. I'm just all about guardrails. And I also believe that $2 billion is all you get to have, okay? Now that was pretty Marxist. Yeah, I think that was in Dusk Happy Tall. We're gonna have $2 billion. You said, calm Marxist. Yeah. No more. No more. So no. So basically you're saying, not enough pure science education, but a hell of a lot of money. Right. Is a bad combination. Yeah, and I agree completely. So give us the takeaway thesis of your book. Oh, I mean, I actually do. And the book saying that we should, you know, limit the amount of money that people should be able to have. I don't think you meant that. I don't think you meant that. What you mean is, we should limit how much power the people who have money have. Yeah, absolutely. The problem with that is, the more money you have, money is always, and they call it soft power, it's not. It is straight hard power because you are able to influence every single person in the world. And I think that's a good thing. Because you have that hard power because you are able to influence every corridor of power that there is when you have enough money. No, this is the same, man. You should follow me around and just say the stuff. Yes, hey, look better. Right, it's good. What you need to do is, this is where progressive tax is a good thing. And we found that out under FDR, and back in the day where basically and they were like, yeah, we're gonna take 90% of that, okay? And we're gonna take it and we're gonna do stuff because you wouldn't have been able to get that much money without all the things that we want to now support with the money that we helped you make. But you'll get to keep up until that point, pretty much all your money, but when you get to this level, you gonna give us that money. Give me that money. But yeah, no, I think that we as a society, because it's not just the billionaires, it's also that we as a society buy into this idea that the ultra wealthy know what they're talking about when it comes to something other than the ins and outs of having more money. And they can be complete dumbasses. Yeah, exactly. You know what they're good at? They're good at rigging the game for them to make more money. That's what they're good at. And everybody thinks they're gonna be rich one day. Okay, and so I did the calculation for a billionaire, what it takes just to make a billion dollars. And I think I used $500 an hour, which is a very good amount of money. $500 an hour, 24 hours a day, seven days a week. And I think it came out to like, you'd have to work 2300 or 4,000 years or 2300 years. It's ridiculous. It's a ridiculous amount of money. That's what I'm saying. Well, we wanna know what makes it even more ridiculous. Turn it around. You have a billion dollars, you wanna get rid of it. You spend $500 an hour. An hour. 24 hours. 24 hours. And it takes you many times longer than a human life. What do you need more than a billion dollars for? That's my point. So you get $2 billion. And that's it. I did that calculation with Elon Musk's wealth. Oh, did you? Yeah, you turn all of his money into $100 bills. And lay them end to end. You ask how far does it go around the earth? We can go several times around the earth with $100 bills. And then there's some leftover money. Tape them together to a ribbon. And you'll have enough leftover to go to the moon and back. Yeah, see, that's ridiculous. That's my point. That's freaking ridiculous. So now we feel like we have to protect these people. This is what I don't understand. The issue is their outsized power they have over laws, legislation, politicians. Absolutely. And the rest of us. Out of mind rich people provided they're not trying to control my life. Exactly. Okay. No, I'm with you. We gotta land this plane. Okay. I think I figured out what's going on here. A lot of smart people, a lot of wealthy people, a lot of people with influence, trying to figure out what kind of future we will have, what kind of future we should have. And we all know that future will pivot on advances in science and technology as civilization has always pivoted on science and technology. And so, but we're at a point now. And maybe we've been at this point before. So is this really any different? I don't know. But it seems like we have the future in the palm of our hands. And in the end it comes down to not how advanced the science is, not how clever anybody is, not how, it's not related to any of that. It has to do with how wise we are in the face of our own creations. And wisdom, I think, is an undervalued factor in all the brilliance people are exhibiting in their creations, in their discoveries, in their forces operating what the future of civilization will be. So, if I may appeal to what it is to not only think about how great your inventions and discoveries are, but think about how you might harness it as you harness a horse. An unharnessed horse runs wild. You don't know what it's gonna do next. A harness horse is still a horse. But it gets to do exactly what you needed to do and what you wanted to do. And that is a dose of wisdom coupled with our ingenuity. I'd like to think there's more of that in our future. Maybe we'll avoid the disasters that the science fiction writers always portray. And that is a cosmic perspective. Dude, thank you for being on Stark Talk. Thanks for having me. This has been a lot of fun. Good luck with the book. Let's give me the title of the book again. More Everything Forever, AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity. You got it right. He did it right. Well, he did, right. Yeah. Yeah. All right, back to Berkeley, you go. Yes. And keep us thinking about the future. I will. There's not enough of that going on. Thank you. Yeah, I'd be happy to come back anytime. All right, this has been Stark Talk Special Edition. You put together another one with your peeps. Lane Unsworth as well. Take a large slice of credit. There you go. All right, Chuck. Always a pleasure. Well, all good here. Neil deGrasse Tyson for Stark Talk Special Edition. Bidding you to keep looking up.