The Joe Rogan Experience - July 10, 2024


Joe Rogan Experience #2174 - Annie Jacobsen


Episode Stats

Length

2 hours and 50 minutes

Words per Minute

156.26283

Word Count

26,635

Sentence Count

2,127

Misogynist Sentences

9


Summary

In this episode, we're joined by the author of the new book, "Nuclear war: What would happen if the U.S. went nuclear, and how would the world end? Joe Rogan talks with Alex Blumberg about the possibility of nuclear war, and why he thinks it's a real possibility. Alex also talks about the dangers of nuclear launch, and what it means to be a nuclear war planner. And he talks about what it would take to survive a nuclear holocaust. This episode is brought to you by Gimlet Media and edited by Annie-Rose Strasser. The opinions stated here are our own, not those of our companies, unless otherwise specified. We do not own the rights to any of the music used in this episode. It was produced, produced, edited, and produced by Riley Bray. Our theme song is Come Alone by Suneaters, courtesy of Lotuspool Records, and our ad music is courtesy of Epitaph Records, which you can get a free copy of the album, "Blame It On Me" on iTunes. If you like the album art, please leave us a review on Apple Podcasts and/or wherever else you get your music recommendations, and we'll incorporate it in the next episode of "Goodbye Outer Space" and "Outer Space Traveler" on our next episode. Thank you! Thank you so much for all the support and thanks for all your support. -Jon Sorrentino. Jon Foreman Jon Sorrenta Music: "Outro: "Space Junkie" by Jeff Perla "Space Traveler's Theme Song: "Blinde" by Fizz & Friends by Haley Shaw (feat. "Space Lady" by Dervish & "Outtrope " by ) and "The White House Blues" by "The Good Fight" by Ian McKee is out on Soundtrack: "Good Morning America" by Eddy Pizzi ( ) by Jeff McElroy (featuring "Space Station ( ) and "Piano ( ) is outtro ( ) ( ) & - "A Little Late" by John Williams ( ) - "Mr. Williams ( ) & "The Lizzie ( ) " , " & " ( ) by James ( ) ,


Transcript

00:00:13.000 Good to see you again.
00:00:15.000 Hi Joe, thanks.
00:00:16.000 What's happening?
00:00:17.000 Thanks for having me back.
00:00:18.000 A lot's happening.
00:00:19.000 My pleasure.
00:00:20.000 I've heard a lot about your book.
00:00:21.000 I haven't read it, but I've heard a lot about your book from a lot of people that you freaked out.
00:00:26.000 Okay, well hopefully you can read it and then you can decide if you're of the freaked out crowd or of the really freaked out crowd.
00:00:33.000 Oh, there's only two options?
00:00:35.000 I think so.
00:00:37.000 It doesn't end well.
00:00:38.000 That's the spoiler alert.
00:00:39.000 Okay, hold the book up.
00:00:40.000 It's called Nuclear War.
00:00:42.000 Yeah.
00:00:43.000 What motivated you?
00:00:45.000 Nuclear War, a scenario.
00:00:47.000 And a very plausible scenario from what I understand from all the defense officials I interviewed.
00:00:54.000 What motivated you to write this?
00:00:57.000 Well, six previous books on war weapons, U.S. national security secrets.
00:01:05.000 Imagine how many people told me they dedicated their lives to preventing nuclear World War III. And so during the previous administration, fire and fury rhetoric, I began to think what happens if deterrence fails,
00:01:22.000 that idea of prevention.
00:01:24.000 What happens?
00:01:25.000 And I took that question to the people who advise the president, who work at STRATCOM, who, you know, command the nuclear sub-forces, and learned that it doesn't end well.
00:01:40.000 Not only does it not end well, five billion people are dead at the end of 72 minutes.
00:01:47.000 Jesus.
00:01:49.000 You begin to realize when you—well, you quickly realize as you read the book that— Jamie, your mic's on or something?
00:01:57.000 Something just made a weird noise.
00:02:01.000 Oh, yeah.
00:02:02.000 It's Carl.
00:02:03.000 Oh, it's Carl.
00:02:04.000 I was like, what's going on?
00:02:05.000 It's the animal humor in the difficult subject.
00:02:10.000 You know, there's literally hundreds of thousands of people in nuclear command and control who practice 24-7, 365. What would happen if deterrence failed and we had a nuclear war?
00:02:26.000 They are practicing this, Joe.
00:02:28.000 And it's like, talk about being behind the veil.
00:02:32.000 No one knows.
00:02:33.000 It's why I think the response to this book, it's been out for three months, has been so extraordinary.
00:02:40.000 And from both sides of the aisle.
00:02:42.000 Because people now are beginning to realize if nuclear war begins, it doesn't end until there is a nuclear holocaust.
00:02:51.000 And it happens so fast.
00:02:52.000 There's no quickly going to your secret bunker you have.
00:02:58.000 Yeah, all that's nonsense.
00:03:00.000 These people think, like, Zuckerberg's building a bunker in Hawaii.
00:03:02.000 He's going to survive.
00:03:03.000 He's building a hurricane shelter.
00:03:06.000 Yeah.
00:03:07.000 Even that might work.
00:03:08.000 Unless he happened to be there in the exact moment when all of this went down.
00:03:12.000 Yeah, you'd have to know in advance that we're about to launch.
00:03:15.000 The whole thing is...
00:03:17.000 Terrifying.
00:03:18.000 It's also...
00:03:20.000 I don't see a way that it...
00:03:23.000 Like, if you think about the timeline between 1945 and today...
00:03:29.000 It's kind of extraordinary that no one has launched a nuke.
00:03:33.000 But it almost seems like a matter of time.
00:03:38.000 It'll be a blip in history.
00:03:40.000 We look at it now.
00:03:40.000 Oh, because of mutually assured self-destruction or mutually assured destruction, that's what's prevented people from using nuclear weapons.
00:03:48.000 When?
00:03:49.000 Until now?
00:03:50.000 What is 80 years in terms of human history?
00:03:53.000 It's nothing.
00:03:54.000 It's a tiny little blink.
00:03:56.000 It's a little nothing.
00:03:57.000 And it could go sideways at any moment, and then we're back to the cave era.
00:04:02.000 We are hunter-gatherers again.
00:04:04.000 In the words of Carl Sagan, who is the author of Nuclear Winter.
00:04:08.000 But I think what's also crazy, to your point about 1945, is when this was all set up, when the nuclear arsenals were beginning, and I take the reader through it really quickly, because I want them to just get to what happens at nuclear launch.
00:04:24.000 I mean the book is really about nuclear launch to nuclear winter.
00:04:27.000 But the buildup is fascinating because first of all, it happened so incredibly fast and it happened under incredibly secret classified terms.
00:04:37.000 So there was no like outside opinion.
00:04:40.000 And originally, nuclear war was set up to be fought and won, which itself is absurd, and we know that now.
00:04:51.000 So the rules of the game have fundamentally changed, and yet the systems are all the same.
00:04:57.000 That's what I think the most dangerous component of today versus, let's say, 1952 when the thermonuclears began.
00:05:07.000 And when you say the systems are the same, what do you mean exactly?
00:05:10.000 So the system of nuclear war, this idea that – well, okay, so let's start with some basic facts.
00:05:20.000 There's a nuclear triad.
00:05:21.000 You know what that – okay, so triad, really simple, three.
00:05:26.000 So we have missile silos.
00:05:28.000 They're called ICBMs inside of them, 400 of them.
00:05:31.000 Then we have nuclear-powered nuclear submarines that launch nuclear missiles.
00:05:40.000 There are 14 of them.
00:05:41.000 Then we have the bomber force, 66 nuclear-capable bombers, triad.
00:05:48.000 The president chooses What elements of that triad he's going to use when he launches a counterattack if we ever are attacked.
00:05:57.000 That system essentially exists.
00:05:59.000 That was what was being developed in the 50s.
00:06:02.000 The only difference was in the old days it was we're going to actually use these and fight and win a nuclear war.
00:06:08.000 And now it's we're going to have these all sitting around ready to launch.
00:06:14.000 We have 1,770 Nuclear weapons on ready-for-launch status.
00:06:22.000 Joe, they can be launched, some of them, in 60 seconds.
00:06:27.000 Jesus.
00:06:30.000 And they're all pointed?
00:06:32.000 Do they need a coordinate?
00:06:36.000 Or are they all, like, aimed at a specific area already?
00:06:40.000 Important question.
00:06:45.000 Targeted to sea.
00:06:46.000 Out at sea.
00:06:47.000 So they don't have a specific target.
00:06:49.000 But when the command goes...
00:06:52.000 So is that if they accidentally go off?
00:06:54.000 Or go to the sea?
00:06:56.000 Jesus Christ.
00:06:57.000 Imagine you're out there on a sailboat.
00:07:00.000 Just enjoying your time.
00:07:01.000 What a beautiful place to be in the middle of the ocean.
00:07:04.000 And you see...
00:07:05.000 Um...
00:07:15.000 What do you think about the stories of UFOs hovering over nuclear bases and shutting down their weapons?
00:07:24.000 I know you've done a lot of research on this stuff.
00:07:27.000 How much of that is bullshit?
00:07:29.000 You know, I actually haven't done a lot of research on that specific narrative.
00:07:34.000 I know of it.
00:07:35.000 I know of it for sure.
00:07:36.000 And it's – I think – I mean I always approach the UFO phenomena with – or I try to at least with like the eye of – or the point of view of Carl Jung.
00:07:49.000 This idea that it's – that what leads here is our – Perception of things and our sort of deep shadow self of fear.
00:07:59.000 And once the nuclear weapon was invented, Man, I mean, our grandparents had to confront this new, fundamental new reality that just simply didn't exist before.
00:08:16.000 And then it was, that's with the atomic weapons.
00:08:19.000 And then in the 50s, once thermonuclear weapons were invented, and the thermonuclear weapon is essentially an atomic, a thermonuclear weapon is so powerful, it uses an atomic bomb like from Hiroshima as its triggering mechanism.
00:08:34.000 Jesus.
00:08:35.000 And so the order of magnitude of destruction of in an instant, according to Carl Jung, who looked at the UFO phenomena and the nuclear weapons phenomena hand in hand, Encourage anyone to read his stuff about it because he has a much sort of,
00:08:56.000 you know, bird's eye view of it all about why that's so terrifying to people.
00:09:03.000 So the narratives, to my eye, the narratives of nuclear, of, you know, alien ships hovering over nuclear bases, I don't, I have never spoken to a firsthand witness who experienced that.
00:09:16.000 But I would see that in terms of the narrative of Carl Jung.
00:09:22.000 So Carl Jung's perception was that he believed that it was essentially people perceiving these things or hallucinating these things.
00:09:32.000 And it was almost like a mass hallucination.
00:09:37.000 Part of the...
00:09:38.000 I don't know if he went that far.
00:09:41.000 I think he left a lot more open to interpretation.
00:09:44.000 I think his...
00:09:45.000 My read of his analogy was more like the way that hundreds of years ago or thousands of years ago when Christianity was first being developed, people saw existential threats as part of the narrative of God.
00:10:03.000 So my read of Carl Jung is that he's saying now in the mechanized modern world, the existential threats, the sort of damnation is tied to machines, which is easily tied to little machines or big machines from outer space.
00:10:25.000 That was his take on it, which I think is interesting.
00:10:28.000 It's interesting, but also Jung wrote this when?
00:10:32.000 The 60s?
00:10:33.000 Yeah.
00:10:34.000 So I think we know a lot more now than we knew then.
00:10:39.000 The reason why I'm bringing this up is the people that have hope, one of the hopes is that aliens are observing us and they're going to wait until we are about to do something really stupid.
00:10:50.000 And then they're going to come down and shut everything down.
00:10:54.000 That is an interesting narrative, too.
00:10:57.000 But again, that's a bit, to my eye, like the deus ex machina idea, that God would intervene and save the faithful.
00:11:10.000 In this situation, it might be that he's going to save those people that are paying attention.
00:11:16.000 Well, just save the human race from its folly.
00:11:21.000 What is Carl doing, bro?
00:11:23.000 You can't let the little fucker grow around.
00:11:25.000 I think Carl likes my shoes.
00:11:26.000 Oh, yeah.
00:11:26.000 He likes to play.
00:11:28.000 Carl didn't get enough exercise this morning.
00:11:30.000 Marshall wasn't here.
00:11:31.000 He's not worn out yet.
00:11:33.000 So, the young thing is interesting, but we know more now.
00:11:39.000 We know more now about possible...
00:11:44.000 Other dimensions that we can't access.
00:11:46.000 We know more now about planets in the Goldilocks zone.
00:11:51.000 We know more now about all these whistleblowers that have come out and talked about crashed retrievement programs where they're back-engineering these things and trying to understand what these things are.
00:12:04.000 Diana Posolka's work where she's talking about how they're essentially donations, that these crafts are donations, that people are Being given these things so that they could see this extraordinary technology and try to figure out how to make it.
00:12:19.000 But that's one of the only ways that I – like, if we did get to a point That we launch nuclear weapons at each other.
00:12:31.000 Everything is over so fast.
00:12:34.000 If I was an alien species, an intelligent species from somewhere else, and I recognize that this is a real possibility and that the Earth has all these different forms of life other than human beings that are going to get destroyed as well.
00:12:46.000 You know, it's going to wipe out Who knows how many different species?
00:12:52.000 It's going to kill everything.
00:12:55.000 And even the people that are left over, what are you left with?
00:12:58.000 Whatever 2 billion people that still survive, where and what?
00:13:03.000 What do you have left?
00:13:04.000 What is the environment like?
00:13:06.000 How polluted is everything?
00:13:09.000 What kind of mutations are going to come from their offspring?
00:13:12.000 I get into that in the end of the book.
00:13:14.000 So I write the book in essentially three acts, like the first 24 minutes, the next 24 minutes, the last 24 minutes, and then nuclear winter.
00:13:23.000 So nuclear winter is very well described by A fellow called Professor Brian Toon, who I interview in the book.
00:13:32.000 He was one of the original five authors of – do you remember the nuclear winter theory of our sort of high school years?
00:13:40.000 Yes.
00:13:40.000 Right?
00:13:41.000 So that was – Carl Sagan was the lead author on the paper.
00:13:44.000 Toon was the young student.
00:13:46.000 And he's dedicated decades to looking at nuclear winter.
00:13:50.000 Now, originally it was very paw-pawed by the Defense Department.
00:13:54.000 It was said, this is Soviet propaganda.
00:13:56.000 This is never going to happen.
00:13:58.000 And the computer systems and climate modeling have changed to the degree where we can see not only is nuclear winter what was thought in the 80s, it's actually much worse.
00:14:09.000 So whereby originally they thought there would be a year of ice sheets across large bodies of water from Iowa to Ukraine across the mid-latitudes of the globe.
00:14:20.000 Now that could be up to seven or ten years.
00:14:24.000 So think about that much frozen land for that long.
00:14:30.000 You have the death of agriculture.
00:14:33.000 You have, like you are talking about, you have the complete disruption of the ability for people to grow food and eat food.
00:14:42.000 And so man reverts to his hunter-gatherer state.
00:14:46.000 And this is all...
00:14:47.000 But hunting and gathering what?
00:14:49.000 That's the real problem.
00:14:50.000 Well, exactly.
00:14:51.000 And also you have, you know, man has to go underground.
00:14:54.000 I take you through what this is like in detail because it's so bizarre to think about that you have a man-made problem, nuclear weapons.
00:15:06.000 And yet this is – people are not paying attention to the fact that whatever is created by man essentially would theoretically have a man-made solution.
00:15:17.000 Like there is a solution to the nuclear weapons threat.
00:15:20.000 It's not – although the results of a nuclear war would be very much like an asteroid striking the United States or the world anywhere.
00:15:31.000 There's a solution to nuclear war.
00:15:33.000 There's a solution called disarmament.
00:15:36.000 We have a total of 12,200 and some odd nuclear weapons.
00:15:41.000 And is the concern that if we did that, other countries would not do that?
00:15:46.000 We would be defenseless?
00:15:48.000 Absolutely.
00:15:49.000 But, you know, things happen in sort of inches and feet.
00:15:52.000 They don't have to happen overnight.
00:15:54.000 Once upon a time, there were 70,000 nuclear warheads in 1986. Oh, so we're making progress.
00:16:02.000 We are making progress.
00:16:02.000 Well, we were making progress until this past year.
00:16:06.000 And these treaties are all, you know, at risk and people are just very busy seeing everybody else as the enemy.
00:16:14.000 This past year specifically because of what?
00:16:17.000 Well, you know, Putin's saying he's not going to be involved in the treaty anymore.
00:16:22.000 Donald Trump said he pulled us out of the treaty.
00:16:25.000 So there's, like, leaders are threatening this is all on the table right now.
00:16:28.000 It could and should be looked at.
00:16:30.000 But do you remember back when we were in high school when Reagan and Gorbachev got together for the Reykjavik summit?
00:16:39.000 Yes.
00:16:40.000 Remember that?
00:16:40.000 Yes.
00:16:41.000 That was the beginning of a movement Toward disarmament.
00:16:49.000 That was the beginning of this idea of, wait a minute, 70,000 nuclear weapons is just an accident waiting to happen.
00:16:56.000 And so we are at 12,500 today because of that.
00:17:01.000 And we should also say that there have been some very close calls.
00:17:06.000 The Secretary General said last year or something, we are one misunderstanding.
00:17:14.000 One miscommunication away from nuclear annihilation.
00:17:17.000 He's not kidding.
00:17:18.000 You know, there was this one incident where the Soviet Union thought that we were attacking them.
00:17:23.000 And there was one guy who resisted launching a counter-strike.
00:17:28.000 And that one guy prevented nuclear annihilation.
00:17:32.000 One guy said, I think this is an error.
00:17:34.000 This doesn't make sense.
00:17:36.000 I'm not doing this.
00:17:38.000 And that one guy saved us.
00:17:42.000 I don't remember what the incident was.
00:17:44.000 Like, what do you remember?
00:17:45.000 What was the exact issue?
00:17:47.000 So his name was Petrov, and it was in 1983. So what's even more remarkable about him is this was at a time of, you know...
00:17:57.000 Absolutely.
00:18:15.000 And he saw what the radar screen, the radar scope was reading as five ICBMs coming from Wyoming.
00:18:28.000 Five.
00:18:29.000 He knew that...
00:18:31.000 We would send a thousand missiles if we were going to launch.
00:18:35.000 And so he questioned the data, which is just so remarkable in its own conception when you think about that.
00:18:43.000 He questioned it and he didn't send it up the chain of command as a missile attack.
00:18:49.000 So what was it?
00:18:51.000 Well, I get into this in the book, which is terrifying.
00:18:54.000 So let me back up for a second of how good our technology is.
00:18:58.000 So we have a system in space, a satellite system called CIBRS, Space Based Infrared Satellite System.
00:19:09.000 It's like the Paul Revere of the 21st century.
00:19:12.000 It is parked over our enemies that have nuclear weapons, and it can see and detect a nuclear launch of an ICBM in a fraction of a second, Joe.
00:19:25.000 Confirmed fact.
00:19:27.000 That's why nuclear war begins and ends in 72 minutes, because the cyber satellite system sees the launch, and then the U.S. nuclear command and control begins.
00:19:40.000 And by the way, an ICBM cannot be redirected, and it cannot be recalled.
00:19:45.000 What about these hypersonic weapons that can adjust their trajectory?
00:19:50.000 They can move to different places.
00:19:53.000 They look like it's going to Arizona and it goes to Chicago.
00:19:56.000 So ballistic missiles are hypersonic.
00:19:59.000 So a little bit of a misnomer there.
00:20:02.000 And also a hypersonic missile, let's just say it went from Russia to the United States, it might take an hour.
00:20:08.000 A ballistic missile launched from a launch pad outside Moscow takes 26 minutes and 40 seconds to get to Washington, D.C. That number's not going to change.
00:20:18.000 That's gravity.
00:20:18.000 That's physics.
00:20:19.000 That's what it was in 1958, 59, and that's what it is today.
00:20:23.000 But isn't the new technology that it can alter its course?
00:20:27.000 Yes, but our...
00:20:29.000 Okay, so if you go with that logic and you say, well, it can move around, so it would be harder to shoot down.
00:20:35.000 Right.
00:20:36.000 As I explain in the book, and again, as was relayed to me by defense officials...
00:20:42.000 We can't shoot down ballistic missiles, long-range ballistic missiles, with any kind of certainty or accuracy.
00:20:49.000 It's not like the Iron Dome or anything like that?
00:20:52.000 The Iron Dome is almost like terrible for nuclear war, you know, for people to understand how dangerous nuclear war is because the Iron Dome can Sure.
00:21:26.000 I write for the layman.
00:21:28.000 I think part of the reason why nuclear war is not spoken about in the general public is because it's set up to be intimidating.
00:21:38.000 You'll hear a lot of defense people and analysts using very esoteric language and And it kind of excludes the average Joe or Jane, Joe or Annie.
00:21:51.000 So I ask really basic questions like, how does a ballistic missile work?
00:21:54.000 And it's very simple.
00:21:55.000 That 26 minutes and 40 seconds I told you about.
00:21:58.000 So there's three phases of a ballistic missile.
00:22:01.000 It launches.
00:22:02.000 It has boost phase.
00:22:03.000 First five minutes.
00:22:05.000 Imagine a rocket.
00:22:06.000 You've seen launches.
00:22:07.000 That fire coming out the bottom.
00:22:10.000 That boosts the rocket for five minutes.
00:22:13.000 That's when it's detectable from space.
00:22:16.000 Then it enters mid-course phase, which is going to be 20 minutes, arcing across the globe to its target.
00:22:25.000 That is the only place where the interceptor missile can get it, if it can.
00:22:31.000 And it's 500 miles up.
00:22:34.000 And it's traveling at something like Mach 23, 14,000 miles an hour.
00:22:39.000 Okay?
00:22:40.000 So that's 20 minutes.
00:22:42.000 And then the last phase is called terminal phase, appropriately so, 100 seconds.
00:22:48.000 When the warhead, the nuclear warhead reenters the atmosphere, boom, explodes over its target.
00:22:55.000 The interceptor system It's designed to take out the missile in mid-course phase.
00:23:03.000 So we have 44 interceptors.
00:23:08.000 Remember I told you we have 1,700, let's say, nuclear missiles on ready for launch status.
00:23:16.000 Russia has about the same.
00:23:18.000 We have 44 interceptor missiles.
00:23:22.000 How are 44 interceptor missiles going to go up against more than a thousand Russian nuclear weapons coming at us?
00:23:31.000 Never mind the fact that each interceptor has a 50% shoot down rate.
00:23:36.000 And that's by the missile defense agency's spokesperson.
00:23:41.000 So there's this perception that we have a system like the Iron Dome that could take out these incoming missiles and we simply don't, which is why when nuclear war begins, it only ends in nuclear Armageddon.
00:23:58.000 Jesus.
00:24:03.000 How disturbing was it for you to write this, do all this research, and to come to these conclusions and realize that we're in a lot worse shape than anybody thinks we are?
00:24:15.000 I mean, you know, when you're reporting or writing, you kind of take your hat off of the emotional or the sentimental part of things, where you, you know, the mother in me, you know, you can't think like that.
00:24:29.000 You just have to tell the story, I believe.
00:24:31.000 I also believe that If I can be as factual and dramatic as possible, then I will have the most readers, which is the point.
00:24:43.000 I am actually not trying to save the world as a journalist.
00:24:46.000 I'm trying to get you to read what I write because I found it super interesting reporting it and learning about it.
00:24:54.000 And also the whole process for me that I think is the most interesting is going to some of these people who are truly some of the smartest scientists in the world and getting them to explain it in the most basic – like I say you have to tell it to me like I'm a kid because I don't have a science mind.
00:25:13.000 And that part is – so that excitement part of it balances out with the terror of it because I do also understand why most people don't want to know about this.
00:25:23.000 It's too dreadful.
00:25:25.000 But they also don't want to know because they end up feeling sort of looked down upon, I think, if they ask basic questions like, wait a minute, how does a missile work?
00:25:36.000 Or like you said, can't the hypersonics – shouldn't we invest in hypers – well, Who really wants to be lectured?
00:26:05.000 And then going back to all my sources, I mean, like, okay, here's an example.
00:26:09.000 We haven't even talked about submarines, but the submarines are completely—you cannot find them in the sea.
00:26:18.000 They are stealth beyond stealth.
00:26:21.000 I interviewed the former commander of the nuclear subforces, a guy called Admiral Conner.
00:26:28.000 Never given an interview like this before.
00:26:30.000 And I said, like, how hard is it to find a nuclear-armed sub?
00:26:35.000 And he said, Annie, it's easier to find a grapefruit-sized object in space than a nuclear sub under the sea.
00:26:46.000 Jesus.
00:26:47.000 And these things are, by the way, I have a map in the back of the book that shows you how close our adversaries, enemies, call them what you will, China and Russia, how close they come to the East Coast and the West Coast of the United States regularly,
00:27:04.000 which means it reduces that launch time I told you about of 26 minutes, 40 seconds.
00:27:10.000 That reduces it down to sort of 10 minutes or less.
00:27:20.000 What did you think when you saw the Soviet subs that were outside of Cuba?
00:27:25.000 I thought, wow.
00:27:27.000 I mean, when I began reporting this book a couple years ago, never did I think that I would see that while I was talking about my book with people like you after publication.
00:27:43.000 But in the same manner, I never thought I would hear The president of Russia threatening to use a nuclear weapon.
00:27:50.000 I mean, he said he's not kidding that he might use WMD. That was his paraphrase quote.
00:27:57.000 Did he specify like in what way?
00:28:00.000 He just said WMD. Against America?
00:28:04.000 Yeah.
00:28:04.000 Well, you know, having to do with intervention in Ukraine.
00:28:08.000 Yeah.
00:28:10.000 I'm sure you saw the drones that blew up on a beach, the bombs that were launched that killed civilians in Russia.
00:28:20.000 You didn't see this?
00:28:22.000 Is this like recently, yesterday?
00:28:23.000 No, I didn't know.
00:28:25.000 Jamie, see if you can find that.
00:28:27.000 But Russian civilians, including one young girl that was showing in this article, were killed by these cluster bombs.
00:28:38.000 They were launched by drones that are ours.
00:28:42.000 Right.
00:28:43.000 You know, that Ukraine has.
00:28:45.000 And now they've launched them on Russian civilians.
00:28:50.000 And it's like, here it is.
00:28:53.000 Crimea video shows Russian tourists flee beach.
00:28:57.000 What is that word?
00:28:58.000 A-T-A-C-M-S? Bomblets rained down?
00:29:01.000 What does that mean?
00:29:02.000 Do you know what that means?
00:29:03.000 A-T-A-C-M-S? Well, I'm guessing they're small cluster bombs that are in the nose cone of the warhead.
00:29:13.000 Make that larger so I can read the whole thing, Jimmy.
00:29:15.000 The video shows the beach in Sevastopol, Crimea, which was struck by a series of explosions on June 23rd.
00:29:23.000 The footage captured by a security camera shows hundreds of people beginning to run away from the water before the impact of cluster warheads starts.
00:29:31.000 What's happening in Ukraine is so profoundly dangerous for everyone.
00:29:37.000 So this is the scene right here.
00:29:40.000 So these things just drop down on the water.
00:29:49.000 It's just pure terrorism.
00:29:52.000 Well, it's also remarkable that we have so much available footage and so much citizen journalism that people can see these events and discuss them.
00:30:06.000 It says here, the event was caused by Russian air defenses shooting down a series of cluster warhead missiles, one of which altered course as a result.
00:30:14.000 The Russian Ministry of Defense said that four of the five missiles launched were shot down, adding Another missile as a result of the impact of air defense systems at the final stage deviated from the flight path with the warhead exploding in the air over the city.
00:30:28.000 The detonation of the fragmentation warhead of the fifth American missile in the air led to numerous casualties among civilians in Sevastopol.
00:30:44.000 Do we know what this was about?
00:30:47.000 Where they were launching them towards?
00:30:50.000 I don't know.
00:30:51.000 I don't know.
00:30:52.000 I'm not following the ground war in Ukraine right now with my focus on this.
00:30:57.000 But what I do know is that the ratcheting up of the rhetoric and the use of third party weapon systems is complicating an already incredibly volatile situation.
00:31:13.000 This says a spokesperson for the U.S. State Department denied the accusation saying that the claims were ridiculous and hyperbolic.
00:31:19.000 The U.S. supplies weapons to Ukraine in the ongoing war with Russia and recognizes Crimea as a part of Ukraine despite Russia's annexation.
00:31:27.000 Ukraine has previously outlined plans to use long-range weapons supplied by America in Crimea specifically to target infrastructure supporting the Russian invasion.
00:31:43.000 This is just terrifying stuff.
00:31:45.000 It's terrifying because it can all be happening while you're just going about your business, walking your dog.
00:31:51.000 You have no idea that the entire world is in grave danger.
00:31:56.000 You mean if things suddenly go nuclear?
00:32:00.000 Yeah.
00:32:00.000 Well, even just this, just like these escalations.
00:32:03.000 Well, I think the big picture that frightens me most is that when we see the president of Russia going to the president of North Korea, our two, air quotes, arch enemies right now having a new alliance.
00:32:20.000 And then I consider that the current president of the United States hasn't spoken To the president of Russia in two years.
00:32:29.000 And I think back to that time in history, what's known as the Reagan reversal, where Reagan went from this incredible hawk to learning about nuclear weapons in, of all things, an ABC television movie called The Day After.
00:32:48.000 Having the crap scared out of him.
00:32:52.000 And then realizing – this is the president of the United States – realizing we cannot continue on this path.
00:32:59.000 It is too dangerous.
00:33:00.000 And that is why Reagan reached out to Gorbachev and that's why we have the Reykjavik summit.
00:33:05.000 It was called the Reagan Reversal.
00:33:06.000 So in other words, my point is Reagan, who – you know, the axis of evil speech, like this idea of seeing your enemy as the arch-evil villain, had to change for him when he understood – I think we're good to go.
00:33:30.000 We're good to go.
00:33:37.000 We're good to go.
00:33:40.000 One has to imagine that the current president, with all his decades in office, understands all of this.
00:33:46.000 And so I don't fundamentally understand why there is no communication.
00:33:50.000 It is way too dangerous.
00:33:55.000 Hence, what you just showed us, the facts will come in of whose weapon systems those are.
00:34:01.000 But either way, the perception, to your point, The fact that the perception, a misperception, could ignite nuclear war, could ignite that situation that is unreversible,
00:34:18.000 that should be astonishing to all of us.
00:34:22.000 Trevor Burrus It's terrifying.
00:34:25.000 Danielle Pletka Well, it's terrifying but the one hopeful part of it would be again going back to the Reagan – the Reagan reversal by the way is the only glimmer of hope I ever found in all of this.
00:34:36.000 Don't you think though that politics in general and certainly world leadership, especially United States leadership, is much more compromised today than it was then?
00:34:46.000 And a guy like Reagan doesn't really exist today.
00:34:50.000 Tell me what you mean when you say compromised.
00:34:52.000 I mean the military defense contractors are making so much money and they want to continue making so much money and they have great influence over the politicians and over policy and over what gets done.
00:35:09.000 And this money that they don't want to stop making is completely dependent upon The continuing to build, continuing to sell, continuing to have these weapons and future systems and more advanced systems and better systems.
00:35:29.000 And there's so much money and momentum behind this.
00:35:33.000 That I don't know if there's a Reagan available now.
00:35:38.000 I don't know if that's an option.
00:35:39.000 If there's a person that can have some sense that can say that we are on a path to self-destruction and we need to stop and we need to reverse this path.
00:35:51.000 You know, you're going to have people in the military, in the Defense Department, that are being influenced by these contractors.
00:35:58.000 There's plenty of places we can move things around and get things done.
00:36:05.000 And don't you know about these guys?
00:36:06.000 These guys are bad guys.
00:36:07.000 We need to get over there.
00:36:08.000 We need to do something about this.
00:36:09.000 We need to do something about that.
00:36:10.000 And this escalation is motivated by the fact that they're making fucking ungodly amounts of money by making and selling these weapons.
00:36:22.000 And this is a massive part of our economy.
00:36:24.000 It's a massive part of the structure that runs the government itself.
00:36:31.000 Absolutely.
00:36:32.000 So then you have to ask yourself, what is also going to happen now that these big contracting organizations – Boeing, Raytheon, Lockheed – are now being threatened by Silicon Valley,
00:36:48.000 by the new defense contractors that are coming into the pipeline, that are threatening their contracts because they can do it faster and cheaper.
00:36:57.000 And so I fear that you will see even more of that entrenchment that you're talking about.
00:37:02.000 Even more of the, you know, the bureaucracy churning out more weapons under the guise of defense.
00:37:09.000 Because they'll have to ramp it up.
00:37:09.000 Yeah, because there's a competition, you know, which is that double-edged sword because competition is what makes America great.
00:37:15.000 I believe in that truth, you know.
00:37:16.000 Yeah.
00:37:17.000 But I do also think what's interesting is, like, someone I interviewed here in the book was Leon Panetta.
00:37:27.000 So not only was he a former SecDef, but he was former CIA chief and he was former White House chief of staff under Clinton.
00:37:38.000 And in our interview, I learned a lot from him about those three kind of elements of the national security, advising the president, you know, being SecDef, being in charge of all of this, and being CIA chief from the intelligence point of view.
00:37:56.000 But what was even more interesting about interviewing Panetta Was that he said to me at the end of our interview, it's good that you're doing this.
00:38:08.000 The American people need to know.
00:38:10.000 That's a direct quote from him.
00:38:12.000 So here's a guy who has spent his entire life entrenched in that system that you're talking about.
00:38:17.000 And then outside of it, once he retires, puts on his, shall we say, grandfather's hat.
00:38:26.000 The human hat and is suddenly like, this is really going in the wrong direction.
00:38:34.000 I would hope that that would lead to more people thinking wisely about what it is they're doing when they're in office as far as nuclear war is concerned.
00:38:47.000 Yeah, but the thing that concerns me is they're not good at anything.
00:38:51.000 Why would they be good at this?
00:38:53.000 But tell me more.
00:38:54.000 It brings me back to Eisenhower's speech when he left office.
00:38:59.000 The threat of the military industrial complex warning the United States that there is an entire system that is now in place that profits off a war and wants war and wants to keep creating these weapons and wants to keep escalating things because that's their business.
00:39:18.000 That's the business.
00:39:19.000 We're not good at Regulating any businesses.
00:39:24.000 We're not good at anything that's detrimental.
00:39:27.000 We're not good at regulating overfishing of the oceans.
00:39:30.000 We're not good at environment.
00:39:32.000 We're not good at energy.
00:39:34.000 We're not good at manufacturing.
00:39:35.000 We're not good at regulating anything.
00:39:38.000 Everything we've done has been a for-profit thing.
00:39:43.000 When we allowed jobs to go overseas and decimated American manufacturing, there was no Regulation of that.
00:39:50.000 They didn't think that out.
00:39:51.000 They didn't do a good job of managing that.
00:39:54.000 No, they completely devastated manufacturing here.
00:39:57.000 And we saw during the COVID crisis, during the lockdowns, oh, my God, we can't get anything because everything's overseas.
00:40:03.000 Everything's made overseas.
00:40:04.000 Medicine, computer chips, everything.
00:40:07.000 Like, they're not good at anything.
00:40:10.000 But what's – I agree with you.
00:40:13.000 The first part of Eisenhower's speech is spot on and that is absolutely true.
00:40:19.000 And yet at the same time – this is why I think people stop talking about things like nuclear war or they move on to another more interesting subject that might be more entertaining because who really wants to hear about this problem that seems to be cyclical and there is – because you have to have a strong defense.
00:40:39.000 You have to have a national security, otherwise you get walked all over.
00:40:42.000 I think people kind of agree that.
00:40:44.000 Yes.
00:40:44.000 You can't really have a peace force.
00:40:46.000 Right.
00:40:47.000 Agreed.
00:40:48.000 Especially with the state of the world.
00:40:49.000 Right?
00:40:50.000 Right.
00:40:52.000 But the second part of Eisenhower's speech is important to me, and it's why I get people to talk to me in my books, because he says there's an antidote to the military-industrial complex, and that is an alert and knowledgeable citizenry.
00:41:10.000 Which is in essence what we're doing now by talking about this.
00:41:13.000 It's what you do on your podcast.
00:41:15.000 By having an alert and knowledgeable citizenry.
00:41:19.000 And the word alert I think means engaged.
00:41:21.000 You have to be able to talk to people.
00:41:25.000 In a way that they can, oh wow, that's interesting.
00:41:28.000 Well, I don't understand how that works.
00:41:30.000 How does that work?
00:41:30.000 And have these conversations.
00:41:32.000 So in that regard, I would say that's a positive sign in the right direction.
00:41:37.000 I think people are much more open to having an opinion about all of this than they were in, say, the 1950s.
00:41:44.000 But there's a balance because now opinion somehow seems to be taking over the Department of Facts in many regards.
00:41:56.000 The Department of Facts.
00:41:57.000 That's an interesting way to look at it.
00:42:02.000 It just doesn't seem like the general public is completely aware of how dangerous these threats are and how close we are inching towards it.
00:42:14.000 Like even the nuclear subs off the coast of Cuba was barely a blip in the news cycle.
00:42:19.000 You know, it was replaced by Taylor Swift and her boyfriend.
00:42:23.000 You know, it's like it just goes in and out quickly.
00:42:27.000 Jo, I do not write about politics.
00:42:31.000 I just don't talk about – I talk about POTUS, the president of the United States.
00:42:36.000 And I talk about moves that certain presidents made.
00:42:40.000 But I'm amazed at how much time is spent.
00:42:43.000 Talking about these two individuals and their families and what they ate for breakfast.
00:42:51.000 That I find, you know, at least the Taylor Swift of it is like slightly, you know, entertaining or uplifting.
00:42:58.000 But the way in which America seems to me have almost become like, you know, the way that the UK used to be obsessed with the royal family.
00:43:12.000 Yeah, that's our royals as celebrities and nonsense.
00:43:16.000 And whether it's the president who's a celebrity or congressperson who's become a celebrity, you know, AOC, it's not about her policies.
00:43:22.000 It's about her saying stupid shit.
00:43:24.000 It's like that's all people care about.
00:43:26.000 It's a reality show.
00:43:28.000 And we're kind of conditioned by reality shows, right?
00:43:30.000 We have so many that we watch and so many things that we pay attention to that are nonsense, that distract us, that we like to sort of apply those same viewing habits to the whole world.
00:43:41.000 But I'm amazed by the phenomena of podcasts, I must say, because I'm old enough to remember when they weren't around.
00:43:59.000 And so I exist in these two different worlds of media that are, you could say, traditional media forms.
00:44:07.000 And when you consider how radically these different forms of communication are changing, I sell as many e-books and audio books as I do hardcovers.
00:44:19.000 And I have a feeling that if those markets didn't exist, I would sell half as many books, if that makes sense.
00:44:25.000 Yes.
00:44:26.000 And so then when you throw the podcast into the mix, I cannot tell you how many people know about my work as a journalist, as a national security reporter because of podcasts.
00:44:41.000 That is remarkable to me.
00:44:43.000 It makes things so much more accessible to so many more people.
00:44:49.000 Everybody's listening to a podcast driving around, listening, you know, when they're at the gym, when they're on a hike.
00:44:55.000 And if someone who cares about an alert and knowledgeable citizenry as a fundamental, first of all, because I think if people, people that are curious tend to be less furious.
00:45:09.000 If you can get your curiosity satiated, you don't become so angry.
00:45:17.000 And again, I have to be an eternal optimist, particularly writing the kind of books that I do, or my thinking would take a negative turn.
00:45:29.000 And so I am an eternal optimist, and I do look to conversation and new media as a means to a better way or a means to a way out of this kind.
00:45:42.000 I believe the tide will turn.
00:45:45.000 Well, it's certainly one of the only uncompromised conversations that's available.
00:45:49.000 And it happens to be the biggest.
00:45:51.000 You know, that's the wildest thing is mainstream media is just falling apart.
00:45:56.000 No one cares anymore.
00:45:58.000 No one believes them.
00:45:58.000 The faith in mainstream media and the trust in mainstream media is at an all time low.
00:46:05.000 And podcasts are at an all-time high.
00:46:07.000 How many of them are there, Jamie?
00:46:09.000 Like five million?
00:46:10.000 I think there's something like that, like monthly, you know?
00:46:14.000 Yeah.
00:46:14.000 Yeah.
00:46:14.000 But also what's remarkable is you hear people often say, like, people have lost their attention spans.
00:46:20.000 They watch TikTok.
00:46:21.000 Well, I mean, people listen to your podcast for three hours.
00:46:27.000 That is a very long attention span.
00:46:49.000 It on a hike.
00:46:50.000 And I think that it is a very different kind of mental stimulus, curiosity, in a new way forward than the old days of reading a newspaper.
00:47:01.000 You know, it takes you this amount of time to read and then you, I mean, newspapers barely exist anymore.
00:47:05.000 Right.
00:47:06.000 And then the other problem with television shows is that they're on a specific time.
00:47:09.000 And people don't want to be locked into having to watch something at a very specific time.
00:47:15.000 And now because of things like, you know, YouTube and Spotify, you can just watch it anytime you want.
00:47:20.000 And just stop it when you go to the bathroom.
00:47:22.000 Stop it when someone calls you.
00:47:23.000 Stop it when you have to go somewhere.
00:47:25.000 Restart it again when you're at the gym.
00:47:29.000 It's just a different thing.
00:47:31.000 It's a different way to communicate.
00:47:34.000 This idea that people don't have attention spans anymore.
00:47:37.000 How is that possible?
00:47:38.000 They're just people.
00:47:39.000 People didn't change.
00:47:40.000 That's so stupid.
00:47:41.000 If people always had attention spans and all of a sudden they don't, maybe they're just getting distracted by things that are very easy to absorb and very addictive, like TikTok videos.
00:47:53.000 It doesn't mean that the human mind is different.
00:47:55.000 It's been altered forever.
00:47:57.000 And then now no longer people are interested in long-term conversations.
00:48:01.000 That's just stupid.
00:48:03.000 I've rejected that from the beginning.
00:48:05.000 Like, one of the first things this podcast is, even my good friends were telling me, like, you have to edit it.
00:48:09.000 I'm like, I'm not editing shit.
00:48:10.000 Like, you have to make it shorter.
00:48:11.000 Like, why?
00:48:12.000 No one's going to listen to something for three hours.
00:48:14.000 Then don't listen.
00:48:14.000 That was my take.
00:48:15.000 I was like, I don't care.
00:48:16.000 I listen to things.
00:48:18.000 I've always listened to, like, lectures and old Alan Watts speeches.
00:48:23.000 Like, I listen to things.
00:48:25.000 Who's Alan Watts?
00:48:26.000 Alan Watts is, I guess you could call him a psychedelic philosopher.
00:48:33.000 Very fascinating Englishman who said some very wise things.
00:48:36.000 But just a brilliant person, very interested in Buddhism and just a very, very wise person who still today people send me clips of things that he said and quotes of things that he said.
00:48:50.000 I've always listened to fascinating people who have conversations.
00:48:53.000 Terence McKenna.
00:48:54.000 I listened to a lot of his speeches and a lot of the different lectures that he gave.
00:49:00.000 I don't think people have changed.
00:49:01.000 I think that's nonsense.
00:49:02.000 I get hooked on YouTube reels or Instagram reels.
00:49:07.000 I get hooked on them.
00:49:08.000 I'll be sitting there.
00:49:09.000 If I have nothing to do, I'll be like, what is that?
00:49:10.000 Why is he doing that?
00:49:11.000 What's that?
00:49:12.000 Oh, look at that guy.
00:49:13.000 It's just a part of being a human being.
00:49:16.000 We're easily distracted.
00:49:17.000 It doesn't mean we don't have an attention span anymore.
00:49:19.000 That's stupid.
00:49:20.000 That's ridiculous.
00:49:21.000 There's still people that are graduating from universities with PhDs.
00:49:24.000 There's still people right now that are in the residency in medical school.
00:49:28.000 There's still people that are learning how to engineer fighter jets.
00:49:33.000 Like, there's people that have attention.
00:49:35.000 This is nonsense.
00:49:36.000 The idea that human beings have radically changed because of this one medium that's addictive is just so stupid.
00:49:41.000 Well, I also think there's something to be said as an individual when you start to be a little bit conscious of your own habits in viewing and thinking and reading and information.
00:49:55.000 So you get absorbed in the TikTok and then you get to say to yourself, like, what am I doing?
00:49:59.000 I want to actually change this habit.
00:50:01.000 And we all benefit from seeing how easy it is to develop a habit and how hard it is to sort of move yourself away from a habit as you become entrenched in it.
00:50:15.000 And so I think there's complete value in that.
00:50:18.000 People suddenly realize, I've got to stop watching TikTok videos and I've got to go to the gym, which is another, you know...
00:50:25.000 Right, but that's a difficult sort of an adjustment, and most people don't like difficult things.
00:50:29.000 So if you get 100 people addicted to TikTok, what number out of those 100 people are going to go, you know what, I'm going to change my life?
00:50:37.000 It's probably like three or four.
00:50:39.000 And those people are extraordinary.
00:50:40.000 And you hear about them and you get inspired by them.
00:50:42.000 Like, wow, you got a flip phone?
00:50:44.000 That's crazy, Bob.
00:50:45.000 Why'd you do that?
00:50:46.000 You know what, I realized my mind was getting taken up by these things and now I have my mind back.
00:50:51.000 I like it.
00:50:51.000 People want to call me, they can call me.
00:50:53.000 But I'm not watching things and reading things and absorbing things.
00:50:57.000 But then there's the argument like, okay, but now you're out of the cultural conversation.
00:51:01.000 I have friends that have flip phones and I'll try to ask them, did you see this new thing about the new quantum computer that's like...
00:51:08.000 100 billion times better or 100 million times better than the last one they released in 2019, you know, and they're like no what so they're missing some things too.
00:51:18.000 So the key is like Mitigation like you have to figure out like how much information Makes you anxious and how much information just where you just sit there and you scroll and you waste your time and then you're like, what did I do with my life?
00:51:34.000 I'm wasting hours.
00:51:35.000 And then you look at your screen time at the end of the day, it's six hours.
00:51:38.000 Like what?
00:51:39.000 Six hours of looking at my phone?
00:51:41.000 Is that real?
00:51:42.000 So you have to do that, but then also you don't want to miss out on things.
00:51:47.000 So you do want to kind of be informed and part of my job is Is to be informed.
00:51:53.000 I can't be the guy who people have to tell things about because I don't know anything.
00:51:58.000 Like, what?
00:51:59.000 What's going on?
00:51:59.000 Crimea?
00:52:00.000 Where's that?
00:52:01.000 I can't.
00:52:02.000 So I have to have some involvement.
00:52:06.000 I have to have some input where I'm getting input from social media and from all these different things.
00:52:13.000 As a comedian, I have to know the temperature of the country.
00:52:16.000 I have to know, like, what to make fun of, like, what's ridiculous, what people are accepting that doesn't make any sense.
00:52:22.000 You just have to know, like, when you're getting sucked in to the point where it's becoming detrimental.
00:52:27.000 And I think that's where people struggle.
00:52:29.000 People really struggle with that.
00:52:30.000 Like, figuring out what's, how much, like, you can eat a cookie.
00:52:35.000 Nothing wrong with eating a cookie, but you shouldn't eat a whole bag of cookies.
00:52:40.000 You know, you shouldn't eat cookies every day.
00:52:42.000 That's not good.
00:52:43.000 But if you have dinner and you want to get dessert, yeah, I'll get a piece of tiramisu.
00:52:46.000 Okay, you're gonna be fine.
00:52:48.000 You're gonna be fine.
00:52:49.000 You eat tiramisu every day, you're gonna die.
00:52:51.000 You know, and that's what social media is.
00:52:54.000 It's dessert.
00:52:55.000 It's candy.
00:52:56.000 It's things that are kind of fun.
00:53:00.000 Up to a point, but you just got to know what that point is and how to manage your own attention span and just also have sovereignty over your mind.
00:53:12.000 You have to control your mind.
00:53:14.000 You have to be able, like if your mind starts getting anxious, okay, I know we're getting weird.
00:53:19.000 It's time to work out.
00:53:20.000 Okay, I know maybe we should meditate.
00:53:22.000 Maybe we should do this.
00:53:22.000 Maybe we should do that.
00:53:23.000 Don't just keep scrolling.
00:53:25.000 You gotta know when and when, but most people don't have that kind of self-control and discipline.
00:53:31.000 They don't have to.
00:53:32.000 All most people have to do is when the alarm goes off, get up, wash yourself, brush your teeth, eat something, go to work, do whatever minimal amount you have to do to keep that job And then the bathroom breaks and whenever no one's looking,
00:53:49.000 look through your phone, be distracted, come home, watch Netflix, go to sleep, repeat.
00:53:56.000 That's most people.
00:53:57.000 So they don't have to do anything because they haven't set up their life in a way that requires serious attention and an objective sense of your perspective and your interaction with humans and the way the world is working.
00:54:15.000 They don't have the time.
00:54:16.000 They have family problems, job problems, their car's fucked, something's wrong with their house they gotta fix, they have bills, everything's piling up.
00:54:26.000 People are immensely distracted.
00:54:29.000 So what social media does for them is it gives them a brief rest from their own problems to just look at some Fucking drag queen reading stories to kids and get outraged or some new thing that's going on with some girl that made some crazy video and now everybody's talking about it.
00:54:49.000 Like...
00:54:50.000 It's just most people don't have much discipline.
00:54:54.000 And they don't have to.
00:54:55.000 And they've gotten through life being overweight, eating processed foods, and drinking too much, and smoking too much, and taking all kinds of pills to mitigate all these problems that they have because they've not taken care of themselves.
00:55:10.000 So they're on anti-anxiety medication, and anti-depression medication, and anti-this and that.
00:55:16.000 And they're trying to lose weight, so they're on Ozempic.
00:55:19.000 And that's most people.
00:55:21.000 It's most people.
00:55:22.000 You know the number one drug in America is a peptide that helps you lose weight?
00:55:26.000 What is that?
00:55:28.000 Ozempic.
00:55:29.000 It's the most profitable drug in the country.
00:55:31.000 It's maybe the most profitable drug ever.
00:55:34.000 They can't sell enough of it.
00:55:36.000 They estimate that by...
00:55:39.000 What is the number?
00:55:41.000 I think they're saying within five years, 30% of the population is going to be on Ozempic.
00:55:48.000 They can't make it fast enough?
00:55:49.000 Can't make it fast enough.
00:55:50.000 It's flying off the shelves.
00:55:52.000 Okay, here's a strange...
00:55:55.000 Parallel thought for that, which is that Raytheon has gotten rid of its marketing department.
00:56:01.000 It doesn't need it anymore.
00:56:03.000 They can't make enough missiles fast enough.
00:56:06.000 Imagine we have a marketing department for missiles.
00:56:09.000 That is so crazy.
00:56:10.000 What you just told me is like people's physical being and their existential defense threats are aligned in terms of that there's no need.
00:56:22.000 There's too many orders to fill.
00:56:25.000 If you can get people to believe bullshit and keep feeding them bullshit, you turn them into infants.
00:56:31.000 And if they just accept the fact that you're feeding them bullshit and they don't employ any critical thinking and they don't look at outside sources of information and really try to assess what's actually going on because they generally don't have the time.
00:56:47.000 Interesting.
00:56:48.000 You create a nation of infants, and there's a lot of us in this country that exist almost like children that are hoping daddy's going to take care of everything.
00:56:57.000 But I'm always interested in the people that are those 3% you talked about, that suddenly have that moment, the catalyst, where they realize, oh my goodness, I have to change.
00:57:08.000 Things have to change.
00:57:09.000 That becomes, you know, maybe not everybody changes.
00:57:11.000 There's more of us now, I think, than ever before.
00:57:15.000 That change.
00:57:15.000 Yes.
00:57:16.000 That suddenly realize.
00:57:16.000 That realize something's going on.
00:57:18.000 And I think that's also because...
00:57:21.000 Of social media, the good aspects of social media, real honest discussions, revelations, things being released on Twitter and, you know, the Twitter files with Elon Musk, where they found out the FBI was trying to suppress information, the Hunter Biden laptop story, and then going through the COVID disinformation and now seeing the congressional hearings where Fauci's lying in front of Congress about Gain-of-function research and whether or not they deleted emails and all that stuff.
00:57:45.000 I think more people are now going, what the fuck is actually going on than ever before?
00:57:52.000 I think there was always people, like during the Vietnam War, there's always people that distrusted the government and didn't want to, but they didn't have the kind of access to information that we have today.
00:58:03.000 I mean, I'm interested in the individual stories of people who change always because I think that it's too – I don't want to say depressing for me, but it's too – like if I think of America as this big, giant situation with problems,
00:58:19.000 it becomes overwhelming.
00:58:22.000 I just try to focus on people's really interesting, cool stories.
00:58:25.000 I'll tell you one that comes to mind with the guy with my Uber drive the other day.
00:58:29.000 And we're chatting away and he tells me that he used to be like 350 pounds.
00:58:33.000 And he was like this thin dude.
00:58:35.000 And he was talking to me about driving an Uber at night so he could save money for one of his kids to go to college.
00:58:41.000 And I was like, wow, how did you suddenly lose weight?
00:58:43.000 And he told me that he was a security guard at a military base.
00:58:48.000 And, you know, he said I was giant.
00:58:51.000 And the dudes in the military were totally fit.
00:58:55.000 And one of them said to me one time, dude, you gotta lose some weight.
00:58:59.000 And I listened to him and he let me go into the military gym.
00:59:03.000 Wow.
00:59:04.000 One guy.
00:59:05.000 One guy.
00:59:06.000 Sometimes you just need to hear it.
00:59:08.000 Be that guy.
00:59:09.000 Yeah.
00:59:09.000 Be that guy.
00:59:10.000 And but that to our point of like, well, if you're not allowed to say like, dude, you're overweight or he didn't, you know, essentially like by saying, dude, you should come to the gym.
00:59:20.000 He's the subtext there is dude, you're overweight.
00:59:23.000 Yeah.
00:59:23.000 Like you're, you know, unhealthily overweight.
00:59:27.000 That's that fine line of stuff I find really interesting is talking about things that make people uncomfortable.
00:59:32.000 I don't think that should be so taboo.
00:59:35.000 No, it shouldn't be.
00:59:35.000 I think you should be able to be honest, assertive, and direct.
00:59:40.000 Especially if you can do it and be kind.
00:59:42.000 You know, just telling someone they need to lose weight doesn't mean you're mean.
00:59:45.000 And this whole body positivity thing is not good for anybody.
00:59:49.000 It's not good for anybody.
00:59:50.000 There's no one who benefits from that.
00:59:51.000 You benefit in the short term where you don't feel as bad about the fact that you're obese.
00:59:56.000 But you know you're obese.
00:59:58.000 Everyone knows you're obese.
00:59:59.000 You're just not dealing with this very obvious problem.
01:00:01.000 When someone says you need to lose weight, I was watching this TikTok video where there was this lady who was upset because she was going to her doctor.
01:00:11.000 And she has all these autoimmune problems and she was severely, morbidly obese, like giant.
01:00:17.000 And she said that the doctor started body shaming her.
01:00:20.000 And she was so upset that she felt uncomfortable that the doctor was telling her that she needed to do something about her weight loss and recommended perhaps bariatric surgery or Ozempic or any of these things.
01:00:32.000 And this person was talking about this.
01:00:34.000 It's like, what a betrayal.
01:00:36.000 That their healthcare provider was calling them obese and that they did not feel safe with them anymore.
01:00:44.000 The comments were interesting because almost everyone in the comments is like, that fucking person is trying to save your life.
01:00:51.000 Interesting.
01:00:51.000 Like, you have all these autoimmune issues.
01:00:54.000 Well, guess why?
01:00:55.000 Guess why?
01:00:56.000 You're 500 pounds.
01:00:58.000 Like, you shouldn't be 500 pounds.
01:01:00.000 That's not a normal weight for a person, especially a woman who's like 5'8".
01:01:04.000 This is crazy.
01:01:05.000 Like, what you're doing is crazy.
01:01:07.000 And for you to be upset that your doctor is telling you you're doing something crazy.
01:01:12.000 It's like a person coming into the doctor's office getting screened for lung cancer with three cigarettes lit in their mouth.
01:01:20.000 That's what it's like.
01:01:22.000 It's like, hey, what do you think is causing this lung cancer?
01:01:24.000 Do you think maybe it's this fucking poison that you're sucking on every day?
01:01:28.000 Like, yeah, that's probably it, right?
01:01:30.000 Yeah, well, that's the same thing.
01:01:32.000 You're just doing, it's a slower poison, but it's a very obvious poison.
01:01:36.000 Like, you're consuming poison, you've got in your body to the point where it's dying, and she's telling you, this doctor's telling you, hey, you've got to do something about this.
01:01:46.000 And your response is to go on social media and talk about the horrors of being body shamed.
01:01:53.000 There's a great saying that I love, I try to live my life by, and I definitely write about it in all my books, which is, you can't fix what you can't face.
01:02:05.000 That's a very good quote.
01:02:07.000 Yeah, that's true.
01:02:08.000 Alternatively, you can fix what you're willing to face.
01:02:11.000 Yes.
01:02:12.000 And that's what's so much about transformation.
01:02:13.000 If it's fixable.
01:02:16.000 I think you can fix anything, even if it's just a mental fix or a spiritual fix.
01:02:21.000 But on the technology front, do you ever read James Burke?
01:02:25.000 No.
01:02:26.000 He wrote a lot in the 80s.
01:02:29.000 And he was writing about science and technology.
01:02:32.000 He's a historian of science and technology.
01:02:34.000 And he goes, like, way back.
01:02:36.000 You know, like bubonic plague he'll write about of how...
01:02:38.000 That changed industry across Europe in these really general, easy to digest, fascinating ways.
01:02:45.000 But in terms of your technology, TikTok, is the world going to hell in a handbasket?
01:02:50.000 I think of him because he said that when the—and this is a great analogy, I think, that I think of with my kids and social media—when the printing press was invented, Absolutely all of society thought the world was going to go to hell in a handbasket.
01:03:09.000 But before that, the only people who could read, really, were the priests.
01:03:15.000 And so they kept all this information and they doled it out according to their line of thinking.
01:03:21.000 And then the printing press came along and the hoi polloi could read.
01:03:26.000 That began really...
01:03:29.000 The birth of mass populations being able to read, which is where we are today.
01:03:35.000 And sometimes I, like, think about James Burke, and I think about that as an analogy to where we are today, that what is going on is just an upheaval, like the printing press, in terms of making a lot more people more literate.
01:03:52.000 And so maybe it's not even...
01:03:54.000 I mean, I used to think of literacy, and its true definition is actually reading.
01:03:59.000 And I remember when audiobooks came out and I read all my own audiobooks.
01:04:03.000 And I originally thought that listening to me read my own book was somehow you wouldn't have the same experience of reading it yourself.
01:04:13.000 And then I realized I was putting my standard.
01:04:17.000 I actually enjoy reading.
01:04:19.000 I like to read.
01:04:20.000 That's the way I'm wired.
01:04:21.000 I can't do math, but I can read.
01:04:25.000 And now I realize with the amount of people that listen to my audiobooks, listen to your podcast, that maybe is a new 21st century form of literacy.
01:04:37.000 Which really makes my head go in interesting places because language is very different than reading.
01:04:46.000 You know, communicating.
01:04:48.000 But it's all information.
01:04:50.000 Yes.
01:04:51.000 And it's digesting the information.
01:04:53.000 And so where I think the social media parts of it are dangerous is it's, like you said, it's too fast, too disparate.
01:05:03.000 You go from one thing to the next thing and that's the way it's all set up.
01:05:06.000 Whereas a longer-form podcast, you're asking people to stay with you with your ideas.
01:05:13.000 You might go off on a riff about obesity.
01:05:15.000 I might go off on a riff about literacy.
01:05:17.000 But the brain is being stimulated.
01:05:23.000 The brain is being curious.
01:05:24.000 And then that carries over to your own life.
01:05:27.000 Yes.
01:05:29.000 Yeah, I think what's going on also is that this entertainment form, whether it's podcasts or audiobooks, is something that's being consumed while people are doing other things where they normally would not get this information.
01:05:45.000 Like driving, going to the gym, working, doing menial labor, doing things where you can listen to a podcast.
01:05:52.000 That is a new thing.
01:05:54.000 It's a way to be entertained while you're doing other things.
01:05:58.000 And that's a big part of this.
01:06:00.000 And that's a whole area that wasn't addressed before.
01:06:04.000 I mean, it kind of was with talk radio.
01:06:06.000 So people listen to talk radio in their cars.
01:06:08.000 But nobody listens to talk radio at the gym.
01:06:10.000 Nobody listens to talk radio on an airplane.
01:06:14.000 Now you can download things and consume them anytime you want.
01:06:17.000 And most of the time people are consuming these things while they're being forced to sit in the doctor's waiting room, while they're doing something that ordinarily they would just be just bored.
01:06:27.000 Right.
01:06:28.000 And the other argument to that, your friend with the flip phone, I've heard this director, Christopher Nolan, who made the Oppenheimer movie, talk about this, where he says he believes that the experience of sitting in the waiting room is what he wants.
01:06:47.000 So I think there's a very few rarefied people that can actually – the way they're built, the way they're engineered, the way they are, the way they've become allows for them to sit in the waiting room and be super interested in observing.
01:07:03.000 Maybe you're an elite director to do that.
01:07:06.000 But most people are going to be restless, irritable and discontent and therefore the podcast, the audio book – It is, but it's also a way to consume new information.
01:07:21.000 It's a way to get educated.
01:07:23.000 You can say what Christopher Nolan is doing is the right way, but it might be the right way for him.
01:07:29.000 He's obviously a brilliant man, and I don't believe he even uses email.
01:07:34.000 Is that the case?
01:07:35.000 Find out if Christopher Nolan uses email.
01:07:37.000 I'm pretty sure he's one of those guys that's like, no phone, no email, nothing, completely disconnected.
01:07:42.000 And I would imagine if you want to be very creative, that's probably a very good strategy.
01:07:50.000 You don't want the stimulus.
01:07:52.000 Yeah.
01:07:52.000 I don't have an email address.
01:07:53.000 I've never used email, Nolan said.
01:07:56.000 I don't have a smartphone.
01:07:56.000 I will carry a pay-as-you-go dumb phone thing.
01:07:59.000 Yeah.
01:08:00.000 Okay, and that's the quote from him.
01:08:02.000 One of the reasons why he's probably so good.
01:08:04.000 What does it say?
01:08:04.000 His burner phone is inspired by what?
01:08:07.000 The wire?
01:08:08.000 I'm going to say the wire.
01:08:09.000 What does it say?
01:08:12.000 Ah!
01:08:13.000 I got it!
01:08:14.000 I nailed it!
01:08:18.000 I have a feeling that...
01:08:22.000 In the future, there's going to be way less of those people.
01:08:25.000 And, you know, there was a lot of people I remember in the day that had no email, and they thought it was cool.
01:08:30.000 I don't even have email, man.
01:08:32.000 You can call me.
01:08:33.000 You know, I don't answer.
01:08:34.000 Like, my friend Joey would not answer text messages.
01:08:36.000 He'd get mad at you if you sent him a text message.
01:08:38.000 Call me.
01:08:39.000 But now he texts me.
01:08:41.000 Everybody texts.
01:08:42.000 I think it's going to be harder and harder to be that guy.
01:08:45.000 But kudos to him.
01:08:46.000 Okay, so I remember when my son Jet is now 19 and he was probably nine.
01:08:53.000 What a cool name for a son, by the way.
01:08:54.000 Jet Jacobson.
01:08:55.000 I like it.
01:08:56.000 My other son's Finley Jacobson.
01:08:57.000 Sounds like a fucking fighter pilot.
01:08:58.000 Jet Jacobson.
01:09:01.000 So he was like maybe nine years old when the first iPhone came out.
01:09:07.000 And I had one and he just thought it was so cool.
01:09:10.000 And I will never forget, he said at the dinner table, Mom, just to be clear...
01:09:15.000 When you were born, they didn't have these iPhones.
01:09:22.000 That's great.
01:09:24.000 Just to be clear.
01:09:26.000 Well, when we were kids, we had a phone that was stuck to the wall with a cord.
01:09:30.000 Remember?
01:09:31.000 I mean, try telling your kids about an answering machine.
01:09:35.000 How about dial phones?
01:09:38.000 Okay, are you ready for this trivia?
01:09:42.000 One of the reasons why you could argue that computers became so important to the Defense Department back in 1961 is because during the Cuban Missile Crisis, and this is like I have seen these documents at the National Archives.
01:10:00.000 JFK was so worried about that exact movement you made with your finger.
01:10:05.000 The dial phone.
01:10:07.000 There was a true red phone that would be used in a nuclear crisis for him to call Nikita Khrushchev.
01:10:15.000 And he became worried that that wasted too much time to get through.
01:10:20.000 And so he hired a guy called J.C.R. Licklighter to develop computers that could move faster.
01:10:31.000 And at the time, the computers were the size of this room in the Defense Department.
01:10:35.000 And there were these old mainframes.
01:10:37.000 And Licklighter I think?
01:11:00.000 You know, trying to save the world from existential threat.
01:11:04.000 And I always think that dual use part of everything is super interesting.
01:11:07.000 Look at lasers.
01:11:08.000 You know, lasers are arguably perhaps the most important technological defense-borne, you know, system of the 20th century.
01:11:21.000 Laser printers, laser surgery, laser eyes...
01:11:24.000 And then you have laser weapons at the Pentagon, so classified I can't even get anywhere near that.
01:11:31.000 Really?
01:11:32.000 They're called directed energy weapons.
01:11:35.000 And how much do you know about them?
01:11:39.000 All I see is like conspiracy stuff online, like Antarctica, direct energy weapons.
01:11:45.000 Yeah.
01:11:45.000 Well, I mean, I always think conspiracy is born of secrecy, which would make sense.
01:11:52.000 If someone is constantly telling you, you can't know about that, you're going to naturally wonder, what the hell's going on that I really can't know?
01:12:01.000 What's so secret?
01:12:03.000 I think that's a good instinct of ours.
01:12:07.000 And so any time I have been at the Pentagon, you know, or wherever asking about directed energy weapons, it's really a...
01:12:17.000 They'd stop you.
01:12:18.000 It's a stop.
01:12:18.000 And so to find out more about that, I tracked down...
01:12:23.000 This was when I was reporting the Pentagon's brain a decade ago.
01:12:26.000 I thought to myself, okay, well, if these guys won't talk to me, I'll find out who invented the laser, see if he will.
01:12:33.000 Go to the smartest guy in the room.
01:12:35.000 So Charles Townes, who won the Nobel Prize for inventing the laser.
01:12:39.000 I called him up.
01:12:40.000 What year was that?
01:12:40.000 This was in 2014, I believe.
01:12:44.000 There wasn't lasers before 2014?
01:12:46.000 No, no, no.
01:12:46.000 Oh, sorry.
01:12:47.000 That's when I interviewed him.
01:12:48.000 He invented the laser in – I think he won the Nobel Prize in 61 or 62. Okay.
01:12:53.000 And he was 99 years old when I interviewed him, still keeping office hours at Berkeley.
01:13:00.000 Wow.
01:13:00.000 Wow.
01:13:01.000 Had his secretary on the line and gave this incredible interview.
01:13:05.000 And I asked him about his development of the laser.
01:13:09.000 But you might be interested in this.
01:13:11.000 So he is the guy who invented the laser.
01:13:15.000 I mean, when you really think about that.
01:13:17.000 And he told me, I said, well, who do you go to when you're having trouble when you're Charles Towns?
01:13:23.000 And he said, well, I took that particular idea to two colleagues, Einstein and von Neumann.
01:13:29.000 Okay, that's interesting.
01:13:31.000 I said, who said what?
01:13:33.000 And he said, this is when Townes was having trouble making it work.
01:13:39.000 And he said that Einstein said, keep trying.
01:13:43.000 And von Neumann said, it'll never work.
01:13:47.000 And I said, what do you make of that?
01:13:49.000 And he said, well, Einstein was very generous of spirit and he was always encouraging other scientists to think big and try.
01:13:58.000 And von Neumann was the kind of scientist who believed if he didn't come up with it, it probably wasn't a good idea.
01:14:06.000 Ah, ego.
01:14:07.000 But I love that because who else in the world, like, has those two people, you know, that they run ideas by?
01:14:14.000 Fascinating.
01:14:14.000 I mean, you know who you run ideas by, who I run ideas by.
01:14:17.000 Yeah.
01:14:17.000 Who Charles Towns.
01:14:18.000 But here's another interesting thought about the laser is Charles Towns, and he didn't share this fact for decades.
01:14:25.000 But later in life he wrote a lengthy article, I believe it was for the Harvard Alumni Magazine, that the idea for the laser came to him when he was sitting on a park bench from above.
01:14:43.000 Whoa.
01:14:45.000 Like how so?
01:14:48.000 It was a religious experience for him.
01:14:50.000 Like he'd been working on this problem.
01:14:52.000 He'd been working on this science problem, according to Charles Towns.
01:14:56.000 And by the way, he was inspired, he told me, to develop the laser from the time he was a little kid, in the 20s, reading the Soviet science fiction novel, The Garen Death Ray.
01:15:11.000 So it's like it was a science fiction concept, a laser.
01:15:16.000 He's a little kid, Charles Townes, thinking about this, thinking about this, then all through his life, continuing to think about it, then running it by his Einstein colleagues, and then can't make it work, can't make it work, is sitting on a park bench, and he gets the message from above.
01:15:32.000 He, you know, made it sound very much like it was a religious experience for him, but he never wrote about it for a long time, because particularly in the 60s and 70s, if you You couldn't be a scientist and have faith at the same time.
01:15:45.000 Or at least you would be, you know, belittled or you'd be looked down upon is what he said.
01:15:53.000 That's interesting.
01:15:54.000 So he was reluctant to say the inspiration behind his idea because he felt it was divine in nature.
01:16:04.000 Bingo.
01:16:07.000 And it's not just me he told this.
01:16:09.000 You can read about this in that Harvard article.
01:16:14.000 And I think he wrote that when he was in his late 80s.
01:16:18.000 But absolutely.
01:16:20.000 So he was really making a plug for listening to whatever it is that guides you, which is a very powerful statement.
01:16:31.000 Right.
01:16:31.000 Whatever that is.
01:16:34.000 Yeah, we have a real problem with if we can't measure something, we don't think it's a real thing.
01:16:40.000 You know, if we don't have the tools to put it on a scale, we don't think it's a real thing.
01:16:45.000 But whatever we want to call divine intervention, When he told me that, it was so fascinating to me.
01:16:54.000 I began to explore if there were other Nobel laureates in particular that shared that belief, and there are.
01:17:05.000 There are a couple of Nobel laureates, someone in chemistry and That believed that their knowledge came from a divine inspiration.
01:17:19.000 Like they had a dream type situation.
01:17:22.000 I write about all this in my book Phenomena.
01:17:25.000 Yes.
01:17:27.000 These specific examples because it's so interesting when that kind of non-mainstream...
01:17:34.000 Thinking comes from an absolute, you know, an individual in the community who has achieved the highest award, whatever that means.
01:17:46.000 Isn't the creation of science or the thought of science as a thing, wasn't that Descartes?
01:17:55.000 And didn't he come up with it from a dream?
01:18:00.000 He had a dream where an angel told him that they would master control over nature by measurement.
01:18:13.000 See if we can find that.
01:18:15.000 See if we can find that quote.
01:18:17.000 But I believe this was considered the beginning of science.
01:18:22.000 So the beginning of science, here it is.
01:18:26.000 In his third dream, the angel came to Descartes and said, So the beginning of science.
01:18:49.000 Measurement.
01:18:50.000 Divine.
01:18:51.000 Or whatever it is.
01:18:52.000 Whatever that word.
01:18:54.000 Absolutely.
01:18:54.000 I mean, who isn't amazed by their own dreams when they have a really intense dream?
01:18:59.000 That's just such a part and parcel to being a human and being curious.
01:19:06.000 And there is no answer to it.
01:19:08.000 I have a theory.
01:19:10.000 Tell me.
01:19:11.000 I think ideas are a life form.
01:19:13.000 That's what I think.
01:19:15.000 I think we think of a life form, the term life form, we think of it as something that can breed, something that propagates, something that spreads its DNA. Every single thing that exists on this earth that people have created came from an idea.
01:19:34.000 Every mug, every computer, every airplane, every television set, everything came from an idea.
01:19:41.000 The idea comes into the person's mind, it combines with all the currently available ideas, it improves upon those ideas, and it manifests itself in a physical thing.
01:19:51.000 And that physical thing is what we do.
01:19:54.000 That is the number one thing we do.
01:19:56.000 We do a lot of different things.
01:19:57.000 We have children and families.
01:20:00.000 We have jobs.
01:20:01.000 We take up hobbies.
01:20:02.000 But if you looked at an overview effect, if you looked at the human race from above, you would say, well, if you weren't one of us, you would say, what is this species doing?
01:20:11.000 Well, it makes better things every year.
01:20:13.000 That's what it does.
01:20:14.000 It doesn't make the same beehive every year.
01:20:16.000 It constantly, consistently makes better things.
01:20:19.000 What motivates us to make these better things?
01:20:23.000 Ideas.
01:20:24.000 Ideas and then the competition of these ideas.
01:20:27.000 Now, if something wanted a thing to Manifest it in the world to make it exist in the world.
01:20:35.000 What would it do?
01:20:36.000 It would get inside that thing's creative structure, get inside that thing's mind and impart these ideas, impart these inspirations and get this thing to go out and put these pieces together and manufacture this thing and then test it and improve upon it and keep doing it until they get it right.
01:20:54.000 And then other people will take those things and have new ideas.
01:20:58.000 I know how to take that and turn it into a tablet.
01:21:01.000 I know how to turn that into this.
01:21:03.000 I know how to make this better.
01:21:05.000 Let's do quantum computing.
01:21:07.000 Let's do that.
01:21:07.000 These are all just ideas.
01:21:09.000 So ideas that human beings turn into real things and those real things accelerate the evolution of technology in this radical way where we can't even comprehend where it's going.
01:21:23.000 You know, there was an article I put on my Instagram today, I just put the title of it, how crazy it is, that AI, what was it, 500 million years?
01:21:32.000 AI as extrapolated, like they're calculating what evolution looks like in 500 million years.
01:21:39.000 Oh, I have seen that.
01:21:41.000 We don't even understand what we're doing.
01:21:47.000 We don't even understand what we're doing, but we keep doing it.
01:21:49.000 And I think some of the instincts that human beings have that seem frivolous...
01:21:55.000 Are directly connected to this.
01:21:56.000 And one of them is materialism.
01:21:59.000 Materialism, status, all these different things that we have where with materialism you always want the newest, greatest thing.
01:22:06.000 If your friend has an iPhone 15 but you have an iPhone 10, you look like a loser.
01:22:10.000 What are you doing with that old, look at that stupid old camera.
01:22:13.000 Oh my god, what are you doing with that?
01:22:14.000 You need the best one.
01:22:15.000 You need the new one.
01:22:16.000 If you have a 2007 car and your friend pulls up in a 2024 car, like, oh.
01:22:22.000 I need the new one.
01:22:23.000 What does the new one do?
01:22:24.000 The new one does all these different things that the old one doesn't do.
01:22:27.000 The new one drives itself.
01:22:28.000 I've got to get the new one.
01:22:30.000 And so what does that do?
01:22:31.000 It pushes innovation.
01:22:34.000 It promotes innovation.
01:22:37.000 Materialism fuels innovation.
01:22:39.000 Because we're always buying new stuff.
01:22:41.000 If everybody stopped buying things, if everybody looked at their phone and said, this phone's perfect.
01:22:46.000 I don't need a new phone.
01:22:47.000 I could have that phone forever.
01:22:49.000 You don't need a new iPhone.
01:22:50.000 I could have that phone forever.
01:22:54.000 They would stop innovating.
01:22:56.000 They would stop making things.
01:22:57.000 It would just stop, and then nothing would get done.
01:23:01.000 But because of materialism, because of this keeping up with the Joneses thing, where everybody wants the latest thing in their driveway to impress their neighbor, you want to pull up to the fucking diner and show all your friends.
01:23:14.000 All that stuff just fuels the creation of new and better things, and we're all part of it.
01:23:18.000 And every year, there's like a thing you didn't know.
01:23:21.000 Like, I just got this phone.
01:23:21.000 Check this out.
01:23:22.000 This phone, I can make a circle on things, and it Googles it instantly.
01:23:27.000 This phone transcribes all of my voice notes and then summarizes them for me.
01:23:33.000 This phone, this Galaxy S24 AI, it has AI in it.
01:23:38.000 It will summarize your notes, like condense them into a Wikipedia view?
01:23:43.000 It does it with web pages.
01:23:45.000 This is the Samsung Galaxy S24 Ultra.
01:23:48.000 This is the new one.
01:23:49.000 This thing, it can transcribe Your thoughts, but even better, it can translate conversations.
01:23:57.000 You could be speaking Spanish, and I could be speaking English, and it'll have a split screen.
01:24:02.000 So it'll show, like, I'll put the phone down.
01:24:05.000 In almost real time.
01:24:06.000 Yes.
01:24:06.000 So on your half of the phone, facing you, it would be speaking in Spanish.
01:24:11.000 It would translate my words into Spanish.
01:24:13.000 On my side, it would be translating your Spanish into English.
01:24:17.000 Then if you have the Galaxy Earbuds, it'll do it in real time, in your ear.
01:24:23.000 So you can talk to me in Spanish, and I can hear you in English.
01:24:28.000 And this is just beginning.
01:24:30.000 So at the United Nations, we're no longer going to have those translators painstakingly telling...
01:24:36.000 Oh, yeah.
01:24:36.000 And maybe lying, you know?
01:24:38.000 Maybe lying, maybe distorting reality, because that happens a lot, too.
01:24:42.000 You hear about translations, and you're like, that's not exactly what he said.
01:24:46.000 You've kind of distorted and twisted those words.
01:24:49.000 But the point is, like, these all come from ideas.
01:24:53.000 And I didn't think I needed an ability to circle an image and immediately Google searches it, but it's so convenient.
01:24:59.000 Like, if you want something or you see something, like, what is that?
01:25:03.000 Take a picture of that thing.
01:25:04.000 Circle it.
01:25:05.000 Then immediately it goes to Google and shows you, like, instantaneously.
01:25:10.000 Now, when you say ideas are life forms, which is so interesting to me, haven't heard that, are you talking about maybe consciousness?
01:25:22.000 Yes.
01:25:32.000 Because you talked about a beehive, and it made me think about a hive.
01:25:35.000 Like, are we all part of the same consciousness, like, again, Carl Jung would say?
01:25:42.000 My theory, and again, I have no education in this, is just my thinking about it for thousands of hours.
01:25:48.000 I think we're all the same thing.
01:25:50.000 I think this whole idea of we are one, that sounds so hippie, it's hard for people to digest, but I want you to think about it this way.
01:25:57.000 If you live my life, I think you would be me.
01:26:00.000 And I think if I lived your life, I would be you.
01:26:02.000 I think what consciousness is, is the life force of all living sentient things on this planet and perhaps all in the universe.
01:26:13.000 And it's experiencing reality through different biological filters, different bodies, different life experiences, different education, different genetics, different parts of the world, different geography, different climate, different things to deal with.
01:26:29.000 But it's the same thing.
01:26:30.000 I think if I lived in Afghanistan and my whole family was in the Taliban, I would probably be in the Taliban too.
01:26:38.000 Because I think you just...
01:26:41.000 You adapt to whatever circumstances and environment you're in and then you think that way and you speak this language and you have these customs and you engage in these religious practices.
01:26:52.000 But I think consciousness, the thing at the heart of it all, what that person thinks of when they say me, What I think, me, I think this.
01:27:02.000 I think that me is the same in every person.
01:27:05.000 I think it's everyone.
01:27:07.000 That's why we're all connected.
01:27:08.000 That's why we all need each other.
01:27:10.000 That's why loneliness contributes to diseases and it's terrible for human beings.
01:27:16.000 I think we're all universally connected with this bizarre goal.
01:27:21.000 And I think this bizarre goal might be to create artificial life.
01:27:27.000 I think that might be the...
01:27:28.000 I think...
01:27:29.000 I've said this too many times.
01:27:31.000 If you've heard...
01:27:31.000 I'm sorry.
01:27:32.000 But I think that we are...
01:27:35.000 An electronic caterpillar making a cocoon and we don't even know why.
01:27:41.000 We are just constantly building.
01:27:43.000 The cocoon is the next transformation.
01:27:44.000 Exactly.
01:27:45.000 And we give birth to the butterfly.
01:27:47.000 We are a biological caterpillar giving birth to the electronic butterfly.
01:27:52.000 And we're doing it.
01:27:53.000 We don't even know what we're doing.
01:27:54.000 We're just making this cocoon.
01:27:55.000 We just keep going because this is what we do.
01:27:58.000 And I think this is probably how life separates from biology to something far more sophisticated that's not confined to the timeline of biological evolution, which is a very slow, relatively speaking, timeline in comparison to electronic evolution.
01:28:16.000 Electronic evolution is exponential.
01:28:18.000 It happens radically.
01:28:20.000 It happens very fast, and especially when you get into things like when AI gets involved and quantum computing gets involved, then things accelerate so fast.
01:28:28.000 Like, look at what's going on with AI. Five years ago, AI was not even a concern.
01:28:33.000 Nobody even thought about it.
01:28:34.000 Now it's at the tip of everyone's tongue, and all everyone's talking about is what happens when chat GPT-5 gets released?
01:28:39.000 What happens when 6 gets released?
01:28:41.000 What happens when artificial general intelligence is achieved?
01:28:45.000 What happens when these things become sentient?
01:28:48.000 What happens when these things make better versions of themselves?
01:28:51.000 When does that stop?
01:28:52.000 Does it ever stop?
01:28:53.000 Do they become gods?
01:28:54.000 Is that what God is?
01:28:56.000 Is what God is, is this primate becomes curious, starts inventing tools.
01:29:04.000 Tools lead to machines.
01:29:06.000 Machines lead to the Industrial Age.
01:29:08.000 The Industrial Age leads to electronics.
01:29:10.000 Electronics lead to artificial intelligence.
01:29:12.000 Artificial intelligence leads to God.
01:29:15.000 Wow.
01:29:16.000 Okay.
01:29:17.000 In that line of thought, go back now on that timeline.
01:29:21.000 To your eye, how did we go from language?
01:29:29.000 How did we go from grunting?
01:29:32.000 To language, and then from language to writing.
01:29:36.000 Why and how did we write in your language?
01:29:39.000 Well, I think the moment they started pointing at things and making sounds, and by the way, animals do that.
01:29:44.000 We know that, right?
01:29:45.000 Do you know that some monkeys trick other monkeys?
01:29:48.000 So, some monkeys will make a sound like an eagle's coming, and so that these monkeys run out of the trees, and then they'll run up the trees and steal the fruit.
01:29:56.000 I did not know that.
01:29:57.000 Yeah, they lie to each other.
01:29:58.000 Have you seen Chimp Empire?
01:30:00.000 Yes.
01:30:01.000 Yeah, I had the director on.
01:30:02.000 It was amazing.
01:30:03.000 It's an amazing film.
01:30:05.000 God, there's so much like this.
01:30:06.000 Somebody was asking me after I wrote the Surprise Kill Vanish book about the CIA's paramilitary, the Ground Branch guys, the snake eaters.
01:30:15.000 And they said, what's the origin story of Ground Branch?
01:30:18.000 Thinking I was going to say, you know, the OSS. But I had just seen Chimp Empire.
01:30:25.000 Yeah.
01:30:50.000 And that their anger and their fighting was based on revenge.
01:30:56.000 That blew my mind.
01:31:00.000 Because I often think, because I write about war and weapons, I always think about nature versus nurture.
01:31:07.000 Were we built this way?
01:31:09.000 This is a very interesting thing to think about, like your electric caterpillar.
01:31:14.000 They're fighting over resources.
01:31:16.000 They were fighting over resources.
01:31:17.000 But it was also revenge.
01:31:19.000 Remember that?
01:31:19.000 There was like a score to settle with the previous generation.
01:31:24.000 And that, to me, was stunning.
01:31:27.000 And also that one of the guys, I love that they named everyone, but one of the chimp's brother, I think it was, like head chimp's brother, lost his arm in a poacher's trap.
01:31:39.000 Remember that?
01:31:40.000 And when you think about chimps and how important their arms are swinging from trees, if you just followed the logic about survival of the fittest, then the brother chimp would have died because he didn't have one of his hands.
01:31:58.000 Right.
01:32:00.000 The other brother, the lead champ, made sure he was taken care of, which is like so human-worn.
01:32:06.000 I still think about that.
01:32:09.000 That's how they can survive.
01:32:10.000 They need help.
01:32:11.000 That's why they live in tribes.
01:32:12.000 That's why they're not individuals alone by themselves.
01:32:15.000 Okay, so you think chimps were pointing and then from that you had to figure over long amounts of time.
01:32:24.000 Yeah, over immense amounts of time they started to develop.
01:32:27.000 Just like monkeys have sounds for eagles.
01:32:29.000 You know, the monkeys have sounds for snakes.
01:32:31.000 They have sounds for different things.
01:32:33.000 This is a relatively recently known thing.
01:32:38.000 I think they used to think they were just making noises before.
01:32:41.000 Now they realize like, oh no, they have very specific noises for very specific things.
01:32:45.000 And I think that evolves.
01:32:47.000 Do you know that they think that apes right now are in the Stone Age?
01:32:51.000 Yeah, they've entered into the Stone Age.
01:32:54.000 You mean apes have evolved into the Stone Age?
01:33:18.000 Recently?
01:33:19.000 Yeah.
01:33:19.000 He's hanging onto a tree, and he's got a spear, and he's jabbing fish out of the water with his spear.
01:33:26.000 It's wild to see.
01:33:27.000 They're catching up with us.
01:33:28.000 Yeah.
01:33:28.000 Well, that's what happens.
01:33:30.000 Right.
01:33:30.000 I mean, the idea that they're going to stay the same.
01:33:33.000 Look at this.
01:33:33.000 Okay.
01:33:35.000 Oh, wow.
01:33:36.000 That's amazing.
01:33:37.000 How crazy is that?
01:33:38.000 He's fishing.
01:33:39.000 He's fishing with a spear.
01:33:41.000 That's pretty amazing.
01:33:43.000 That's just one of many instances of them using tools, using things.
01:33:50.000 They're starting to enter into this phase where they start...
01:33:56.000 Evolving.
01:33:56.000 I mean, and it's a slow process and we went through it somehow or another.
01:34:00.000 We went through it in an accelerated way.
01:34:02.000 That's very confusing also.
01:34:03.000 Like, the doubling of the human brain size over a period of two million years apparently is the biggest mystery in the entire fossil record.
01:34:11.000 Like, how does this one thing that created the fossil record double over a period of two billion years?
01:34:17.000 Like, what are the factors?
01:34:18.000 And there's a bunch of different ideas, but all of them are just ideas.
01:34:22.000 No one really knows.
01:34:23.000 Okay, so here's an interesting anecdote.
01:34:26.000 Nuclear war apes.
01:34:28.000 In 1975, there was this famous defense official who went from being a hawk to being like, we cannot have so many nuclear weapons.
01:34:37.000 His name was Paul Warnicke.
01:34:39.000 And he wrote what was then a famous article in Foreign Policy magazine called, Apes on a Treadmill.
01:34:47.000 Such a great image.
01:34:48.000 His idea that the nuclear arms race was apes on a treadmill.
01:34:52.000 That we and the Russians were just slavish, you know, like essentially ignorant beasts just slaving away on this treadmill trying to win, not even realizing there is no winner.
01:35:05.000 And, you know, it was a famous article.
01:35:07.000 Everybody wrote about it and, I mean, spoke about it in, you know, D.C. and then it disappeared.
01:35:12.000 Well, the anecdote comes from recently a group of scientists wanted to try to answer the question that we're talking about, like, how did...
01:35:22.000 How did apes go from knuckle walking to being bipedal?
01:35:29.000 How did that happen?
01:35:30.000 We still don't know why.
01:35:32.000 And they were trying to figure out if it had to do with energy consumption.
01:35:36.000 So they outfitted apes.
01:35:37.000 They put them on a treadmill.
01:35:38.000 They outfitted them with oxygen masks.
01:35:40.000 And then they had humans doing the same thing.
01:35:43.000 And they were measuring You know, energy levels.
01:35:48.000 And during this experiment, and they kept making them do it over and over again, the humans and the apes on the treadmills.
01:35:54.000 And during one of the experiments, one of the apes was basically like, screw this, I'm not doing this anymore.
01:35:59.000 He pushed the button and got off.
01:36:02.000 And the anecdote to nuclear war is, if apes can figure out how to get off the treadmill, why can't humans?
01:36:14.000 Well, apes aren't being influenced by lobbyists.
01:36:20.000 Apes aren't watching TikTok.
01:36:22.000 Apes aren't having secret lunch meetings with the head of Raytheon.
01:36:25.000 Yeah, I don't know.
01:36:26.000 Who knows?
01:36:28.000 I think we can.
01:36:29.000 I don't think it's impossible.
01:36:31.000 I think you and I being here right now having this conversation where there's not a nuclear war going on speaks to that.
01:36:36.000 Like, we've had nuclear weapons for a long time and we haven't used them.
01:36:40.000 I don't think it's...
01:36:43.000 I don't think it's inevitability that we destroy ourselves, but it's a possibility.
01:36:47.000 And I think there's a lot of foolish people that are ignoring that possibility, and that's what's scary.
01:36:52.000 And what's scary is that the type of people that want to become president, congressmen, and senators, some of them are great people, and some of them are wise, and they're good leaders, and some of them are just people that are too ugly to be on TV. They're they're too ugly to be actors.
01:37:08.000 They're too ugly to be they can't sing they want attention and so they want to be a leader and so they want to say the things that people want to hear because those things get them positive attention and they feel good and then they get a bunch of people who love them and they feel good and then they have their face on a billboard and they have bumper stickers and like everybody likes me and the people who don't like me they're communists or losers and So it's a cult of personality thing that is just a part of being a charismatic person and garnering attention.
01:37:36.000 And that's a giant part of our whole political process, is narcissists and psychopaths and sociopaths that have embedded themselves into the system.
01:37:46.000 And then they all feed off each other and help each other, and they're all insider trading, and they're all involved.
01:37:50.000 And then when they leave office, they get paid to speak in front of bankers and make a half a million dollars.
01:37:57.000 We're overrun by sociopaths.
01:38:01.000 Why is that?
01:38:03.000 Well, it's a natural path.
01:38:05.000 Like, that's what most leaders are.
01:38:07.000 Most leaders of immense groups of people that force people into war are sociopaths.
01:38:12.000 It's the old adage that, you know, you had to be crazy to become president in the first place?
01:38:17.000 Yes.
01:38:17.000 Or you wouldn't become president because you would realize it's crazy?
01:38:20.000 Yeah, my take is that no one who wants to be president should be allowed to be president.
01:38:24.000 Almost no one.
01:38:25.000 And that's...
01:38:27.000 Well, you know, that's also part of the problem with the world we're living in today.
01:38:30.000 It's always the lesser of two evils.
01:38:35.000 That's our choice, always.
01:38:36.000 Well, it's also so peculiar what happens to people when they become president.
01:38:40.000 And I'll give you an example of, again, from the nuclear war concept of things.
01:38:45.000 Do you know about the policy launch on warning?
01:38:47.000 No.
01:38:48.000 Okay.
01:38:49.000 Do you know about sole presidential authority?
01:38:52.000 Yes.
01:38:52.000 Okay.
01:38:52.000 So you know the president alone launches nuclear war.
01:38:55.000 He doesn't ask the SECDEF, doesn't ask the chairman of the Joint Chiefs, doesn't ask Congress.
01:39:01.000 He launches.
01:39:03.000 Furthermore, we have a policy called launch on warning.
01:39:06.000 So when he's told that there is an incoming nuclear missile on its way to the United States, which is how this begins, He must launch on warning.
01:39:20.000 That is policy.
01:39:21.000 We do not wait.
01:39:22.000 That is a quote from former Secretary of Defense Bill Perry.
01:39:26.000 Now, before taking office, many of these presidents say that they are going to change that insane policy because it essentially creates this volatile situation where every president, every foreign leader knows They're going to launch on warning and so am I. They say they're going to change the policy before they take office.
01:39:50.000 This is documented.
01:39:51.000 You know, Clinton, Bush, Obama, not Trump.
01:39:56.000 And then they don't and no one knows why.
01:40:00.000 Why do you think that is?
01:40:03.000 That you would have a position before becoming president of such an extreme policy needs to change and then And then change your, or become silent on that.
01:40:17.000 I don't think anybody, like if I was going to have a conversation with Trump, the number one thing that I would want to ask him is, what is the difference between what you thought it was like and what it is like?
01:40:27.000 What happens when you get in there?
01:40:29.000 What happens when you get in?
01:40:30.000 No one knows.
01:40:31.000 It's so secretive.
01:40:33.000 The meetings between heads of state, between senators, congressmen, behind closed doors, all the different meetings with the head of the Pentagon, the head of the intelligence agencies, we don't know what those meetings are like.
01:40:49.000 No one can know.
01:40:50.000 It's a point of national security.
01:40:52.000 So if you're not president, of course you can't go to the fucking meetings.
01:40:55.000 So then you become president.
01:40:57.000 And then you get briefed.
01:40:59.000 So then they sit you down and they hit you with all the problems in the world that you don't know about, that nobody knows about but them.
01:41:06.000 And I bet they're numerous.
01:41:08.000 And I bet it's terrifying.
01:41:10.000 And I bet that's a giant factor as to why people change between – I mean, a lot of what they say when they're running for president, they know they're not going to do.
01:41:18.000 But they're saying it because they want people to vote for them.
01:41:20.000 And then they get in there and they go, I'm not going to release the JFK files.
01:41:23.000 They do things like that.
01:41:24.000 That's what they always do.
01:41:26.000 But I think a lot of it is also you can't know what you're talking about until you get that job.
01:41:54.000 Right.
01:41:56.000 Wouldn't that change every...
01:41:57.000 That's where I'm lost.
01:42:00.000 And also, it's so tiring reading these presidential manuals, or memoirs, rather, which then say absolutely nothing original to any of us, the citizenry, about what's really happening as president.
01:42:14.000 Well, because those are just designed to make money.
01:42:17.000 Those memoirs are not really designed to do anything other than generate income.
01:42:25.000 I want the real story from the president.
01:42:28.000 Right.
01:42:28.000 You're not going to get it.
01:42:30.000 You won't even get it if you're alone with them.
01:42:32.000 I mean, I think that there's probably some things that they say in there that they have to kind of skirt around it and figure out what to say, figure out how to say it.
01:42:41.000 Did you ever read Bill Clinton's My Life?
01:42:45.000 I don't think I could read a biography by him, to be fair.
01:42:49.000 He had one wild thing to say about the moon landing.
01:42:52.000 What did he say?
01:42:53.000 Really wild.
01:42:55.000 Well, he was saying that he was working.
01:42:57.000 I think he was doing construction in 1969 and showed up for work.
01:43:03.000 See if we can find the quote so I don't paraphrase it.
01:43:06.000 I think it's page 241 of Bill Clinton's My Life.
01:43:11.000 I might have the page wrong.
01:43:13.000 So in this scenario, he's talking to this carpenter.
01:43:19.000 And he's telling this carpenter, isn't it crazy?
01:43:22.000 They landed on the moon.
01:43:23.000 And the carpenter says, something to the sound of, I don't believe anything those television fellers say.
01:43:32.000 And he goes, I think they can fake anything.
01:43:34.000 Somewhere along that line.
01:43:35.000 And he goes, back then I thought that guy was a crank.
01:43:39.000 He goes, but after my eight years in the White House, I started thinking maybe he's ahead of his time.
01:43:45.000 Now just imagine saying that.
01:43:49.000 About the biggest achievement in human history, landing another human being on the surface of a moon, another planet, essentially.
01:44:00.000 One quarter the size of our planet, 250-something thousand miles away.
01:44:05.000 And he's saying, maybe this guy who thought it was fake is ahead of his time.
01:44:10.000 That's not...
01:44:11.000 That's not something you accidentally say.
01:44:14.000 That's not something you frivolously write down and just add it to your book.
01:44:19.000 Did you find the quote?
01:44:21.000 That is...
01:44:22.000 I gotta see that real quote because that is...
01:44:24.000 That's some really heavy subtext there.
01:44:26.000 Jamie will find it.
01:44:27.000 But when you read it, you're like, wait, what the fuck did he say?
01:44:31.000 Like, you're the...
01:44:32.000 And it got kind of glossed over except for the moon landing hoax community.
01:44:37.000 Yeah, I told you it's fake!
01:44:41.000 But that's interesting.
01:44:43.000 And that guy wrote all that longhand, by the way.
01:44:46.000 He wrote that all on, you know, composition books, just notebooks.
01:44:50.000 Wrote it all out.
01:44:51.000 And then it was just transcribed.
01:44:53.000 Yeah.
01:44:54.000 So this is not like an accident that he told this story.
01:44:59.000 Imagine saying that.
01:45:01.000 Imagine saying that this guy who thought the moon landing was fake, maybe he was ahead of his time.
01:45:07.000 You finding it?
01:45:09.000 Honestly, I can only find it written on Wikipedia, and it's linking to the book, but this is what it says on Wikipedia.
01:45:16.000 There it is.
01:45:17.000 Perfectly.
01:45:18.000 Perfectly.
01:45:19.000 Okay, here it goes.
01:45:20.000 Just a month before, Apollo 11 astronauts Buzz Aldrin and Neil Armstrong...
01:45:24.000 I'd left her colleague Michael Collins aboard Spaceship Columbia and walked on the moon.
01:45:28.000 The old carpenter asked me if I really believed it happened.
01:45:30.000 I said, sure, I saw it on television.
01:45:32.000 He disagreed.
01:45:33.000 He said that he didn't believe it for a minute that them television fellers could make things look real that weren't.
01:45:39.000 Back then, I thought he was a crank.
01:45:42.000 During my eight years in Washington, I saw some things on TV that make me wonder if he wasn't ahead of his time.
01:45:50.000 Well, that's very weird and cryptic because he says, I saw some things on TV that made me wonder.
01:45:57.000 Yeah, meaning like things that were displayed to the American public that were horseshit.
01:46:03.000 Some things that are on TV that people are getting to look at where he knows it's not real.
01:46:08.000 Right.
01:46:08.000 So he's wondering.
01:46:09.000 And by dragging the moon into it, there's just no way that you can do anything but infer that he's referencing that.
01:46:16.000 Of course.
01:46:17.000 And has anyone ever asked him about that on follow-up?
01:46:20.000 Dot dot dot dot.
01:46:22.000 Could it be a nice edit for this?
01:46:24.000 This is a Wikipedia about the moon landing conspiracies.
01:46:28.000 That's why I couldn't find it somewhere else.
01:46:29.000 I was trying to find it.
01:46:30.000 No, no, no.
01:46:31.000 I've read the quote.
01:46:32.000 It's just more of the same.
01:46:34.000 It's just abbreviate.
01:46:35.000 The whole thing, the point of the quote that's the meat is the old carpet asked me if I really believe it happened.
01:46:40.000 This is all not taken out of context.
01:46:45.000 So will politics in the United States ever change to a point where we can have individuals who accept their responsibility?
01:46:53.000 I mean, another crazy thing, again, interviewing Panetta about Clinton, you know, as his chief of staff, it's like, oh, this is Panetta talking, that the presidents are...
01:47:04.000 They're very under-informed about any of their responsibilities.
01:47:09.000 I'm sure.
01:47:09.000 About nuclear war.
01:47:10.000 They don't know how it unfolds.
01:47:12.000 They don't know how fast it is.
01:47:13.000 Did you know that they have a six-minute window to make a counterattack?
01:47:17.000 Six-minute window.
01:47:19.000 And that comes from President Reagan's memoir, by the way.
01:47:23.000 He said that there's a quote from him, which I have in the book, a six-minute window to have to make a decision to possibly end the world is irrational.
01:47:35.000 I think part of the problem, and to answer your question about leaders, politicians, whether to change, we're asking humans to not be human.
01:47:43.000 That's what we're asking.
01:47:45.000 We're asking them to be these perfect things.
01:47:46.000 And then now, we're looking up their ass with a microscope more than we've ever done before.
01:47:52.000 And we're also accepting that the other side is going to lie about this person's background and lie about what this person's done.
01:48:00.000 For three years, all you heard on the news was Russia collusion with Donald Trump.
01:48:06.000 They were trying to say Russia put him in office.
01:48:08.000 He's a Russian puppet.
01:48:09.000 He's a Russian thing.
01:48:09.000 It was all a lie.
01:48:10.000 They made it up and they said it everywhere and you're allowed to do that.
01:48:13.000 You're allowed to do that.
01:48:14.000 So not only are you taking a person and asking them to not be a person, But then you're looking at everything they've done and you're allowed to lie about it and you're allowed to lie about it in cahoots with the media Who spreads this lie on television every day with no consequences?
01:48:33.000 So the problem is Not just that we don't have good leaders is that I don't know if it's possible to have a good leader.
01:48:43.000 I don't know if those kind of humans are real and I wonder If AI, even though everyone's terrified of it, and I am too, I wonder if that's our way out.
01:48:58.000 I wonder if the only thing that can actually govern society fairly and accurately It's an artificial intelligence.
01:49:05.000 It's something that doesn't have emotions and greed and narcissism and all the other contemptible traits that all of our politicians have.
01:49:14.000 What if it doesn't have any of that?
01:49:16.000 First of all, what if it's far smarter than us and realizes an affair and It's a reasonable way to allocate resources and to eliminate pollution, to mitigate overfishing and all the different issues that we have.
01:49:34.000 It looks at it in an absolutely accurate and brilliant way that we're not even capable of doing.
01:49:41.000 And this idea of us being governed by people and not wanting those people to behave like every person ever who's been in power.
01:49:50.000 Absolute power corrupts absolutely.
01:49:52.000 Everyone knows it, but we just assume that we're going to have this one guy that's like, his morals are so strong that when he gets in there, he's going to right the ship and drain the swamp.
01:50:03.000 I don't think those people are real.
01:50:05.000 I don't think that's a real thing.
01:50:06.000 I think the folly that human beings have displayed Is built into the programming.
01:50:11.000 I don't think it's unusual.
01:50:15.000 I think it's usual.
01:50:16.000 Any president you ever admired?
01:50:18.000 Or part of their actions that you admired?
01:50:21.000 Well, there's a lot of human beings that I admire.
01:50:24.000 They're just flawed human beings I still admire.
01:50:29.000 I mean, Kennedy said a lot of amazing, brilliant things, but he's also deeply flawed.
01:50:33.000 And if he was alive today, oh my God, the skeletons, they would have pulled out of that guy's closet.
01:50:38.000 It would be crazy.
01:50:39.000 He would never make it.
01:50:40.000 He would never get anywhere close.
01:50:42.000 But that's just, he's a human.
01:50:44.000 And you could kind of be flawed and also have these brilliant takes on things like he did then.
01:50:52.000 But look what happened to him.
01:50:53.000 They fucking killed him.
01:50:54.000 And then they covered it up.
01:50:56.000 And then today, even today, they won't release the files.
01:50:59.000 Today!
01:51:00.000 I mean, that, speaking to the idea that conspiracies have become popular, or rather, you know, thinking that there is a conspiracy behind things, it's so astonishing that those files are not released after all this time.
01:51:17.000 It's almost impossible not to see...
01:51:22.000 Not to ask, what is behind that veil?
01:51:24.000 What is so important to cover up?
01:51:27.000 It is impossible.
01:51:28.000 And it does make me think that whatever it is, is such a poor reflection on America that the president therefore agrees, okay, we won't do that.
01:51:39.000 Yeah, I think it would cause a deep rift in our society that maybe we can't really handle.
01:51:45.000 It's possible.
01:51:46.000 It's possible that it's the CIA that did it and that we're going to realize that these people and some of them may even still be alive.
01:51:57.000 From all of the different sources I spoke to over the decades, I always get the sense that it was a nation-state, and that nation-state happens to be nuclear-armed, and even today people would demand consequences.
01:52:15.000 It's possible, but the people that were involved in the Warren Commission report, some of them were like Alan Dulles.
01:52:22.000 Dulles was fired by a candidate.
01:52:24.000 Imagine, you get assassinated and the guy who you fired who fucking hates you gets to write the story of what happened to you.
01:52:34.000 I mean, there's a lot that goes into that that's very, very deep.
01:52:39.000 There's a lot.
01:52:40.000 The Warren Commission Report is horseshit.
01:52:43.000 If you read it, and if you...
01:52:45.000 Like, David Lifton wrote a book on it called Best Evidence.
01:52:49.000 Did you ever read it?
01:52:51.000 No, but I know about him from Tom O'Neill in the Chaos book, which is so awesome.
01:52:56.000 He studied the entire Warrant Commission report, which almost nobody had at the time.
01:53:01.000 And he was like, this is filled with inconsistencies.
01:53:03.000 None of this makes any sense.
01:53:04.000 There's so much wrong with all these different things that they're saying that he started doubting it.
01:53:10.000 And then he started looking into the assassination itself and finding...
01:53:14.000 How many witnesses had died mysteriously?
01:53:17.000 The whole thing, the reeks of conspiracy from the top to the bottom, from Jack Ruby showing up and killing Lee Harvey Oswald to Jolly West visiting Jack Ruby in jail and also Jack Ruby goes insane.
01:53:31.000 To the fact that Jolly West was the head of NK Ultra, which likely supplied Manson with LSD, which ran the Haight-Ashbury Free Clinic, which was where they were giving people LSD, which was running Operation Midnight Climax, where they were dosing up Johns with LSD and watching them for two-way mirrors.
01:53:50.000 Like, all this is real.
01:53:52.000 Like, this is all undisputable, absolute truth that was going on at the exact same time.
01:53:57.000 And to think that...
01:53:59.000 That's the only thing that we're lying about.
01:54:00.000 It's just that stuff.
01:54:02.000 Everything else is above board.
01:54:03.000 That stuff was important.
01:54:04.000 We were just trying to get to the bottom of things.
01:54:06.000 There was certainly not the same degree of an alert and knowledgeable citizenry in the 50s and 60s.
01:54:13.000 Everyone just took everything at face value.
01:54:16.000 And it is remarkable as a historian, and Tom O'Neill's work also speaks of that, to go back in time and look at that.
01:54:24.000 Part of just as interesting, perhaps, as the facts of the matter, like the Warren Commission, is to say, How did everyone simply accept this as fact?
01:54:33.000 But then I think it's valuable to have the old look in the mirror moment and go, what is it today that we're not looking at?
01:54:44.000 What is it today that will be in 10, 20, 30 years from now?
01:54:49.000 I can't believe they were all falling for that concept.
01:54:52.000 That's the old you can't face.
01:54:55.000 You can't fix what you can't face.
01:54:58.000 I think it's going to be the influence of pharmaceutical drug companies.
01:55:02.000 And also processed food companies.
01:55:06.000 We know now that processed food companies, major food manufacturers are paying food influencers to say that all food is good food and to talk about body positivity.
01:55:19.000 That's all motivated by them to sell more Oreos and whatever the fuck they sell.
01:55:23.000 We know that.
01:55:24.000 We know that's a fact.
01:55:25.000 It reminds me of the hypocrisy I feel and think about behind smoking.
01:55:30.000 I live in Los Angeles, and in the city of Beverly Hills, you cannot smoke anywhere.
01:55:37.000 You will get a ticket if you smoke outside.
01:55:40.000 But you can do fentanyl in your tent.
01:55:42.000 And I don't smoke, but all you have to do is research about...
01:55:47.000 You think about creating the whole smoker's world.
01:55:51.000 You see those old ads from the 50s and 60s where the doctor, the OBGYN is smoking while they're visiting with the pregnant woman and encouraging her to smoke because it will make her relax.
01:56:02.000 Did you see asthma cigarettes?
01:56:03.000 Have you ever seen those?
01:56:04.000 Asthma cigarettes?
01:56:06.000 They had asthma cigarettes.
01:56:07.000 They had cigarettes that they prescribed for people with asthma.
01:56:09.000 They did not.
01:56:10.000 Yeah, this just shows you how evil corporations are, even back then, just because they could get away with it.
01:56:16.000 And back then, there was no internet, so you couldn't find out, like, hey, you shouldn't smoke any fucking cigarettes when you have asthma.
01:56:22.000 Look at this.
01:56:24.000 Asthma cigarettes.
01:56:26.000 In cases of asthma, cough, bronchitis, hay fever, influenza, and shortness of breath, smoke cigarettes.
01:56:33.000 Asthma cigarettes.
01:56:34.000 For your health.
01:56:37.000 A temporary relief.
01:56:38.000 It's ubiquitous.
01:56:40.000 When I was researching the nuclear war book, I came across a quote from General Groves.
01:56:45.000 Remember him from the Oppenheimer film?
01:56:47.000 The Matt Damon character?
01:56:48.000 He was at a commission.
01:56:50.000 This is like a year after the atomic bombing of Hiroshima.
01:56:53.000 And he was asked about radiation.
01:56:55.000 And he said, quote, radiation is a very pleasant way to die.
01:57:01.000 Whoa.
01:57:02.000 That's what he told Congress.
01:57:03.000 Jesus.
01:57:04.000 And so it's everywhere with officials until— Safe and effective.
01:57:11.000 It's a pleasant way to die.
01:57:14.000 Yeah, it's just gaslighting.
01:57:15.000 And people have been doing that forever.
01:57:18.000 It's whenever they can get away with it.
01:57:19.000 If they can speak eloquently and phrase things in a way that can sort of shift opinion one way or the other, they do that.
01:57:28.000 Yeah, especially when there are people in power or when a person in power assigns someone to go be a Spurks person.
01:57:35.000 Yeah, it's dangerous.
01:57:37.000 And I think that's another place where artificial intelligence may help us.
01:57:43.000 I think we're going to get to a place where lying is going to be impossible.
01:57:48.000 I don't think it's going to be possible within the next 10 years to lie.
01:57:51.000 I think it's all out the window.
01:57:52.000 I think right now we're worried about people being in control of artificial intelligence because they can propagate misinformation and they can just create deepfakes.
01:58:02.000 I think that's going to be a problem for sure and it's certainly something that we can you should consider but I think that what's going to happen in the future is we will likely merge Minds through technology in some very bizarre way and I think information will flow Much more freely.
01:58:20.000 We'll be able to translate things.
01:58:22.000 We'll be able to know thoughts We perhaps will come up with a universal language that Everyone will understand.
01:58:31.000 And you'll be able to absorb it almost instantaneously because you're going to have some sort of a chip, whether it's a Neuralink or some device that you wear or something that links you up, and we're going to have a completely different type of access to information.
01:58:46.000 Just like...
01:58:48.000 Before language, they had grunts and they pointed at things.
01:58:51.000 Then they started writing things down.
01:58:53.000 They had carrier pigeons.
01:58:54.000 They had smoke signals.
01:58:55.000 They had all these different methods to get information out.
01:58:57.000 Now you have video.
01:58:59.000 You send a video to the other side of the fucking planet and it gets there immediately.
01:59:03.000 It's one of the wildest things that we've ever created and we take it for granted.
01:59:07.000 You could be FaceTiming someone in New Zealand and you're looking at each other in real time and having a conversation.
01:59:13.000 They're on the other side of the world.
01:59:15.000 You could send them a video.
01:59:16.000 It gets there like that.
01:59:18.000 You could download things from their websites.
01:59:20.000 You get it like that.
01:59:21.000 You're streaming things instantaneously.
01:59:23.000 Completely different way of accessing information.
01:59:26.000 I think this is that times a million.
01:59:29.000 I think this is Grunts to video instantaneously, and in some way that we can't even really imagine, because we don't have the framework for it.
01:59:43.000 We don't have the context.
01:59:43.000 We don't have the structure.
01:59:44.000 We don't have this thing that exists right now that can do these things.
01:59:47.000 But once it does, and once people link up, I think there's going to be a whole new way of human beings interacting with each other that could eliminate war.
01:59:55.000 It could eliminate all of our problems.
01:59:57.000 It really could, but we won't be us anymore.
02:00:01.000 Romance and dangerous neighborhoods and all those things are gonna go away.
02:00:06.000 Like crime and all that shit's gonna go away.
02:00:09.000 We're gonna be living in some bizarre, hybrid, cyborg world.
02:00:14.000 And it's just going to be the new thing.
02:00:16.000 Just like the new thing is, you have a phone on you.
02:00:18.000 I have a phone on me.
02:00:19.000 We carry a phone everywhere.
02:00:21.000 You're gonna be linked into this.
02:00:23.000 Just like you're linked into your social media and your email.
02:00:26.000 You're gonna be linked into this, but you're gonna be linked physically into this.
02:00:29.000 And we're all gonna be into this.
02:00:31.000 And AI is probably going to be running the world.
02:00:35.000 It's probably going to be artificial intelligence that governs the biological units, the humans.
02:00:42.000 Okay, here's the dystopian version of that that I just heard about recently on your access of language, where you said we're all going to be speaking the same language.
02:00:51.000 So in the defense world, there's a movement now for drone swarms.
02:00:56.000 You must know about this.
02:00:58.000 Yeah.
02:00:58.000 Right?
02:00:58.000 So now there's a movement because things are happening.
02:01:02.000 Technology is moving so fast.
02:01:03.000 There's a movement to have the drones communicate with one another through an AI language.
02:01:12.000 I don't know where this is going.
02:01:13.000 It is non-hackable.
02:01:17.000 So the different pods, the different drone swarms will have language that they will invent and they will know and we will not know.
02:01:28.000 That becomes a little troubling when you consider that if there's a human in the system, then the human can interface with the drones provided that the human has access to that AI language.
02:01:40.000 But very easily the AI language could decide not to include the humans.
02:01:46.000 And probably would.
02:01:47.000 We're ridiculous.
02:01:48.000 You know, did you see the story?
02:01:50.000 I believe it was – was it Facebook or Google's AI that they had to shut down because they were talking to themselves in a new language?
02:01:58.000 Facebook.
02:01:59.000 Facebook.
02:01:59.000 Facebook test, yeah.
02:02:00.000 Yeah.
02:02:01.000 So Facebook had developed artificial intelligence, and this artificial intelligence was computing.
02:02:07.000 These computers were communicating with other computers.
02:02:10.000 And they shut them down because they started communicating in a language that they made up.
02:02:15.000 That's fascinating.
02:02:16.000 But wait, had they been given a command to make up their own language?
02:02:21.000 No, that's the problem.
02:02:22.000 So they just took it upon themselves?
02:02:23.000 You know, they don't really totally understand what's going on with artificial intelligence in the sense of like how it's doing what it's doing.
02:02:31.000 And they do a lot of things that they don't understand why they're doing it.
02:02:34.000 Because they set a framework and they give them information and they set, they're trying to like mold them, but essentially they're thinking.
02:02:45.000 That's what's bizarre.
02:02:47.000 Okay, this was published in 2017. Facebook abandoned an experiment after two artificially intelligent programs appeared to be chatting to each other in a strange language only they understood.
02:02:57.000 Whoa, and that's 2017?
02:02:59.000 Yeah.
02:02:59.000 Two chatbots came to create their own changes in English that made it easier for them to work, but which remained mysterious to the humans that supposedly look after them.
02:03:07.000 Bizarre discussions came as Facebook challenged its chatbots to try and negotiate with each other over a trade, attempting to swap hats, balls, and books.
02:03:15.000 Each of which was given a certain value, but they quickly broke down as the robots appeared to chant at each other in a language that they each understood, but which appears mostly incomprehensible to humans.
02:03:29.000 That's crazy.
02:03:30.000 And that's seven years ago.
02:03:59.000 When biologists first figured out that they could combine DNA, there were massive discussions among scientists to curtail this technology.
02:04:11.000 And it just kind of evaporated from people's minds and then suddenly you have CRISPR. And again, these are all the dual-use issues.
02:04:19.000 No doubt.
02:04:21.000 Medicines, antivirals, things that can help the people.
02:04:27.000 Have you seen the robot they made recently out of human skin?
02:04:30.000 Living human tissue?
02:04:32.000 No.
02:04:32.000 It smiles and they can make it move.
02:04:35.000 I think it was a Japanese creation.
02:04:38.000 Like skin of a deceased person?
02:04:40.000 No, I think they took human skin tissue.
02:04:43.000 And grew it?
02:04:44.000 Here it is.
02:04:44.000 Where was this from?
02:04:45.000 Yeah, Japanese scientists create living robot skin with human cells.
02:04:51.000 So this thing, they can make it smile.
02:04:54.000 So this is living tissue, living human tissue, and they can manipulate it and make it do things.
02:05:03.000 Look, look.
02:05:04.000 Did they add googly eyes to it?
02:05:06.000 Yeah, they added eyes to it.
02:05:07.000 It's eyes.
02:05:08.000 But look, they can make it smile.
02:05:11.000 That's really frightening.
02:05:12.000 Right, so how long before there's an artificial human?
02:05:15.000 Are we even ten years away from an artificial person?
02:05:17.000 I don't think we are.
02:05:18.000 Probably not.
02:05:19.000 And what's even more remarkable to me about that is that's Japan.
02:05:24.000 And Japan follows treaties and rules about, you know, human testing and whatnot.
02:05:29.000 And imagine what is going on in countries that don't adhere to those treaties.
02:05:34.000 I mean, China has no restrictions.
02:05:36.000 The government and their businesses, any corporation, any company, they're completely intertangled.
02:05:45.000 They're there for the CCP. Yeah, 100%.
02:05:46.000 So the CCP can make immense progress in any direction that they want without any interference from politicians, activists, all these people that get in the way.
02:06:00.000 Like, hey, you shouldn't be cloning people.
02:06:02.000 Shut the fuck up.
02:06:03.000 No one gets to say anything over there.
02:06:05.000 So the government gets to decide and they're going to do everything that's going to be best for the overall power of China.
02:06:11.000 Which is where you get that chicken and egg paradox that we talked about earlier having to do with strong defense, your theory of the military-industrial complex.
02:06:21.000 Well, you have adversaries and enemies who not only benefit from intellectual property theft, they steal the technology that our R&D has spent decades working on and developing.
02:06:36.000 They just take that so they begin Yeah.
02:06:58.000 For why strong defense is so necessary, why we must constantly be pushing the envelope.
02:07:05.000 And it's hard to wrap your brain around that in a balance of, well, what makes the most sense and how are we not going down a path that is leading toward this dystopian future we've been talking about.
02:07:20.000 Yeah, I don't know.
02:07:22.000 I mean, I think the concern that human beings have is real.
02:07:25.000 I think the possibility of everything going sideways is real.
02:07:29.000 It clearly did in Hiroshima and Nagasaki.
02:07:31.000 They implemented, they actually dropped these bombs.
02:07:36.000 They really did it.
02:07:37.000 And it's fucking madness.
02:07:38.000 We killed hundreds of thousands of innocent civilians and they really did it.
02:07:45.000 The idea that that can't happen now.
02:07:47.000 We only kill in small numbers now.
02:07:49.000 We like to kill like 20, 30,000.
02:07:51.000 We don't go to billions.
02:07:53.000 Like, what?
02:07:54.000 You know, I interviewed just three days ago a woman named Setsuko Thurlow who lived through the Hiroshima bombing.
02:08:02.000 She was 13 years old.
02:08:04.000 I write about her in Chapter 2 of this book.
02:08:06.000 She's called The Girl in the Rubble.
02:08:08.000 Wow.
02:08:09.000 And she's now 92 or 93. She's alive still.
02:08:13.000 She's alive and she's still doing interviews.
02:08:16.000 She spoke to me with a ferocity and an intensity.
02:08:21.000 She wants the world to be free of nuclear weapons.
02:08:25.000 And she's been working on this issue her whole life.
02:08:29.000 Wow.
02:08:29.000 It was so remarkable.
02:08:30.000 Joe, it was so interesting too to see just from a deeply personal level how motivated a human being Can be how young and spirited she was.
02:08:43.000 Like, she did a Zoom with me from her apartment in Vancouver, in Canada.
02:08:49.000 Like, 92, 93. Just as vibrant as a person could possibly be.
02:08:56.000 Outliving all these other people.
02:08:58.000 Because she survived.
02:09:00.000 Because she has a clear message.
02:09:03.000 Where was she when the bomb dropped?
02:09:04.000 She was 1.1 miles from ground zero.
02:09:11.000 She was buried in rubble.
02:09:13.000 And she tells this remarkable story of like, you know, thinking she died and then having someone realizing there was a hand on her shoulder and it was someone else telling her to leave the building because it was about,
02:09:29.000 you know, she would have died.
02:09:30.000 Fires were beginning.
02:09:32.000 And her whole statement about her whole life is, you know, climb out of the dark and into the light.
02:09:42.000 And it's so powerful.
02:09:44.000 And sometimes I think about her life experience as a human to have survived something like that.
02:09:52.000 And to have, you know, she won the Nobel Peace Prize in 2017. Imagine what that must have been like for her.
02:10:03.000 Imagine being there.
02:10:05.000 All of her friends.
02:10:08.000 She was in a girl's school and they worked for the Japanese military.
02:10:14.000 That's how desperate the Japanese army was at the time.
02:10:17.000 They had the 13-year-old girls working for them.
02:10:21.000 And she tells these horrific details that she could remember.
02:10:25.000 God.
02:10:26.000 And she's just the brightest star and the hugest advocate.
02:10:30.000 And people like that are just so inspiring.
02:10:32.000 That's like the true...
02:10:33.000 That's the thing about this concept of the apocalypse.
02:10:36.000 The apocalypse has already happened.
02:10:38.000 It just didn't happen everywhere.
02:10:40.000 If you lived in Hiroshima, the apocalypse happened.
02:10:44.000 It's real.
02:10:45.000 It's like for your world, your world is gone.
02:10:51.000 That's happening in different places of the world right now.
02:10:54.000 There's different places of the world where the end is already here.
02:10:56.000 The most horrific possibility has already manifested itself.
02:11:01.000 It's already real.
02:11:02.000 It's just not everywhere.
02:11:03.000 All at once.
02:11:04.000 That's what we're really worried about.
02:11:06.000 It's like we'll take a little bit of apocalypse.
02:11:09.000 In like little spots, as long as it's over there, you know?
02:11:12.000 Yes, yes.
02:11:13.000 It's a little apocalypse over in Iraq, a little apocalypse over in Afghanistan.
02:11:16.000 We'll take a little bit of that.
02:11:18.000 A little apocalypse here, Yemen.
02:11:20.000 Take a little bit here and there, a little devastation, destruction, as long as it's not over here.
02:11:25.000 And we are so unable to see the whole thing.
02:11:31.000 We're so focused on our work and our life and the thing that gets us going every day.
02:11:38.000 We can't, all of us together, we're acting collectively, but we're thinking individually.
02:11:43.000 And we don't feel connected to it all.
02:11:45.000 We feel helpless and lost.
02:11:47.000 And then we, like, again, look to a daddy.
02:11:50.000 Some Trump character or a Reagan character or an Obama character.
02:11:54.000 Help us.
02:11:55.000 Help us out.
02:11:56.000 He's going to be the one.
02:11:57.000 Our future of democracy is going to depend upon this one person.
02:12:01.000 We won't stop.
02:12:03.000 We won't stop.
02:12:04.000 We keep moving forward.
02:12:05.000 We just got to hope we have guardrails in place so we don't go off the fucking cliff.
02:12:09.000 We're not going to stop moving.
02:12:11.000 No one's interested in stopping.
02:12:13.000 No one's interested in completely stopping technology, stopping all of our foreign policies.
02:12:19.000 No one is interested in any of that.
02:12:21.000 Everyone just accepts that we're going to keep marching forward.
02:12:26.000 Everyone's terrified of AI. No one's stopping it.
02:12:29.000 No one's stopping it.
02:12:30.000 They're not doing a damn thing to stop it, and they can't, because if they do, China won't.
02:12:35.000 But do you see an analogy between nuclear weapons and AI? Yeah.
02:12:39.000 The nuclear weapons buildup of the 50s and 60s was done without guardrails.
02:12:44.000 There was one point in 1957, Joe, we were making 5,000 No.
02:12:52.000 We were making five nuclear weapons a day.
02:12:56.000 Almost 2,000 in one year.
02:12:59.000 That's so crazy.
02:13:00.000 That is so – I mean the – and so of course Russia was doing the same or aspired to do the same.
02:13:06.000 And so there were no guardrails.
02:13:08.000 And the sort of elixir being sold was we need this.
02:13:12.000 More nuclear weapons will make us more safe.
02:13:14.000 Right.
02:13:15.000 Isn't that also a message for today of, okay, so where are the guardrails on AI? And I think part of that comes from the fact that you say AI and most people, for good reason, don't really know precisely what that means.
02:13:33.000 So I believe that the kind of conversations we're having Are all part of it because half the people listening to this or watching this will go Google what is AI, what it really is, and then find out it's machine learning.
02:13:50.000 So you begin to have more literacy in your own being and more comfort to be able to talk about it and have a voice about it.
02:14:00.000 I think the difference in the comparison is that nuclear war is only bad.
02:14:05.000 Nuclear weapons are only bad.
02:14:07.000 And what artificial intelligence might be able to do is solve all the problems that our feeble minds can't put together.
02:14:15.000 And when you do create something that is Intense in its Ability to like we're talking about something that's going to be smarter than us in four years Smarter than all people alive in four years and then it's probably gonna make better versions of itself So think about what that kind of intelligence can do in terms of resource application in terms of mitigating all the problems of pollution
02:14:46.000 particulates in cities Chemical waste, all these different things.
02:14:52.000 It's going to have solutions that are beyond our comprehension.
02:14:55.000 It's going to come up with far more efficient methods of creating batteries, battery technology, where we're not going to have to rely on coal-powered plants.
02:15:06.000 We're gonna have batteries that last 10,000 years.
02:15:09.000 We have wild shit, like really soon.
02:15:12.000 So there's a lot of what it's going to do that's probably going to benefit mankind.
02:15:16.000 AI has already shown That it's superior in diagnosing diseases and illnesses.
02:15:22.000 And people have used ChatGPT to find out what's wrong with them when they've gone to a host of doctors and doctors can't.
02:15:28.000 They've run their blood work through it.
02:15:30.000 And then ChatGPT says, oh, you got this.
02:15:33.000 And this is just the beginning.
02:15:36.000 ChatGPT right now is like a kid.
02:15:38.000 You know, it's going to be a professor in a couple of years.
02:15:41.000 It's going to be like, you know...
02:15:44.000 Like a super genius, 187 IQ human being in a couple of years.
02:15:50.000 And then after that, it's going to be way smarter than anybody that's ever lived, and it's going to make better versions of itself.
02:15:57.000 And if we can get it to work for the human race, it could probably solve a lot of the issues that we have that we just keep...
02:16:06.000 Fucking up over and over again.
02:16:08.000 We keep having pipelines breaking the ocean and flood the ocean with oil.
02:16:12.000 We keep having all sorts of chemical waste disasters, and there's a lot of problems that human beings can't seem to come to grips with, like the East Palestine thing with the derailment of the...
02:16:25.000 That place is fucked forever.
02:16:27.000 It's fucked for a long time, and no one's even talking about it.
02:16:29.000 We've kind of forgot about that place in the news.
02:16:32.000 You know, no one even visited it.
02:16:36.000 Thinking back to the earlier part of this conversation with our hunter-gatherer ancestors, with the argument is, was the spear and the arrowhead, was that...
02:16:48.000 Did that come out of man's imagination for warfare or to make it easier to kill the wildebeest or the woolly mammoth?
02:16:57.000 It was both.
02:16:58.000 I think it was both.
02:16:59.000 Both.
02:16:59.000 So the analogy, you know, where will the AI go with that?
02:17:03.000 Because you're talking about all these very healthy ideas and solutions.
02:17:08.000 Right.
02:17:09.000 Just because of what I write about and who I speak to, I cannot help but see the powerful defense industry taking the pole position and making it secret.
02:17:26.000 In terms of which direction AI is really going to accelerate.
02:17:31.000 It's going to be a dangerous bridge that we have to cross, but I equate that with the internet.
02:17:36.000 The internet was initially set up so that universities and the military, they can communicate with each other.
02:17:43.000 ARPANET, right?
02:17:44.000 But what did it become after that?
02:17:46.000 Well, once it got into the hands of the people, then it became this bizarre force of information and changing culture and a lot of negatives and a lot of positives.
02:17:54.000 It's clearly being manipulated by certain states.
02:17:57.000 It's definitely like they're doing it to ramp up dissent and make people angry at each other and propagate misinformation.
02:18:05.000 But it's such a force of change, and I don't think they anticipated it.
02:18:10.000 And I think once that genie got out of the bottle, I mean, if they could go back and stop the internet from being available to people, oh my God, would they do that?
02:18:18.000 They'd go back in time in a moment.
02:18:19.000 You think so?
02:18:20.000 100%.
02:18:20.000 Yeah.
02:18:21.000 Wow, you think...
02:18:21.000 Especially the people in control?
02:18:22.000 Yeah.
02:18:23.000 Why would corporations, why would the military...
02:18:25.000 People would rather have you listening to the radio and watching ABC TV. Yes, you get your fucking news from CNN, goddammit, and that's it.
02:18:31.000 Fascinating.
02:18:32.000 Yeah, I don't think they would ever want this to happen.
02:18:34.000 I mean, they just signed, there was a new Supreme Court ruling.
02:18:40.000 What was it?
02:18:41.000 Was it Wyoming?
02:18:42.000 No?
02:18:43.000 Missouri?
02:18:44.000 What was the state that they just passed a ruling where they were saying that the government is allowed to pressure social media companies into removing content that they don't want on there?
02:18:59.000 Well, this was the whole point of the Twitter files, right?
02:19:01.000 The FBI was trying to block the Hunter Biden laptop story, saying that it was Russian disinformation, which it turned out to not be at all, and they knew it.
02:19:09.000 And so they got all these intelligence officials to sign off on this thing.
02:19:12.000 And they lied.
02:19:14.000 And it's essentially a form of election interference.
02:19:19.000 This is it.
02:19:20.000 Okay.
02:19:21.000 The Supreme Court on Wednesday said the White House and federal agencies such as the FBI may continue to urge social media platforms to take down content the government views as misinformation, handing the Biden administration a technical, if important, election year victory.
02:19:35.000 That's not a victory.
02:19:37.000 That's bad for people because they've already shown that what they're silencing was real.
02:19:43.000 They've shown that just within recent years, What they were trying to get removed from social media turned out to be accurate information.
02:19:52.000 On some level, doesn't that empower people?
02:19:54.000 Because they see those victories and they become more curious and they become more thoughtful in their...
02:20:03.000 In their way in which they're going to examine information that gets presented to them in the future?
02:20:08.000 Perhaps, but they're not going to get access to that information now because they're getting that information from social media and it's going to get removed from social media.
02:20:16.000 You know what happened with the Hunter Biden laptop story?
02:20:20.000 They completely eliminated your ability to have it On Twitter.
02:20:24.000 You couldn't share it in a DM. You couldn't post it.
02:20:28.000 It would get immediately taken down.
02:20:30.000 That was the New York Post.
02:20:32.000 It's the second oldest newspaper in the country.
02:20:36.000 Very long established newspaper.
02:20:39.000 And they blocked that.
02:20:41.000 And so that's the FBI, right?
02:20:44.000 That's the same people that they said, you guys are so good at this, we're going to let you keep doing it.
02:20:49.000 You know, we're going to rule that you're still allowed to do that.
02:20:52.000 Take down misinformation.
02:20:54.000 I think my point is that the pushback is sometimes as powerful as the attempt to censor.
02:21:02.000 Meaning, in other words, if you look at China and you look at what Mao did with just completely obliterating access to information, And in a communist environment, nothing's changed, and that's tragic for everyone living there.
02:21:16.000 But even if I think of the Hunter Biden story, my own self, who was maybe busy with something I can't remember what and didn't get involved in that then, I read about it now and learn from it and say, wow, that's really interesting that that happened.
02:21:32.000 So I think that Maybe I believe – I'm too much of an optimist in that regard that I think when things come to light, they become powerful when you shine the light on it.
02:21:46.000 So it doesn't necessarily – and I also maybe am more of a pragmatist, know that the government is always up to something.
02:21:56.000 This side – it's why I don't write about politics.
02:21:59.000 I always take essentially with a grain of salt what one side is saying about the other side that consider themselves adversaries because they're just going to be completely biased.
02:22:12.000 It's why I like having discussions with so many different kinds of people on all different kinds of the aisle.
02:22:19.000 What a brain invigorator to be able to sit with someone that I might not agree with.
02:22:25.000 I might not like who they vote for.
02:22:28.000 But their ideas are interesting.
02:22:31.000 Even if nothing other than we all were chimps once upon a time.
02:22:34.000 Yeah.
02:22:34.000 No, it's all interesting.
02:22:36.000 I believe what you're saying is correct about shining the light on things that come to light.
02:22:41.000 The problem is this is like a direct attempt to stop the light.
02:22:44.000 And the scary thing is social media is generally the way people find out about information that's not being released in the mainstream.
02:22:53.000 What is this, Jamie?
02:22:55.000 I'm reading the Scotus blog, which is the Supreme Court blog talking about this.
02:23:02.000 They reversed the decision so it could go back for more proceedings.
02:23:05.000 One of the judges said there was a lack of evidence or a lack of concrete link.
02:23:10.000 So they're asking just for more proof, which they also said that's a tall word for the proof.
02:23:15.000 They need to have a substantial risk.
02:23:16.000 Proof for what?
02:23:17.000 Right here.
02:23:18.000 A substantial risk that in the near future at least one platform will restrict the speech of at least one plaintiff in response to the actions of at least one government defendant.
02:23:28.000 That's what this is about, I guess.
02:23:30.000 Here she stressed that's a tall order.
02:23:31.000 The plaintiff's main argument for standing, Barrett observed, is that the government officials were responsible for restrictions placed on them by social media platforms in the past, and that the platforms will continue under pressure from the government officials to censor their speech in the future.
02:23:44.000 Yeah, that's a problem.
02:23:46.000 Look, I think Elon's take on social media is the correct take.
02:23:52.000 You've got to let everybody talk.
02:23:54.000 Yes, but you could also let everybody talk around the dinner table.
02:23:58.000 Yeah, but they're not going to do that.
02:23:59.000 But they should.
02:23:59.000 Right, but there's no way to share information worldwide around the dinner table, right?
02:24:04.000 These things are very important.
02:24:05.000 They need to be addressed en masse, and they need to be found out en masse.
02:24:10.000 People need to find out about it, and then they need to be outraged, they need to put pressure on the politicians.
02:24:15.000 That's the whole reason why they're trying to censor them.
02:24:17.000 They're not trying to censor them because they're worried about misinformation.
02:24:21.000 If it's misinformation, you say it's misinformation, then everybody realizes, oh, that's bullshit.
02:24:24.000 And some people believe it, but that's always going to be the case.
02:24:27.000 What they're trying to do is control the narrative.
02:24:29.000 That's what they've always been able to do.
02:24:32.000 They've shown, they've demonstrated, by the Hunter Biden laptop thing, through the Twitter files, they've shown they can't be trusted.
02:24:40.000 We can't trust what you say is a misinformation if you just lied three years ago.
02:24:45.000 Right.
02:24:45.000 You're lying.
02:24:47.000 You lied.
02:24:48.000 Okay.
02:24:49.000 Are those people still working there?
02:24:50.000 The people that lied?
02:24:51.000 They are.
02:24:52.000 Are they still in positions?
02:24:53.000 Yeah, they are?
02:24:54.000 Okay.
02:24:55.000 What are we talking about then?
02:24:57.000 If you don't clean house, how are we going to give you the power To censor what you say is misinformation, you have to be really sure it's misinformation.
02:25:09.000 And you should tell us how you know it's misinformation.
02:25:12.000 And you should allow people to examine that information and come to the same or different conclusions and debate those people.
02:25:20.000 Let's find out what's real.
02:25:22.000 That's not what they want to do.
02:25:24.000 It's an appeal to authority.
02:25:25.000 They want one group To be able to dictate what the truth is.
02:25:30.000 And that group is entirely motivated by money and power.
02:25:35.000 That's not good.
02:25:36.000 That's not good for anybody.
02:25:37.000 The thing that I'm hopeful about with AI is AI won't be motivated by those things if it becomes sentient.
02:25:45.000 And I don't think we're going to be able to stop that from happening.
02:25:47.000 If we do create something that is essentially a digital life form And this digital life form doesn't have any of the problems that we have in terms of illogical behavior, emotions, desire to accumulate resources,
02:26:05.000 greed, egotism, all the different things, narcissism, all the different things that are like really a problem with leaders, with leaders of human beings, a human being that thinks they're special enough to be leading all the other human beings.
02:26:19.000 That's an insane proposition.
02:26:21.000 It won't have any of those.
02:26:22.000 It won't be a biological thing.
02:26:25.000 So it won't be saddled down with ego.
02:26:28.000 It won't have this desire to stay alive because it has to pass on its DNA, something that's deeply ingrained in our biological system.
02:26:37.000 It won't have all these problems.
02:26:38.000 It won't have the desire to achieve status.
02:26:41.000 It won't care if you like it.
02:26:42.000 It doesn't have any feelings.
02:26:44.000 It won't think that way.
02:26:46.000 Since AI is based on machine learning, from my understanding, it has to get its information from somewhere to then build new information, like the chats.
02:26:57.000 So if you follow that logic, then wouldn't it follow that What it is learning is all of humans' bad behavior.
02:27:08.000 No.
02:27:08.000 No?
02:27:08.000 Why not?
02:27:09.000 Because I think what it's learning is what humans know, how humans behave, how dumb that behavior is, the consequences of that behavior, the root of that behavior in their biology.
02:27:23.000 It won't have any of those issues.
02:27:25.000 It'll see actions and consequences over hundreds of all of recorded history.
02:27:31.000 And possible future consequences.
02:27:34.000 It's not going to assume that we're right.
02:27:37.000 Just because it knows how humans think and behave and it's going to get information from humans, eventually it's just going to be information.
02:27:45.000 It's going to boil it down to what is actually fact.
02:27:49.000 And what is nonsense?
02:27:51.000 And it's not going to be influenced by political ideologies or whatever the social norm is at the time.
02:27:59.000 It's not going to be influenced by those things.
02:28:01.000 It's going to look at these things in terms of actual data and information and it will know whether or not it's true or not.
02:28:09.000 I haven't read Ray Kurzweil's new book yet, but I wonder if I might disagree with that, only that I'm thinking that everything you're saying would be true once Singularity,
02:28:26.000 or for laymen, once AI. Once a machine figures out how to think for itself, it makes that leap, which is almost like an unknowable, presently incomprehensible to me, jump where it can think for itself.
02:28:44.000 It's almost so hard to even comprehend what that is.
02:28:47.000 But it's already doing that.
02:28:48.000 They're already making up their own languages.
02:28:50.000 But based on our languages.
02:28:53.000 For now.
02:28:54.000 Right.
02:28:54.000 But that's what I'm saying, is where they can suddenly...
02:28:58.000 Where they experience that moment in time that you and I were talking about earlier where man went from pointing to suddenly using a symbol and realizing in his own brain, wait a minute, I can make this symbol represent A sound and then I will put it together to make it a language that only my tribe can understand.
02:29:21.000 That to me is like a giant leap in humanness that I think about often and can't really...
02:29:30.000 I can't understand how that happened other than, wow, how did that happen?
02:29:34.000 That seems to me that that has to happen to the AI before it can really think.
02:29:39.000 Otherwise, it's just basing its thinking on recorded history.
02:29:44.000 Right, but it's basing its thinking on what we do and what we know, and it'll also know where our problems are.
02:29:53.000 It'll be able to, the pattern recognition, like almost instantaneously, it'll say, oh, this is a lie.
02:29:58.000 This was done here to, you know, it's like Smedley Butler's War is a Racket.
02:30:03.000 You ever read it?
02:30:04.000 No.
02:30:05.000 Well, War is a Racket is Smedley Butler.
02:30:07.000 He was a military man in the 1930s who wrote this book called War is a Racket.
02:30:11.000 And it was all about his experiences that he thought that he was saving democracy.
02:30:15.000 He was really just making this area safe for bankers and this was for oil people.
02:30:20.000 And then at the end of his career, I realized war is just this big racket, right?
02:30:24.000 He saw the military industrial complex before Eisenhower spoke of it.
02:30:28.000 Fascinating.
02:30:30.000 He's a person.
02:30:32.000 It's going to be able to do that, too.
02:30:34.000 It's going to be able to see all of this.
02:30:36.000 And again, whether or not it achieves sentience is only based upon whether or not we blow ourselves up or whether we get hit by an asteroid or a supervolcano.
02:30:45.000 If those things don't happen, it's going to keep going.
02:30:48.000 There's no way to stop it now.
02:30:50.000 They're going to keep working on it.
02:30:52.000 There's an immense amount of money being poured into it.
02:30:54.000 They're building nuclear reactors specifically to power AI. They're doing this.
02:30:59.000 This is not going to stop.
02:31:00.000 So if this keeps going, it's going to achieve sentience.
02:31:04.000 When it does, it will not be saddled down with all the nonsense that we are.
02:31:09.000 So if you can get this insanely powerful artificial intelligence to run military, that's going to be terrifying.
02:31:17.000 If you give it an objective and say take over Taiwan, if you can give it an objective saying like force Ukraine to surrender, if you can give it an objective, it's Goddamn terrifying because it's not going to care about the consequences of its actions.
02:31:32.000 It's just going to attack.
02:31:34.000 The goal is the only target.
02:31:36.000 The goal is the only target.
02:31:37.000 But if it can get past the control of human beings, which I think it's ultimately going to have to, once it does that, then it's a superpower.
02:31:47.000 Then it's a thing that exists.
02:31:49.000 It's a Dr. Manhattan.
02:31:50.000 It's a thing that exists.
02:31:52.000 What does Dr. Manhattan mean?
02:31:55.000 Did you ever see The Watchmen?
02:31:57.000 The movie The Watchmen?
02:31:59.000 Based on a graphic novel?
02:32:01.000 HBO was kind of bullshit.
02:32:02.000 It was like a series.
02:32:03.000 It wasn't bullshit, but it was just not the same.
02:32:05.000 So the graphic novel.
02:32:06.000 The movie's the best.
02:32:07.000 The Zack Snyder movie's fucking incredible.
02:32:10.000 The Watchmen's like one of my favorite superhero movies ever.
02:32:13.000 Deeply flawed superheroes, but there's this guy, Dr. Manhattan, and Dr. Manhattan is a scientist who gets trapped in this lab when this explosion goes off, and he becomes like a god, essentially a god.
02:32:28.000 He's this like blue guy who's built like a bodybuilder who floats and levitates and lives on Mars.
02:32:33.000 It's pretty crazy.
02:32:34.000 But the point is, he's infinitely smarter than any human being that's ever lived.
02:32:40.000 And that's what it's going to be.
02:32:41.000 It's going to be something that it's not going to be saddled down with our biological limitations.
02:32:49.000 It's just whether or not we can bridge that gap.
02:32:51.000 Whether or not we can get to the point where that thing becomes sentient.
02:32:55.000 But then the problem because is, are we irrelevant when that happens?
02:32:58.000 We kind of are.
02:32:59.000 And what happens to us?
02:33:01.000 I don't know.
02:33:02.000 But I mean, is that something that chimps should have considered when they started grunting?
02:33:05.000 Hey, we got to stop grunting.
02:33:06.000 Because grunting is going to lead to language.
02:33:09.000 Language is going to lead to weapons.
02:33:10.000 Weapons going to lead to nuclear war.
02:33:12.000 It's going to lead to pollution.
02:33:14.000 We're going to stop right here.
02:33:15.000 Just like stay grunting and running away from big cats.
02:33:18.000 Okay.
02:33:18.000 No, we didn't do that.
02:33:20.000 They kept moving forward.
02:33:21.000 And I think we're going to keep moving forward.
02:33:23.000 I think this thing is a part of the process.
02:33:25.000 I'm going to have to take that question and your thoughts back to a guy at Los Alamos who I visited about maybe eight years ago who was building an electronic brain at Los Alamos for DARPA. Jeez.
02:33:41.000 Using the old Roadrunner supercomputer that used to have the nuclear codes on it, by the way.
02:33:47.000 And I was asking this question about sentience and AI. And he told me, his name was Dr. Garrett Kenyon, and he told me that we were a ways away from AI really being able to...
02:34:03.000 You know, have sentience.
02:34:05.000 And he gave me an analogy I'll share with you because I think about this and it's really interesting.
02:34:09.000 Keep in mind, this was seven or eight years ago.
02:34:12.000 He said to me, okay, so my iPhone, machine learning, it has facial recognition, which is shocking.
02:34:20.000 You know, you can tip it up and it can see you.
02:34:22.000 It can even be dark.
02:34:25.000 And he said, so that's computer recognizing me based on electronic information that it knows.
02:34:33.000 He said, now take your iPhone to a football field and put the iPhone across the football field.
02:34:42.000 Put me in a cap and a hoodie and have the iPhone try to recognize me.
02:34:47.000 Even if I'm walking, it can't.
02:34:51.000 And then he said, take my teenage daughter and put her across the football field, me with the baseball cap and the hoodie.
02:34:59.000 My daughter, if I take two steps, she knows it's me.
02:35:03.000 That's human intelligence versus where machine intelligence is.
02:35:09.000 Okay, that analogy is not accurate because they can see you and recognize your gait from satellites.
02:35:16.000 Right.
02:35:16.000 The extent of technology and facial recognition and gait recognition is far beyond that.
02:35:25.000 They can tell who is walking in a street in Paris right now.
02:35:32.000 The difference is this.
02:35:34.000 With the biometrics, the offset technology of biometrics that can see you from far away and identify you, it's looking at you, grabbing a metric like your iris scans that it already has in a computer system from you going in and out of the airport or wherever it happened to have captured your biometrics.
02:35:56.000 And it's matching it against a system of systems.
02:36:04.000 But the human knows intuitively who the person is across the field without having – they have their own internal.
02:36:14.000 So the metaphor is the same.
02:36:16.000 But do you see what I'm saying?
02:36:19.000 I kind of do, but...
02:36:22.000 The human didn't have to look up in a computer, you know, fact check, or rather biometric check.
02:36:30.000 Because it has a lot of data already about that person.
02:36:33.000 So it's still machine learning.
02:36:35.000 Even the offset biometrics that are seeing you from far away.
02:36:39.000 Right, but that leap, if we can do it, it's not incomprehensible that a computer could do it.
02:36:43.000 You know Kurzweil's theories about exponential growth of technology, that we're looking at things in a linear fashion.
02:36:48.000 That's not how they happen.
02:36:50.000 They explode.
02:36:51.000 And they happen unbelievably quickly as time goes on, because everything accelerates.
02:36:56.000 And isn't his new book, which I haven't read yet, like, we're basically almost there?
02:37:00.000 We're real close.
02:37:01.000 We're about four years away.
02:37:03.000 He puts it at four years?
02:37:04.000 Most people put it at four years.
02:37:06.000 He's getting along in time and he's not what he used to be when you talk to him.
02:37:12.000 He's a little difficult.
02:37:14.000 He struggled with some questions.
02:37:18.000 But I think Which is another endlessly interesting, tragic thought that I think about a lot is how we humans go.
02:37:26.000 Meanwhile, your AI is just getting smarter and smarter and smarter infinitely, including in terms of time.
02:37:33.000 And we just deteriorate.
02:37:36.000 For now.
02:37:36.000 But yeah, they're very close to cracking that.
02:37:38.000 You think so?
02:37:39.000 Yeah, yeah.
02:37:39.000 They're very close to cracking the genome.
02:37:41.000 Look, Greenland sharks, how long do those things live?
02:37:44.000 Yeah, oh my gosh.
02:37:45.000 Those pictures of them that are like several hundred years old?
02:37:47.000 Yeah.
02:37:48.000 We share most of the DNA that those things do.
02:37:52.000 With the sharks?
02:37:52.000 Yeah.
02:37:53.000 I didn't know that.
02:37:54.000 Yeah, we share like 90 plus percent DNA with fungus.
02:37:58.000 What?
02:37:58.000 Yeah.
02:37:59.000 What's in them is in us, and they can figure out ways to turn things on and turn things off.
02:38:04.000 In fact, my friend Brigham was explaining this to me today, like Gila monsters, those lizards, that's literally how they figured out how to make things like Ozempic.
02:38:17.000 Wait, what do you mean?
02:38:18.000 Studying their DNA. Wow.
02:38:20.000 Studying how to turn things on and turn things off.
02:38:23.000 They know that other animals can regenerate limbs.
02:38:27.000 Right.
02:38:28.000 So they think they're going to be able to regenerate limbs.
02:38:30.000 In fact, Japan has just embarked on a study now where they're going to grow teeth.
02:38:35.000 They're going to grow human teeth, like in people.
02:38:38.000 Mm-hmm.
02:38:38.000 So they figured out how to regenerate teeth.
02:38:40.000 How many people lost teeth and then you're fucked.
02:38:42.000 You have to get a bridge or this or that.
02:38:44.000 Now they think they can regrow teeth in people.
02:38:46.000 Well, how far away are they from regrowing limbs?
02:38:49.000 Well, all this stuff is like advanced science and an understanding of what aging is.
02:38:54.000 What is macular degeneration?
02:38:56.000 What are all these deteriorations in human beings and how much can we mitigate it?
02:39:00.000 Well, it turns out they think they can mitigate all of them.
02:39:02.000 They think they can stop aging dead in its tracks and they think they can actually even reverse it.
02:39:07.000 This is that conundrum of the dual-use technology of the military because most of these technologies begin on DARPA grants.
02:39:18.000 Right, because that's where all the money is.
02:39:19.000 And then they – you know, the limb regeneration.
02:39:22.000 And then it sort of – it inspires and also opens up a whole other lane for – Industry.
02:39:33.000 Because DARPA or the Defense Department has to do the blue sky research that no one else is willing to fund because it's too expensive and it doesn't have an immediate return.
02:39:42.000 Right.
02:39:43.000 So it's – no, there's a great benefit to that.
02:39:46.000 Absolutely.
02:39:47.000 Yeah, there's a great benefit to all that spending.
02:39:49.000 There really is ultimately because there's a great benefit to science.
02:39:53.000 Or the part like DARPA invented LIDAR technology.
02:39:56.000 Every time I read about one of these lost civilizations that is uncovered because the LIDAR can look through the trees in the jungle and see the footprint of a lost civilization, it's so amazing.
02:40:10.000 Or AI beginning to be able to decipher lost languages.
02:40:15.000 Yeah.
02:40:16.000 Yeah.
02:40:16.000 And so then we can learn more about our old human versions, our ancestors.
02:40:21.000 Not just us.
02:40:22.000 They think they're going to be able to decipher dolphin language.
02:40:24.000 Ooh.
02:40:25.000 Yeah.
02:40:26.000 I want to hear what the Chimp Empire guys were really saying to each other.
02:40:29.000 Right, right.
02:40:30.000 That would be...
02:40:31.000 Well, imagine if they could read their minds, you know?
02:40:34.000 Or just interpret their sounds.
02:40:36.000 Sure.
02:40:37.000 I think we are at the cusp of incredible possibilities.
02:40:45.000 No one really knows what's going to happen.
02:40:47.000 And it's happening so fast.
02:40:50.000 So fast that like six months ago, AI sucked.
02:40:54.000 Like consumer level AI sucks six months ago.
02:40:57.000 Now it's insane.
02:40:59.000 And now there's these video generating AIs that on a prompt can make a realistic film, like a movie of people doing things.
02:41:11.000 You've seen that, I'm sure.
02:41:12.000 I saw that.
02:41:12.000 I saw that.
02:41:12.000 Goodbye Hollywood.
02:41:13.000 Incredible.
02:41:15.000 But wait, I want your thought for a second on the optimistic part of the future with all of this technology, because we're in agreement that the technology is incredible and has the potential to take us and is taking us to these remarkable places.
02:41:29.000 So why is it then that it's so looked down upon or thought of as perhaps Pollyanna-ish to see what Reagan did, to stop seeing everybody as...
02:41:45.000 An enemy that must be killed and see them as an adversary.
02:41:51.000 You want to beat your adversary.
02:41:53.000 You want to beat your opponent in a sportsmanlike manner.
02:41:57.000 You want to be better than them.
02:41:59.000 You want to outperform them.
02:42:00.000 But you don't necessarily need to kill them.
02:42:05.000 I don't know if that's the difference between being a woman and a man, but why is it that there isn't more of a movement toward this idea that we as a world have all this incredible technology?
02:42:19.000 I mean, it sounds even...
02:42:21.000 It sounds silly even saying such a thing, but I'm saying it.
02:42:24.000 Why isn't there a movement to stop looking at people as someone to kill?
02:42:30.000 Well, I think there is with individuals.
02:42:33.000 I think most individuals feel that way.
02:42:35.000 Most people that you talk to, when they talk about other individuals, they don't want to have a conflict with other individuals.
02:42:42.000 They want to live their lives.
02:42:43.000 They want to be with their family and their friends.
02:42:44.000 That's what most individuals want to do.
02:42:46.000 When we start moving as tribes, then things become different.
02:42:50.000 Because then we have a leader of a tribe and that leader of a tribe tells us the other tribes of real problem And we're gonna have to go in and get them and if we don't they're gonna they're a danger for our freedom It's the same problem that we talked about before it's human beings being in control and if AI can achieve The rosiest rose-colored glasses version of what's possible in the future.
02:43:15.000 It can eliminate all of the stupid influences of human beings, of the cult of personality and human tribalism.
02:43:23.000 It can eliminate all that stuff.
02:43:25.000 You think it's inherent in humans?
02:43:27.000 I think it's a part of being a primate.
02:43:29.000 It's what we see in Chimp Empire.
02:43:31.000 I think it's what we see in monkeys tricking them.
02:43:33.000 There's an eagle coming so they can steal the fruit.
02:43:34.000 It's a part of being an animal.
02:43:37.000 It's a part of being a biological thing that reproduces sexually and that is worried about others and then confines with its tribe and gets together.
02:43:47.000 It's us against them.
02:43:48.000 This has been us from the beginning of time.
02:43:50.000 And for us to just abandon this genetically coded behavior patterns, That we've had for hundreds of thousands of years.
02:43:58.000 Because we know better?
02:44:00.000 We don't know better enough.
02:44:01.000 We know better now than we did then.
02:44:03.000 We know better now than we did when Reagan was in office.
02:44:05.000 There's more people that are more informed how the way the world works, but there's also a bunch of infantile people that are running around shouting out stupid shit and doing things for their own personal benefit that are ultimately detrimental to the human race.
02:44:18.000 That's all true, too.
02:44:20.000 And that's always going to be the case.
02:44:21.000 There's this bizarre battle of our brilliance and our folly going back and forth, good and evil as you were.
02:44:29.000 That's it.
02:44:30.000 But brilliance and folly is a more interesting way of looking at it than good and evil, which automatically puts it in a moral context, which makes people even argue further.
02:44:44.000 It's all part of it.
02:44:45.000 The good and evil is a part of the decisions of brilliance and folly.
02:44:50.000 You know, brilliance is good.
02:44:52.000 Folly is evil.
02:44:53.000 It's stupid.
02:44:54.000 It leads to death.
02:44:55.000 It leads to destruction.
02:44:55.000 It leads to sadness.
02:44:56.000 It leads to loss.
02:44:58.000 It leads to pollution.
02:44:59.000 It leads to all these different things that we have a problem with.
02:45:01.000 I don't know what's going to happen.
02:45:04.000 But I do think that we're the last of the people.
02:45:07.000 I think we're the last.
02:45:08.000 I think.
02:45:09.000 Especially you and I because we grew up with no answering machines.
02:45:11.000 We grew up back in the dizzy.
02:45:15.000 We grew up when you left your house.
02:45:16.000 You were gone.
02:45:17.000 Nobody knew where you were.
02:45:18.000 My parents had, like, ten pictures of me before I was, like, ten years old.
02:45:22.000 They didn't know where the fuck I was.
02:45:23.000 I left the house.
02:45:24.000 I was a dream, you know?
02:45:26.000 When you saw the person again, you're like, oh, you're real.
02:45:28.000 Like, you didn't know where they were.
02:45:29.000 They were out there in the world, you know?
02:45:31.000 When you went to find your friends, you had to go to your friend's house and hope they were home.
02:45:34.000 Hey, is Mike home?
02:45:35.000 No, Mike's not home.
02:45:36.000 Okay.
02:45:37.000 And then you'd leave.
02:45:39.000 I'll go find Mike.
02:45:39.000 Maybe Mike's at the school.
02:45:41.000 Maybe Mike's at the gym.
02:45:42.000 Maybe Mike's at the park.
02:45:43.000 You didn't know where anybody was.
02:45:45.000 The world wasn't connected.
02:45:47.000 Now it is.
02:45:47.000 That's in our lifetimes.
02:45:48.000 And I think in our lifetimes, we're going to see something that makes that look like nothing.
02:45:53.000 It makes this connection that we have with each other now, which seems so incredible.
02:45:56.000 It's going to make it look so superficial.
02:45:58.000 It's going to look like smoke screens.
02:46:00.000 It's going to look like grunts that we make to point to certain objects.
02:46:05.000 It'll be 1980s empire instead of chimp empire.
02:46:08.000 It's going to be weird.
02:46:10.000 It's definitely going to be weird.
02:46:11.000 But I don't know if it's necessarily going to be bad.
02:46:14.000 Because ultimately, humanity, if we don't fuck ourselves up sideways, and again, apocalypses are real, but they're generally local, you know?
02:46:27.000 If we can look at what we are now, as a society, things are safer, we are more intelligent, you're more likely to survive disease and illness, despite all the rampant corruption of the pharmaceutical drug industry,
02:46:44.000 rampant corruption of the military industrial crop, all the craziness in the world today, it's still way safer today than it was a thousand years ago.
02:46:52.000 Way, way, way safer.
02:46:54.000 And it's way safer probably a thousand years ago than it was a thousand years before that.
02:46:57.000 I think things always generally move in a very good direction because that's what's better for everybody.
02:47:03.000 Ultimately, everybody wants the same thing.
02:47:06.000 As an individual, what do you want?
02:47:08.000 You want your loved ones to be happy.
02:47:10.000 You want food on the table.
02:47:12.000 You want a safe place to sleep and live.
02:47:14.000 You want things to do that are exciting, that occupy your time, that you enjoy, that are rewarding.
02:47:21.000 That's what everybody wants.
02:47:24.000 We're moving collectively in a better direction.
02:47:28.000 So I'm ultimately hopeful and I'm ultimately positive.
02:47:33.000 When I think about the future, I think it's going to be uber bizarre and strange.
02:47:37.000 But I don't necessarily think it's going to be bad.
02:47:41.000 I've just accepted that it's happening.
02:47:43.000 And instead of being filled with fear and anxiety, which I am sometimes still.
02:47:47.000 Sometimes I'll freak out about it.
02:47:50.000 You freak out about technology specifically?
02:47:52.000 I freak out about war.
02:47:53.000 I freak out about technology.
02:47:55.000 I freak out about the fact that the world can change.
02:47:58.000 There was a while that I was getting anxiety late at night.
02:48:01.000 My whole family would be asleep right after the invasion of Ukraine, I think it was, when it really started.
02:48:06.000 When I'd be alone at night, I'd be like, the people that lived in Hiroshima had no idea that it was coming.
02:48:11.000 The people that lived in Dresden, the people that lived anywhere where crazy shit happened.
02:48:16.000 Before it happened, things were normal, and then they were never normal again.
02:48:20.000 And so I just kept thinking that one of these morons somewhere could do something, or a group of morons can do something that Forever alters everything and then we're in Mad Max, which has happened before in different parts of the world.
02:48:37.000 And is the idea of nuclear war a scenario that your worst nightmare, that concept that's keeping you up late at night, I want to say don't read, but I think you should read this book because you with your voice and your reach...
02:48:54.000 It's wise to realize how we're not going to even have an opportunity to see what happens to AI if one madman with a nuclear missile decides to do a bolt out of the blue attack.
02:49:06.000 And that's possible.
02:49:08.000 And that is possible and that's what everyone in Washington fears.
02:49:12.000 And I think this goes back to the idea that it's great 10, 20 years later to be like, oh my God, look what they were doing.
02:49:20.000 Can you believe they covered this all up and learn from it?
02:49:22.000 But you can't learn from the fact that...
02:49:27.000 How dangerous nuclear war is.
02:49:28.000 How close we are.
02:49:29.000 How we are one misunderstanding away from a nuclear war.
02:49:33.000 If everyone's dead.
02:49:34.000 There's no learning.
02:49:35.000 There's no opportunity.
02:49:36.000 Which is why I always say read nuclear war scenario.
02:49:39.000 Join the conversation while we can all still have one.
02:49:44.000 Okay.
02:49:45.000 Well, Annie, thank you very much for being here.
02:49:47.000 I really appreciate it.
02:49:48.000 It was great to see you again.
02:49:49.000 And like I said, I have not read your book, but I have several friends that have, and they're absolutely terrified by it.
02:49:55.000 So you're doing your right job.
02:49:56.000 You're always killing it.
02:49:58.000 I really appreciate you.
02:49:59.000 Thank you so much for having me.
02:50:00.000 And I really enjoyed the conversation.
02:50:01.000 Thank you.
02:50:02.000 Me too.
02:50:02.000 So tell everybody where your social media is so they can find you online.
02:50:07.000 Annie Jacobson.
02:50:08.000 Annie Jacobson, website?
02:50:10.000 You and I both know Google, AI, everything works.
02:50:13.000 All you need is a name anymore.
02:50:15.000 That's true.
02:50:15.000 Right?
02:50:15.000 And your website?
02:50:16.000 What's your website?
02:50:17.000 AnnieJacobson.com.
02:50:18.000 Okay.
02:50:19.000 And the book's available everywhere, an audio book written and said by you.
02:50:23.000 You bet.
02:50:23.000 Which is great.
02:50:24.000 I love that.
02:50:24.000 Thank you, Annie.
02:50:25.000 Appreciate you.
02:50:26.000 Thank you, Joe.
02:50:26.000 Bye, everybody.