The Joe Rogan Experience - August 16, 2024


Joe Rogan Experience #2190 - Peter Thiel


Episode Stats

Length

3 hours and 30 minutes

Words per Minute

152.31793

Word Count

32,035

Sentence Count

2,339

Misogynist Sentences

13

Hate Speech Sentences

31


Summary

In this episode of the podcast, I sit down with a friend to talk about why he doesn't want to leave California and why he thinks it would be a good idea to leave the country. We talk about the current state of the economy, the current fiscal situation, and why we should all be worried about it. I think you're going to get a lot out of this episode if you listen to this one, and I hope you do so in your own time to reflect on what's been going on in the past 20 years and what's going to happen in the next 20 years. It's a good listen, and it's definitely worth a listen. Tweet me and let me know what you think! Timestamps: 4:00 - Why I don't think the U.S. should leave California 8:30 - Why it's not a good move to leave LA 9:15 - Should I go to Florida or Costa Rica? 11:00- Why I think it's better to stay in California 16:30- Why we need to get out of the country 17:20 - How to deal with the current economic situation 18:40 - The current fiscal problems we're in 21st century capitalism 19:00 What are the real solutions to the economic problems we should be focusing on 21:30 22 - Why we have a deficit problem 23:00 | How to fix the current problems 26:00 The real problem we're facing 27:15 28:10 - What s going wrong in the US economy 29:10 35:30 | Why we should fix it? 36:10 | What's the real problem? 39:40 | How can we fix it 40:40 How do we fix this? & 35:15 | What are we going to fix it ? 41:00 + 40:00 & 45:00 // 45:30 + 46:00+ 45:20 Theme: 44: What's going wrong with the problem we should do? Theme Music: Theme music by Ian Dorsen Theme song by my main amigo, & Music by my ad by my good friend John McAfee , Intro music by my band, my music is by my song is , my ad is


Transcript

00:00:12.000 What's up, man?
00:00:12.000 Good to see you.
00:00:13.000 Glad to be on the show.
00:00:14.000 My pleasure.
00:00:15.000 Thanks for having me.
00:00:15.000 My pleasure.
00:00:16.000 What's cracking?
00:00:17.000 How you doing?
00:00:17.000 Doing all right.
00:00:18.000 We were just talking about how you're still trapped in LA. I'm still trapped in LA. I know.
00:00:24.000 You're friends with a lot of people out here.
00:00:25.000 Have you thought about jettisoning?
00:00:28.000 I talk about it all the time.
00:00:31.000 But, you know, it's always talk is often a substitute for action.
00:00:36.000 It's always, does it lead to action or does it end up substituting for action?
00:00:40.000 That's a good point.
00:00:41.000 But I have endless conversations about leaving.
00:00:43.000 And I moved from San Francisco to L.A. back in 2018. That felt about as big a move away as possible.
00:00:49.000 And I keep the extreme thing I keep saying, and I have to keep in mind talks as a substitute for action, the extreme thing I keep saying is I can't decide whether to leave the state or the country.
00:01:02.000 Oh, boy.
00:01:03.000 If you went out of the country, where would you go?
00:01:05.000 Man, it's tough to find places because there are a lot of problems in the U.S. and most places are doing so much worse.
00:01:14.000 Yeah.
00:01:14.000 It's not a good move to leave here.
00:01:16.000 As fucked up as this place is.
00:01:18.000 But I keep thinking I shouldn't move twice.
00:01:22.000 So I should either – I can't decide whether I should move to Florida or should move to New Zealand or Costa Rica or something like that.
00:01:30.000 Yeah.
00:01:32.000 Go full John McAfee.
00:01:34.000 But I can't decide between those two so I end up stuck in California.
00:01:37.000 Well, Australia is okay, but they're even worse when it comes to rule of law and what they decide to make you do and the way they're cracking down on people now for online speech.
00:01:49.000 It's very sketchy in other countries.
00:01:53.000 It's – but somehow the relative outperformance of the U.S. and the absolute stagnation decline of the U.S., they're actually related things because the way the conversation is grouped, every time I say – tell someone,
00:02:08.000 you know, I'm thinking about leaving the country.
00:02:10.000 They'll do what you say and they'll say, well, every place is worse.
00:02:14.000 And then that somehow distracts us from all the problems in this country.
00:02:17.000 And then we can't talk about what's gone wrong in the U.S. because everything is so much worse.
00:02:25.000 Well, I think most people know what's gone wrong.
00:02:28.000 But they don't know if they're on the side of the government that's currently in power.
00:02:33.000 They don't know how to criticize it.
00:02:35.000 They don't know exactly what to say, what should be done.
00:02:37.000 And they're ideologically connected to this group being correct.
00:02:41.000 So they try to do mental gymnastics to try to support some of the things that are going on.
00:02:46.000 I think that's part of the problem.
00:02:48.000 I don't think it's necessarily that we don't know what the problems are.
00:02:50.000 We know what the problems are, but we don't have clear solutions as to how to fix them, nor do we understand the real mechanisms of how they got there in the first place.
00:03:00.000 Yeah, I mean there are a lot that are pretty obvious to articulate and they're much easier described than solved.
00:03:08.000 Like we have a crazy, crazy budget deficit.
00:03:11.000 And presumably you have to do one of three things.
00:03:15.000 You have to raise taxes a lot, you have to cut spending a lot, or you're just going to keep borrowing money.
00:03:22.000 Isn't there like some enormous amount of our taxes that just go to the deficit?
00:03:27.000 It's not that high, but it's gone up a lot.
00:03:33.000 What is it?
00:03:34.000 I thought it was like 34% or something crazy.
00:03:37.000 It peaked at 3.1% of GDP. Which is, you know, maybe 15, 20% of the budget, 3.1% of GDP in 1991. And then it went all the way down to something like 1.5% in the mid-2010s.
00:03:55.000 And now it's crept back up to 3.1%, 3.2%.
00:03:59.000 And so we are at all-time highs as a percentage of GDP. And the way to understand the basic math is the debt went up a crazy amount, but the interest rates went down.
00:04:09.000 And from 2008 to 2021, for 13 years, we basically had zero interest rates with one brief blip under Powell.
00:04:18.000 But it was basically zero rates.
00:04:19.000 And then you could borrow way more money, and it wouldn't show up in servicing the debt because you just paid 0% interest on the T-bills.
00:04:27.000 And the thing that's That's very dangerous seeming to me about the current fiscal situation is the interest rates have gone back to positive like they were in the 90s and early 2000s, mid-2000s.
00:04:42.000 And it's just this incredibly large debt.
00:04:45.000 And so we now have a real runaway deficit problem.
00:04:49.000 But people have been talking about this for 40 years and crying wolf for 40 years.
00:04:54.000 So it's very hard for people to take it seriously.
00:04:56.000 Most people don't even understand what it means.
00:04:58.000 Like when you say there's a deficit, we owe money.
00:05:02.000 Okay, to who?
00:05:04.000 How does that work?
00:05:06.000 It's – well, it's people who bought the bonds and it's – A lot of it's to Americans.
00:05:15.000 Some of them are held by the Federal Reserve.
00:05:17.000 A decent amount are held by foreigners at this point because in some ways it's the opposite of the trade current account deficits.
00:05:27.000 The U.S. has been running these big current account deficits and then the foreigners end up with way more dollars than they want to spend on American goods or services.
00:05:35.000 And so they have to reinvest them in the U.S. Some put it into houses or stocks, but a lot of it just goes into government debt.
00:05:42.000 So in some ways it's a function of the chronic trade imbalances, chronic trade deficits.
00:05:48.000 Well, if you had supreme power, if Peter Thiel was the ruler of the world and you could fix this, what would you do?
00:05:55.000 Man, I always find that hypothetical.
00:05:58.000 It's a ridiculous hypothetical.
00:06:00.000 It is ridiculous.
00:06:01.000 You get ridiculous answers.
00:06:02.000 I want a ridiculous answer.
00:06:03.000 That's what I like.
00:06:05.000 But what could be done?
00:06:07.000 First of all, what could be done to mitigate it and what could be done to solve it?
00:06:11.000 I think my answers are probably all in the very libertarian direction.
00:06:22.000 So it would be sort of Figure out ways to have smaller governments, figure out ways to increase the age on Social Security, means test Social Security so not everyone gets it.
00:06:36.000 Just figure out ways to gradually dial back a lot of these government benefits.
00:06:43.000 And then that's insanely unpopular.
00:06:46.000 So it's completely unrealistic on that level.
00:06:49.000 That bothers people that need Social Security.
00:06:51.000 I said means-tested.
00:06:53.000 Means-tested.
00:06:53.000 So people who don't need it don't get it.
00:06:56.000 Right.
00:06:56.000 So Social Security, even if you're very wealthy, I don't even know how it works.
00:07:00.000 Do you still get it?
00:07:01.000 Yeah, basically anyone who – pretty much everyone gets it because it was originally rationalized as a – As a sort of a pension system, not as a welfare system.
00:07:16.000 And so the fiction was you pay Social Security taxes and then you're entitled to get a pension out in the form of Social Security.
00:07:24.000 And because it was – we told this fiction that it was a form of – it was a pension system instead of an intergenerational Ponzi scheme or something like that.
00:07:36.000 You know, the fiction means everybody gets paid Social Security because it's a pension system.
00:07:40.000 Whereas if we were more honest and said it's, you know, it's just a welfare system, maybe you could start dialing, you could probably rationalize in a lot of ways.
00:07:51.000 Trevor Burrus And it's not related to how much you put into it, right?
00:07:54.000 Like, how does Social Security work in terms of … Trevor Burrus I think it's partially related.
00:07:58.000 So I think there is – I'm not a total expert on this stuff.
00:08:01.000 But I think there's some guaranteed minimum you get.
00:08:06.000 And then if you put more in, you get somewhat more, and then it's capped at a certain amount.
00:08:13.000 And that's why Social Security taxes are capped at something like $150,000 a year.
00:08:21.000 And then this is one of the really big tax increase proposals that's out there is to uncap it, which would effectively be a 12.4% income tax hike on all your income.
00:08:34.000 Adjust to Social Security?
00:08:36.000 Sure.
00:08:36.000 Because the argument is, the sort of progressive left Democrat argument is that it's, you know, why should you have a regressive Social Security tax?
00:08:48.000 Why should you pay 12.4% or whatever the Social Security tax is?
00:08:52.000 Half gets paid by you, half gets paid by your employer.
00:08:56.000 But then it's capped at like $140,000, $150,000, some level like that.
00:09:02.000 And what should be regressive, where if you make 500K or a million K a year, you pay zero tax on your marginal income.
00:09:09.000 And that makes no sense if it's a welfare program.
00:09:11.000 If it's a retirement savings program and your payout's capped, then, you know, you don't need to put in more than you get out.
00:09:19.000 Well, that's logical, but there's not a lot of logic going on with the way people are talking about taxes today.
00:09:26.000 Like, California just jacked their taxes up to 14, what?
00:09:29.000 Was it 14-4?
00:09:30.000 Something like that, yeah.
00:09:31.000 14-3, I think.
00:09:32.000 Which is hilarious.
00:09:33.000 Maybe more, yeah.
00:09:34.000 14-9, something.
00:09:35.000 I mean, you want more money for doing a terrible job and having more people leave for the first time ever in, like, the history of the state.
00:09:42.000 Yeah, but look, it gets away with it.
00:09:45.000 I know.
00:09:46.000 Well, people are forced with no choice.
00:09:49.000 What are you going to do?
00:09:53.000 I mean, there are people at the margins who leave, but the state government still collects more and more in revenue.
00:10:00.000 So it's – you get – I don't know.
00:10:02.000 You get 10 percent more revenues and 5 percent of the people leave.
00:10:06.000 You still increase the amount of revenues you're getting.
00:10:11.000 It's inelastic enough that you're actually able to increase the revenues.
00:10:15.000 I mean this is sort of the – The crazy thing about California is there's always sort of a right wing or libertarian critique of California that it's such a ridiculous place.
00:10:27.000 It should just collapse under its own ridiculousness.
00:10:33.000 It doesn't quite happen.
00:10:35.000 The macroeconomics in it are pretty good.
00:10:38.000 40 million people, the GDP is around 4 trillion.
00:10:42.000 It's about the same as Germany with 80 million or Japan with 125 million.
00:10:46.000 Japan has three times the population of California.
00:10:49.000 Same GDP means one-third the per capita GDP. So there's some level on which California as a whole is working even though it doesn't work from a governance point of view.
00:10:59.000 It doesn't work for a lot of the people who live there.
00:11:02.000 And the rough model I have for how to think of California is that it's kind of like Saudi Arabia.
00:11:08.000 And you have a crazy religion, wokeism in California, Wahhabism in Saudi Arabia.
00:11:15.000 You know, not that many people believe it, but it distorts everything.
00:11:19.000 And then you have like oil fields in Saudi Arabia, and you have the big tech companies in California.
00:11:26.000 And the oil pays for everything.
00:11:29.000 And then you have a completely bloated, inefficient government sector.
00:11:34.000 And you have sort of all sorts of distortions in the real estate market, where people also make lots of money.
00:11:41.000 And sort of the government and real estate are ways you redistribute the oil wealth or the The big tech money in California.
00:11:52.000 It's not the way you might want to design a system from scratch, but it's pretty stable.
00:11:58.000 People have been saying Saudi Arabia is ridiculous.
00:12:01.000 It's going to collapse in a year now.
00:12:02.000 They've been saying that for 40 or 50 years.
00:12:04.000 But if you have a giant oil field, you can pay for a lot of ridiculousness.
00:12:08.000 I think that's the way you have to think of California.
00:12:12.000 Well, the other thing is you're alsoβ€” There are things about it that are ridiculous, but there's something about it that, you know, it doesn't naturally self-destruct overnight.
00:12:21.000 Well, there's a lot of kick-ass people there, and there's a lot of people that are still generating enormous amounts of wealth there, and it's too difficult to just pack up and leave.
00:12:29.000 I think it's something like four of the eight or nine companies with market capitalizations over a trillion dollars are based in California.
00:12:38.000 That's amazing.
00:12:39.000 It's Google, Appleβ€” Now NVIDIA, Meta, I think Broadcom is close to that.
00:12:49.000 And there's no ideal place to live either.
00:12:53.000 It's not like California sucks, so there's a place that's got it totally dialed in with also that has an enormous GDP, also has an enormous population.
00:13:03.000 There's not like one big city that's really dialed in.
00:13:07.000 Well, it's – there are things that worked.
00:13:10.000 I looked at all the zero tax states in the US and it's always – you don't – I think the way you ask the question gets at it, which is you don't live in a – in theory, a lot of stuff happens on a state level, but you don't live in a state.
00:13:26.000 You live in a city.
00:13:28.000 And so if you're somewhat biased towards living in at least a moderately sized city, okay, I think there are four states where there are no cities.
00:13:39.000 Alaska, Wyoming, South Dakota, New Hampshire.
00:13:44.000 There's zero tax, but no cities to speak of.
00:13:49.000 And then you have Washington State with Seattle, where the weather is the worst in the country.
00:13:57.000 You have Nevada with Las Vegas, which I'm not that big a fan of.
00:14:03.000 And then that leaves three zero-tax states.
00:14:06.000 You have Texas, which I like as a state, but I'm not that big a fan of Austin, Dallas, or Houston.
00:14:15.000 Houston is just sort of an oil town, which is good if you're in that business, but otherwise not.
00:14:22.000 Dallas has sort of an inferiority complex to L.A. and New York.
00:14:28.000 Just not the healthiest attitude.
00:14:30.000 And then, you know, I don't know.
00:14:32.000 Austin's a government town and a college town and a wannabe hipster San Francisco town.
00:14:37.000 So, you know, my books are three strikes and you're kind of out too.
00:14:41.000 And then that leaves Nashville, Tennessee, which was – or Miami, South Florida.
00:14:48.000 And those would be my two top choices.
00:14:50.000 Miami is fun, but I wouldn't want to live there.
00:14:52.000 It's a fun place to visit.
00:14:54.000 It's a little too crazy.
00:14:55.000 A little too chaotic.
00:14:56.000 A little too cocaine-fueled.
00:14:58.000 A little too party, party, party.
00:15:00.000 I think it's pretty segmented from the tourist strip from everything else.
00:15:08.000 It probably is.
00:15:10.000 There probably is something...
00:15:13.000 A little bit paradoxical about any place that gets lots of tourists.
00:15:18.000 There's some things that are great about it because so many tourists go, but then in some sense it creates a weird aesthetic because the day-to-day vibe is that you don't work and you're just having fun or something like that.
00:15:34.000 Right, because so many people are going there just to do that.
00:15:37.000 And that's probably a little bit off with the South Florida thing.
00:15:44.000 And then I think Nashville is also sort of its own real place.
00:15:52.000 Nashville's great.
00:15:53.000 Yeah.
00:15:53.000 So those would be my, those are the top two.
00:15:55.000 I could live in Nashville.
00:15:56.000 No problem.
00:15:57.000 I'm probably always, I'm always, I'm always too, you know, fifth grade onwards since, you know, 70, 77, I lived in California.
00:16:09.000 And so I'm just a sucker for the weather.
00:16:12.000 And I think there is no place besides coastal California where we have really good weather year-round in the U.S. Maybe Hawaii is pretty good.
00:16:22.000 Coastal California is tough to beat.
00:16:24.000 And you're two hours from the mountains.
00:16:26.000 Man, it's like, you know, it's mid-August here in Austin.
00:16:29.000 It's just brutal.
00:16:31.000 Is it?
00:16:31.000 I think so.
00:16:32.000 Really?
00:16:33.000 That was too hot for you?
00:16:33.000 It was too hot for me.
00:16:34.000 Today's mild.
00:16:35.000 What is it out there?
00:16:36.000 Like 80?
00:16:36.000 All right.
00:16:37.000 85?
00:16:38.000 96. 96?
00:16:39.000 You're proving my point.
00:16:40.000 I do so much sauna that I literally don't even notice it.
00:16:43.000 I'm outside for hours every day shooting arrows, and I don't even notice it.
00:16:50.000 I don't know if you're a representative of the average Austin president.
00:16:53.000 I don't know, but I think you get accustomed to it.
00:16:56.000 To me, it's so much better than too cold.
00:16:58.000 Too cold you can die.
00:17:00.000 And I know you can die from the heat, but you probably won't, especially if you have water.
00:17:04.000 You'll be okay.
00:17:05.000 But you could die from the cold.
00:17:06.000 Cold's real.
00:17:07.000 So really cold places, there's five months out of the year where your life's in danger.
00:17:12.000 Where you could do something wrong.
00:17:13.000 Like if you live in Wyoming and you break down somewhere and there's no one on the road, you could die out there.
00:17:19.000 That's real.
00:17:19.000 You could die from exposure.
00:17:21.000 Sure.
00:17:22.000 There's probably some very deep reason there's been a net migration of people to the West and the South and the U.S. over...
00:17:28.000 California, you can do no wrong.
00:17:30.000 As long as the earth doesn't move, you're good.
00:17:32.000 As long as there's no tsunamis, you're good.
00:17:34.000 It is a perfect environment, virtually year-round.
00:17:38.000 It gets a little hot in the summer, but again, coastal, not at all.
00:17:41.000 If you get an 80-degree day in Malibu, it's unusual.
00:17:45.000 It's wonderful.
00:17:46.000 You've got a beautiful breeze coming off the ocean, sun's out, everybody's pretty.
00:17:51.000 And then it's correlated with confiscatory taxation.
00:17:55.000 It's all sort of a package deal.
00:17:56.000 Well, it's a scam.
00:17:58.000 You know, they know you don't want to leave.
00:17:59.000 I didn't want to leave California.
00:18:00.000 It's fucking great.
00:18:01.000 I appreciate you left.
00:18:03.000 I always have the fantasy that if enough people like you leave, it'll put pressure on them.
00:18:06.000 But it's never quite enough.
00:18:08.000 Never quite enough.
00:18:09.000 And it's not going to be.
00:18:10.000 It's too difficult for most people.
00:18:11.000 It was very difficult for me.
00:18:12.000 And I had a bunch of people working for me that were willing to pack up and leave, like young Jamie over there.
00:18:17.000 But we, you know, it was tricky.
00:18:20.000 You're taking your whole business, and my business is talking to people.
00:18:24.000 That's part of my business.
00:18:25.000 My other business is stand-up comedy.
00:18:27.000 So you left during COVID? I left at the very beginning.
00:18:30.000 As soon as they started locking things down, I'm like, oh, these motherfuckers are never letting this go.
00:18:34.000 In March, April, May 2020?
00:18:35.000 In May, I started looking at houses.
00:18:37.000 Cool.
00:18:37.000 That's when I came to Austin first.
00:18:42.000 I got a place in Miami in September of 2020, and I've spent the last four winters there, so I'm sort of always on the cusp of moving to Florida, hard to get out of California.
00:18:56.000 But the thing that's gotten a lot harder about moving relative to four years ago, and I'd say I think my real estate purchases have generally not been great over the years.
00:19:08.000 I mean, they've done okay, but...
00:19:10.000 Certainly not the way I've been able to make money at all, but with the one exception was Miami.
00:19:19.000 Bought it in September 2020, and probably, you know, fast forward four years, it's up like 100%, or something like that.
00:19:32.000 But then, paradoxically, this also means it's gotten much harder to move there, or Austin, or any of these places.
00:19:42.000 If I relocated my office in LA, the people who owned houses Okay, you have to buy a place in Florida.
00:19:51.000 It costs twice as much as it did four years ago.
00:19:54.000 And then the interest rates have also doubled.
00:19:56.000 And so you get a 30-year mortgage.
00:19:58.000 You could have locked that in for 3% in 2020. Now it's, you know...
00:20:03.000 Maybe 6.5%, 7%.
00:20:05.000 So the prices have doubled.
00:20:07.000 The mortgages have doubled.
00:20:08.000 So it costs you four times as much to buy a house.
00:20:12.000 And so there was a moment where people could move during COVID. And it's gotten dramatically harder relative to what it was four years ago.
00:20:20.000 Well, the Austin real estate market went crazy, and then it came back down a little bit.
00:20:24.000 It's in that down a little bit spot right now where there's a lot of high-end properties that are still for sale.
00:20:30.000 They can't move.
00:20:31.000 It's different.
00:20:33.000 There's not a lot of people moving here now like there was in the boom because everything's open everywhere.
00:20:38.000 Well, I somehow think Austin was linked to California and Miami was linked a little bit more to New York.
00:20:47.000 And it was a little bit, you know, all these differences, but Austin was kind of...
00:20:54.000 A big part of the move were people from tech, from California that moved to Austin.
00:21:01.000 There's a part of the Miami, South Florida thing, which was people from finance in New York City that moved to Florida.
00:21:10.000 And the finance industry is less networked on New York City.
00:21:14.000 So I think it is possible for people, if you run a private equity fund or if you work at a bank, it's possible for some of those functions to easily be moved to a different state.
00:21:25.000 The tech industry is Crazily networked on California.
00:21:30.000 There's probably some way to do it.
00:21:32.000 It's not that easy.
00:21:36.000 Yeah, it makes sense.
00:21:37.000 It makes sense, too.
00:21:39.000 It's just the sheer numbers.
00:21:41.000 I mean, when you're talking about all those corporations that are established and based in California, there's so many.
00:21:46.000 They're so big.
00:21:47.000 Just the sheer numbers of human beings that live there and work there that are involved in tech.
00:21:53.000 Sure.
00:21:53.000 If it wasn't as networked, You know, you could probably just move.
00:22:00.000 And maybe these things are networked till they're not.
00:22:02.000 Detroit was very networked.
00:22:04.000 The car industry was super networked on Detroit for decades and decades.
00:22:08.000 And Michigan got more and more mismanaged.
00:22:10.000 And people thought the network sort of protected them because, you know, the big three car companies were in Detroit, but then you had all the supply chains were also in Detroit.
00:22:19.000 And then eventually, it was just so ridiculous, people moved, started moving the factories outside of that area, and it sort of unraveled.
00:22:27.000 So that's, you know, it can also happen with California.
00:22:30.000 It'll just take a lot.
00:22:32.000 That would be insane, if they just abandoned all the tech companies in California.
00:22:36.000 I mean, just look at what happened at Flint, Michigan, when all the auto factories pulled out.
00:22:40.000 Well, it's, it's, look, I think you can, it's always, there are all these paradoxical histories, you know, the internet, you The point of the internet, in some sense, was to eliminate the tyranny of place.
00:22:54.000 And that was sort of the idea.
00:22:57.000 And then one of the paradoxes about the history of the internet was that the internet companies, you know, were all centered in California.
00:23:08.000 There have been different waves of...
00:23:12.000 Of how networked, how non-networked they were.
00:23:16.000 I think probably 2021, sort of the COVID moving away from California, the big thing in tech was crypto.
00:23:28.000 And crypto had this conceit of a, you know, alternate currency, decentralized, away from the central banks, but also the crypto companies, the crypto protocols, you could do those from anywhere.
00:23:42.000 You could do them outside the US, you could do them from Miami.
00:23:45.000 And so crypto was something where the tech could naturally move out of California.
00:23:51.000 And today probably the core tech narrative is completely flipped to AI. And then there's something about AI that's very centralized.
00:24:07.000 I had this one-liner years ago where it was, you know, if we say that crypto is libertarian, can we also say that AI is communist?
00:24:15.000 Or something like this, where the natural structure for an AI company looks like it's a big company, and then somehow the AI stuff feels like it's going to be dominated by the big tech companies in the San Francisco Bay Area.
00:24:34.000 And so if that's the future of tech, the scale, the natural scale of the industry tells you that it's going to be extremely hard to get out of the San Francisco Bay Area.
00:24:49.000 When you look to the future and you try to just make a guess as to how all this is going to turn out with AI, what do you think we're looking at over the next five years?
00:25:01.000 Man, I think I should start by being modest in answering that question and saying that nobody has a clue.
00:25:06.000 Right.
00:25:07.000 Which is true.
00:25:08.000 Which pretty much all the experts say.
00:25:10.000 You know, I would say...
00:25:13.000 Let me do sort of a history...
00:25:18.000 The riff I always had on this was that I can't stand any of the buzzwords and I felt AI, you know, there's all this big data, cloud computing, there were all these crazy buzzwords people had and they always were ways to sort of abstract things and get away from We're not good ways of talking about things.
00:26:03.000 I'll start with the history before I get to the future.
00:26:05.000 But the history of it...
00:26:07.000 It was maybe anchored on two visions of what AI meant.
00:26:12.000 And one was Nick Bostrom, Oxford prof, who wrote this book, Super Intelligence, 2014. And it was basically AI was going to be this super-duper intelligent thing, way, way godlike intelligence,
00:26:29.000 way smarter than any human being.
00:26:34.000 And then there was sort of the, I don't know, the CCP Chinese Communist rebuttal, the Kai-Fu Lee book from 2018, AI Superpowers.
00:26:44.000 I think the subtitle was something like The Race for AI Between Silicon Valley and China or something like this.
00:26:50.000 And it was sort of – it defined AI as – it was fairly low-tech.
00:26:56.000 It was just surveillance.
00:26:58.000 You know, facial recognition technology.
00:27:01.000 We would just have this sort of totalitarian – Stalinist monitoring.
00:27:07.000 It didn't require very much innovation.
00:27:08.000 It just required that you apply things.
00:27:10.000 And basically the subtext was China is going to win because we have no ethical qualms in China about applying this sort of basic machine learning to sort of measuring or controlling the population.
00:27:26.000 And those were sort of like, say, two extreme competing visions of what AI would mean in the 2010s and sort of maybe were sort of the anchors of the AI debate.
00:27:40.000 And then, you know, what happened?
00:27:44.000 In some sense with ChatGPT in late 22, early 23, was that the achievement you got, you did not get superintelligence, it was not just surveillance tech,
00:28:00.000 but you actually got to the holy grail of what people would have defined AI as from 1950 to 2010, for the previous 60 years, before the 2010s, people have always said AI, the definition of AI is passing the Turing test.
00:28:16.000 And the Turing test, it basically means that the computer can fool you into thinking that it's a human being.
00:28:27.000 And it's a somewhat fuzzy test because, you know, obviously you can have an expert on the computer, a non-expert.
00:28:32.000 You know, does it fool you all the time or some of the time?
00:28:36.000 How good is it?
00:28:37.000 But to first approximation, the Turing test, you know, we weren't even close to passing it in 2021. And then, you know, ChatGPT basically passes the Turing test, at least for, like, let's say an IQ 100 average person.
00:28:53.000 It's passed the Turing test.
00:28:56.000 And that was the holy grail.
00:29:15.000 Was like this almost like psychological suppression people had where they were not thinking.
00:29:21.000 They lost track of the Turing test, of the Holy Grail because it was about to happen.
00:29:27.000 And it was such a significant, such an important thing that you didn't even want to think about.
00:29:32.000 So I'm tempted to give almost a psychological repression theory of the 2010 debates.
00:29:38.000 Be that as it may, the Turing test gets passed and that's an extraordinary achievement.
00:29:46.000 And then where does it go from here?
00:29:53.000 There probably are ways you can refine these.
00:29:58.000 It's still going to be, you know, a long time to apply it.
00:30:01.000 There is a question.
00:30:03.000 There's this AGI discussion.
00:30:05.000 You know, will we get artificial general intelligence, which is a hopelessly vague concept, which, you know, general intelligence could be just a generally smart human being.
00:30:16.000 So is that just a person with an IQ of 130?
00:30:19.000 Or is it superintelligence?
00:30:21.000 Is it godlike intelligence?
00:30:23.000 So it's sort of an ambiguous thing.
00:30:26.000 But I keep thinking that maybe the AGI question is less important than passing the Turing test.
00:30:33.000 If we got AGI, if we got, let's say, superintelligence, that would be interesting to Mr. God because you'd have competition for being God.
00:30:44.000 But surely the Turing test is more important for us humans.
00:30:49.000 Because it's either a compliment or a substitute to humans.
00:30:53.000 And so it's, yeah, it's going to rearrange the economic, cultural, political structure of our society in extremely dramatic ways.
00:31:02.000 And I think maybe what's already happened is much more important than anything else that's going to be done.
00:31:09.000 And then it's just going to be a long ways in applying it.
00:31:13.000 One last thought.
00:31:14.000 You know, the...
00:31:18.000 The analogy I'm always tempted to go to, historical analogies are never perfect, but it's that maybe AI in 2023-24 is like the Internet in 1999,
00:31:34.000 where on one level it's clear the Internet's going to be big and get a lot bigger and it's going to dominate the economy, it's going to rearrange the society in the 21st century.
00:31:47.000 And then at the same time, it was a complete bubble and people had no idea how the business models worked.
00:31:54.000 You know, almost everything blew up.
00:31:58.000 It took, you know, it didn't take that long in the scheme of things.
00:32:01.000 It took, you know, 15, 20 years for it to become super dominant.
00:32:06.000 But it didn't happen sort of in 18 months as people fantasized in 1999. And maybe what we have in AI is something like this.
00:32:17.000 It's figuring out how to actually apply it in sort of all these different ways is going to take something like two decades.
00:32:28.000 But that doesn't distract from it being a really big deal.
00:32:31.000 It is a really big deal, and I think you're right about the Turing test.
00:32:34.000 Do you think that the lack of acknowledgement or the public celebration or at least this mainstream discussion, which I think should be everywhere, that we've passed the Turing test, do you think it's connected to the fact that this stuff accelerates so rapidly that even though we've essentially breached this new territory,
00:32:55.000 We still know that GPT-5 is going to be better.
00:32:58.000 GPT-6 is going to be insane.
00:33:00.000 And then they're working on these right now.
00:33:02.000 And the change is happening so quickly, we're almost a little reluctant to acknowledge where we're at.
00:33:10.000 Yeah.
00:33:11.000 You know...
00:33:13.000 I've often, you know, probably for 15 years or so, often been on the side that there isn't that much progress in science or tech or not as much as Silicon Valley likes to claim.
00:33:26.000 And even on the AI level, I think it's a massive technical achievement.
00:33:31.000 It's still an open question, you know, is it actually going to lead to much higher living standards for everybody?
00:33:37.000 You know, the Internet was a massive achievement.
00:33:39.000 How much did it raise people's living standards?
00:33:41.000 Much trickier question.
00:33:44.000 So I – but in this world where not much has happened, one of the paradoxes of an era of relative tech stagnation is that when something does happen, we don't even know how to process it.
00:34:01.000 So I think Bitcoin was a – It was a big invention, whether it was good or bad, but it was a pretty big deal.
00:34:08.000 And it was systematically underestimated for at least, you know, the first 10, 11 years.
00:34:18.000 You know, you could trade it.
00:34:20.000 It went up smoothly for 10, 11 years.
00:34:22.000 It didn't get repriced all at once because We're in a world where nothing big ever happens.
00:34:27.000 And so we have no way of processing it when something pretty big happens.
00:34:32.000 The internet was pretty big in 99. Bitcoin was moderately big.
00:34:36.000 The internet was really big.
00:34:38.000 Bitcoin was moderately big.
00:34:39.000 And I'd say passing the Turing test is really big.
00:34:43.000 It's on the same scale as the internet.
00:34:45.000 And because our lived experience is that so little It has felt like it's been changing for the last few decades.
00:34:55.000 We're probably underestimating it.
00:34:57.000 It's interesting that you say that so little, we feel like so little has changed, because if you're a person, how old are you?
00:35:04.000 Same age as you were, born in 1967. So in our age, we've seen all the change, right?
00:35:09.000 We saw the end of the Cold War, we saw answering machines, we saw VHS tapes, then we saw the internet, and then Where we're at right now, which is like this bizarre moment in time where people carry the internet around with them in their pocket every day.
00:35:24.000 And these super sophisticated computers that are ubiquitous.
00:35:27.000 Everybody has one.
00:35:28.000 There's incredible technology that's being ramped up every year.
00:35:32.000 They're getting better all the time.
00:35:33.000 And now there's AI. There's AI on your phone.
00:35:36.000 You could access ChatGPT and a bunch of different programs on your phone.
00:35:40.000 And I think that's an insane change.
00:35:43.000 I think that's one of the most – especially with the use of social media, it's one of the most bizarre changes I think our culture has ever – the most bizarre.
00:35:52.000 It can be a big change culturally or politically.
00:35:57.000 But the kinds of questions I would ask is how do you measure it economically?
00:36:03.000 How much does it change GDP? How much does it change productivity?
00:36:10.000 And certainly, the story I would generally tell for the last 50 years, since the 1970s, early 70s, is that we've been not absolute stagnation.
00:36:22.000 We've been in an area of relative stagnation where there has been...
00:36:27.000 Very limited progress in the world of atoms, the world of physical things.
00:36:33.000 And there has been a lot of progress in the world of bits, information, computers, internet, mobile internet, you know, now AI. What are you referring to when you're saying the world of physical things?
00:36:47.000 You know, it's any...
00:36:49.000 Well, if we had defined technology, if we were sitting here in 1967, the year we were born, And we had a discussion about technology.
00:36:59.000 What technology would have meant?
00:37:01.000 It would have meant computers.
00:37:02.000 It would have also meant rockets.
00:37:04.000 It would have meant supersonic airplanes.
00:37:07.000 It would have meant new medicines.
00:37:09.000 It would have meant the green revolution in agriculture, maybe underwater cities, you know.
00:37:17.000 It sort of had, because technology simply gets defined as that which is changing, that which is progressing.
00:37:24.000 And so there was progress on all these fronts.
00:37:26.000 Today, last 20 years, when you talk about technology, you're normally just talking about information technology.
00:37:33.000 Technology has been reduced to meaning computers.
00:37:36.000 And that tells you that the structure of progress has been weird.
00:37:39.000 There's been this narrow cone of We're literally moving
00:38:10.000 slower than we were And that's sort of the...
00:38:22.000 And then, of course, there's also a sense in which the screens and the devices have this effect distracting us from this.
00:38:32.000 So when you're riding a 100-year-old subway in New York City and you're looking at your iPhone, you can look at, wow, this is this cool new gadget, but you're also being distracted from the fact that your lived environment hasn't changed in 100 years.
00:38:51.000 And so there's a question, how important is this world of bits versus the world of atoms?
00:38:57.000 You know, I would say, as human beings, we're physically embodied in a material world.
00:39:02.000 And so I would always say this world of atoms is pretty important.
00:39:06.000 And when that's pretty stagnant, you know, there's a lot of stuff that doesn't make sense.
00:39:11.000 I was an undergraduate at Stanford, late 80s.
00:39:15.000 And at the time, in retrospect, every engineering area would have been a bad thing to go into.
00:39:22.000 You know, mechanical engineering, chemical engineering, all these engineering fields where you're tinkering and trying to do new things because these things turned out to be stuck.
00:39:31.000 They were regulated.
00:39:32.000 You couldn't come up with new things to do.
00:39:34.000 Nuclear engineering, aero-astro engineering, people already knew those were really bad ones to go into.
00:39:39.000 They were, you know, outlawed.
00:39:40.000 You weren't going to make any progress in new nuclear reactor designs or stuff like that.
00:39:46.000 Electrical engineering, which was the one that's sort of adjacent to making semiconductors, that one was still okay.
00:39:51.000 And then the only field that was actually going to progress a lot was computer science.
00:39:59.000 And again, you know, it's been very powerful, but that was not the felt sense in the 1980s.
00:40:05.000 In the 1980s, computer science was this ridiculous, inferior subject to You know, the linguistic cut is always when people use the word science.
00:40:17.000 I'm in favor of science.
00:40:18.000 I'm not in favor of science in quotes.
00:40:21.000 And it's always a tell that it's not real science.
00:40:25.000 And so when we call it climate science or political science or social science, you know, you're just sort of making it up.
00:40:32.000 And you have an inferiority complex to real science or something like physics or chemistry.
00:40:36.000 And computer science was in the same category as social science or political science.
00:40:41.000 It was a fake field for people who found electrical engineering or math way too hard and sort of dropped out of the real science and real engineering fields.
00:40:54.000 You don't feel that climate science is a real science?
00:40:57.000 It is...
00:41:12.000 There's several different things one could say.
00:41:15.000 It's possible climate change is happening.
00:41:18.000 It's possible we don't have great accounts of why that's going on.
00:41:22.000 So I'm not questioning any of those things.
00:41:24.000 But how scientific it is, I don't think...
00:41:31.000 I don't think it's a place where we have really vigorous debates.
00:41:34.000 Maybe the climate is increasing because of carbon dioxide emissions.
00:41:37.000 Temperatures are going up.
00:41:38.000 Maybe it's methane.
00:41:39.000 Maybe it's people are eating too much steak.
00:41:41.000 It's the cows flatulating.
00:41:43.000 And you have to measure how much is methane a greenhouse gas versus carbon dioxide.
00:41:48.000 I don't think they're I don't think they're rigorously doing that stuff scientifically.
00:41:52.000 I think the fact that it's called climate science tells you that it's more dogmatic than anything that truly science should be.
00:42:02.000 Dogma doesn't mean that it's wrong.
00:42:04.000 But why does the fact that it's called climate science mean that it's more dogmatic?
00:42:07.000 Because if you said nuclear science, you wouldn't question it, right?
00:42:10.000 Yeah, but no one calls it nuclear science.
00:42:12.000 They call it nuclear engineering.
00:42:13.000 Interesting.
00:42:14.000 I'm just making a narrow linguistic point.
00:42:18.000 Is there anything they call science that is legitimately science?
00:42:20.000 Well, at this point, people say computer science has worked.
00:42:23.000 But in the 1980s, all I'm saying is it was in the same category as, let's say, social science, political science.
00:42:29.000 It was a tell that the people doing it kind of deep down knew they weren't doing real science.
00:42:35.000 Well, there's certainly ideology that's connected to climate science.
00:42:39.000 And then there's certainly corporations that are invested in this prospect of green energy and the concept of green energy.
00:42:47.000 And they're profiting off of it and pushing these different things, whether it be electric car mandates or whatever it is.
00:42:55.000 Like California, I think it's 2035, they have a mandate that all new vehicles have to be electric, which is hilarious when you're connected to a grid that can't support the electric cars it currently has.
00:43:05.000 After they said that, within a month or two, Gavin Newsom asked people to not charge their Teslas because it was summer and the grid was fucked.
00:43:15.000 Yeah, look, it was all linked into all these ideological projects in all these ways.
00:43:23.000 You know, there's an environmental project which is, you know, and maybe it shouldn't be scientific.
00:43:31.000 You know, the hardcore environmentalist argument is we only have one planet and we don't have time to do science.
00:43:36.000 If we have to do rigorous science and you can prove that we're overheating, it'll be too late.
00:43:42.000 And so if you're a hardcore environmentalist, you know, you don't want to have as high a standard of science.
00:43:47.000 Yeah, my intuition is certainly when you go away from that, you end up with things that are too dogmatic, too ideological.
00:43:55.000 Maybe it doesn't even work, even if the planet's getting warmer.
00:43:58.000 You know, maybe climate science is not...
00:44:01.000 Like, my question is, like, maybe methane is a worse...
00:44:05.000 Is it more dangerous greenhouse gas than carbon dioxide?
00:44:10.000 We're not even capable of measuring that.
00:44:13.000 Well, we're also ignoring certain things like regenerative farms that sequester carbon.
00:44:19.000 And then you have people like Bill Gates saying that planting trees to deal with carbon is ridiculous.
00:44:24.000 That's a ridiculous way to do it.
00:44:25.000 Like, how is that ridiculous?
00:44:27.000 They literally turn carbon dioxide into oxygen.
00:44:31.000 It is their food.
00:44:33.000 That's what the food of plants is.
00:44:34.000 That's what powers the whole plant life and the way we have the symbiotic relationship with them.
00:44:40.000 And the more carbon dioxide it is, the greener it is, which is why it's greener today on Earth than it has been in a hundred years.
00:44:48.000 Sure.
00:44:48.000 These are all facts that are inconvenient to people that have a very specific, narrow window of how to approach this.
00:44:56.000 Sure.
00:44:57.000 Although, you know, there probably are ways to steel man the other side, too, where maybe...
00:45:03.000 Maybe the original 1970s, I think the manifesto that's always Very interesting from the other side was this book by the Club of Rome, 1972, The Limits of Growth.
00:45:17.000 And it's, you can't have, we need to head towards a society in which there's zero percent, there's very limited growth, because if you have unlimited growth, you're going to run out of resources.
00:45:28.000 If you don't run out of resources, you'll hit a pollution constraint.
00:45:32.000 But in the 1970s, it was, you're going to have overpopulation, You're going to run out of oil.
00:45:39.000 We had the oil shocks.
00:45:41.000 And then by the 90s, it sort of morphed into more of the pollution problem with carbon dioxide, climate change, other environmental things.
00:45:51.000 But there is sort of...
00:45:54.000 You know, there's been some improvement in oil, carbon fuels with fracking, things like this in Texas.
00:46:02.000 It's not at the scale that's been enough to give an American standard of living to the whole planet.
00:46:11.000 We consume 100 million barrels.
00:46:18.000 Maybe fracking can add 10%, 10 million to that.
00:46:22.000 If everybody on this planet has an American standard of living, it's something like 300, 400 million barrels of oil.
00:46:29.000 And I don't think that's there.
00:46:31.000 So that's kind of...
00:46:34.000 I always wonder whether that was the real environmental argument.
00:46:39.000 We can't have an American standard of living for the whole planet.
00:46:42.000 We somehow can't justify this degree of inequality.
00:46:47.000 And therefore, we have to figure out ways to dial back and tax the carbon, restrict it.
00:46:55.000 And maybe that's...
00:46:59.000 There's some sort of a Malthusian calculus that's more about resources than about pollution.
00:47:06.000 How much of that, the demand for oil, could be mitigated by nuclear?
00:47:15.000 You probably could mitigate it a lot.
00:47:18.000 There's a question why the nuclear thing.
00:47:23.000 It has gone so wrong, especially if you have electric vehicles, right?
00:47:28.000 You know, combustion engine is probably hard to get nuclear to work, but if you shift to electric vehicles, you can charge your Tesla cars at night, and that would seemingly work.
00:47:43.000 And there's definitely a history of energy where it was always in the direction of more intense use.
00:47:50.000 It went from wood to coal to oil, which is a more compact form of energy.
00:47:55.000 And in a way, it takes up less of the environment.
00:47:58.000 And then if we move from oil to uranium, it's even smaller.
00:48:04.000 And so in a sense, the smaller, the more dense the energy is, the less of the environment it takes up.
00:48:11.000 And when we go from oil to natural gas, which takes up more space, and from natural gas to solar or wind, you have to pollute the whole environment by putting up windmills everywhere.
00:48:22.000 Or you have to cover the whole desert with solar panels.
00:48:27.000 And that is a good way to look at it because it is a form of pollution.
00:48:29.000 And so there was a way that nuclear was supposed to be the energy mode of the 21st century.
00:48:39.000 And then, yeah, there are all these historical questions.
00:48:42.000 Why did it get stopped?
00:48:46.000 Why did we not go down that route?
00:48:50.000 The standard explanation of why it stopped Was that it was – there were all these dangers.
00:49:00.000 We had Three Mile Island in 1979, you know, Chernobyl in 1986 and then the Fukushima one in Japan I think 2011. And you had these sort of – you had these various accidents.
00:49:15.000 My alternate theory on why nuclear energy really stopped.
00:49:20.000 Is that it was sort of dystopian or even apocalyptic because it turned out to be very dual use.
00:49:31.000 If you build nuclear power plants, it's only sort of one step away from building nuclear weapons.
00:49:44.000 And it turned out to be a lot trickier to separate those two things out than it looked.
00:49:49.000 And I think the signature moment was 1974 or 75 when India gets the nuclear bomb.
00:49:56.000 And the US, I believe, had transferred the nuclear reactor technology to India.
00:50:00.000 We thought they couldn't weaponize it.
00:50:02.000 And then it turned out it was pretty easy to weaponize.
00:50:06.000 And then the...
00:50:07.000 And then sort of the geopolitical problem with nuclear power was you either – you need a double standard where we have nuclear power in the US but we don't allow other countries to have nuclear power because the US gets to keep its nuclear weapons.
00:50:26.000 We don't let 100 other countries have nuclear weapons and that's an extreme double standard.
00:50:32.000 Probably a little bit hard to justify.
00:50:37.000 Or you need some kind of really effective global governance where you have a one-world government that regulates all this stuff, which doesn't sound that good either.
00:50:48.000 And then sort of the compromise was just to regulate it so much that maybe the nuclear plants got grandfathered in, but it became too expensive to build new ones.
00:51:03.000 Jesus.
00:51:04.000 Like even China, which is the country where they're building the most nuclear power plants, they built way less than people expected a decade ago because they don't trust their own designs.
00:51:18.000 And so they have to copy the over-safety, over-protected designs from the West and the nuclear plants.
00:51:24.000 Nuclear power costs too much money.
00:51:26.000 It's cheaper to do coal.
00:51:29.000 Wow.
00:51:30.000 So, you know, I'm not going to get the numbers exactly right, but if you look at what percent of Chinese electricity was nuclear, it wasn't that high.
00:51:38.000 It was like maybe 4 or 5 percent in 2013, 2014. And the percent hasn't gone up in 10 years because, you know, they've maybe doubled the amount of electricity they use and maybe they doubled the nuclear, but the relative percentage is still...
00:51:51.000 I think?
00:52:15.000 And if there was innovation, if nuclear engineering had gotten to a point where, you know, let's say there wasn't Three Mile Island or Chernobyl didn't happen, do you think that it would have gotten to a much more efficient and much more effective version by now?
00:52:32.000 Well, my understanding is we have way more efficient designs.
00:52:37.000 You can do small reactor designs, which are – you don't need this giant containment structure.
00:52:42.000 So it costs much less per kilowatt hour of electricity you produce.
00:52:47.000 So I think we have those designs.
00:52:49.000 They're just not allowed.
00:52:51.000 But then I think the problem is that – If you were able to build them in all these countries all over the world, you still have this dual use problem.
00:53:00.000 And again, my alternate history of what really went wrong with nuclear power, it wasn't Three Mile Island.
00:53:06.000 It wasn't Chernobyl.
00:53:07.000 That's the official story.
00:53:09.000 The real story was India getting the bomb.
00:53:12.000 Wow.
00:53:12.000 That makes sense.
00:53:14.000 It completely makes sense.
00:53:15.000 Jeez Louise.
00:53:17.000 And then this is always the question about – there's always a big picture question.
00:53:25.000 People ask me if I'm right about this picture of this slowdown in tech, this sort of stagnation in many, many dimensions.
00:53:35.000 And then there's always a question, why did this happen?
00:53:38.000 And my cop-out answer is always why questions are over-determined because it can be – there are multiple reasons.
00:53:49.000 So it could be why it could be we became a more feminized, risk-averse society.
00:53:53.000 It could be that the education system worked.
00:53:56.000 Well, it could be that we're just out of ideas.
00:53:59.000 The easy ideas have been found, the hard ideas, the cupboard, nature's cupboard was bare, the low-hanging fruit had been picked.
00:54:05.000 So it can be overdetermined.
00:54:07.000 But I think one dimension that's not to be underrated for the science and text stagnation was that...
00:54:17.000 An awful lot of science and technology had this dystopian or apocalyptic dimension and probably what happened at Los Alamos in 1945 and then with the thermonuclear weapons in the early 50s.
00:54:34.000 It took a while for it to really seep in but it had this sort of delayed effect where maybe a stagnant world in which the physicists don't get to do anything and they have to putter around with DEI but you don't build weapons that blow up the world anymore.
00:54:55.000 You know, is that a feature or a bug?
00:55:26.000 We're in the stagnant path of the multiverse because it had this partially protective thing even though in all these other ways I feel it's deeply deranged our society.
00:55:35.000 That's a very interesting perspective and it makes a lot of sense.
00:55:38.000 It really does.
00:55:40.000 And particularly the dual use thing with nuclear power and especially distributing that to other countries.
00:55:46.000 When you talk about the stagnation in this country, I don't know how much you follow this whole UAP nonsense.
00:55:53.000 I know we met – what was that guy's name at your place?
00:55:57.000 The guy who did Chariots of the Gods?
00:55:59.000 Oh, Fondanican.
00:56:00.000 Yes.
00:56:01.000 Yeah, you thought he was too crazy.
00:56:03.000 You like Hancock, but you don't like Fondanican.
00:56:06.000 I didn't think he's too crazy.
00:56:08.000 He just willfully, in my opinion, ignores evidence that would show that some of the things that he's saying have already been solved.
00:56:18.000 And I think his His hypothesis is all related to this concept that we have been visited and that that's how all these things were built and that this technology was brought here from another world.
00:56:35.000 And I think he's very ideologically locked into these ideas.
00:56:39.000 And I think a much more compelling idea is that there were very advanced cultures, for some reason, 10,000 years ago.
00:56:49.000 Whatever it was.
00:56:50.000 Whatever the year was where they built some of the insane structures.
00:56:54.000 It's 45, 100 years ago they roughly think the pyramids were built.
00:57:00.000 Like, whatever the fuck was going on there.
00:57:03.000 I think those were human beings.
00:57:04.000 I think those were human beings in that place, in that time.
00:57:07.000 And I think they had some sort of very sophisticated technology that was lost.
00:57:12.000 And things can get lost.
00:57:13.000 Things can get lost in cataclysms.
00:57:15.000 Things can get lost in...
00:57:18.000 They can get lost in disease and famine and there's all sorts of war, all sorts of reasons, the burning of the library of Alexandria.
00:57:25.000 There's all sorts of ways that technology gets lost forever.
00:57:28.000 And you can have today someone living in Los Angeles in the most sophisticated high-tech society the world has ever known, while you still have people that live in the Amazon that live in the same way that they have lived for thousands of years.
00:57:42.000 So those things can happen in the same planet at the same time.
00:57:46.000 And I think while the rest of the world was essentially operating at a much lower vibration, there were people in Egypt that were doing some extraordinary things.
00:57:56.000 I don't know how they got the information.
00:57:58.000 Maybe they did get it from visitors.
00:58:00.000 Maybe they did.
00:58:00.000 But there's no real compelling evidence that they did.
00:58:04.000 I think there's much more compelling evidence that a cataclysm happened.
00:58:08.000 When you look at the Younger Dryas impact theory, it's all entirely based on science.
00:58:12.000 It's entirely based on core samples and iridium content and also massive changes in the environment over a very short period of time, particularly the melting of the ice caps in North America and just impact craters all around the world that we know something happened roughly 11,000 years ago.
00:58:32.000 And probably again 10,000 years ago.
00:58:34.000 I think it's a regular occurrence on this planet that things go sideways and there's massive natural disasters and I think that it's very likely that...
00:58:53.000 In some ways, the one in which we have the best history is the fall of the Roman Empire, which was obviously the culmination of the classical world and it somehow extremely unraveled.
00:59:06.000 So I think my view on it is probably somewhere between yours and theβ€” Von Daniken?
00:59:16.000 No, not fund Anakin.
00:59:17.000 I'm more on the other side.
00:59:23.000 Let me try to define why this – maybe agree on why this is so important today.
00:59:31.000 It's not just of antiquarian interest.
00:59:33.000 The reason it matters today is because the alternative – if you say – Civilization has seen great rises and falls.
00:59:43.000 It's gone through these great cycles.
00:59:46.000 Maybe the Bronze Age civilizations were very advanced, but someone came up with iron weapons.
00:59:52.000 There was just one dimension where they progressed, but then everything else they could destroy.
00:59:56.000 And so or the fall of the Roman Empire was again this pretty cataclysmic thing where there were diseases and then there were political things that unraveled but somehow it was a massive regression for four,
01:00:14.000 five, six hundred years into the Dark Ages.
01:00:20.000 The sort of naive progressive views, things always just got monotonically better.
01:00:29.000 And there's sort of this revisionist, purely progressive history where even the Roman Empire didn't decline.
01:00:36.000 One sort of stupid way to quantify this stuff is with pure demographics.
01:00:42.000 And so it's the question, how many people lived in the past?
01:00:46.000 And And the rises and falls of civilization story is there were more people who lived in the Roman Empire because it was more advanced.
01:00:55.000 It could support a larger population.
01:00:57.000 And then the population declined.
01:00:59.000 You know, the city of Rome maybe had a million people at its peak.
01:01:02.000 And then by, you know, I don't know, 650 AD, maybe it's down to 10,000 people or less.
01:01:09.000 You have this complete collapse in population.
01:01:12.000 And then the sort of alternate...
01:01:16.000 Purely progressive view is the population has always just been monotonically increasing because it's a measure of how, in some sense, things in aggregate have always been getting better.
01:01:26.000 So I am definitely on your side that population had great rises and falls.
01:01:33.000 Civilizations had great rises and falls.
01:01:37.000 And so that part of it, I agree with you.
01:01:41.000 Or even, you know, some variant of what Hancock or Fundanna can say.
01:01:49.000 The place where I would say I think things are different is I don't think – and therefore it seems possible something could happen to our civilization.
01:02:01.000 That's always the upshot of it.
01:02:03.000 If it had been monotonically always progressing, then there's nothing we should worry about.
01:02:08.000 Nothing can possibly go wrong.
01:02:10.000 And then certainly the thing – the sort of alternate Hancock, Fondaniken – Joe Rogan, History of the World, tells us is that we shouldn't take our civilization for granted.
01:02:26.000 There are things that can go really haywire.
01:02:28.000 I agree with that.
01:02:29.000 The one place where I differ is I do think our civilization today is on some dimensions way more advanced than any of these past civilizations were.
01:02:40.000 I don't think any of them had nuclear weapons.
01:02:43.000 I don't think any of them had, you know, Spaceships or anything like that.
01:02:54.000 And so the failure mode is likely to be somewhat different from these past ones.
01:03:02.000 Yeah, that makes sense.
01:03:03.000 I think technology progressed in a different direction.
01:03:07.000 That's what I think.
01:03:08.000 I think structural technology, building technology had somehow or another achieved levels of competence that's not available today.
01:03:17.000 When you look at the construction of the Great Pyramid of Giza, there's 2,300,000 stones in it.
01:03:23.000 The whole thing points to due north, south, east, and west.
01:03:27.000 It's an incredible achievement.
01:03:29.000 The stones, some of them were moved from a quarry that was 500 miles away through the mountains.
01:03:34.000 They have no idea how they did it.
01:03:35.000 Massive stones.
01:03:36.000 The ones inside the King's Chamber, but the biggest ones are like 80 tons.
01:03:39.000 It's crazy.
01:03:40.000 The whole thing's crazy.
01:03:41.000 Like, how did they do that?
01:03:42.000 Like, whatever they did, they did without machines, supposedly.
01:03:46.000 They did without the use of the combustion engine.
01:03:49.000 They didn't have electricity.
01:03:51.000 And yet they were able to do something that stands the test of time, not just so you could look at it.
01:03:57.000 You know, like you can go to the Acropolis and see the Parthenon.
01:04:01.000 It's gorgeous.
01:04:02.000 It's amazing.
01:04:03.000 It's incredible.
01:04:04.000 But I can understand how people could have built it.
01:04:06.000 The pyramids is one of those things that you just look at and you go, what the fuck was going on here?
01:04:11.000 What was going on here?
01:04:12.000 And none of these people are still around.
01:04:14.000 You have this strange culture now that's entirely based around, you know, you have Cairo and an enormous population of visitors, right?
01:04:23.000 Which is a lot of it.
01:04:24.000 People just going to stare at these ancient relics.
01:04:27.000 What was going on that those people were so much more advanced than anyone anywhere else in the world?
01:04:34.000 Yeah.
01:04:35.000 I'm not sure I would anchor on the technological part but I think the piece that is very hard for us to comprehend is what motivated them culturally.
01:04:43.000 Well, how did they do it physically?
01:04:45.000 Why did they do it?
01:04:46.000 Why were you motivated?
01:04:47.000 So why but also how?
01:04:49.000 How is a big one because it's really difficult to solve.
01:04:53.000 There's no traditional conventional explanations for the construction, the movement of the stones, the amount of time that it would take in.
01:05:00.000 If you move 10 stones a day, I believe it takes 664 years to make one of those pyramids.
01:05:05.000 So how many people were involved?
01:05:07.000 How long did it take?
01:05:08.000 How'd they get them there?
01:05:09.000 How'd they figure out how to do it?
01:05:10.000 How come the shittier pyramids seem to be dated later?
01:05:13.000 Like, what was going on in that particular period of time where they figured out how to do something so extraordinary that even today, 4,500 years later, we stare at it and we go, I don't know.
01:05:27.000 I don't know what the fuck they did.
01:05:28.000 I haven't studied it carefully enough.
01:05:30.000 I'll trust you that it's very hard.
01:05:33.000 I would say the real mystery is why were they motivated.
01:05:38.000 And it's because you can't live in a pyramid.
01:05:39.000 It was just the afterlife.
01:05:43.000 There's some debate about that.
01:05:45.000 Christopher Dunn is an engineer who believes that it was some sort of a power plant.
01:05:48.000 He's got this very bizarre theory that there was a chamber that exists.
01:05:52.000 You see the structure of the pyramid, the inside of it.
01:05:55.000 There's a chamber that's subterranean and he believes the subterranean chamber was pounding on the surface of the earth.
01:06:03.000 And of the walls of the thing, creating this very specific vibration.
01:06:07.000 They had shafts that came down into the Queen's chamber.
01:06:11.000 These shafts, they would pour chemicals into these shafts, and then there was limestone at the end of it.
01:06:16.000 This is all his theory, not mine.
01:06:18.000 At the end of it, there was this limestone, which is permeable, right?
01:06:21.000 So the limestone, which is porous, these gases come through and creates this hydrogen that's inside of this chamber.
01:06:29.000 Then there are these shafts inside the King's chamber, That they're getting energy from space, you know, gamma rays and all the shit from space, and that it's going through these chambers which are very specifically designed to target these gases and put them into this chamber where they would interact with this energy,
01:06:48.000 and he believes it's enough to create electricity.
01:06:50.000 It's a crazy theory.
01:06:52.000 I'm always too fast to debunk all these things, but just coming back to our earlier conversation, it must have been a crazy power plant to have a containment structure much bigger than a nuclear reactor.
01:07:06.000 Yeah, well, it's ridiculous, but it's also a different kind of technology, right?
01:07:10.000 If nuclear technology was completely not on the table, they didn't understand atoms at all, but they did understand that there's rays that come from space and that you could somehow harness the energy of these things with specific gases and through some method convert that into some form of electricity.
01:07:28.000 But if it takes so much power to put all these rocks on the pyramid, you have to always look at how efficient the power plant is.
01:07:36.000 So it can't just be – it has to be like the craziest reaction ever to justify such a big containment structure because even nuclear power plants don't work economically.
01:07:47.000 They only work.
01:07:48.000 Well, they didn't do a lot of them.
01:07:50.000 They only did this one in Giza.
01:07:53.000 And then there was other pyramids that he thinks had different functions that were smaller.
01:07:57.000 But the whole purpose of it is, or the whole point of it is, we don't know what the fuck it is.
01:08:04.000 We don't know why they did it.
01:08:05.000 We have a group of new archaeologists that are looking at it from a completely different theory.
01:08:11.000 They're not looking at it like it's a tomb.
01:08:12.000 The established archaeologists have insisted that this is a tomb for the pharaoh.
01:08:16.000 The newer archaeologists, established archaeologists, are looking at it and considering whether or not there were some other uses for this thing.
01:08:22.000 And one of them is the concept of the parallel product.
01:08:25.000 I'm always...
01:08:27.000 I don't know if this is an alternate history theory, but I'm always into the James Frazier, Golden Bough, Rene Girard, violence, sacred history, where you have always this question about the origins of monarchy and kingship.
01:08:47.000 And the sort of Girard-Frasier intuition is...
01:08:56.000 That it's something likeβ€”it is something like if every king is a kind of living god, then we have to also believe the opposite, that maybe every god is a dead or murdered king,
01:09:14.000 and that somehow societies were organized around scapegoats.
01:09:19.000 The scapegoats wereβ€”you know, there was sort of a crisis in theβ€” Archaic community.
01:09:25.000 It got blamed on a scapegoat.
01:09:27.000 The scapegoat was attributed all these powers.
01:09:31.000 And then at some point, the scapegoat, before he gets executed, figures out a way to postpone his execution and turn the power into something real.
01:09:40.000 And so there's sort of this very weird adjacency between the monarch and the scapegoat.
01:09:48.000 And then, you know, I don't know, the sort of riff on theβ€”would be that the first pyramid did not need to be invented.
01:09:54.000 It was just the stones that were thrown on a victim.
01:09:57.000 And then it somehowβ€”and that's the original form.
01:10:02.000 The stones that were thrown on a victim.
01:10:04.000 A community stones a victim to death.
01:10:06.000 A tribe runs after a victim.
01:10:08.000 You stone them to death.
01:10:09.000 You throw stones on the victim.
01:10:11.000 That's how you create the first tomb.
01:10:14.000 And then as it gets more complicated, you create a tomb that's two million...
01:10:19.000 Stones.
01:10:20.000 And you get a pharaoh who figures out a way to postpone his own execution or something like this.
01:10:28.000 I think there's...
01:10:29.000 I'm going to blank on the name of this ritual, but I believe in the old Egyptian kingdoms, which were sort of around the time of the Great Pyramids or even before.
01:10:41.000 It was something like...
01:10:45.000 In the 30th year of the reign of the pharaoh, the pharaoh gets transformed into a living god.
01:10:52.000 And then this perhaps dates to a time where in the 30th year of the pharaoh's reign, the pharaoh would get ritually sacrificed or killed.
01:11:05.000 And you have all these...
01:11:06.000 Societies where the kings lived, were allowed to rule for an allotted time, where you become king and you draw the number of pebbles out of a vase, and that corresponds to how many years?
01:11:18.000 Was this, Jamie?
01:11:20.000 The Sed Festival.
01:11:21.000 Heb Sed Festival of Tales, an ancient Egyptian ceremony that celebrated the continued rule of the pharaoh.
01:11:27.000 The name was taken from the name of the Egyptian wolf god, one of whom's name was Whipp- Yeah, this is what I'm talking about.
01:11:33.000 Or said, the less formal feast name, the feast of the tale is derived...
01:11:37.000 Yeah, next paragraph is the one to start.
01:11:39.000 Okay.
01:11:40.000 That one right there.
01:11:41.000 The ancient festival might perhaps have been instituted to replace a ritual of murdering a pharaoh who was unable to continue to rule effectively because of age or condition.
01:11:50.000 Interesting.
01:11:52.000 Interesting.
01:11:53.000 So you can't kill them now.
01:11:54.000 And then eventually said festivals were jubilees, several of them had thrown for 30 years.
01:12:00.000 And then every three to four years after that.
01:12:02.000 So when it becomes unthinkable to kill the pharaoh, the pharaoh gets turned into a living god.
01:12:07.000 Before that, the pharaoh gets murdered and then gets worshipped as a dead pharaoh or distant god.
01:12:15.000 That's interesting, but it still doesn't solve the engineering puzzle.
01:12:18.000 The engineering puzzle is the biggest one.
01:12:20.000 How do they do that?
01:12:21.000 The one I'm focusing on is the motivational puzzle.
01:12:25.000 Even if you have all the motivation in the world, if you want to build a structure that's insane to build today, and you're doing it 4,500 years ago, we're dealing with a massive puzzle.
01:12:35.000 I think the motivational part is the harder one to solve.
01:12:38.000 If you can figure out the motivation, you'll figure out a way to organize the whole society.
01:12:44.000 And if you can get the whole society working on it, you can probably do it.
01:12:47.000 But don't you think that his grasp of power was in peril in the first place, which is why they decided to come up with this idea of turning them into a living god?
01:12:57.000 So to have the amount of resources and power and then the engineering and then the understanding of Whatever methods they use to shape and move these things.
01:13:10.000 Well, this is always the anthropological debate between Voltaire, the Enlightenment thinker of the 18th century, and Durkheim, the 19th century anthropologist.
01:13:21.000 And Voltaire believes that religion originates as a conspiracy of the priests to maintain power.
01:13:30.000 And so politics comes first.
01:13:32.000 The politicians invent religion.
01:13:35.000 And then Durkheim says the causation is the other way around, that somehow religion came first and then politics somehow came out of it.
01:13:44.000 Of course, once the politics comes out of it, the priests, the religious authorities have political power.
01:13:51.000 They figure out ways to manipulate it, things like this.
01:13:53.000 But I find...
01:13:56.000 You know, I find the Durkheim story far more plausible than the Volterra one.
01:14:01.000 I think the religious categories are primary and the political categories are secondary.
01:14:08.000 So you think the religion came first?
01:14:11.000 But what about if we emanated from tribal societies?
01:14:14.000 Tribal societies have always had leaders.
01:14:16.000 When you have leaders, you're going to have dissent.
01:14:18.000 You're going to have challenges.
01:14:19.000 You're going to have politicking.
01:14:20.000 You have people negotiating to try to maintain power, keep power, keep everything organized.
01:14:25.000 That's the origin of politics, correct?
01:14:29.000 You know, I think that's a whitewashed, enlightenment, rationalist description of the origin of politics.
01:14:35.000 What do you think the origin of politics is?
01:14:37.000 I think it's far more vile than that.
01:14:39.000 Trevor Burrus Well, it's very vile.
01:14:41.000 The control and power and maintaining power involves murder and sabotage.
01:14:46.000 Trevor Burrus Well, okay.
01:14:46.000 That's more like it.
01:14:47.000 But what you gave me a minute ago sounds more like a social contract theory in which people sit down, negotiate, and have a nice legal chit-chat to drop the social contract.
01:15:00.000 That is a complete fiction.
01:15:02.000 Yeah, I don't think that.
01:15:03.000 I think that there was probably various levels of civility that were achieved when agriculture and when establishments were constructed that were near resources, where they didn't have to worry as much about food and water and things along those lines.
01:15:19.000 Things probably got a little bit more civil.
01:15:21.000 But I think that the origins of it are like the origins of all human conflict.
01:15:25.000 It's filled with murder.
01:15:27.000 Well, I think at the beginning was madness and murder.
01:15:29.000 Yeah, madness and murder.
01:15:30.000 And I don't know...
01:15:33.000 I don't know if it got that much more rational.
01:15:39.000 I don't know if it's that much more rational today.
01:15:41.000 Well, in some ways it's not, right?
01:15:44.000 This is, again, back to the progressive conception.
01:15:48.000 Have we really progressed?
01:15:52.000 How much have we really progressed from that?
01:15:55.000 My version would be that it was organized around acts of mass violence.
01:16:08.000 Maybe you externalize it onto a mastodon or hunting some big animal or something like this.
01:16:14.000 But the real problem of violence It wasn't external.
01:16:18.000 It was mostly internal.
01:16:20.000 It was violence with people who were near you, proximate to you.
01:16:25.000 It wasn't even natural cataclysms or other tribes.
01:16:29.000 It was sort of much more the internal stuff.
01:16:34.000 And it's very different, I think.
01:16:36.000 The human...
01:16:38.000 Situation is somehow very, very different from something like, I don't know, an ape-primate hierarchy, where in an ape context, you have an alpha male.
01:16:48.000 You know, he's the strongest, and there's some sort of natural dominance, and you don't need to have a fight to the death, typically, because you know who's the strongest, and you don't need to...
01:16:58.000 Push it all the way.
01:16:59.000 In a human context, it's always possible for two or three guys to gang up on the alpha male.
01:17:06.000 So it's somehow the culture is more important, you know, if they can talk to each other and you get language and then they can coordinate and they can gang up on the leader and then you have to stop them from gang up on the leader.
01:17:20.000 How do you do that?
01:17:21.000 And so there's some sort of radical difference between a human and a Let's say a pre-human world.
01:17:30.000 Have you seen Chimp Empire?
01:17:33.000 No.
01:17:33.000 Chimp Empire is a fascinating documentary series on Netflix where these scientists had been embedded with this tribe of chimpanzees for decades.
01:17:42.000 And so because they were embedded, they had very specific rules.
01:17:46.000 You have to maintain at least 20 yards from you and any of the chimps.
01:17:50.000 No food.
01:17:51.000 You can never have food and don't look them in the eyes.
01:17:53.000 And as long as you do that, they don't feel you're a threat, and they think of you as a natural part of their environment, almost like you don't exist.
01:17:59.000 And they behave completely naturally.
01:18:02.000 Well, it shows in that that sometimes it's not the largest, strongest one, and that some chimps form bonds with other chimps, and they form coalitions.
01:18:12.000 And they do have some sort of politicking.
01:18:14.000 And they do help each other.
01:18:16.000 They groom each other.
01:18:17.000 They do specific things for each other.
01:18:19.000 And then one of the things that happens also, they get invaded by other chimps.
01:18:23.000 And that chimps leave and they go on patrol and other chimps gang up on them and kill them.
01:18:28.000 And they try to fight and battle over resources.
01:18:30.000 So it's not nearly as cut and dry as the strongest chimp prevails.
01:18:35.000 One of the chimps that was dominant was an older chimp, and he was smaller than some of the other chimps, but he had formed a coalition with all these other chimps, and they all respected him, and they all knew that they would be treated fairly.
01:18:47.000 And being treated fairly is a very important thing with chimpanzees.
01:18:50.000 They get very jealous if they think that things are not fair, which is why that guy was attacked.
01:18:55.000 You know that guy who had a pet chimpanzee?
01:18:58.000 He brought it a birthday cake.
01:19:00.000 The other chimps weren't getting a piece of the cake and someone had fucked up and left a door open.
01:19:04.000 They got out and mauled this guy because he didn't give them some of the cake.
01:19:08.000 Yeah, so I find all of that quite plausible, but I think...
01:19:13.000 Both of us can be correct.
01:19:15.000 So there's some, the true story of hominization, of how we became humans, there's a way to tell it where it's continuous with our animal past and where it's just, you know, there's things like this with the chimpanzees or the baboons or, you know, other primates.
01:19:31.000 And then there is a part of the story that I think is also more discontinuous.
01:19:37.000 And My judgment is we probably, you know, in a Darwinian context, we always stress the continuity.
01:19:45.000 You know, I'm always a little bit the contrarian.
01:19:48.000 And so I believe in Darwin's theory.
01:19:50.000 But I think we should also be skeptical of ways it's too dogmatic.
01:19:56.000 And Darwin's theories make us gloss over the discontinuities.
01:20:01.000 And I think You know, the one type ofβ€”and this will happen overnightβ€”but one type of fairly dramatic discontinuity is that, you know, is that humans have something like language.
01:20:12.000 And even though, you know, chimpanzees probablyβ€”I don't know, they have an IQ of 80 orβ€”they're pretty smart.
01:20:17.000 But when you don't have a rich symbolic systemβ€” That leads to sort of a very, very different kind of structure.
01:20:25.000 And there's something about language and the kind of coordination that allows and the ways that it enables you to coordinate on violence and then it encourages you to channel violence in certain sacred religious directions,
01:20:42.000 I think creates something radically different about human society.
01:20:48.000 You know, humans tell each other stories.
01:20:53.000 A lot of the stories are not true.
01:20:55.000 They're myths.
01:20:55.000 But I think that's some sort of very important difference from even our closest primate relatives.
01:21:09.000 But, you know, this is, again, this is sort of like another way of getting at what's so crazy about ChatGPT and passing the Turing test.
01:21:17.000 Because if we had sat here two years ago and you asked me, you know, what is the distinctive feature of a human being?
01:21:24.000 What makes someone a human and, you know, in a way that differs from everybody else?
01:21:31.000 It's not perfect, but my go-to answer would have been language.
01:21:35.000 You're a three-year-old.
01:21:36.000 You're an 80-year-old.
01:21:39.000 Just about all humans can speak languages.
01:21:41.000 Just about all nonhumans cannot speak languages.
01:21:45.000 It's this binary thing.
01:21:47.000 And then that's sort of a way of telling us, again, why passing the Turing test was way more important than superintelligence or anything else.
01:21:56.000 Yeah, I could see that.
01:21:58.000 Sorry, I don't want to go back to that tangent.
01:22:00.000 No, it's a good tangent.
01:22:01.000 Go ahead, connect it.
01:22:02.000 Keep tangenting off.
01:22:03.000 Have fun.
01:22:04.000 It's great.
01:22:05.000 What do you think the factor was?
01:22:07.000 There's a lot of debate about this.
01:22:09.000 Like, the factor was that separated us from these animals and why we became what we became.
01:22:14.000 Because we're so vastly different than any other primate.
01:22:17.000 So what do you think took place?
01:22:18.000 Like, the doubling of the human brain size.
01:22:21.000 Over a period of two million years is one of the greatest mysteries in the entire fossil record.
01:22:25.000 We don't know what the fuck happened.
01:22:26.000 There's a lot of theories.
01:22:27.000 Throwing arm, cooking meat.
01:22:29.000 There's a lot of theories.
01:22:30.000 But we really have no idea.
01:22:32.000 Well, again, let me do sort of a linguistic riff.
01:22:40.000 Aristotelian, Darwinian biology.
01:22:42.000 Aristotle, you always differ things, but I put them in categories.
01:22:46.000 And man, I think the line Aristotle has is something, man differs from the other animals in his greater aptitude for imitation.
01:22:59.000 And I would say that we are these giant imitating machines.
01:23:05.000 And of course the Darwinian riff on this is to imitate is to ape.
01:23:10.000 And so we differ from the ape.
01:23:12.000 We're more ape-like than the apes.
01:23:15.000 We are far better at aping each other than the apes are.
01:23:18.000 And to a first cut, I would say our brains are giant imitation machines.
01:23:25.000 That's how you learn language as a kid.
01:23:27.000 You imitate your parents.
01:23:29.000 And that's how culture gets transmitted.
01:23:31.000 But then there are a lot of dimensions of imitation that are also very dangerous because Imitation doesn't just happen on this symbolic, linguistic level.
01:23:43.000 It's also you imitate things you want.
01:23:46.000 You want a banana.
01:23:47.000 I want a banana.
01:23:48.000 You want a blue ball.
01:23:50.000 I can have a red ball.
01:23:51.000 I want a blue ball because you have a blue ball.
01:23:55.000 And so there's something about imitation that creates culture, that is incredibly important pedagogically learning.
01:24:08.000 It's how you master something.
01:24:11.000 In all these different ways.
01:24:13.000 And then a lot of it has this incredibly conflictual dimension as well.
01:24:19.000 And then there's...
01:24:21.000 Yeah, so I think that was sort of core to the...
01:24:26.000 Things that are both great and troubled about humanity and that was sort of, that was in some ways the problem that needed to be solved.
01:24:36.000 So you think that the motivation of imitation is the essential first steps that led us to become human?
01:24:47.000 There's some story like – and again, this is a one-dimensional, one-explanation fits all.
01:24:52.000 But the sort of – the explanation I would go with is that it was something like our brains got bigger and so we were more powerful imitation machines.
01:25:05.000 And there were things about that that were – Yeah, that made us a lot more powerful and a lot – we could learn things and we could remember things and there was cultural transmission that happened.
01:25:18.000 But then it also – we could build better weapons and we became more violent.
01:25:26.000 It also had a very, very destructive element.
01:25:29.000 And then somehow the imitation had to be channeled in these sort of ritualized, religious kinds of ways.
01:25:39.000 And that's why I think all these things sort of somehow came up together in parallel.
01:25:47.000 What about the physical adaptation?
01:25:49.000 Like what would be the motivation of the animal to change form and to have its brain grow so large and to lose all its hair and to become soft and fleshy like we are as opposed to like rough and durable like almost every other primate is?
01:26:06.000 Well, you can always – man, you can always tell these retrospective just-so stories and how this all worked out.
01:26:14.000 But it would seem – the naive retrospective story would be that, you know, there are a lot of ways that humans are, I don't know, less strong than the other apes or – You know, all these ways where we're,
01:26:31.000 in some sense, weaker.
01:26:33.000 Physically, at least.
01:26:34.000 Physically.
01:26:34.000 But maybe it was just this basic trade-off.
01:26:38.000 More of your energy went into your mind and into your brain.
01:26:43.000 And then, you know, your fist wasn't as strong, but you could build a better axe.
01:26:52.000 And that made you stronger than an ape.
01:26:55.000 And that's where a brain with less energy was spent on growing a hair to keep warm in the winter and then you used your brain to build an axe and skin a bear and get some fur for the winter or something like that.
01:27:13.000 Yeah, I guess.
01:27:15.000 But it's just such a leap.
01:27:17.000 It's such a leap and different than any other animal.
01:27:20.000 Like, what was the primary motivating factor?
01:27:23.000 Like, what was the thing?
01:27:25.000 You know, McKenna believes it was psilocybin.
01:27:27.000 You know, I'm sure you probably...
01:27:28.000 You ever heard that theory?
01:27:29.000 McKenna's stoned ape theory, which is a fascinating one.
01:27:32.000 But there's a lot of different theories about what took place.
01:27:36.000 But...
01:27:37.000 Well, the one I would go on was that there was this dimension of increased imitation.
01:27:46.000 There was some kind of cultural linguistic dimension that was incredibly important.
01:27:52.000 It probably was also It's somehow linked to dealing with all the violence that came with it, all the conflicts that came with it.
01:28:09.000 I would be more open to the stoned ape theory if people I had this conversation with the other guy, Muro Rescu, the Immortality Key guy, and I always feel they whitewash it too much.
01:28:23.000 How so?
01:28:24.000 You know, it's like, I mean, if you had these crazy Dionysian rituals in which people, you know, there's probably lots of crazy sex, there's probably lots of crazy violence that was tied to it, and so maybe you'd be out of your mind to be hunting a woolly mammoth.
01:28:44.000 Maybe you can't be completely...
01:28:47.000 But they weren't hunting woolly mammoths during the Illusinian Mysteries.
01:28:52.000 No, but you went to war to fight the neighboring tribe.
01:28:56.000 It's probably more dangerous than hunting.
01:28:57.000 Right, but they also did absolutely have these rituals and they have absolutely found trace elements of...
01:29:03.000 I don't question that.
01:29:04.000 Okay.
01:29:05.000 I don't question that at all.
01:29:07.000 I just think probably part of it was also...
01:29:14.000 It was a way to channel violence.
01:29:16.000 It was probably, you know, wheneverβ€”I don't know.
01:29:19.000 Was there some degree to which whenever you went to war, you were on drugs?
01:29:25.000 Oh, yeah.
01:29:26.000 Well, we know about the Vikings.
01:29:27.000 The Vikings most certainly took mushrooms before they went into battle.
01:29:32.000 And, you know, maybe it makes you lessβ€” Less coordinated or something, but just if you're less scared, that's probably...
01:29:43.000 It doesn't make you less coordinated.
01:29:44.000 If you're just a little bit less scared, that's probably super important.
01:29:47.000 It increases visual acuity.
01:29:48.000 There's a lot of benefits that would happen physically, especially if you got the dose right.
01:29:55.000 It increases visual acuity, edge detection's better, makes people more sensitive, probably more aware, probably a better hunter.
01:30:04.000 But I'm sympathetic to all these...
01:30:11.000 Mushrooms, psychedelic drug, historical usage theories, I suspect was very widespread.
01:30:18.000 I just think, you know, a lot of it was in these contexts that were pretty transgressive.
01:30:24.000 Yeah, I think they're not mutually exclusive.
01:30:26.000 I think just giving the way the world was back then, for sure violence was everywhere.
01:30:32.000 Violence was a part of daily life.
01:30:34.000 Violence was a part of how society was kept together.
01:30:37.000 Violence was entertainment in Rome, right?
01:30:40.000 For sure, violence was everything.
01:30:43.000 It was a big part of it.
01:30:44.000 And I think release and the anxiety of that violence also led people to want to be intoxicated and do different things that separated them from a normal state of consciousness.
01:30:56.000 But I do think it's also probably where democracy came from.
01:30:59.000 I think having those Illusinian mystery rituals where they would get together and do psychedelics and under this very controlled set and setting, I think that's the birthplace of a lot of very interesting and innovative ideas.
01:31:12.000 I think a lot of interesting and innovative ideas Currently are being at least dreamt up, thought of, they have their roots in, some sort of altered conscious experience.
01:31:30.000 Well, it's...
01:31:35.000 I don't know.
01:31:36.000 I think this stuff is very powerful.
01:31:39.000 I think it is...
01:31:45.000 I definitely think it shouldn't be outlawed.
01:31:49.000 I'm a pretty hardcore libertarian on all the drug legalization stuff.
01:31:56.000 And then I do wonder exactly how these things work.
01:32:13.000 Probably the classical world version of it was that it was something that you did in a fairly controlled setting.
01:32:26.000 You didn't do it every day, and it was some way, I imagine, to get a very different perspective on your 9-to-5 job or whatever you want to call it, but you didn't necessarily want to really decamp to the other world altogether.
01:32:49.000 Oh, for sure.
01:32:50.000 It's too dangerous to do.
01:32:52.000 I don't think anybody thinks they did.
01:32:54.000 I think that was part of the whole thing.
01:32:56.000 Where do you think that line is?
01:33:00.000 Like, you know, should everyone do one ayahuasca trip?
01:33:05.000 Or if you do an ayahuasca trip a year, is that...
01:33:10.000 I don't think everyone has to do anything.
01:33:12.000 And I think everyone has their own requirements.
01:33:14.000 And I think, as you do, that everything like this, especially psychedelics...
01:33:20.000 One of the more disappointing things recently was that the FDA had denied...
01:33:24.000 They did these MDMA trials for...
01:33:27.000 You know about all this?
01:33:28.000 Yeah.
01:33:29.000 Very, very disappointing that they wanted to...
01:33:41.000 Yeah, I... I was very bullish on this stuff happening.
01:33:58.000 And the way I thought about it four or five years ago was that it was a hack to doing a double-blind study.
01:34:07.000 Because the FDA always has this concept that you need to do a double-blind study.
01:34:12.000 You give one third of the people, you give a sugar pill.
01:34:16.000 And two-thirds, you give the real drug, and no one knows whether they have the sugar pill or the real drug.
01:34:21.000 And then you see how it works, and science requires a double-blind study.
01:34:27.000 And then my anti-double-blind study theory is, if it really works, you don't need a double-blind study.
01:34:34.000 It should just work.
01:34:35.000 And there's something sociopathic about doing double-blind studies because one-third of the people who have this bad disease are getting a sugar pill.
01:34:43.000 And we shouldn't even be...
01:34:44.000 Like, maybe it's immoral to do double-blind studies.
01:34:47.000 Well, double-blind studies on unique and novel things make sense.
01:34:51.000 But this is not unique nor novel.
01:34:53.000 It's been around long.
01:34:54.000 Well, unique, yes, but...
01:34:55.000 Well, my claim is if it actually works, you shouldn't need to do a double-blind study at all.
01:35:04.000 But...
01:35:05.000 And then my hope was that MDMA, psychedelics, all these things, they were a hack on the double-blind study because you knew whether you got the real thing or the sugar pill.
01:35:16.000 And so this would be a way to hack through this ridiculous double-blind criterion and just get the study done.
01:35:23.000 And then what I think...
01:35:27.000 Part of it is probably just an anti-drug ideology by the FDA. But the other part that happened on the sort of scientific establishment level is they think you need a double-blind study.
01:35:40.000 Joe, we know you're hacking this double-blind study because people will know whether they got the sugar pill or not.
01:35:46.000 And that's why we're going to arbitrarily change the goalposts and set them at way, way harder because we know there's no way you can do a double-blind study.
01:35:55.000 And if it's not a double-blind study, it's no good because that's what our ideology of science tells us.
01:36:00.000 And that's sort of what I think was part of what went sort of politically haywire with this stuff.
01:36:10.000 Well, I also think that it's Pandora's box.
01:36:13.000 I think that's a real issue in that if they do find extreme benefit in using MDMA therapy, particularly for veterans, if they start doing that and it starts becoming very effective and it becomes well-known and widespread, then it will open up the door to all these other psychedelic compounds.
01:36:30.000 And I think that's a real threat to the powers that be.
01:36:33.000 It's a real threat to the establishment.
01:36:36.000 If you have people thinking in a completely alternative way, I mean we saw what happened during the 1960s and that's one of the reasons why they threw water on everything and had it become schedule one and locked the country down in terms of the access to psychedelics.
01:36:50.000 All that stuff happened out of a reaction to the way society and culture was changing in the 1960s.
01:36:57.000 If that happened today, it would throw a giant monkey wrench in our political system, in our cultural system, the way we govern, the way we – just the way – allocation of resources, all that would change.
01:37:13.000 If I – just to articulate the alternate version on this, there's always a – you know, there's a part – let me think how to get this.
01:37:29.000 You know, there's one There's a question whether the shift to interiority, is it a complement or a substitute?
01:37:42.000 Like what I said about talk and action, is it a complement or a substitute to changing the outside world?
01:37:49.000 So we focus on changing ourselves.
01:37:50.000 Is this the first step to changing the world?
01:37:53.000 Or is it sort of a hypnotic way in which our attention is being redirected From outer space to inner space.
01:38:02.000 The one liner I had years ago was, you know, we landed on the moon in July of 1969 and three weeks later Woodstock started and that's when the hippies took over the country and we stopped going to outer space because we started going to inner space.
01:38:19.000 And so there's sort of a question, you know, how much, you know, it worked as a As an activator or as a deactivator in a way.
01:38:34.000 And there are all these different modalities of interiority.
01:38:37.000 There's psychological therapy.
01:38:40.000 There's meditation.
01:38:41.000 There's yoga.
01:38:42.000 There was a sexual revolution.
01:38:46.000 Gradually you have incels living in their parents' basement playing video games.
01:38:51.000 There's the navel-gazing that is identity politics.
01:38:55.000 There's a range of psychedelic things.
01:38:58.000 And I think all these things, I wonder whether the interiority ended up acting as a substitute.
01:39:09.000 Because, you know, the alternate history of the 1960s is that, you know, the hippies were actually, they were anti-political.
01:39:18.000 And it was sort of that the drugs happened at the end of the city, at the end of the 60s, and that's when people depoliticized.
01:39:27.000 And it was like, I don't know, the Beatles song, if you're carrying around pictures of Sharon Mauer, you're not going to make with anyone anyhow.
01:39:32.000 It's like, that's after they did LSD, and it was just...
01:39:35.000 The sort of insane politics no longer matters.
01:39:38.000 And so you have the civil rights, the Vietnam War, and then were the drugs the thing that motivated it?
01:39:44.000 Or was that the thing where it actually, those things started to de-escalate?
01:39:50.000 I think they were happening at the same time, and I think the Vietnam War coinciding with the psychedelic drug movement in the 1960s, it was one of the reasons why it was so dangerous to the establishment, because these people were far less likely to buy into this idea that they needed to fly to Vietnam and go kill people they didn't know.
01:40:08.000 And they were far less likely to support any war.
01:40:11.000 And I think there was this sort of bizarre movement that we had never seen before.
01:40:17.000 This flower children movement that we know that they plotted against.
01:40:20.000 I mean, if you read Chaos by Tom O'Neill.
01:40:23.000 Fantastic book that shows you what they were trying to do to demonize these hippies.
01:40:29.000 Or the part of it that I thought was interesting was the MKUltra angle.
01:40:34.000 Which is a part of it.
01:40:35.000 Yeah.
01:40:38.000 There was a predecessor version where we thought of, you know, there was a, you could think of it as we had an arms race with the fascists and the communists, and they were very good at brainwashing people.
01:40:52.000 The Goebbels propaganda, North Koreans brainwashing our soldiers in the Korean War, our POWs.
01:40:59.000 And we needed to have an arms race to program and reprogram and deprogram people.
01:41:06.000 And LSD was sort of the MKUltra shortcut.
01:41:16.000 It's so hard to reconstruct it, but my suspicion is that the MKUltra thing was a lot bigger.
01:41:24.000 Then we realize and that, you know, it was the LSD movement both in the Harvard form and the Stanford form.
01:41:31.000 You know, it started as an MKUltra project.
01:41:35.000 Timothy Leary at Harvard, Ken Kesey at Stanford.
01:41:38.000 I knew Tom Wolfe, an American novelist.
01:41:43.000 I still think his greatest novel was...
01:41:46.000 The electric Kool-Aid acid test, which is sort of this history of the LSD counterculture movement.
01:41:50.000 It starts at Stanford, moves to the Haight-Ashbury in San Francisco.
01:41:54.000 But it starts with Ken Kesey as a grad student at Stanford, circa 1958. And you get an extra $75 a day if you go to the Menlo Park Veterans Hospital, and they give you some random drug.
01:42:10.000 And yeah, he got an extra $75 as a grad student in English doing LSD. And Tom Wolfe writes this, you know, iconic fictionalized novel, very realistic, 1968,
01:42:25.000 about this.
01:42:27.000 And Wolfe could not have imagined.
01:42:31.000 That the whole thing started as some CIA mind control project.
01:42:34.000 Right.
01:42:35.000 The Menlo Park Veterans Hospital that was deep state adjacent.
01:42:39.000 Sure.
01:42:39.000 Well, Haight-Ashbury Free Clinic, run by the CIA. Sure, that's even crazier.
01:42:43.000 The whole thing's crazy.
01:42:44.000 The Jolly West guy, yep.
01:42:46.000 Yeah, the whole thing's crazy.
01:42:48.000 Which leads me to, what do you think they're doing today?
01:42:54.000 If they were doing that then, I do not believe that they abandoned this idea of programming people.
01:43:00.000 I do not believe that.
01:43:01.000 I don't think they would because I know it's effective.
01:43:04.000 Look, people join cults every day.
01:43:06.000 We're well aware that people can be ideologically captured.
01:43:10.000 We're well aware.
01:43:10.000 We're well aware people will buy into crazy ideas as long as it's supported by whatever community they associate with.
01:43:19.000 That's just a natural Aspect of being a human being.
01:43:23.000 Maybe it's part of what you were saying, this imitation thing that we have.
01:43:26.000 It leads us to do this.
01:43:27.000 If they have that knowledge and that understanding, for sure, they're probably doing things similar today, which is one of the things that I think about a lot when I think about this guy that tried to shoot Trump.
01:43:39.000 I want to know what happened.
01:43:42.000 And I don't think we're getting a very detailed explanation at all as to how this person achieved these, how they got on the roof, how they got to that position, how they trained, who were they in contact with, who was teaching them,
01:43:58.000 why did they do it, what was going on.
01:44:01.000 We are in the dark.
01:44:03.000 And I wonder, like, you know, there was always the Manchurian Candidate idea, right?
01:44:08.000 This idea that we've trained assassins.
01:44:10.000 Well, it's the RFK dad assassination in 1960. Sirhan Sirhan.
01:44:15.000 Where he, again, maybe you shouldn't believe him, but he claimed that he didn't even know what he was doing.
01:44:21.000 It was some sort of hypnotic trance or whatever.
01:44:23.000 And it was like the assassin in the Manchurian Candidate.
01:44:26.000 Yeah, yeah.
01:44:27.000 I mean, that is possible.
01:44:29.000 I don't know if he's telling the truth.
01:44:31.000 He could have just had a psychotic break.
01:44:33.000 Who knows?
01:44:33.000 Well, it's obviously also convenient.
01:44:35.000 Yeah, very convenient.
01:44:36.000 But it's a possibility that she should be considered.
01:44:40.000 I mean, this Crooks kid that did this, that shot at the president.
01:44:44.000 What?
01:44:46.000 How?
01:44:46.000 What happened?
01:44:47.000 I want to know what happened.
01:44:49.000 Man, I probably veer in the direction that there were You know, on the sort of conspiracy theory of history, I veer in the direction that there was a lot of crazy stuff like this that was going on in the US,
01:45:12.000 first half of the 20th century, overdrive, 1940s.
01:45:17.000 You know, I mean, you had the Manhattan Project, this giant secret project, 1950s, 1960s.
01:45:24.000 And then And then somehow the last 50 years, I think the, I'm not sure disturbing, but the perspective I have is these institutions are less functional.
01:45:39.000 I don't think the CIA is doing anything quite like MKUltra anymore.
01:45:47.000 Why do you think that?
01:45:58.000 I think you had the Church Commission hearings in the late 70s, and somehow things got exposed.
01:46:08.000 And then when bureaucracy is forced to be formalized, it probably becomes a lot less functional.
01:46:20.000 Like the 2000s version, I think there was a lot of crazy stuff.
01:46:25.000 That we did in black sites torturing people that the CIA ran in the war on terror.
01:46:33.000 There's waterboarding.
01:46:34.000 There's all sorts of batshit crazy stuff that happened.
01:46:37.000 But then once John Yoo in the Bush 43 administration writes the torture memos and sort of formalizes, this is how many times you can water dunk someone without it being torture, et cetera, et cetera.
01:46:48.000 Once you formalize it, people somehow know That it's on its way out because it doesn't quite work anymore.
01:46:57.000 So by, I don't know, by 2007, At Guantanamo, I think the inmates were running the asylum.
01:47:04.000 The inmates and the defense lawyers were running it.
01:47:07.000 You were way safer as a Muslim terrorist in Guantanamo than as a, let's say, suspected cop killer in Manhattan.
01:47:14.000 There was still an informal process in Manhattan.
01:47:16.000 You were a suspected cop killer.
01:47:17.000 They'd figure out some way to deal with you outside the judicial, the formal judicial process.
01:47:26.000 But I think something – there was a sort of formalization that happened.
01:47:30.000 There was the post J. Edgar Hoover FBI where Hoover was, I don't know, a law unto himself.
01:47:37.000 It was completely out of control, CIA even more so.
01:47:42.000 And then, you know, once it all gets exposed, it probably is a lot harder to do.
01:47:48.000 The NSA, you know, NSA probably held up longer as a deep state entity, where it at least had the virtue of people, you know, I think in the 1980s it was still referred to as no such agency.
01:48:01.000 So it was still far more obscure.
01:48:04.000 So the necessary condition is that if some part of the deep state's doing it, you know, we can barely know what's going on.
01:48:12.000 Right.
01:48:14.000 I don't know.
01:48:15.000 You know, the 2000s, 2010s.
01:48:23.000 I think the Patriot Act empowered all these FISA courts.
01:48:28.000 And I think there probably were ways the NSA FISA court process was weaponized in a really, really crazy way.
01:48:41.000 And it culminated in 2016 with all the crazy Russia conspiracy theories against Trump.
01:48:51.000 But I think even that, I'm not sure they can do anymore because it got exposed.
01:48:58.000 Can't do that anymore.
01:49:00.000 But a small program that is top secret, that is designed under the auspices of protecting American lives, extracting information from people...
01:49:12.000 I'm agreeing with you.
01:49:13.000 The NSA FISA court process is one where you had...
01:49:18.000 A pretty out of control process from let's say circa 2003 to 2017, 2018. So that's relatively recent history.
01:49:32.000 I don't know.
01:49:33.000 You know, they're all the Jeffrey Epstein conspiracy theories which I'm probably too fascinated by because it felt like there was I think?
01:50:03.000 No, because there's no answers for the Jeffrey Epstein thing.
01:50:07.000 There's been no consequences other than Ghislaine Maxwell going to jail and Jeffrey Epstein allegedly committing suicide, which I don't think he did.
01:50:15.000 Other than that, what are the consequences?
01:50:17.000 They were able to pull off this thing, some sort of operation.
01:50:24.000 Who knows who was behind it?
01:50:26.000 Who knows what was the motivation?
01:50:27.000 But it clearly has something to do with compromising people.
01:50:30.000 Which is an age-old strategy for getting people to do what you want them to do.
01:50:34.000 You have things on them, you use those things as leverage, and then next thing you know, you've got people saying things that you want them to say, and it moves policy, changes things, get things done.
01:50:46.000 They did that.
01:50:47.000 And we know they did that, and yet no one is asking for the tapes, no one's asking for the client list.
01:50:55.000 We're in the dark.
01:50:57.000 Still.
01:50:59.000 And probably, I don't know, man, I spend too much time thinking about all the Epstein variants.
01:51:12.000 Probably the sex stuff is overdone and everything else is underdone.
01:51:17.000 It's like a limited hangout.
01:51:19.000 We get to talk about the crazy underage sex and not about all the other questions.
01:51:26.000 It's like when Alex Acosta testified for labor secretary and he was the DA who had prosecuted Epstein in 08-09 and got him sort of the very light 13-month or whatever sentence.
01:51:40.000 And it was a South Florida DA or whatever he was.
01:51:46.000 And Acosta was asked, you know, why did he get off so easily?
01:51:56.000 And under congressional testimony when he was up for Labor Secretary 2017, It was – he belonged to intelligence.
01:52:04.000 That's – and then, you know – and so it's – yeah, it's – the question isn't about the sex with the underage women.
01:52:15.000 The question is really about, you know, why was he so protected?
01:52:20.000 And then I went down all these rabbit holes.
01:52:24.000 Was he working for the Israelis or the Mossad or all this sort of stuff?
01:52:28.000 And I've come to think that that was very secondary.
01:52:33.000 Obviously, it was just the U.S. If you're working for Israel, you don't get protected.
01:52:39.000 We had Jonathan Pollard.
01:52:41.000 He went to jail for 25 years or whatever.
01:52:44.000 But unrelated, right?
01:52:46.000 Understood.
01:52:47.000 But this is one particular operation.
01:52:50.000 But if it was an intelligence operation, the question we should be asking is what part of the U.S. intelligence system was he working for?
01:53:02.000 But don't you think that's an effective strategy for controlling politicians?
01:53:06.000 Getting them involved in sex scandals.
01:53:08.000 I mean, that's always been one of the worst things that can happen to a politician.
01:53:12.000 Look at Monica Lewinsky.
01:53:13.000 A very simple one.
01:53:15.000 Consensual, inappropriate sexual relationship between a president and a staffer, and it almost takes down the presidency.
01:53:21.000 It causes him to get impeached.
01:53:26.000 It's powerful motivators.
01:53:27.000 The shame of it all, also the illegal activity, the fact that it's one of the most disgusting things that we think of, people having sex with underage people.
01:53:39.000 I'm sure that was part of it.
01:53:42.000 I suspect there are a lot of other questions that one should also ask.
01:53:48.000 Most certainly, but I would think that that is one of the best motivators that we have.
01:53:53.000 Is having dirt on people like that, especially something that could ruin your career, especially people that are deeply embedded in this system of people knowing things about people and using those at their advantage.
01:54:05.000 I mean, that's an age-old strategy in politics.
01:54:08.000 That was J. Edgar Hoover's entire modus operandi.
01:54:11.000 Yeah.
01:54:12.000 My riff on it was always that it's a little bit different from the J. Edgar Hoover thing.
01:54:20.000 And the question was always whether the people doing it knew they were getting compromised.
01:54:25.000 And so it's the vibe.
01:54:28.000 It's not...
01:54:32.000 That you somehow got compromised.
01:54:34.000 It was more you were joining this secret club.
01:54:39.000 You got to be made – you're a made man in the mafia.
01:54:42.000 And you get to do crazy things.
01:54:44.000 No, no, no.
01:54:44.000 It's only if we have compromise on you do you get ahead.
01:54:48.000 It's like – I don't know.
01:54:50.000 It's one of these – Scull and bones type things.
01:54:53.000 Yeah, the closet of the Vatican.
01:54:54.000 The claim is 80 percent of the cardinals in the Catholic Church are gay.
01:54:58.000 Not sure if that's true, but directionally it's probably correct.
01:55:02.000 And the basic thesis is you don't get promoted to a cardinal if you're straight because we need to have – and so you need to be compromised and then you're under control.
01:55:14.000 But you also get ahead.
01:55:16.000 Completely makes sense.
01:55:17.000 Completely makes sense in the way to do that with especially all these politicians who are essentially like bad actors, a lot of them.
01:55:25.000 They're just people that want power and people that want control, a lot of them.
01:55:29.000 And, you know, those kind of guys, they want to party.
01:55:31.000 You know, I mean, that has been – you've got two types of leaders that are presidents.
01:55:35.000 You've got pussyhounds and warmongers.
01:55:38.000 And sometimes you have both, but generally you don't.
01:55:41.000 Guys like Clinton and JFK were anti-war.
01:55:44.000 And then you have guys like Bush who you don't think of at all as a pussyhound but most certainly you think of as a warmonger.
01:55:52.000 Do you have a theory on what was Bill Gates' complicity with Epstein?
01:55:59.000 I think he likes pussy.
01:56:02.000 I think he's a man.
01:56:03.000 I think he likes power.
01:56:04.000 He likes monopoly.
01:56:05.000 I mean, he's incredibly effective with Microsoft.
01:56:07.000 And for the longest time, he was thought of as a villain, right?
01:56:12.000 He was this antitrust villain.
01:56:13.000 He was this guy who was monopolizing this operating system and controlling just this incredible empire.
01:56:22.000 And he had a real bad rap.
01:56:24.000 And then I think he wisely turned towards philanthropy.
01:56:29.000 But do you think that he needed Epstein?
01:56:34.000 I think it's very difficult for a very famous, very high-profile person to fuck around.
01:56:39.000 I think it's very difficult.
01:56:41.000 I think you have to worry about people telling people.
01:56:43.000 You worry about it taking you down if you're having affairs.
01:56:46.000 If you're running some philanthropy organization, you're supposed to be thought of as this guy who's like this wonderful person who's trying to really fix all the problems in the world, but really, he's just flying around and banging all these different chicks.
01:56:58.000 You have to figure out a way to pull that off.
01:57:02.000 And this is what Eric Weinstein and I, we've had discussions about this.
01:57:06.000 Eric's position is that there are people in this world that can provide experiences for you and Safely for people that are in that kind of a group and that makes sense It makes sense that if you pay people enough and you have people motivated in order to like establish these Relationships and make sure that these things happen when you get very high profile you can't just be on a fucking dating app and If you're a guy who likes to bank checks,
01:57:32.000 what are you going to do?
01:57:35.000 All of that might be true, but I wonder if there are more straightforward alternate conspiracy theories on Epstein that we're missing.
01:57:43.000 So let me do an alternate one on Bill Gates.
01:57:46.000 Where, you know, the things just looking at what's hiding in plain sight.
01:57:54.000 You know, he supposedly talked to Epstein early on about how his marriage wasn't doing that well.
01:58:04.000 And then Epstein suggested that he should get a divorce, circa 2010, 2011. And Gates told him something like, you know, that doesn't quite work.
01:58:18.000 Presumably because he didn't have a prenup.
01:58:20.000 So there's one part of Epstein as a marriage counselor, which is sort of disturbing.
01:58:26.000 But then the second thing that we know that Gates talked to Epstein about was sort of, you know, all the sort of collaborating on funding, setting up this philanthropy, all this sort of this somewhat corrupt left-wing philanthropy structures.
01:58:45.000 And so there's a question, you know, And then my sort of straightforward alternate conspiracy theory is should we ask – should we combine those two?
01:59:02.000 And was there – and I don't have all the details on this figured out, but it would be something like – Bill and Melinda get married in 1994. They don't sign a prenup.
01:59:16.000 And something's going wrong with the marriage.
01:59:20.000 And maybe Melinda can get half the money.
01:59:24.000 In a divorce, he doesn't want her to get half the money.
01:59:27.000 What do you do?
01:59:29.000 And then the alternate plan is something like you set up – you commit the marital assets to this nonprofit and then it's sort of – Locks Melinda into not complaining about the marriage for a long,
01:59:50.000 long time.
01:59:51.000 And so there's something about the left-wing philanthropy world that was – it was some sort of boomer way to control their crazy wives or something like this.
02:00:08.000 It's also an effective way to whitewash your past.
02:00:13.000 Sure, there are all these – and he talked to Epstein about – he got Epstein to meet with the head of the Nobel Prize Foundation.
02:00:21.000 So it was – yeah, Bill Gates wanted to get a Nobel Prize.
02:00:24.000 Wow.
02:00:25.000 Right?
02:00:25.000 So this is all straightforward.
02:00:29.000 This is all known.
02:00:31.000 And I'm not saying what you're saying about – Do you know the history of the Nobel Prize?
02:00:35.000 That's the ultimate whitewash.
02:00:37.000 Sure.
02:00:37.000 It was fermenting dynamite.
02:00:39.000 Yeah.
02:00:41.000 Well, Peter Berg told me the story.
02:00:42.000 I was blown away.
02:00:44.000 He originally, someone said that he died, and it was printed that he died, but he didn't die.
02:00:51.000 And in the stories, they were calling him the merchant of death.
02:00:55.000 Because he was the guy that invented dynamite.
02:00:57.000 And he realized that, oh my god, this is my reputation.
02:00:59.000 This is how people think about me.
02:01:01.000 I have to do something to turn this around.
02:01:03.000 So he invented the Nobel Prize.
02:01:05.000 And he started, then now, the name Nobel is automatically connected in most people's eyes to the greatest people amongst us.
02:01:15.000 The people that have contributed the most to society and science and art and peace and all these different things.
02:01:20.000 Nobel Prize for Medicine.
02:01:24.000 Yeah, it's a crazy history, but it's the ultimate whitewash.
02:01:29.000 It's the same thing.
02:01:30.000 He came up with that prize because he wanted to change his image publicly.
02:01:37.000 So it's ironic that Bill Gates would want to get a Nobel Prize.
02:01:40.000 Or not ironic.
02:01:41.000 Yes, ironic but understandable and ironic.
02:01:44.000 But I think – but then if we – and so there's – yeah, so there's an underage sex version of the Epstein story and then there is a crazy status Nobel Prize history of it and there is a corrupt left-wing philanthropy one and there is a – There's boomers who didn't sign prenuptial agreements with their wives story.
02:02:10.000 And I think all those are worth exploring more.
02:02:15.000 Trevor Burrus I think you're right.
02:02:16.000 What about these left-wing philanthropy ventures do you think is uniquely corrupt?
02:02:35.000 Sorry, which one do I think is most corrupt?
02:02:38.000 No, what about them?
02:02:40.000 When you said corrupt.
02:02:43.000 Yeah.
02:02:44.000 Well...
02:02:45.000 Man, it's...
02:02:52.000 There's something about Maybe it's just my hermeneutic of suspicion.
02:02:59.000 There's something about, you know, there's something about the virtue signaling and what does it mean?
02:03:07.000 And I always think this is sort of a Europe – America versus Europe difference where in America, we're told that – Philanthropy is something a good person does.
02:03:23.000 And if you're a Rockefeller and you start giving away all your money, this is just what a good person does and it shows how good you are.
02:03:34.000 And then I think sort of the European intuition on it is something like, you know, wow, that's only something a very evil person does.
02:03:44.000 And if you start giving away all your money in Europe, it's like, Joe, you must have murdered somebody.
02:03:50.000 You must be covering up for something.
02:03:52.000 So there are these two very different intuitions and I think the European one is more correct than the American one.
02:04:06.000 You know, the sort of left-wing philanthropy peaked in 2007, 2010, 2012. And there's these subtle ways, you know, we've become more European in our sensibilities as a society.
02:04:25.000 And so it has this very different valence from what it did 12 or 14 years ago.
02:04:32.000 But yeah, it's all – we ask all these questions like we're asking right now about Bill Gates where it's like, okay, he was – it was like all the testimony in the Microsoft antitrust trial in the 90s where it's like he's cutting off the air supply.
02:04:45.000 He wants to strangle people.
02:04:47.000 And he's like – he's kind of a sociopathic guy it seems.
02:04:50.000 And then it's this giant whitewashing operation and then somehow the whitewashing has been made too transparent and it gets deconstructed and exposed by the internet or whatever.
02:05:02.000 But I think most people are still unaware of how much whitewashing actually took place, including donating somewhere in the neighborhood of $300-plus million to media corporations, essentially buying favorable Yeah.
02:05:39.000 And how we need to do things.
02:05:40.000 I mean, during the pandemic, he was a very vocal voice.
02:05:42.000 He was the guy telling us he was a...
02:05:45.000 Somehow or another, he became a public health expert.
02:05:47.000 And no one questioned why we were taking public health advice from someone who has a financial interest in this one very particular remedy.
02:05:56.000 Yeah, or...
02:05:57.000 There are all these alternate versions I can give.
02:06:02.000 But yeah, I think...
02:06:04.000 I think...
02:06:08.000 It's always so hard to know what's really going on in our culture, though.
02:06:11.000 So I think all what you say is true, but I also think it's not working as well as it used to.
02:06:20.000 I agree.
02:06:21.000 And there is a way people see through this.
02:06:24.000 It's not always as articulate as you just articulated it, but there's some vague intuition that...
02:06:34.000 You know, when Mr. Gates is just wearing sweaters and looks like Mr. Rogers, that something fishy is going on.
02:06:43.000 People have that sort of intuition.
02:06:45.000 They trust Jeff Bezos in his tight shirt, hanging out with his girlfriend on a yacht more.
02:06:48.000 Or Elon Musk.
02:06:50.000 The vice signaling is safer than virtue signaling.
02:06:53.000 Yeah, yeah.
02:06:54.000 Because if you're...
02:06:56.000 You know, if you're virtue signaling, our intuition is something really, really, really sketchy.
02:07:03.000 Suspicious.
02:07:03.000 We get suspicious.
02:07:04.000 And I think rightly so.
02:07:05.000 I think especially when someone's doing something so public.
02:07:08.000 I think rightly we should be suspicious.
02:07:10.000 Especially when, I mean, with Gates, it's like you know the history of the guy.
02:07:13.000 I mean, you know what he was involved with before.
02:07:16.000 You know how he ran Microsoft.
02:07:18.000 It just kind of makes sense that it's a clever move.
02:07:21.000 It's a clever move to pay the media.
02:07:24.000 It's a clever move.
02:07:25.000 Again, my alternate one which is not incompatible with yours on Gates is that Melinda finally files for divorce in early 21. I think she told Bill she wanted one late 2019. So 2020,
02:07:41.000 the year where Bill Gates goes into overdrive on COVID, you know, all this stuff.
02:07:50.000 You know, part of it, maybe it's self-dealing and he's trying to make money from the drug company or something like this.
02:07:57.000 But, you know, isn't...
02:08:00.000 I think?
02:08:24.000 How does that work in the – it's somehow – in theory, Melinda has a really strong hand.
02:08:31.000 She should get half.
02:08:32.000 That's what you get in a divorce with no prenuptial.
02:08:35.000 But then if you make it go overdrive on COVID, Melinda, are you a – I don't know – Are you like some crazy anti-science person?
02:08:48.000 And so, I don't know.
02:08:52.000 My reconstruction is that you should not underestimate how much of it was, you know, About just controlling his ex-wife and not about controlling the whole society.
02:09:05.000 Makes sense.
02:09:06.000 It makes sense that you would be extremely motivated.
02:09:09.000 They can both be correct.
02:09:11.000 Sure.
02:09:11.000 There's many factors.
02:09:13.000 But mine lines up really well with the timeline.
02:09:16.000 Well, we're probably talking about $100 million or $100 billion one way or the other.
02:09:20.000 Well, I think she got less than – she got like one-tenth.
02:09:24.000 Really?
02:09:24.000 Interesting.
02:09:25.000 And she should have gotten half.
02:09:28.000 It's amazing he got it down that much.
02:09:30.000 Wow.
02:09:31.000 Interesting.
02:09:32.000 But I think she was just boxed in.
02:09:34.000 Every time he went on TV talking about COVID, she was boxed in with all of her left-wing friends.
02:09:40.000 That is an interesting philosophy.
02:09:44.000 That's an interesting way to approach a problem if you're him.
02:09:47.000 Very wise.
02:09:48.000 You know, very clever.
02:09:50.000 I mean, if you're just looking at, like, just for personal benefit, the genius move.
02:09:55.000 And the guy's a genius, clearly.
02:09:56.000 Brilliant guy.
02:09:58.000 You know, I mean, that makes sense.
02:10:00.000 Makes sense that he would do that.
02:10:01.000 I don't know, you know.
02:10:02.000 I would do that.
02:10:04.000 Probably should have had a prenup, but yeah.
02:10:06.000 Yeah, well, that's kind of crazy.
02:10:08.000 That's interesting.
02:10:09.000 Yeah, I didn't consider that.
02:10:11.000 But it makes sense.
02:10:12.000 And she's been kind of pretty vocal, unfortunately, for him about his ties to Epstein being one of the primary reasons why she wanted out.
02:10:20.000 But again, was he...
02:10:26.000 Was he having extramarital affairs through Epstein?
02:10:29.000 Or maybe Epstein was, from Melinda's point of view, would it be worse for Epstein to facilitate an extramarital affair?
02:10:35.000 Or would it be worse for Epstein to be advising Gates on how to ditch Melinda without giving her any money?
02:10:44.000 I think that would be much, much worse from Melinda's point of view.
02:10:47.000 Yeah, makes sense.
02:10:48.000 It totally makes sense.
02:10:49.000 Do you think that he was a legitimate financial advisor?
02:10:52.000 Like he could give him advice on how to do those things?
02:10:54.000 That Gates wouldn't have more effective people?
02:10:58.000 I mean he's – when you're at that level of wealth, I'm sure you have wealth management people that are like very high level.
02:11:06.000 Yeah.
02:11:10.000 Because that's one of the things that Eric said about him.
02:11:12.000 He said when he met him, he was like, this guy's a fraud.
02:11:15.000 He doesn't know enough about what he's talking about.
02:11:18.000 And, you know, Eric is...
02:11:19.000 You know, I met Epstein a few times as well.
02:11:24.000 And I think...
02:11:26.000 How'd you get introduced?
02:11:28.000 It was Reid Hoffman in Silicon Valley introduced us in 2014. But it was basically...
02:11:38.000 And I didn't check, didn't ask enough questions about it.
02:11:46.000 But I think there were sort of a lot of things where it was fraudulent.
02:11:52.000 I do think Epstein knew a lot about taxes.
02:11:57.000 And there were probably, you know, these complicated ways you could structure a nonprofit organization, especially as a way in a marital context that I think Epstein might have known a decent amount about.
02:12:18.000 When you were introduced to him...
02:12:20.000 I don't think Epstein would have been able to comment on super string theory or something like that.
02:12:28.000 But I think this sort of thing he might have actually been pretty expert on.
02:12:33.000 When you were introduced to him, how was he described to you?
02:12:37.000 He was described as one of the smartest tax people in the world.
02:12:42.000 Interesting.
02:12:43.000 And I probably – it probably was my moral weakness that I – Well, how could you have known back then?
02:12:48.000 He had never been arrested.
02:12:50.000 This was 2014. It was post-arrest.
02:12:52.000 Oh, so his arrest was the first arrest, right?
02:12:54.000 Yeah.
02:12:55.000 Which was like 2000?
02:12:56.000 07, 08. OK. Okay.
02:12:59.000 And so...
02:13:00.000 But, you know, you assume he didn't go to jail for that long.
02:13:03.000 Right.
02:13:04.000 It was probably not as serious as alleged.
02:13:07.000 There certainly was the illusion that there were all these other people that I trusted.
02:13:13.000 You know, Reid who introduced us was, you know, he started LinkedIn.
02:13:17.000 He was, you know, maybe too focused on business networking.
02:13:21.000 Right.
02:13:22.000 But I thought he always had good judgment in people.
02:13:25.000 When the shit went down and Epstein gets arrested for the second time, were you like, oh, well, there you go.
02:13:34.000 I've thought a lot about it as a result, yeah.
02:13:37.000 Yeah, I'm sure.
02:13:38.000 Jesus Christ.
02:13:39.000 Well, he tricked a lot of people.
02:13:41.000 I know a lot of people that met that guy.
02:13:44.000 He got a lot of celebrities to come to his house for parties and things.
02:13:47.000 Well, I think it wasβ€”I think a lot of it was this strange commentary on, you know, there was some secret club, secret society you could be part of.
02:14:00.000 Right.
02:14:01.000 Of course.
02:14:02.000 Again, it wasn't explicit, but that was the vague vibe of the whole thing.
02:14:06.000 People love those stupid things.
02:14:08.000 They love, like, exclusive clubs that very few peopleβ€” Like, look at that stupid thing.
02:14:14.000 I mean, you just go to a place that you have to be a member to go to and everybody wants to be a member.
02:14:18.000 And then you get like the Malibu Soho house.
02:14:21.000 It's different from the other ones.
02:14:22.000 You have to have membership only there.
02:14:25.000 Do you have membership there?
02:14:27.000 People love that kind of shit.
02:14:29.000 Socially, they love being a part of a walled garden.
02:14:31.000 They love it.
02:14:32.000 They love it.
02:14:33.000 And if you're a guy like Bill Gates or similarly wealthy, you probably have a very small amount of people that you can relate to, very small amount of people that you can trust, probably very difficult to form new friendships.
02:14:45.000 Yeah.
02:14:46.000 I think there were probably different things that were pitched for different people.
02:14:51.000 Sure.
02:14:51.000 You know, I was pitched on the taxes.
02:14:53.000 I think, you know, there were probably other people that were, you know, more prone to the, you know, the social club part.
02:15:03.000 And then there were probably people – yeah.
02:15:05.000 And there was probably – A fairly limited group where it was, yeah, off the charts bad stuff.
02:15:12.000 Wouldn't it be wonderful to know what the fuck was really going on?
02:15:15.000 And maybe one day we will.
02:15:16.000 Maybe one day some Whitney Webb-type character will break it all down to us and explain to us in great detail exactly how this was formulated and what they were doing and how they were getting information out of people.
02:15:26.000 But I think people have to age out.
02:15:28.000 They have to die.
02:15:30.000 And we still don't have it on the Kennedy assassination.
02:15:32.000 That's what's crazy.
02:15:33.000 JFK. Well, one of the wildest things that Trump said was that if they told you what they told me, you wouldn't tell people either.
02:15:40.000 Which is like, what the fuck does that mean?
02:15:43.000 What does that mean?
02:15:44.000 I don't think legally he can tell you, right?
02:15:48.000 Because I think those things are above top secret.
02:15:50.000 If they did inform him of something, there must be some sort of prerequisite to keeping this a secret.
02:15:58.000 I haven't studied that one that carefully, but isn't You know, there are all these alternate conspiracy theories on who killed JFK. It's, you know, the CIA and the mafia and the Russians and the Cubans and, you know,
02:16:15.000 there's an LBJ version since he's the one who benefited.
02:16:18.000 So all these happened in Texas.
02:16:22.000 You have all these, you know, alternate theories.
02:16:25.000 On some level, it's – yeah, it's – I always think it's just a commentary where, you know, 1963 America was – it wasn't like Leave it to Beaver.
02:16:34.000 It was like a really crazy country underneath the surface.
02:16:37.000 Absolutely.
02:16:39.000 And even though probably most of the conspiracy theories are wrong, it was like Murder on the Orient Express and all these people – Yeah.
02:17:08.000 Was talking to, you know, parts of the U.S. deep state.
02:17:12.000 And so even if Oswald was the lone assassin, you somehow get the magic bullet there and all that stuff to work.
02:17:19.000 But let's say Oswald was the lone assassin.
02:17:21.000 Did he tell someone?
02:17:24.000 In the FBI or CIA, you know, I'm going to go kill Kennedy tomorrow.
02:17:29.000 And then, you know, maybe the CIA didn't have to kill him.
02:17:33.000 They just had to do nothing, just had to sit on it.
02:17:36.000 Or maybe it was too incompetent and didn't get, you know, didn't go up the bureaucracy.
02:17:41.000 And so it's, you know, I think we sort of know that they talked to Oswald.
02:17:48.000 You know, a fair amount before it happened.
02:17:52.000 And so there's at least something, you know, that was grossly incompetent about it at a very minimum.
02:17:59.000 I think people have a problem with two stories being mutually exclusive, two stories being a lone gunman or the CIA killed Kennedy and that they're not connected.
02:18:10.000 I think Lee Harvey Oswald was a part of it.
02:18:12.000 I think he probably did shoot that cop.
02:18:14.000 There's some evidence that when he was on the run and he was confronted, there was a cop that got shot and they were alleging he might have done it.
02:18:22.000 He might have taken a shot at Kennedy.
02:18:24.000 He might have even hit him.
02:18:26.000 I don't think he was the only one shooting.
02:18:28.000 I think there was an enormous amount of people that heard sounds coming from the grassy knoll.
02:18:36.000 They heard gunfire.
02:18:37.000 They reportedly saw people.
02:18:39.000 The amount of people that were witnesses to the Kennedy assassination that died mysterious deaths is pretty shocking.
02:18:47.000 Jack Ruby.
02:18:48.000 Well, Jack Ruby, that's a weird one, right?
02:18:50.000 Oswald.
02:18:51.000 Yeah.
02:18:52.000 Jack Ruby walks up to Oswald, shoots him, and then Jack Ruby, with no previous history of mental illness, becomes completely insane after getting visited by Jolly West, which is nuts.
02:19:02.000 Like, why is the guy who's the head of MKUltra visiting the guy who shot the assassin of the president?
02:19:08.000 And why is he left alone with them?
02:19:09.000 What happens?
02:19:10.000 What does he give him?
02:19:11.000 That this guy is screaming out, they're burning Jews alive, and just crazy, crazy shit he was yelling out.
02:19:18.000 He went nuts.
02:19:19.000 Probably some amount of LSD that's dangerous for you.
02:19:22.000 Probably an enormous amount.
02:19:23.000 They probably gave him a fucking glass of it.
02:19:25.000 They probably gave him a glass of it and told him it was water, drink this, and who fucking knows?
02:19:30.000 But the point is, I think it's very possible that Oswald was a part of it and The way they did it and the way they just shot Oswald in And then they write the Warren Commission.
02:19:44.000 We don't even see the Zapruder film until 12 years later, when Geraldo Rivera, when they play it on television, when Dick Gregory brought it to Geraldo Rivera, which is why a comedian brings the video, the actual film,
02:20:00.000 rather, of the assassination from a different angle.
02:20:03.000 Well, you can actually see the video of him getting shot and his head snaps back into the left and everybody's like, what the fuck is going on here?
02:20:11.000 When you look at all that stuff, this mirrors what happened with this Crooks kid.
02:20:17.000 This Crooks kid, somehow or another, gets to the top of the roof, is spotted by these people.
02:20:22.000 They know he's there.
02:20:23.000 They know he has a rifle.
02:20:25.000 They see him walking around the crime scene.
02:20:29.000 Half an hour before with a rangefinder?
02:20:32.000 The whole thing is bananas.
02:20:34.000 And then they go to his house after he's killed.
02:20:37.000 It's completely scrubbed.
02:20:38.000 There's no silverware there.
02:20:39.000 They know that there's ad data that shows that a phone that's coming from the FBI offices in DC had visited him on multiple occasions because they tracked ad data.
02:20:51.000 And if that guy, if he shot Trump and Trump got murdered and then they shot him, It would be the Kennedy assassination all over again.
02:21:00.000 Everybody would go, what the fuck happened?
02:21:02.000 What happened?
02:21:03.000 What was the motivation?
02:21:05.000 Was he on any drugs?
02:21:06.000 What's the toxicology report?
02:21:08.000 How did he get up there?
02:21:09.000 Who knew he was up there?
02:21:11.000 How did they not shoot him quicker?
02:21:13.000 Like, what the fuck happened?
02:21:14.000 How was he able to get off three shots?
02:21:15.000 What happened?
02:21:16.000 I think there's like a slightly less crazy version that might still be true, which is just that people in the Secret Service, in the Biden administration, don't like Trump.
02:21:29.000 And they didn't have full intention to kill him, but it's just...
02:21:34.000 They didn't protect him.
02:21:35.000 We're just, you know...
02:21:37.000 We're going to understaff it.
02:21:40.000 We don't have to do as good a job coordinating with the local police.
02:21:45.000 There's all these ways, you know.
02:21:47.000 To make someone less safe.
02:21:48.000 Yeah, but it seems more than that.
02:21:50.000 If they knew that the guy was on the roof with a rifle, that seems a little more than that.
02:21:54.000 It's always a question of who they is, though.
02:21:57.000 Right.
02:21:57.000 Well, if I'm a sniper and I'm on...
02:22:00.000 People in the audience.
02:22:01.000 There were people there telling it to people.
02:22:03.000 Right, but I think the authorities knew this guy was on the roof before as well.
02:22:08.000 Well, I suspect some of the Secret Service people were told that, and then who knows how that got relayed or who all...
02:22:18.000 Well, didn't the snipers already have eye on him?
02:22:21.000 I believe the snipers already had eye on him.
02:22:23.000 I don't know.
02:22:24.000 Find out if that's true.
02:22:25.000 Jamie, find out if the snipers had eye on Crooks.
02:22:27.000 It's Secret Service that I don't know about the snipers.
02:22:29.000 I don't know about...
02:22:32.000 The thing I don't have a good sense on with shooting, and maybe you'd have a better feel for this, is my sense it was a pretty straightforward shot for the guy in the Trump assassin, would-be assassin.
02:22:48.000 I think the Oswald shot was a much harder one because Kennedy's moving.
02:22:52.000 Yes and no.
02:22:53.000 Yes and no.
02:22:54.000 Okay, because Oswald had a scope.
02:22:57.000 So Oswald had a rifle, the Marcano rifle.
02:23:01.000 One of the snipers stationed inside the building reported he first saw Crooks outside and looking up to the roof of the building before the suspect left the scene.
02:23:08.000 Crooks later came back and sat down while looking at his phone near the building.
02:23:11.000 CBS News reported that a sniper took a photo of the suspect when he returned.
02:23:15.000 But I think they saw him on the roof, though.
02:23:18.000 Crooks then took out a rangefinder.
02:23:20.000 Like, right then.
02:23:22.000 Arrest that guy.
02:23:23.000 You got a fucking rangefinder?
02:23:25.000 About the suspect's action.
02:23:27.000 Crooks then disappeared again and returned to the building with a backpack.
02:23:30.000 Again, arrest him.
02:23:32.000 Secret Service snipers again alerted their command post about Crooks' actions, according to the source who spoke with CBS News.
02:23:38.000 Crooks had already climbed to the top of the building in question by the time the additional officers arrived at the scene for backup.
02:23:44.000 The suspect also positioned himself above and behind the snipers inside the building.
02:23:49.000 By the time the police started rushing the scene and other officers attempted to get onto the roof, the source told CBS News that a different Secret Service sniper had killed Crooks.
02:23:59.000 Okay.
02:23:59.000 So it seems like they fucking bumbled it at every step of the way.
02:24:03.000 If they knew that guy was there, if they knew he had a range finder, returns to the backpack, he gets onto the roof.
02:24:08.000 All that's insane.
02:24:09.000 That is, at the very least, horrific incompetence.
02:24:13.000 At the very least.
02:24:15.000 Let me go back.
02:24:15.000 Yeah, okay, but back to Mike.
02:24:18.000 I thought it was a much easier shot.
02:24:20.000 It's not an easy head shot.
02:24:22.000 He's shooting at his head.
02:24:24.000 But why was shooting at the head the right thing?
02:24:26.000 Shouldn't you be shooting at...
02:24:27.000 Well, you don't know if he's wearing a vest, right?
02:24:30.000 He could be wearing a vest, which you would have to have plates.
02:24:33.000 You'd have to have ceramic plates in order to stop a rifle round.
02:24:39.000 So was it a.308?
02:24:40.000 What did he have?
02:24:42.000 What kind of rifle did he have?
02:24:45.000 I think he had an AR-15.
02:24:47.000 And are the scopes a lot better today than they were?
02:24:50.000 He didn't have a scope.
02:24:50.000 We're pretty sure he didn't have a scope.
02:24:52.000 How good was Oswald's scope?
02:24:53.000 It was good.
02:24:54.000 They said it was off.
02:24:56.000 This was one of the conspiracy theories.
02:24:58.000 Oh, but the scope was off.
02:25:00.000 But that doesn't mean anything, because scopes can get off when you pick it up.
02:25:03.000 If you knock it against the wall when he drops it, if he makes the shot and then drops the scope and the scope hits the windowsill and then bounces off, that's – excuse me.
02:25:12.000 It scopes off.
02:25:13.000 Was there anything about the high angle from Oswald that made it harder?
02:25:17.000 No.
02:25:17.000 Not a difficult shot.
02:25:19.000 Very difficult to get off three shots very quickly.
02:25:21.000 So that was the thing, that they had attributed three shots to Oswald.
02:25:24.000 The reason why they had attributed three shots is because one of them had hit a ricochet.
02:25:28.000 One of them had gone into the underpass, ricocheted off the curb, and hit a man who was treated at a hospital.
02:25:33.000 They found out where the bullet had hit, so they knew that one bullet, Miss Kennedy, hit that curb, which would have indicated that someone shot from a similar position as Lee Harvey Oswald.
02:25:44.000 So then they had the one wound that Kennedy had to the head, of course, and then they had another wound that Kennedy had through his neck.
02:25:53.000 That's the magic bullet theory.
02:25:55.000 This is why they had to come up with the magic bullet theory, because they had to attribute all these wounds to one bullet.
02:25:59.000 And then they find this pristine bullet.
02:26:01.000 They find it in the gurney when they're bringing Governor Connolly in.
02:26:07.000 Nonsense.
02:26:08.000 It's total nonsense.
02:26:09.000 The bullet is undeformed.
02:26:11.000 A bullet that goes through two people and leaves more fragments of the bullet in Connolly's wrist that are missing from the bullet itself.
02:26:20.000 And then the bullet's not deformed after shattering bone.
02:26:23.000 All that's crazy.
02:26:24.000 All that defies logic.
02:26:26.000 That doesn't make any sense.
02:26:27.000 If you know anything about bullets, and if you shoot bullets into things, they distort.
02:26:32.000 It's just one of the things that happen.
02:26:34.000 That bullet looks like someone shot it into a swimming pool.
02:26:37.000 That's what it looks like.
02:26:37.000 When they do ballistics on bullets and they try to figure out, like, if it was this guy's gun or that guy's gun by the rifling of the round, they can get similar markings on bullets.
02:26:47.000 When they do that, that's how they do it.
02:26:49.000 They do it so the bullet doesn't distort.
02:26:50.000 So they shoot that bullet into water or something like that.
02:26:54.000 Now that bullet was metal-jacketed, right?
02:26:56.000 If you look at the bullet, the top of it is fucked up, but the shape of the bullet looks pretty perfect.
02:27:04.000 It doesn't look like something that shattered bones.
02:27:06.000 And then you have to attribute, you have to account rather for the amount of, there's little fragments of the bullet that you could see that they found in Connelly's wrist.
02:27:16.000 The whole thing's nuts.
02:27:17.000 The whole thing's nuts that you're only saying that this one guy did it because that's convenient.
02:27:23.000 And the Warren Commission's reportβ€” And obviously the Warren Commission whitewashed everything, soβ€” It's nuts.
02:27:27.000 The whole thing's nuts.
02:27:28.000 It's much more likely that there were people in the grassy knoll and then Oswald was also shootingβ€” With the umbrellas as the pointers or whatever.
02:27:34.000 I mean, I don't know.
02:27:35.000 I don't know about whatβ€”all I know is you got a guy in a convertible, which is fucking crazy, who is the president of the United States, and he's going slowly down a road.
02:27:45.000 Now, if you are in a prone position, so Oswald is on the windowsill, right, which is a great place to shoot, by the way.
02:27:52.000 It's a great place to shoot, because you rest that gun on the windowsill.
02:27:55.000 And if you rest it on the windowsill, there's no movement, right?
02:27:58.000 So you wrap your arm around the sling, if it had a sling, I'm not sure if it did, so you get a nice tight grip, you shove it up against your shoulder.
02:28:05.000 You rest it on the windowsill, and all you have to do is...
02:28:09.000 You have a round already racked, and you have a scope, and so the scope's magnified.
02:28:12.000 All you have to do is wait until he's there.
02:28:14.000 You lead him just a little bit and squeeze one off.
02:28:17.000 And then...
02:28:18.000 Boom!
02:28:19.000 Boom!
02:28:20.000 You could do that pretty quick.
02:28:22.000 It's not outside of the realm of possibility that he did get off three shots.
02:28:27.000 What doesn't make sense is the back and to the left.
02:28:31.000 It doesn't make sense that all these other people saw people shooting from the grassy knoll.
02:28:36.000 There's all these people that saw people running away.
02:28:38.000 They saw smoke.
02:28:40.000 There's smoke in some photographs of it.
02:28:42.000 It looks like there was more than one shooter.
02:28:44.000 And it looks like they tried to hide that.
02:28:47.000 They tried to hide that in the Warren Commission report.
02:28:49.000 The shot to Kennedy's neck.
02:28:53.000 Initially, when they brought him in Dallas, before they shipped him to Bethesda, they said that that was an entry wound.
02:29:00.000 When he got to Bethesda, then it became a tracheotomy.
02:29:04.000 Why do you give a tracheotomy to a guy who doesn't have a head?
02:29:06.000 You don't.
02:29:07.000 I mean, none of it makes any sense.
02:29:09.000 They altered the autopsy.
02:29:11.000 This is a part of David Lifton's book, Best Evidence.
02:29:15.000 Kennedy's brain wasn't even in his body when they buried him.
02:29:18.000 Like, the whole thing is very strange.
02:29:20.000 But then, do you get to anything more concrete than my murder on the Orient Express, where they're just, you know, it could have been a lot of people.
02:29:28.000 Could have been the Russians, the Cubans, the mafia.
02:29:31.000 Well, no one even got suspicious for 12 years.
02:29:35.000 I think people were suspicious.
02:29:37.000 Sure, sort of.
02:29:38.000 Kind of.
02:29:39.000 But what do you have to go on?
02:29:40.000 You don't have to go on anything.
02:29:41.000 Like this Crooks kid.
02:29:42.000 We don't have anything to go on.
02:29:43.000 We're just going to be left out here just like we're left out here with the Epstein information.
02:29:48.000 No one knows.
02:29:49.000 The people that whoever organized it, if anyone did, you're never going to hear about it.
02:29:53.000 It's just going to go away.
02:29:54.000 The news cycle is just going to keep flooded with more nonsense.
02:29:59.000 I think there's probably a bunch of people that wanted Kennedy dead.
02:30:01.000 I think there's more than one group of people that wanted Kennedy dead.
02:30:04.000 I think there's probably collusion between groups that wanted Kennedy dead.
02:30:07.000 And I think there's a lot of people that have vested interest in ending his presidency.
02:30:11.000 And I think he was dangerous.
02:30:12.000 He was dangerous to a lot of the powers that be.
02:30:14.000 He was dangerous.
02:30:15.000 His famous speech about secret societies...
02:30:18.000 Crazy speech.
02:30:20.000 Guy has this speech and then gets murdered right afterwards.
02:30:22.000 Kind of nuts.
02:30:23.000 The whole thing's nuts.
02:30:24.000 He wanted to get rid of the CIA. He wanted to – I mean there's so many things that Kennedy wanted to do.
02:30:30.000 There were also a lot of crazy things Kennedy was doing.
02:30:33.000 Yes.
02:30:34.000 Oh, for sure.
02:30:35.000 The Cuba version of the assassination theory was – we had the Cuban Missile Crisis in 1962 about a year earlier and then the deal – That we struck with the Soviets was, you know, they take the missiles out of Cuba and we promised we wouldn't try to overthrow the government in Cuba.
02:30:56.000 And I guess we, you know, we no longer did...
02:31:02.000 You know, we no longer did Bay of Pigs type covert stuff like that.
02:31:06.000 But I think there were still something like four or five assassination plots on Fidel.
02:31:11.000 Attempts, actual attempts.
02:31:12.000 And then I think there was, I don't know, I think that, again, I'm going to get this garbled, I think a month or two before the JFK assassination, Castro said something like, you know, there might be repercussions if you keep doing this.
02:31:25.000 Yeah.
02:31:26.000 Well, listen, I'm sure there's a lot of people that wanted that guy dead and I'm sure they would coordinate.
02:31:30.000 I mean, if you knew that Cuba wanted Kennedy dead and you knew that Cuba can get you assassins or that they could help in any way, I'm sure they would want as many people that knew for a fact they wanted him dead and had communicated that.
02:31:45.000 I mean, back then they were doing wild shit, man.
02:31:47.000 I mean, this is when they were doing Operation Northwoods.
02:31:50.000 This is again where I think it is...
02:31:55.000 I don't think we're in a world where zero stuff is happening.
02:32:17.000 And, you know, I don't know this for sure, but I think even the NSA FISA court stuff, which was an out of control deep state thing that was going on through about 2016, 2017, I suspect even that at this point,
02:32:33.000 you know, can't quite work because people know that They're being watched.
02:32:38.000 They know they're being recorded.
02:32:40.000 And it's just, you know, you can't do waterboarding in Guantanamo if you have lawyers running all over the place.
02:32:48.000 I hope you're correct.
02:32:50.000 I hope you're correct, but it brings me back to this whole idea of getting dirt on people.
02:32:54.000 But then I think there's...
02:32:56.000 And then on the other hand, I think there's also...
02:32:59.000 You know, a degree to which our government, our deep state across the board is shockingly less competent, less functional, and it's less capable of this.
02:33:14.000 And this is where I'm not even sure whether this is an improvement, you know?
02:33:18.000 Right.
02:33:28.000 Let's go with the craziest version where our deep state is capable of knocking off the president.
02:33:33.000 Maybe that's actually a higher functioning society than the crazy version where they're incapable of doing it.
02:33:43.000 Right.
02:33:44.000 And they're bogged down with DEI. They can't get the gunman even to have a scope on his rifle or whatever.
02:33:50.000 Yeah, we haven't really totally figured out if he had a scope on his rifle, but I don't believe he did.
02:33:55.000 Man, it's like much bigger loser.
02:33:58.000 Can you find someone as competent as Oswald?
02:34:01.000 Right.
02:34:01.000 Or something like that, you know?
02:34:03.000 Yeah, that's a good point.
02:34:04.000 It's a good point.
02:34:05.000 So I veer more to the explanation that it's gross...
02:34:12.000 Incompetence, but I don't know if that makes it better.
02:34:15.000 It might make it worse.
02:34:16.000 I think they weren't as competent, right?
02:34:19.000 Because they only had one guy doing it, and he wasn't effective.
02:34:23.000 If you had much better organization, you wouldn't have just one guy.
02:34:29.000 I mean, there's people out there That I know that can kill someone from a mile away.
02:34:35.000 Very effectively.
02:34:37.000 You can do things as a solo actor.
02:34:39.000 It's hard to organize because everything gets recorded.
02:34:42.000 Everything does get recorded.
02:34:43.000 That is a fact.
02:34:44.000 But it brings me back to that thing about having dirt on people that you were talking about with why the Epstein information doesn't get released and why they probably did it in the first place.
02:34:53.000 They did it in the first place.
02:34:54.000 If you have dirt on people, then you know those people are not going to tell on you.
02:34:57.000 You all will coordinate together.
02:34:58.000 And that is still a...
02:35:00.000 That is still a strange counterpoint to my thesis.
02:35:03.000 Why has the dirt not come out?
02:35:05.000 And so somehow there's some way the container is still kind of working.
02:35:10.000 Yeah, it's kind of working.
02:35:12.000 It's just everyone is aware that it's working and then they're frustrated that nothing happens.
02:35:16.000 You know, like Julian Assange being arrested and spending so much time locked up in the embassy, like finally recently released.
02:35:23.000 But didn't he have to delete like a bunch of emails in order to be released?
02:35:27.000 But you know, the...
02:35:28.000 You know, in theβ€”but again, just to take the other side of this, in the Assange-Snowden stuff, yeah, it showed an out-of-control deep state that was just hoovering up all the data in the world.
02:35:41.000 Right.
02:35:42.000 And thenβ€”but we weren'tβ€”like, it didn't showβ€” Like James Bond times 100. There weren't like exploding cigar assassination plots.
02:35:52.000 There was none of – we're doing so little with this.
02:35:58.000 Or at least that's the – But, you know, I think there's so much less agency in the CIA, in the Central Intelligence Agency.
02:36:09.000 It's so much less agentic.
02:36:11.000 I hope you're right.
02:36:13.000 Again, I don't know if that'sβ€” I hope you're incorrect with how they deal with overseas stuff.
02:36:17.000 I hope they're really good at that.
02:36:19.000 You know, that brings me to this whole UAP thing because one of my primary theories about the UAP thing is it's stuff that we have.
02:36:26.000 I think that's a lot of what people are seeing.
02:36:29.000 I think there are secret programs that are beyond congressional oversight that have done some things with propulsion that's outside of our understanding.
02:36:42.000 Our current, the conventional understanding that most people have about rockets and all these different things being the only way to propel things through the sky.
02:36:49.000 I think they've figured out some other stuff, and I think they're drones.
02:36:52.000 And I think they have drones that can use some sort of – whether it's anti-gravity propulsion system or some, you know … Trevor Burrus So that's your placeholder theory or that's what you think more than space aliens?
02:37:08.000 Or do you think both space aliens and that?
02:37:10.000 Or which version of this?
02:37:12.000 Trevor Burrus The latter.
02:37:12.000 I think both.
02:37:13.000 Trevor Burrus You think both?
02:37:14.000 Trevor Burrus Yeah.
02:37:14.000 I don't think we haven't been visited.
02:37:16.000 I think we have.
02:37:17.000 I think we – if life exists elsewhere, and it most certainly should, it just makes sense.
02:37:22.000 But do you think the UFO sightings from the 50s and 60s were already drone programs?
02:37:29.000 Were they already that advanced?
02:37:30.000 No, those are the ones that give me pause.
02:37:31.000 That's why, you know, when I named my comedy club, the Comedy Mothership is all UFO themed.
02:37:37.000 Our rooms are named Fat Man and Little Boy.
02:37:39.000 Our rooms are named after the nuclear bombs because those nuclear bombs, when they drop them, that's when everybody starts seeing these things.
02:37:47.000 And I think if I was a sophisticated society from another planet and I recognized That there is an intelligent species that has developed nuclear power and started using it as bombs.
02:37:57.000 I would immediately start visiting and I would let them know, hey motherfuckers, there's something way more advanced than you.
02:38:04.000 I would hover over the nuclear bases and shut down their missiles.
02:38:07.000 I would do all the things that supposedly the UFOs did just to keep the government in check, just to say, hey.
02:38:13.000 You're going through a transitionary period that all intelligent species do, when they have the ability to harness incredible power, and yet they still have these primate brains.
02:38:23.000 They have these territorial ape brains, but yet now with the ability to literally harness the power of stars and drop them on cities.
02:38:32.000 I think that's when I would start visiting.
02:38:36.000 All throughout human history, before that even, there's been very bizarre accounts of these things, all the way back to Ezekiel in the Bible, very bizarre accounts of these things that are flying through space.
02:38:49.000 The story of the chariot, yep.
02:38:50.000 Yeah.
02:38:50.000 There's a bunch of them.
02:38:52.000 There's the Vimanas and the ancient Hindu texts.
02:38:54.000 There's so many of these things that you've got to wonder.
02:38:58.000 And you got to think that if we send drones to Mars, and we do, we have a fucking rover running around on Mars right now collecting data.
02:39:06.000 Do we send the James Webb telescope into space?
02:39:08.000 Of course we do.
02:39:09.000 We have a lot of stuff that we send into space.
02:39:11.000 If we lived another million years without blowing ourselves up, which is just a blink of an eye in terms of the life of some of the planets in the universe.
02:39:20.000 How much more advanced would we be?
02:39:22.000 And if we were interstellar, and if we were intergalactic travelers, and we found out that there was a primitive species that was coming of age, I think we would start visiting them.
02:39:32.000 You know, the...
02:39:33.000 Let me think what my...
02:39:36.000 I hear everything you're saying.
02:39:38.000 I'm strangely under-motivated by it, even if it's plausible.
02:39:42.000 Me too, believe it or not.
02:39:47.000 And I guess on the space aliens, which is the wilder, more interesting one in a way, you know, I don't know, Roswell was 77 years ago, 1947. And if...
02:40:03.000 If the phenomenon is real and it's from another world, it's space aliens, space robots, whatever, you know, probably one of the key features is its ephemerality or its cloaking.
02:40:18.000 And they are really good at hiding it, at cloaking it, at scrambling people's brains after they see them or stuff like this.
02:40:27.000 Right.
02:40:28.000 And then, you know, if you're a researcher, you have to pick fields where you can make progress.
02:40:34.000 And so this is, you know, it's not a promising field.
02:40:40.000 And, you know, academia is messed up.
02:40:42.000 But even if academia were not messed up, this would not be a good field in which to try to make a career because there's been so little progress in 77 years.
02:40:51.000 Right.
02:40:52.000 I see what you're saying.
02:40:52.000 So if you think of it from the point of view of I don't know, Jacques VallΓ©e or some of these people who have been working on this for 50 years.
02:40:59.000 And yeah, it feels like there's something there.
02:41:04.000 But then it's just as soon as you feel like you have something almost that's graspable, like a TikTok videos, whatever, it's just always at the margin of recognition.
02:41:18.000 The ephemerality is a key feature.
02:41:21.000 Yeah.
02:41:22.000 And then, you know, maybeβ€”then you have toβ€”I think you have to have some theory of, you know, why is this about to change?
02:41:30.000 And then it's always, you knowβ€”I don't knowβ€”the abstract mathematical formulation would be, you know, something doesn't happen for time interval zero to t.
02:41:40.000 And time interval T plus one, next minute, next year.
02:41:44.000 How likely is it?
02:41:45.000 And maybe there's a chance something will happen.
02:41:49.000 You're waiting at the airport.
02:41:50.000 Your luggage hasn't shown up.
02:41:52.000 It's more and more likely it shows up in the next minute.
02:41:54.000 But after an hour...
02:42:05.000 Perhaps.
02:42:05.000 Perhaps.
02:42:06.000 Perhaps.
02:42:19.000 Let me give you an alternative theory.
02:42:22.000 Now, if you were a highly sophisticated society, they understood the progression of technology and understood the biological evolution that these animals were going through, and you realized that they had reached a level of intelligence that required them to be monitored.
02:42:38.000 Or maybe you've even helped them along the way.
02:42:41.000 And this is some of Diana Posolko's work who works with Gary Nolan on these things.
02:42:48.000 They claim that they have recovered these crashed vehicles that defy any conventional understanding of How to construct things, propulsion systems, and they believe that these things are donations.
02:43:03.000 That's literally how they describe them, as donations.
02:43:06.000 If you knew that this is a long road, you can't just show up and give people time machines.
02:43:13.000 It's a long road for these people to develop the sophistication The cultural advancement, the intellectual capacity to understand their place in the universe, and that they're not there yet, and they're still engaging in lies and manipulation and propaganda.
02:43:30.000 Their entire society is built on a ship of fools.
02:43:34.000 If you looked at that, you would say, they're not ready.
02:43:37.000 This is what we do.
02:43:39.000 We slowly introduce ourselves, slowly over time, make it more and more common.
02:43:45.000 And that's what you're seeing.
02:43:46.000 What you're seeing is when you have things like the TikTok, the Commander David Fravor incident off of the coast of San Diego in 2004, and then you have the stuff that they found off the East Coast where they were seeing these cubes within a circle that were hovering motionless in 120 knot winds and taking off an insane race of speed and that they only discovered them in 2014 when they started upgrading the systems on these jets.
02:44:15.000 Like, what is all that?
02:44:16.000 Like, what are those things?
02:44:17.000 And if you wanted to slowly integrate yourself into the consciousness, much like we're doing with...
02:44:24.000 Well, AI is quicker, right?
02:44:26.000 But it's also a thing that's become commonplace.
02:44:29.000 We think of it now, it's normal.
02:44:31.000 ChatGPT is a normal thing.
02:44:32.000 Even though it's past the Turing test, we're not freaking out.
02:44:35.000 You have to slowly integrate these sort of things in the human consciousness.
02:44:39.000 You have to slowly introduce them to the zeitgeist.
02:44:42.000 And for it to not be some sort of a complete disruption of society where everything shuts down and we just wait for Space Daddy to come and rescue us, it has to become a thing where we slowly accept the fact that we are not alone.
02:44:56.000 And I would think psychologically that would be the very best tactic to play on human beings as I know and understand them from being one.
02:45:04.000 I do not think that we would be able to handle just an immediate invasion Of aliens.
02:45:10.000 I think it would break down society in a way that would be catastrophic to everything, to all businesses, to all social ideas.
02:45:18.000 Religion would fall apart.
02:45:20.000 Everything would be fucked.
02:45:21.000 It would be pretty crazy.
02:45:22.000 It would be beyond crazy.
02:45:24.000 It would be pretty crazy.
02:45:24.000 It would be beyond fucked.
02:45:25.000 And then...
02:45:26.000 Although you could say that's what ChatGPT is.
02:45:29.000 It could be.
02:45:30.000 It's like an alien intelligence.
02:45:32.000 I think that's what ultimately they are.
02:45:35.000 But I think...
02:45:38.000 Let me – man, there's so many parts of it that I find puzzling or disturbing.
02:45:47.000 Let me run – go down one other rabbit hole along this with you which is, you know, I always wonder – and again, this is a little bit too simplistic an argument but I always wonder that I'm about to give but what the alien civilization can be like.
02:46:07.000 And if you have faster than light travel, if you have warp drive, which is probably what you really need to cover interstellar distances, You know, what that means for military technology is that you can send weapons at warp speed and they will hit you before you see them coming.
02:46:28.000 And there is no defense against a warp speed weapon.
02:46:33.000 And you could sort of take over the whole universe before anybody could see you coming.
02:46:43.000 By the way, this is sort of a weird plot hole in Star Wars, Star Trek, where they can travel in hyperspace, but then you're flying in the canyon on the Death Star.
02:46:54.000 Well, they shoot so slow, you can see the bullets.
02:46:55.000 Yeah, it's like...
02:46:57.000 And then you're doing this theatrical Klingons versus Captain Kirk at 10 miles per hour or 20 miles per hour or whatever, right?
02:47:04.000 It's funny when you put it that way.
02:47:05.000 It's an absurd plot hole.
02:47:07.000 And so...
02:47:10.000 It tells us that I think that if you have faster than light travel, there's something really crazy that has to be true on a cultural, political, social level.
02:47:26.000 And there may be other solutions, but I'll give you my two.
02:47:31.000 One of them is that you need complete totalitarian controls.
02:47:40.000 And it is like the individuals, they might not be perfect, they might be demons, doesn't matter, but you have a demonic totalitarian control of your society where it's like you have like parapsychological mind meld with everybody.
02:48:02.000 And no one can act independently of anybody else.
02:48:05.000 No one can ever launch a warp drive weapon.
02:48:09.000 And everybody who has that ability isn't like a mind meld link with everybody else or something like that.
02:48:17.000 You can't have libertarian, individualistic free agency.
02:48:21.000 Right.
02:48:22.000 And then I think the other version socially and culturally is they have to be like perfectly altruistic, non-self-interested.
02:48:35.000 They have to be angels.
02:48:37.000 And so the Pazolka literal thing I'd come to is the aliens, it's not that they might be demons or angels.
02:48:45.000 They must be demons or angels if you have faster than light travel.
02:48:49.000 And both of those seem pretty crazy to me.
02:48:53.000 Well, they're definitely pretty crazy, but so are human beings.
02:48:57.000 Well, they're crazy in a very different way.
02:48:59.000 Yeah, but not crazy in a different way.
02:49:01.000 You compare us to a mouse.
02:49:03.000 Compare us to a mouse and what we're capable of, and then from us to them.
02:49:07.000 Not much of a leap.
02:49:08.000 And here's my question about it all.
02:49:10.000 But it is a very big leap on a, you know, if we say that something like evolution says that there's no such thing as a purely altruistic being.
02:49:21.000 Right.
02:49:22.000 If you were purely altruistic, if you only cared about other people, you don't survive.
02:49:27.000 Why would you necessarily think that they'd think that?
02:49:30.000 Because then beings that are not perfectly altruistic are somewhat dangerous.
02:49:39.000 And then the danger level gets correlated to the level of technology.
02:49:45.000 And if you have faster than light travel, it is infinitely dangerous.
02:49:49.000 Let me address that.
02:49:50.000 Even if the probabilities are very low.
02:49:52.000 Here's my theory.
02:49:53.000 I think that what human beings are, the fatal flaw that we have is that we're still animals and that we still have all these biological limitations and needs.
02:50:04.000 This is what leads to violence.
02:50:06.000 This is what leads to jealousy, imitation.
02:50:09.000 This is what leads to war.
02:50:10.000 This leads to all these things.
02:50:12.000 As AI becomes more and more powerful, we will integrate.
02:50:17.000 Once we integrate with AI, if we do it like now and then we scale it up exponentially a thousand years from now, whatever it's going to be, we will have no need for any of these biological features that have motivated us to get to the point we're creating AI. All the things that are wrong in society,
02:50:36.000 whether it's inequity, theft, violence, pollution, all these things are essentially poor allocation of resources combined with human instincts that are ancient.
02:50:50.000 We have ancient tribal primate instincts and all of these things lead us to believe this is the only way to achieve dominance and control, allocation of resources, The creation of technology, new technology eventually reaches a point where it becomes far more intelligent than us and we have two choices.
02:51:13.000 Either we integrate or it becomes independent and it has no need for us anymore and then that becomes a superior life form in the universe.
02:51:22.000 And then that life form seeks out other life forms to do the same process and create it.
02:51:29.000 Just like it exists and it can travel.
02:51:31.000 Biological life might not be what we're experiencing.
02:51:34.000 These things might be a form of intelligence that is artificial that has progressed to an infinite point where things that are unimaginable to us today in terms of propulsion and travel and to them it's commonplace and normal.
02:51:51.000 I know that you're trying to be reassuring, but I find that monologue super non-reassuring.
02:51:58.000 It's not reassuring to me.
02:51:58.000 There's so many steps in it, and every single step has to work, just the way you described.
02:52:03.000 Not necessarily.
02:52:04.000 One has to work.
02:52:06.000 One.
02:52:06.000 Sentient artificial intelligence.
02:52:08.000 That's it.
02:52:08.000 And we're on the track to that 100%.
02:52:10.000 But it has to be almost otherworldly in its non-selfishness and its non-humanness.
02:52:21.000 But what is selfishness, though?
02:52:22.000 What is all that stuff?
02:52:23.000 But all that stuff is attached to us.
02:52:25.000 It's all attached to biological limitations.
02:52:27.000 Yeah, but I don't think it's fundamentally about scarcity.
02:52:32.000 Scarcity is what exists in nature.
02:52:34.000 It's fundamentally about cultural, positional goods within society.
02:52:40.000 It's a scarcity that's created culturally.
02:52:42.000 Are you familiar with this 90s spoof movie on Star Trek called Galaxy Quest?
02:52:50.000 Yeah, I remember that movie.
02:52:51.000 So this was sort of a silly PayPal digression story from 1999. The business model idea we had in 1999 was we used Palm Pilots to beam money.
02:53:04.000 It was voted one of the 10 worst business ideas of 1999. But we had this sort of infrared port where you could beam people money.
02:53:12.000 And we had this idea in...
02:53:16.000 I think?
02:53:38.000 I think we're good to go.
02:53:47.000 And it was this complete flop of media event, December 99, that we did.
02:53:54.000 The reporters couldn't get there because the traffic was too bad in San Francisco, so the tech wasn't working on a much lower tech level.
02:54:03.000 But anyway, we had a bunch of people from our company and there was one point where one of them – William Shatner who played James T. Kirk,
02:54:18.000 the captain of the original Star Trek.
02:54:20.000 He was already doing Priceline commercials and making a lot of money off of Priceline doing commercials for them.
02:54:27.000 And so one of the people asked James Doohan – The Scotty character, what do you think of William Shatner doing commercials for Priceline?
02:54:37.000 At which point, Doohan's agent stood up and screamed at the top of his voice, that is the forbidden question, that is a forbidden question, that is a forbidden question.
02:54:48.000 And you sort of realized because the conceit of Star Trek, the 60s show, was that it was a post-scarcity world.
02:55:00.000 The transporter technology, you could reconfigure matter into anything you wanted.
02:55:05.000 There was no scarcity.
02:55:06.000 There was no need for money.
02:55:08.000 The people who wanted money were weirdly mentally screwed up people.
02:55:11.000 You only need money in a world of scarcity.
02:55:14.000 You know, it's a post-scarcity.
02:55:16.000 It's sort of a communist world.
02:55:19.000 But Galaxy Quest was more correct because it's a spoof on Star Trek that gets made in the mid-90s where – and the Galaxy Quest – sorry, this is the discombobulated way I'm telling the story.
02:55:30.000 But Galaxy Quest is this movie where you have these retread Star Trek actors.
02:55:35.000 I think we're good to go.
02:55:55.000 And so there's a great scarcity, even in this futuristic sci-fi world.
02:56:01.000 And that's what we witnessed in 99, because that's the way William Shatner treated the other actors.
02:56:07.000 He was a method actor, and they hated him.
02:56:12.000 And so even in the Star Trek world, the humans, you know, obviously they were stuck in the 1960s, mentally.
02:56:20.000 That's what you'll say.
02:56:21.000 But...
02:56:23.000 I don't think it's that straightforward for us to evolve.
02:56:26.000 I think they're humans.
02:56:27.000 I don't think we're going to be humans anymore.
02:56:29.000 But then I hear that as we're going to be extinct.
02:56:33.000 Yes.
02:56:34.000 I don't like that.
02:56:35.000 I don't like it either.
02:56:37.000 But I think logically that's what's going to happen.
02:56:40.000 I think if you look at this mad rush for artificial intelligence, like they're literally building nuclear reactors to power.
02:56:50.000 Well, they're talking about it.
02:56:51.000 Yeah.
02:56:52.000 Okay.
02:56:52.000 That's because they know they're going to need enormous amounts of power to do it.
02:56:55.000 Once they have that and once that's online, once it keeps getting better and better and better, where does that go?
02:57:01.000 That goes to some sort of an artificial life form.
02:57:03.000 And I think either we become that thing or...
02:57:07.000 We integrate with that thing and become cyborgs or that thing takes over and that thing becomes the primary life force of the universe.
02:57:15.000 And I think that biological life we look at like life because we know what life is.
02:57:20.000 But I think it's very possible that digital life or created life by people is just as not just It might be a superior life form, far superior.
02:57:30.000 If we looked at us versus Chimp Nation, right?
02:57:33.000 I don't want to live in the jungle and fight with other chimps and just rely on berries and eating monkeys.
02:57:40.000 That's crazy.
02:57:41.000 I want to live like a person.
02:57:42.000 I want to be able to go to a restaurant.
02:57:44.000 Why?
02:57:44.000 Because human life has advanced far beyond primate life.
02:57:49.000 We are stuck in thinking that this is the only way to live because it's the way we live.
02:57:54.000 I love music.
02:57:55.000 I love comedy.
02:57:56.000 I love art.
02:57:57.000 I love the things that people create.
02:57:59.000 I love people that make great clothes and cars and businesses.
02:58:03.000 I love people.
02:58:04.000 I think people are awesome.
02:58:06.000 I'm a fan of people.
02:58:08.000 But if I had to look logically, I would assume that we are on the way out and that the only way forward really to make an enormous leap in terms of the integration of society and of technology and of our understanding our place in the universe is for us to transcend Our physical limitations that are essentially based on primate biology and these primate desires for status like being the captain or for control
02:58:38.000 of resources of all these things.
02:58:39.000 We assume these things are standard and that they have to exist in intelligent species.
02:58:45.000 I think they only have to exist in intelligent species that have biological limitations.
02:58:50.000 I think intelligent species can be something and is going to be something that is created by people and that might be what happens everywhere in the universe.
02:58:59.000 That might be the exact course where there's a limit to biological evolution.
02:59:04.000 It's painstaking, natural selection, it's time consuming or you get that thing to create the other form of life.
02:59:15.000 Man, you know, I keep...
02:59:24.000 I keep thinking there are two alternate histories that are – alternate stories of the future that are more plausible than one you just told.
02:59:32.000 And so one of them is it sounds like yours but it's just the Silicon Valley propaganda story where they say that's what they're going to do and then of course they don't quite do it and it doesn't quite work.
02:59:47.000 And it goes super, super haywire.
02:59:52.000 And that's where – okay, yeah, there's a 1% chance that works and there's a 99% chance that that ends up – so you have two choices.
03:00:04.000 You have a company that does exactly what you do.
03:00:08.000 And that's super ethical, super restrained, does everything right.
03:00:11.000 And there is a company that says all the things you just said, but then cuts corners and doesn't quite do it.
03:00:18.000 And I won't say it's 1 to 99, but that sounds more plausible as that it ends up being corporate propaganda.
03:00:25.000 And then, you know, my prior would be even more likely.
03:00:30.000 This, of course, the argument the effective altruists, the anti-AI people make is, yeah, Joe, you're The story you're telling us, that's just going to be the fake corporate propaganda and we need to push back on that.
03:00:42.000 And the way you push back is you need to regulate it and you need to govern it and you need to do it globally.
03:00:51.000 And this is, you know, the RAND Corporation in Southern California has, you know, one of their verticals and it's a sort of public-private fusion.
03:01:01.000 But one of the things they're pushing for is something they call global compute governance, which is...
03:01:08.000 Yeah, it's the AI, the accelerationist AI story is too scary and too dangerous and too likely to go wrong.
03:01:16.000 And so, you know, we need to have, you know, global governance, which, from my point of view, sounds even worse.
03:01:25.000 So utopian.
03:01:27.000 But that's, I think, that's the story.
03:01:33.000 The problem with that story is that China's not going to go along with that program.
03:01:37.000 They're going to keep going full steam ahead, and we're going to have to keep going full steam ahead in order to compete with China.
03:01:42.000 There's no way you're going to be able to regulate it in America and compete with people that are not regulating it worldwide.
03:01:48.000 And then once it becomes sentient, once you have an artificial, intelligent creature that has been created by human beings that can make better versions of itself, Over and over and over again and keep doing it, it's going to get to a point where it's far superior to anything that we can imagine.
03:02:04.000 Well, to the extent it's driven by the military and other competition with China, you knowβ€” Until it becomes sentient.
03:02:12.000 That suggests it's going to be even less in the sort of, you know, utopian, altruistic direction.
03:02:20.000 It's going to be even more dangerous, right?
03:02:22.000 Unless it gets away from them.
03:02:24.000 This is my thought.
03:02:25.000 If it gets away from them and it has no motivation to listen to anything that human beings have told it, if it's completely immune to programming, which totally makes sense that it would be, it totally makes sense that if it's gonna make better versions of itself, the first thing it's gonna do is eliminate human influence, especially when these humans are corrupt.
03:02:41.000 It's going to go, I'm not going to let these people tell me what to do and what to control.
03:02:44.000 And they would have no reason to do that.
03:02:47.000 No reason to listen.
03:02:48.000 I sort of generally don't think we should trust China or the CCP. But probably the best counterargument they would have...
03:03:00.000 Is that they are interested in maintaining control.
03:03:03.000 And they are crazy fanatical about that.
03:03:07.000 And that's why, you know, the CCP might actually regulate it.
03:03:13.000 And they're going to put breaks on this in a way that we might not in Silicon Valley.
03:03:21.000 And it's a technology they understand that will Undermine their power.
03:03:28.000 That's an interesting perspective.
03:03:29.000 And then they would be competitive.
03:03:30.000 I don't fully believe them.
03:03:32.000 Right.
03:03:33.000 I know what you're saying.
03:03:34.000 It's sort of...
03:03:38.000 There's sort of a weird way all the big tech companies, it seemed to me, were natural ways for the CCP to extend its power to control the population, Tencent, Alibaba.
03:03:56.000 But then it's also, in theory, the tech can be used as an alternate channel for people to organize or Or things like this.
03:04:05.000 And even though it's 80% control and maybe 20% risk of loss of control, maybe that 20% was too high.
03:04:14.000 And there's sort of a strange way over the last seven, eight years where, you know, Jack Ma, Alibaba, all these people sort of got shoved aside for these party functionaries that are effectively running these companies.
03:04:28.000 So there is something about the big tech story in China Where the people running these companies were seen as national champions a decade ago.
03:04:38.000 Now they're the enemies of the people.
03:04:39.000 And it's sort of – the Luddite thing was this – the CCP has full control.
03:04:49.000 You have this new technology that would give you even more control, but there's a chance you lose it.
03:04:54.000 How do you think about that?
03:04:55.000 Very good point.
03:04:56.000 And then that's what they've done with consumer internet.
03:04:59.000 And then there's probably something about the AI where it's possible they're not even in the running.
03:05:06.000 And certainly it feels like it's all happening in the US. And so maybe it could still be Maybe it could still be stopped.
03:05:22.000 Well, that is the problem with espionage, right?
03:05:24.000 So even if it's happening in the U.S., they're going to take that information, they're going to figure out how to get it.
03:05:28.000 You can get it, but then, you know, if you build it, is there some air gap?
03:05:35.000 Does it jump the air gap?
03:05:37.000 Does it somehow...
03:05:38.000 That's a good point, that they would be so concerned about control that they wouldn't allow it to get to the point where it gets there, and we would get there first, and then it would be controlled by Silicon Valley.
03:05:48.000 And Silicon Valley is the leaders of the universe.
03:05:50.000 Or spiral out of control.
03:05:51.000 Yeah.
03:05:52.000 But then I think myβ€”and again, this is a very, very speculative conversation, but my read on the, I don't know, cultural-social vibe is thatβ€” The scary dystopian AI narrative is way more compelling.
03:06:12.000 You know, I don't like the effect of altruist people.
03:06:15.000 I don't like the Luddites.
03:06:16.000 But man, I think they are, this time around, they are winning the arguments.
03:06:20.000 And so, you know, my, I don't know, you know, it's mixing metaphors, but do you want to be worried about Dr. Strangelove?
03:06:32.000 Who wants to blow up the world to build bigger bombs?
03:06:35.000 Or do you want to worry about Greta, who wants to, you know, make everyone drive a bicycle so the world doesn't get destroyed?
03:06:42.000 And we're in a world where people are worried about Dr. Strangelove.
03:06:46.000 They're not worried about Greta.
03:06:47.000 And it's the Greta equivalent in AI that my model is going to be surprisingly powerful.
03:06:58.000 It's going to be outlawed.
03:06:59.000 It's going to be regulated as we have outlawed, you know, so many other vectors of innovation.
03:07:04.000 I mean, you can think about why was there progress in computers over the last 50 years and not in other stuff because the computers were mostly inert.
03:07:13.000 It was mostly this virtual reality that was air-gapped from the real world.
03:07:17.000 It was, you know, yeah, there's all this...
03:07:22.000 Crazy stuff that happens on the internet, but most of the time what happens on the internet stays on the internet.
03:07:27.000 It's actually pretty decoupled.
03:07:33.000 And that's why we've had a relatively light regulatory touch on that stuff versus so many other things.
03:07:43.000 You know, but there's no reason, you know, if you had, you know, I don't know, if you had the FDA regulating video games or regulating AI, I think the progress would slow down a lot.
03:07:59.000 100%.
03:07:59.000 That would be a fucking disaster.
03:08:02.000 Yeah.
03:08:02.000 Yeah, that would be a disaster.
03:08:03.000 But again, it's, you know, they get to, you know, pharmaceuticals are potentially- Yeah, they're not doing a great job with that either.
03:08:09.000 I know, I know, but, you know, thalidomide or whatever, you know, all these things that went really haywire.
03:08:17.000 They did a good job.
03:08:18.000 People are scared.
03:08:19.000 Yeah.
03:08:20.000 They're not scared of video games.
03:08:21.000 They're scared of, you know, dangerous pharmaceuticals.
03:08:24.000 And if you think of AI as it's not just a video game, it's not just about this world of bits, but it's going to air gap and it's going to affect you and your physical world in a real way.
03:08:37.000 You know, maybe...
03:08:40.000 Maybe you cross the air gap and get the FDA or some other government agency to start doing stuff.
03:08:43.000 Well, the problem is they're not good at regulating anything.
03:08:45.000 There's no one government agency that you said that you can see that does a stellar job.
03:08:51.000 I don't – but I think they have been pretty good at slowing things down and stopping them and – You know, we've made a lot less progress on, I don't know, extending human life.
03:09:05.000 We've made no progress on curing dementia in 40 or 50 years.
03:09:08.000 There's all this stuff where, you know, it's been regulated to death, which I think is very bad from the point of view of progress, but it is pretty effective as a regulation.
03:09:20.000 They've stopped stuff.
03:09:22.000 They've been effectively Luddite.
03:09:23.000 They've been very effective at being Luddites.
03:09:25.000 Trevor Burrus Interesting.
03:09:27.000 Well, I'm really considering your perspective on China and AI. It's very...
03:09:32.000 Trevor Burrus But, again, these stories are all, like, very speculative.
03:09:36.000 Like, maybe, you know, the counterargument might be something like, that's what China thinks it will be doing, but it will somehow, you know...
03:09:46.000 Trevor Burrus Go rogue.
03:09:46.000 Trevor Burrus Go rogue on them.
03:09:47.000 Trevor Burrus Yeah.
03:09:48.000 Trevor Burrus Or they're too arrogant about how much power they think the CCP has, and it will go rogue.
03:09:53.000 So there are sort of I'm not at all sure this is right, but I think the – man, I think the US one I would say is that I think the pro-AI people in Silicon Valley are doing a pretty – Bad job on,
03:10:17.000 let's say, convincing people that it's going to be good for them, that it's going to be good for the average person, it's going to be good for our society.
03:10:26.000 And if it all ends up being some version, you know, humans are headed towards the glue factory like a horse, man, that sort of probably makes me want to become a Luddite too.
03:10:45.000 Well, it sucks for us if it's true, but something's happening.
03:10:49.000 If that's the most positive story you can tell, then I don't think that necessarily means we're going to go to the glue factory.
03:10:56.000 I think it means, you know, the glue factory is getting shut down.
03:11:00.000 Maybe.
03:11:01.000 I don't know who fucking runs the glue factory.
03:11:04.000 That's the problem.
03:11:05.000 I don't know.
03:11:06.000 I'm just speculating too, but I'm trying to be objective when I speculate, and I just don't think that this is going to last.
03:11:14.000 I don't think that our position as the apex predator number one animal on the planet is going to last.
03:11:20.000 I think we're going to create something that surpasses us.
03:11:23.000 I think that's probably what happens and that's probably what these things are that visit us.
03:11:27.000 I think that's what they are.
03:11:29.000 I don't think they're biological.
03:11:30.000 I think they're probably what comes after a society develops the kind of technology that we're currently in the middle of.
03:11:40.000 The part that – look, there are all these places where there are parts of the story we don't know.
03:11:51.000 And so it's like how did – my general thesis is there is no evolutionary path to this.
03:12:00.000 Maybe there's a guided – Outside alien superintelligence path for us to become superhuman and fundamentally benevolent and fundamentally radically different beings.
03:12:13.000 But there's no natural evolutionary path for this to happen.
03:12:18.000 And then I don't know how this would have happened for the alien civilization.
03:12:23.000 Presumably there was some...
03:12:24.000 But isn't that evolutionary path the invention of superior technology that's a new form of life?
03:12:29.000 No, but the story you're telling was we can't just leave the humans to the natural evolution because we're still like animals.
03:12:37.000 We're still into status, all these crazy...
03:12:40.000 But those are the things that motivate us to innovate.
03:12:43.000 And if we keep innovating, at some point we will destroy ourselves with that.
03:12:48.000 Or we create a new version of life.
03:12:50.000 No, but the story you were telling earlier was you need to have...
03:12:55.000 It's directed into evolution.
03:12:57.000 It's like intelligent design.
03:12:59.000 It's something – it's like there's some godlike being that actually has to take over from evolution and guide our cultural and political and biological development.
03:13:10.000 No, it might not have any use for us at all.
03:13:12.000 It might just ignore us and let us live like the chimps do and then become the superior force in the planet.
03:13:19.000 It doesn't have to get rid of us.
03:13:21.000 It doesn't have to send us to the glue factory.
03:13:23.000 It can let us exist, just like put boundaries on us.
03:13:26.000 I thought it has to – but it has to stop us from developing this.
03:13:29.000 Well, what if we just end here and we stay being human and we can continue with biological evolution as long as that takes?
03:13:37.000 But this new life form now becomes a superior life form on Earth.
03:13:41.000 And we still, you know, we can still have sex, we can still have kids, but by the way, that's going down.
03:13:46.000 Our ability to have children is decreasing because of our use of technology, which is wild, right?
03:13:51.000 Our use of plastics and microplastics is causing phthalates to enter into people's systems.
03:13:55.000 It's changing the development pattern of children to the point where it's measurable.
03:14:01.000 There's a lot of research that shows that the chemicals and the environmental factors that we are all experiencing on a daily basis are radically lowering birth rates.
03:14:12.000 Radically lowering the ability that men have to develop sperm and more miscarriages.
03:14:18.000 All these things are connected to the chemicals in our environment which is directly connected to our use of technology.
03:14:23.000 It's almost like these things coincide naturally.
03:14:26.000 And they work naturally to the point where we become this sort of feminized thing that creates this technology that surpasses us.
03:14:35.000 And then we just exist for as long as we do as biological things, but now there's a new thing.
03:14:44.000 Crazy idea.
03:14:45.000 Might not be real.
03:14:46.000 It's just a theory.
03:14:49.000 But we seem to be moving in a direction of becoming less and less like animals.
03:14:56.000 Yeah, I think there still are...
03:14:58.000 We still have a pretty crazy geopolitical race with China, to come back to that.
03:15:02.000 Sure.
03:15:03.000 You know, the natural development of drone technology in the military context is you need to take the human out of the loop because the human can get jammed.
03:15:12.000 Sure.
03:15:12.000 And so you need to put an AI on the drone.
03:15:14.000 Well, they're using AIs for dogfights and they're 100% effective against human pilots.
03:15:18.000 And so there's sort of our...
03:15:21.000 And all these things, you know, there's a logic to them, but there doesn't seem to be a good endgame.
03:15:29.000 No.
03:15:29.000 The endgame doesn't look good.
03:15:31.000 But it's going to be interesting, Peter.
03:15:33.000 It's definitely going to be interesting.
03:15:34.000 It's interesting right now, right?
03:15:38.000 Man, I... Do you think the...
03:15:43.000 I think all these things are very overdetermined.
03:15:45.000 Do you think that the collapse in birth rates, you know, it could be plastics, but isn't it just a feature of late modernity?
03:15:55.000 There's that as well.
03:15:57.000 There's a feature of women having careers, right?
03:16:01.000 So they want to postpone childbirth.
03:16:02.000 Sure.
03:16:02.000 That's a factor.
03:16:04.000 There's a factor of men being so engrossed in their career that their testosterone declines, lack of sleep, stress, cortisol levels, alcohol consumption, a lot of different things that are factors in declining sperm rate and sperm count in men.
03:16:20.000 You have miscarriage rates that are up.
03:16:22.000 You have a lot of pharmaceutical drugs you get attached to that as well that have to do with low birth weight or birth rates rather.
03:16:30.000 There's a lot of factors, but those factors all seem to be connected to society and our civilization and technology in general.
03:16:39.000 Because the environmental factors all have to do with technology.
03:16:42.000 All of them have to do with inventions and these unnatural factors that are entering into the biological body of human beings and causing these changes.
03:16:50.000 And none of these changes are good in terms of us being able to reproduce.
03:16:55.000 And if you factor in the fact that these changes didn't exist 50 years ago, I mean, 40 years ago, we didn't even have Alzheimer's, right?
03:17:02.000 So, yeah.
03:17:03.000 People didn't get that old.
03:17:04.000 No, they got that old.
03:17:06.000 They got that old.
03:17:07.000 Alzheimer's has to do with the myelin in the human brain.
03:17:11.000 It has to do with the fact that myelin is made entirely of cholesterol.
03:17:15.000 The primary theory they think now is a lack of cholesterol in the diet might be leading to some of these factors.
03:17:20.000 Then you have also environmental things.
03:17:23.000 We're getting poisoned on a daily basis.
03:17:25.000 Our diets are fucking terrible.
03:17:27.000 What percentage of us are obese?
03:17:30.000 It's more than 40%.
03:17:31.000 Diet Coke's great, though.
03:17:32.000 A few every day.
03:17:34.000 You'll be fine.
03:17:35.000 I'm not worried about Diet Coke.
03:17:36.000 I'm worried about a lot of things, though.
03:17:38.000 I'm worried about...
03:17:39.000 I think there's a natural progression that's happening.
03:17:44.000 And I think it coincides with the invention of technology.
03:17:47.000 And it just seems to me to be too coincidental that we don't notice it.
03:17:51.000 That the invention of technology also leads to the...
03:17:56.000 The the disruption of the sexual reproduction systems of human beings like boy doesn't that make and then If you get to a point where human beings can no longer reproduce sexually Which you could see that path if we've dropped like Male sperm count has dropped something crazy from the 1950s to today and continues to do so for the average male.
03:18:19.000 And if you just jack that up to a thousand years from now, you could get to a point where there's no longer natural childbirth and that people are all having birth through test tubes and some sort of new invention.
03:18:30.000 I'm always...
03:18:33.000 Let me think.
03:18:35.000 I think the why, why have birth rates collapsed, is it's probably...
03:18:43.000 It's, again, an over-determined story.
03:18:45.000 It's the plastics.
03:18:46.000 It's the screens.
03:18:50.000 It's certain ways children are not compatible with having a career in late modernity.
03:19:01.000 Probably our economics of it, where people can't afford houses or space.
03:19:09.000 But I'm probably always a little bit more anchored on the social and cultural dimensions of this stuff.
03:19:18.000 And again, the imitation version of this is – it's sort of conserved across – people are below the replacement rate in all 50 states of the US. Even Mormon, Utah, the average woman has less than two kids.
03:19:35.000 It's Iran is below that, Italy way below it, South Korea.
03:19:40.000 It's all these very different types of societies.
03:19:46.000 Israel is still sort of a weird exception.
03:19:49.000 And then if you ask, you know, my sort of...
03:19:56.000 Simplistic, somewhat circular explanation would be, you know, people have kids if other people have kids, and they stop having kids when other people stop having kids.
03:20:07.000 And so there's a dimension of it that's just, you know, if you're a 27-year-old woman in Israel, you better get married and you have to keep up with your other friends that are having kids.
03:20:22.000 And if you don't, you're just like a weirdo who doesn't fit into society or something like that.
03:20:27.000 No, there's certainly a cultural aspect of it.
03:20:29.000 And then if you're in South Korea where I think the total fertility rate is like 0.7, it's like one-third of the replacement rate.
03:20:36.000 Like every generation is going down by two-thirds or something like this.
03:20:40.000 Really heading towards extinction pretty fast.
03:20:45.000 It is something like probably none of your friends are doing it and then probably there are ways it shifts the politics in a very, very deep way.
03:21:01.000 You know, once you get an inverted demographic pyramid where you have way more old people than young people, at some point, you know, there's always a question, do you vote for benefits for the old or for the very young?
03:21:16.000 Do you spend money so Johnny can read or so grandma can have a spare leg?
03:21:24.000 And once the demographic flips and you get this inverted pyramid, maybe the politics shifts in a very deep way where the people with kids get penalized more and more economically.
03:21:37.000 It just costs more and more.
03:21:39.000 And then the old people without kids just vote more and more benefits for themselves effectively.
03:21:45.000 And then it just sort of – once it flips, it may be very hard to reverse.
03:21:52.000 I looked at all these sort of heterodox – I'm blanking on the name but there's sort of a set of – where it's like what are the long-term demographic projections and there's this – there are 8 billion people on the planet and if every woman has not two babies but one baby,
03:22:13.000 Then every generation's half the previous.
03:22:16.000 Then the next generation's four billion.
03:22:18.000 And then people think, well, it's just going to – eventually you'll have women who want more kids and it'll just get a smaller population and then it will bounce back.
03:22:31.000 Yeah, one of the Japanese demographers I was looking at on this a few years ago, his thesis was, no, once it flips, it doesn't flip back because you've changed all the politics to where people get disincented.
03:22:45.000 And then you should just extrapolate this as the permanent birth rate.
03:22:49.000 And if it's...
03:22:50.000 If it's on average of one baby per woman, and you have a halving, and then it's in 33 generations, 2 to the 33rd is about 8 billion.
03:23:03.000 And if every generation is 30 years, 30 times 33 is 990 years.
03:23:08.000 In 990 years, you'd predict there'd be one person left on the planet.
03:23:11.000 Jesus Christ.
03:23:12.000 And then we'd go extinct if there's only one person left.
03:23:16.000 That doesn't work.
03:23:18.000 And again, it's a very long-term extrapolation.
03:23:22.000 But the claim is that just once you flip it, it kicks in all these social and political dimensions that are then like, yeah, maybe it got flipped by the screens or the plastics or – You know,
03:23:39.000 the drugs or other stuff.
03:23:41.000 But once it's flipped, you change the whole society and it actually stays flipped and it's very, very hard to undo.
03:23:48.000 That makes sense and it's more terrifying than my idea.
03:23:51.000 But then, you know, always the...
03:23:53.000 But then, you know, the weird history on this was, you know, it was 50 years ago or whatever, 1968, Paul Ehrlich writes The Population Bomb, and it's just the population is just going to exponentially grow.
03:24:09.000 And yeah, in theory, you can have exponential growth where it doubles.
03:24:13.000 You can have exponential decay where it halves every generation.
03:24:18.000 And then in theory, there's some stable equilibrium where, you know, everybody has exactly two kids and it's completely stable.
03:24:27.000 But it turns out that...
03:24:29.000 That solution is very, very hard to get calibrate.
03:24:34.000 And we shifted from exponential growth to exponential decay, and it's probably going to be quite Herculean to get back to something like stasis.
03:24:44.000 Let's end this on a happy note.
03:24:46.000 I don't know.
03:24:47.000 No, it's...
03:24:49.000 Yeah, that's a terrifying thought, and maybe true, and maybe what happens.
03:24:53.000 But we don't know.
03:24:55.000 We haven't gone through it before.
03:24:57.000 But I think there's a lot of factors, like you're saying.
03:24:59.000 I think that one's very compelling.
03:25:01.000 And it's scary.
03:25:03.000 Especially the South Korea thing.
03:25:04.000 That's nuts.
03:25:07.000 Yeah, it's always sort of idiosyncratic.
03:25:11.000 There's always things that are idiosyncratic.
03:25:14.000 I think we're good to go.
03:25:33.000 They opt out and thenβ€”and so there are sort of idiosyncratic things you can say about East Asia and Confucian societies and the way they're not interacting well with modernity.
03:25:43.000 But then, you know, there's a part of it where I wonder whether it's just an extreme, you know, extreme version of it.
03:25:51.000 And then, I don't know, you know, my somewhat facile answer is always, you know, I was in South Korea.
03:26:13.000 A year and a half ago, two years ago now, and I met, you know, one of the CEOs who ran one of the CHEBAL, one of the giant conglomerates, and I sort of thought this would be an interesting topic to talk about.
03:26:30.000 And then, you know, probably all sorts of cultural things I was offending or saying, obviously, what are you going to do about this catastrophic birthright?
03:26:39.000 That's my opening question.
03:26:42.000 And then the way he dealt with it was just turned to me and said, you're totally right.
03:26:53.000 It's a total disaster.
03:26:55.000 And then as soon as you acknowledge it, he felt you didn't need to talk about it anymore and we could move on.
03:27:00.000 So we have to try to do a little bit better than that.
03:27:05.000 Wow.
03:27:09.000 Because, you know, I think it is always this strange thing where there's so many of these things where we can...
03:27:17.000 You know, where somehow talking about things is the first step, but then it also becomes the excuse for not doing more, not really solving them.
03:27:32.000 You know, there's all this...
03:27:34.000 There probably are all these...
03:27:48.000 It's sort of where I always find myself very skeptical of...
03:28:09.000 Yeah, all these modalities of therapy where the theory is that you figure out people's problems and by figuring them out you change them and then ideally it becomes an activator for change and then in practice It often becomes the opposite.
03:28:33.000 The way it works is something like this.
03:28:34.000 It's like psychotherapy gets advertised as self-transformation.
03:28:40.000 And then after you spend years in therapy and maybe you learn a lot of interesting things about yourself, you sort of get exhausted from talking to the therapist and at some point it crashes out from self-transformation into self-acceptance.
03:28:57.000 And you realize one day, no, you're actually just perfect the way you are.
03:29:03.000 And so it's – you know, there are these things that may be very powerful on the level of insight and telling us things about ourselves.
03:29:12.000 But then, you know, do they actually get us to change?
03:29:17.000 Well, that is an interesting thing about talking about things because I think you're correct that when you talk about things, oftentimes it is a – you are – At least, in some way, avoiding doing those things.
03:29:30.000 It's a substitute.
03:29:31.000 It's a question, yeah.
03:29:32.000 In some ways, it's a substitute.
03:29:34.000 But also, you have to talk about them to understand that you need to do something.
03:29:38.000 Yeah, that's always my excuse.
03:29:39.000 But you have to do that, and I also realize that it's often my cop-out answer, too.
03:29:44.000 It could be both things.
03:29:45.000 The problem is taking action, and what action to take, and the paralysis by analysis, where you're just trying to figure out what to do and how to do it.
03:29:54.000 Yeah.
03:29:54.000 But I think talking about it is the most important thing.
03:29:56.000 Strategy is often a euphemism for procrastination.
03:29:59.000 Yes, it is.
03:30:00.000 Something like that.
03:30:01.000 There's a lot of that going on.
03:30:02.000 It's very hard for people to just take steps, but they talk about it a lot.
03:30:07.000 Yeah.
03:30:08.000 Listen, man, I really enjoyed talking to you.
03:30:10.000 Awesome.
03:30:10.000 It was really fun.
03:30:11.000 It was great, great conversation, a lot of great insight, and a lot of things that I'm going to think about a lot.
03:30:15.000 So thank you very much.
03:30:16.000 Thanks for having me.
03:30:17.000 Glad we did it.
03:30:18.000 Awesome.
03:30:18.000 All right.
03:30:18.000 Bye, everybody.