The Joe Rogan Experience - February 06, 2025


Joe Rogan Experience #2269 - Bret Weinstein


Episode Stats

Length

2 hours and 37 minutes

Words per Minute

153.93318

Word Count

24,265

Sentence Count

1,761

Misogynist Sentences

7

Hate Speech Sentences

16


Summary

USAID has a lot of money, and it seems to be using it to fund a bunch of things that don't make sense in light of our constitutional structure. Joe Rogan takes a look at it, and is shocked at how much money USAID has been funneling around the world.


Transcript

00:00:01.000 Joe Rogan Podcast.
00:00:03.000 Check it out.
00:00:03.000 The Joe Rogan Experience.
00:00:06.000 Train by day.
00:00:07.000 Joe Rogan Podcast by night.
00:00:08.000 All day.
00:00:12.000 What's up?
00:00:13.000 Good to see you, my friend.
00:00:13.000 Great to see you, Joe.
00:00:15.000 Wild times.
00:00:17.000 Almost unbelievable.
00:00:18.000 Yeah.
00:00:19.000 The last time you were here, we were really worried about what was going to happen.
00:00:24.000 And now it seems like we're in a completely different timeline.
00:00:26.000 Yeah, I have to say, in addition to being just overarchingly worried about what was going to happen to the republic and to the globe, I was personally worried about what would happen to people like you and me if we lost.
00:00:42.000 Yeah, probably wouldn't be so good for business.
00:00:45.000 They probably would have cracked down.
00:00:47.000 There's that.
00:00:47.000 But I must say, on my darker days, I had concerns even beyond that.
00:00:53.000 You probably should.
00:00:54.000 Yeah.
00:00:55.000 Yeah, in light of what we now know.
00:00:57.000 This USAID thing that's going on, Mike Benz has been on that like a pit bull, and I've been following him on X, and he's going to come back on here and kind of explain everything.
00:01:08.000 He explained it the last time he was here, and I don't think I really grasped it until Elon's...
00:01:16.000 Six wizards.
00:01:17.000 They brought in some young wizards to go in there and go over the books.
00:01:21.000 And they are just finding crazy shit.
00:01:24.000 It's great.
00:01:25.000 And it's so interesting.
00:01:26.000 I was listening to a left-wing podcast today.
00:01:29.000 I like to mix it up.
00:01:30.000 You know, I listen to all kinds of different stuff.
00:01:32.000 And it was like I was listening to a different world.
00:01:34.000 Like they weren't even talking about all this corruption and all this obvious buying of influence Instead they were talking about aid overseas and how people are gonna starve and Mike.
00:01:46.000 It's mind-boggling and there's also I have to say I'm just...
00:01:53.000 I'm upset at the general pattern of a failure to recognize how right those of us who hypothesized that there was a racket that had overtaken our entire governance structure.
00:02:06.000 We turn out to be...
00:02:08.000 Absolutely right about this and no one's going to mention it?
00:02:11.000 That's mind-blowing.
00:02:12.000 It's very strange that the media is ignoring it, especially the left-wing media.
00:02:15.000 It's just too big of a win for the right, and so they're just ignoring it.
00:02:19.000 And then they're just highlighting the good things that USA did, which I'm sure it probably did, probably had to do some good things to at least justify its existence.
00:02:28.000 As a cover story?
00:02:30.000 I'm not even sure.
00:02:32.000 Maybe.
00:02:33.000 It doesn't change anything.
00:02:35.000 Obviously, this was a mechanism used to funnel money to all sorts of things that we didn't vote on that don't make sense in light of our constitutional structure.
00:02:45.000 And I'm, you know, I obviously have concerns like everybody else about where this train takes us.
00:02:54.000 But seeing that structure broken up is...
00:02:59.000 It's a huge relief.
00:03:01.000 They gave $27 million to the George Soros prosecutor fund.
00:03:06.000 So our own government is funding this left-wing lunatic who is hiring the most insane prosecutors who are letting people out of jail.
00:03:19.000 And that's exactly how this racket worked, is that the ability to tax the American public and then effectively get us to pay for being propagandized, for being surveilled, that's the game.
00:03:38.000 And I don't know what era we currently live in.
00:03:42.000 Obviously, there's a lot that's confusing about what the Trump administration is up to, but...
00:03:47.000 I don't think any reasonable person could be unhappy that we are exiting that era.
00:03:53.000 I'm going to read off some of the things that this guy Ken Akota the Great on Twitter listed.
00:03:58.000 And this is off the Jesse Water Show.
00:04:01.000 USAID, $20 million for Iraqi Sesame Street, $2 million for Moroccan pottery classes, $11 million to tell Vietnam to stop burning trash, $27 million to give gift bags to illegals.
00:04:17.000 $27 million.
00:04:20.000 $330 million to help Afghanis grow crops.
00:04:23.000 Crops.
00:04:24.000 I wonder what those crops are.
00:04:26.000 What's their biggest crop, Brett?
00:04:28.000 It's going to be the poppy seeds for bagels, I'm thinking.
00:04:31.000 Opium!
00:04:32.000 $200 million on an unused Afghani dam.
00:04:38.000 $250 million on an unused Afghani road.
00:04:42.000 This is wild.
00:04:46.000 I mean, some of this stuff is really, really crazy.
00:04:51.000 Well, yes, and USAID is, of course, riddled through whatever international madness it is that caused us to open our southern border and facilitate an invasion through the Darien Gap.
00:05:04.000 So, you know, seeing that structure laid bare is...
00:05:10.000 It almost feels like it can't be real.
00:05:13.000 It can't have been this close to the surface, and yet here we are.
00:05:17.000 They were spending...
00:05:18.000 Is this number correct?
00:05:19.000 I think the number that I read was $600 million every two months to ship in illegals?
00:05:27.000 Sounds right.
00:05:28.000 I don't know the number offhand.
00:05:31.000 What the fuck?
00:05:33.000 Well, you have to realize that basically we had a shadow apparatus.
00:05:41.000 Functioning.
00:05:41.000 And it involves all kinds of things.
00:05:44.000 It involves payoffs to people who didn't deserve them.
00:05:47.000 It involves contracting to entities that were necessary to get the work done.
00:05:53.000 So I don't think we can properly understand what these numbers mean and what they're actually being used for.
00:06:01.000 But it was a racket.
00:06:03.000 Well, we were always wondering, like, why is our debt so high?
00:06:06.000 Why is the national debt?
00:06:08.000 So high.
00:06:09.000 Like, why is our deficit so insane?
00:06:12.000 Well, this is it.
00:06:13.000 I mean, how about the one where they paid $236 billion, like, for chargers?
00:06:23.000 Do you know that they were trying to set up chargers?
00:06:26.000 You mean car chargers?
00:06:27.000 Car chargers.
00:06:27.000 And they only built a couple of them?
00:06:28.000 Oh, excuse me.
00:06:29.000 $40 billion for electric car ports.
00:06:33.000 Eight ports have been built.
00:06:35.000 You know how crazy that is?
00:06:37.000 $40 billion for carports.
00:06:40.000 But I have to say, as much as this is shocking, I wasn't surprised.
00:06:46.000 I thought that effectively our entire system had been turned into a racket and that we were basically being fed a cover story from it.
00:06:56.000 And it's weird to now have the evidence of this.
00:07:01.000 But I think it was apparent that whatever had taken over our system wasn't interested in the well-being of average people, that it was interested in the power of the state to take people's resources and redistribute them.
00:07:17.000 And that that really is what's been going on for most of our adult lives.
00:07:22.000 And it's also important to know that this progressive left-leaning, like radical left arm of the government, of the country, was manufactured.
00:07:34.000 Yes.
00:07:34.000 It's all manufactured.
00:07:36.000 It's all manufactured and supported.
00:07:37.000 It's not organic, which is really fascinating about the other side, because the other side, the reaction to it is organic.
00:07:44.000 Like, say what you want about the, you know, the Trump administration and what you think about him.
00:07:51.000 That was an organic shift where people were like, E-fuck enough.
00:07:55.000 Enough.
00:07:56.000 Yes, it was an overdue reaction.
00:08:01.000 The cover story, that what we were up to was righting past wrongs, was so pernicious and pervasive that it was hard to get our footing to challenge it.
00:08:15.000 But it shouldn't really be surprising that that movement wasn't organic.
00:08:20.000 Of course it was induced.
00:08:22.000 It was a cover story for theft.
00:08:26.000 We're going to be waking up to the magnitude of that theft for quite some time.
00:08:32.000 Have you ever heard of the audience effect?
00:08:34.000 It is a psychological theory that our behavior changes when we know we're being watched.
00:08:39.000 And here's the thing.
00:08:40.000 We are being watched.
00:08:42.000 When you use the Internet, data brokers watch and record everything you do online, even if you're using a private browser.
00:08:50.000 But you don't have to become a slave to the digital surveillance state.
00:08:53.000 You can free yourself with ExpressVPN.
00:08:56.000 With ExpressVPN, 100% of your online activity is rerouted through secure, encrypted servers that hide your IP address.
00:09:05.000 That means you can get to use the Internet with real freedom and privacy.
00:09:09.000 ExpressVPN also just rolled out a new feature for U.S. customers called Identity Defender.
00:09:15.000 It can remove your data from data broker's files, monitor the dark web, And the best part?
00:09:40.000 Podcast listeners can get four extra months of ExpressVPN for free at expressvpn.com slash rogan or by tapping the banner.
00:09:49.000 That's expressvpn.com slash rogan or tap the banner.
00:09:54.000 If you're watching on YouTube, you can get four free months by scanning the QR code on screen or by clicking the link in the description.
00:10:03.000 I think this is going to take years.
00:10:05.000 Chamath said that it's going to be like I ran Contra on steroids.
00:10:09.000 That's what he said.
00:10:10.000 He said when you get to the bottom of all this, it's going to be insane because they haven't even got to the Medicaid yet.
00:10:15.000 They haven't even got to the medical stuff.
00:10:17.000 There's so much they haven't even tapped into where they think the real motherlode of fraud is.
00:10:22.000 Yes, and I must say that there is also another aspect to this which we have to be careful about, which is that the justifiable anger at discovering what it is that we've been dragged into as a nation is going to make it hard to see where the limits which is that the justifiable anger at discovering what it is that we've been dragged into as a nation In other words, at the moment, I'm cheering for the wrecking ball.
00:10:48.000 Right.
00:10:49.000 Break this stuff up.
00:10:50.000 Yeah.
00:10:51.000 Never again.
00:10:52.000 But there are.
00:10:54.000 What's that, Jamie?
00:10:55.000 Was I ever read an article about the spending on the Chargers?
00:10:58.000 They said that they haven't actually, according to this, they haven't actually spent all that money yet.
00:11:02.000 What do they do with it?
00:11:04.000 They've spent some of it to make some of those things, but it hasn't been allocated yet.
00:11:09.000 It's a long article going through all the spending that's been done.
00:11:13.000 It's on factcheck.org.
00:11:16.000 Factcheck.org?
00:11:17.000 Who runs that?
00:11:18.000 I don't know.
00:11:19.000 Some of the chargers have been made.
00:11:20.000 Some of them are on the way to be making.
00:11:22.000 They've built 61 at 15 stations since mid-August or through mid-August.
00:11:26.000 14,900 more are currently in some stage of development.
00:11:29.000 But that's where it goes into where they are, what they have to be done, and who's getting the money from them has to be done through a long process from each state.
00:11:37.000 Yeah.
00:11:38.000 The question is, how can we get a proper accounting as you point out?
00:11:42.000 Who the hell is factchecked.org?
00:11:44.000 That's the problem with fact checking organizations.
00:11:46.000 That should really be illegal.
00:11:48.000 Like I think if you're a fact checking organization, we should have stringent rules on what influence is being peddled.
00:11:56.000 Like who's paying for these fact checkers?
00:11:58.000 Who's behind the scenes?
00:12:00.000 It should be very transparent.
00:12:02.000 How did you determine whether or not this was true or false?
00:12:06.000 You know, because there are a lot of things that get said...
00:12:10.000 Like, I don't know if you saw this, but Elizabeth Warren got confronted, and it's on Twitter this morning.
00:12:14.000 She got confronted about the amount of money that she's received from pharmaceutical drug companies.
00:12:20.000 She said she's never received any money from pharmaceutical drug companies and never received any monies from any PACs.
00:12:26.000 And then, of course, underneath it, community notes strikes again.
00:12:29.000 And, of course, she received millions.
00:12:31.000 She's a fucking liar.
00:12:33.000 Well, and, you know, it's an arms race.
00:12:35.000 you know how can pharma cloak the money that it's giving so that there's plausible deniability at the point that that elizabeth warren is confronted or or bernie sanders and it was hilarious only one point five billion Only 1.5 million out of 200 million.
00:12:49.000 Only 1.5.
00:12:50.000 Yep, that's what I saw as well.
00:12:53.000 The hardworking people, the hardworking people in this country gave me money.
00:12:57.000 Well, I don't think the Democrats understand that it's over.
00:13:05.000 And that there was a vast infrastructure that made their feeble arguments viable.
00:13:10.000 And that infrastructure is now collapsing.
00:13:13.000 People are far more aware and their lives aren't going to function anymore.
00:13:18.000 Well, it makes sense now that we're seeing these numbers because, okay, this was what was funding.
00:13:22.000 The infrastructure.
00:13:23.000 Now we get it.
00:13:24.000 Because it wasn't – otherwise, it's organic.
00:13:26.000 This is the will of the people.
00:13:28.000 This is how people are moving.
00:13:29.000 It's not.
00:13:29.000 It wasn't that at all.
00:13:30.000 This was all organic.
00:13:32.000 And it was really about control and money.
00:13:34.000 It had nothing to do with helping people, making people better, giving aid to foreign countries.
00:13:39.000 That's all a cloak and dagger bullshit show.
00:13:42.000 The reality was it's about money.
00:13:44.000 Yeah, it's – Money and influence.
00:13:46.000 It's always about power and limited resources.
00:13:48.000 And this was a new game taking place at a level that was hard to believe, and therefore many of us couldn't see it.
00:13:55.000 Did you see how they used software to map out 55,000 different NGOs that were used as a branch of the democratic system?
00:14:04.000 No, I didn't catch that.
00:14:05.000 I could send it to you.
00:14:06.000 I think I sent it to you, Jamie, right?
00:14:07.000 We went over it on the podcast before.
00:14:09.000 It's so nutty.
00:14:11.000 That this was all kind of hidden until they started using software to try to figure out and map out where all the influence goes.
00:14:19.000 And the crazy thing about the NGOs, and this is one of the things that Mike Benz has gone so deep into, it's essentially like they contribute to the Democratic Party, the government pays them.
00:14:29.000 It's all this weird sort of circular money transfer thing that's out in the open.
00:14:35.000 No, it's a positive feedback.
00:14:38.000 The whole idea is power is utilized to free resources that garner more power.
00:14:44.000 And it is the exact inverse of the system that we are supposed to have.
00:14:50.000 It's very interesting.
00:14:51.000 Where we're headed, that's a harder question.
00:14:54.000 Where we're headed is...
00:14:56.000 We're going to own Gaza.
00:14:58.000 Somehow.
00:14:59.000 This is it.
00:15:00.000 So fractal technology maps previously hidden connections between 55,000 liberal NGOs revealing how tax dollars allegedly flowed through major institutions like Vanguard and Morgan Stanley to groups like the Chinese Progressive Association.
00:15:13.000 This breakthrough tracking system can now monitor every dollar going to every NGO, exposing intricate funding webs that traditional tech couldn't detect.
00:15:22.000 So, an example, Black Voters Matter Fund's $4 million distribution network was invisible until quantum mapping revealed dozens of subsidiary organizations.
00:15:32.000 The unprecedented mapping reveals a previously hidden web of financial relationships.
00:15:38.000 And that's what it's really all about.
00:15:40.000 Yes.
00:15:41.000 The problem is, I, you know, sometimes when I see like a list of preposterous scientific projects that have gotten big grants, I read it, and I think, they all sound preposterous, but I don't know.
00:15:59.000 Some of these things are likely to have had a good explanation, and it just is not apparent in the soundbite, and some of them are every bit as preposterous as they seem.
00:16:09.000 And so I can't look at a map like that and say what I would expect if the system was healthy.
00:16:14.000 So I'm cautious about it.
00:16:16.000 I don't think the system was healthy.
00:16:18.000 I think the system was a racket from one end to the other.
00:16:20.000 And I've been saying that we've been living in the era of malignant governance where there's basically no element of this.
00:16:26.000 You couldn't turn off and make us better.
00:16:29.000 But we have to be suspicious also of our understanding of how a properly functioning system would graph.
00:16:38.000 In something like that, so that we don't overrun the train station when we get there.
00:16:46.000 And I will just say, I was talking to a friend of mine who runs an Alaska Native corporation, which I don't know if we've talked about Alaska Native corporations before.
00:16:56.000 But this is a corporation.
00:17:01.000 It competes for federal contracts.
00:17:04.000 It has...
00:17:05.000 Some advantages in the competition for federal contracts.
00:17:08.000 And all of the profits go to Alaska Natives.
00:17:12.000 And it is finding itself in a very difficult-to-navigate battle because of all of the successes of DOGE. So the Alaska Native Corporation is utilizing something called the 8A program.
00:17:31.000 The 8A program is a program that gives advantages to disadvantaged people, and at some point, that ability to use the 8A program was granted to Alaska Native corporations.
00:17:45.000 Well, the 8A program is now under attack by some large corporations, federal contractors, who do not like competition from things like Alaska Native corporations, and it is being portrayed as if it was based on race, which it isn't.
00:18:01.000 Anybody can use it.
00:18:03.000 It's not a race-based program.
00:18:04.000 But because people are in a mood to dismantle all of this left-wing solution-making corruption...
00:18:14.000 These megacorporations are finding it easy to target the 8A program and they are persuading members of Congress that it doesn't belong.
00:18:23.000 And this is going to be a tragic loss if this program, which works well, is dismantled in the fervor to go after all of the stuff that should never have been.
00:18:33.000 What does this program do exactly?
00:18:35.000 It provides a mechanism for disadvantaged people to compete for grants.
00:18:43.000 It's really not race-based.
00:18:45.000 Anybody, you know.
00:18:46.000 So would you just have to live in Alaska?
00:18:47.000 No.
00:18:47.000 No, no.
00:18:48.000 So there are two separate things.
00:18:50.000 Alaska Native corporations are for Alaska Natives.
00:18:52.000 Right.
00:18:53.000 When you say Alaska Natives, you mean people who live in Alaska or?
00:18:57.000 No, no.
00:18:58.000 Inuits.
00:18:58.000 I mean, yeah.
00:18:59.000 Arctic peoples.
00:19:01.000 Arctic peoples.
00:19:01.000 Yes.
00:19:02.000 So the original people of Alaska before we bought it for 50 bucks from the Russians.
00:19:06.000 Exactly.
00:19:06.000 We bought it for 50 bucks from the Russians and then after the discovery of oil in Prudhoe Bay, the U.S. government realized that it could not afford to give the natives of Alaska sovereign land rights because it was going to need to do things like put a pipeline to transport oil.
00:19:26.000 So instead of giving them...
00:19:32.000 It's an interesting program that does a lot of good, but its connection to the 8A program now has the good that it does in jeopardy.
00:19:49.000 And I don't know how many stories there are like that, but we need to be...
00:19:54.000 Be careful that our excitement about watching all of this nonsense torn apart doesn't cause us to tear apart things that actually are functioning well and don't suffer from the defects of the DEI madness.
00:20:09.000 Got it.
00:20:10.000 So this thing that allows disadvantaged people to get grants, like how is it structured?
00:20:14.000 Oh, that I couldn't tell you.
00:20:16.000 That I couldn't tell you.
00:20:17.000 We could look into it.
00:20:18.000 It's easy to look up.
00:20:19.000 It's the 8A program.
00:20:22.000 Helps small businesses owned by socially and economically disadvantaged individuals to compete for federal contracts, provides training and technical assistance to help businesses compete, categorizes eligible businesses as veteran-owned, women-owned, minority-owned, or owned by a person with disabilities.
00:20:39.000 Certification does not guarantee contract awards, but it can help businesses pursue new opportunities.
00:20:45.000 So this is a question, right?
00:20:48.000 The category is like veteran-owned.
00:20:51.000 Woman-owned and minority-owned.
00:20:53.000 Why is that?
00:20:55.000 Why is woman-owned and minority-owned?
00:20:58.000 Why would they...
00:20:59.000 You know what I mean?
00:21:00.000 Especially women-owned.
00:21:02.000 I do.
00:21:03.000 Is this Grok?
00:21:04.000 Is that what you're looking at there?
00:21:06.000 Is that what that is?
00:21:10.000 Oh, so it's Google.
00:21:13.000 Benefits for native-owned businesses.
00:21:15.000 Program helps native communities develop economic ventures that support their communities.
00:21:18.000 Profits generated from a native-owned participant go back to their native communities.
00:21:25.000 So, I'm not in a position to answer detailed questions about 8A, but what I would say is...
00:21:32.000 There are some good things.
00:21:33.000 There are quite a number of success stories that...
00:21:36.000 This is exactly what we want for...
00:21:41.000 Disadvantaged people.
00:21:42.000 Right.
00:21:42.000 We want a real social safety net.
00:21:44.000 Yeah.
00:21:44.000 Not only a social safety net, but something that provides an opportunity.
00:21:48.000 Hey, build a business.
00:21:49.000 Right.
00:21:50.000 This is what we want you to do.
00:21:51.000 This is the mythology of our system is, you know, pull yourself out of your disadvantaged state.
00:21:55.000 So that you don't need help.
00:21:58.000 So anyway, we should be interested in maintaining those programs at the same time we find the stuff that's actual nonsense and get rid of it as quickly as possible.
00:22:08.000 And that's going to be a delicate balance.
00:22:10.000 So far, we're early in this process and you're going to have big wins like the revelations about USAID. But the day will come soon enough when we're talking about...
00:22:23.000 Discussions where we actually have to do a cost-benefit analysis on the programs that are targeted.
00:22:27.000 Right.
00:22:28.000 And we have to realize that there are programs that benefit people greatly and are really good for the entire country as a whole.
00:22:34.000 Absolutely.
00:22:36.000 That's like the problem.
00:22:39.000 If you're a left-wing progressive person like we both sort of identified with up until a while ago, and then all of a sudden the entire country takes a polar shift, you don't want to lose your own ideas about what's important and what things that we should contribute to with our tax dollars.
00:22:57.000 Because I think we both agree that there's a lot of good.
00:23:02.000 In taking taxes and providing social safety nets, providing food for poor people and homeless people, helping people, welfare.
00:23:12.000 All these things are important.
00:23:14.000 Like, to not have people starving on your fucking streets.
00:23:17.000 Like, all that stuff is, like, we're gonna have a community, which is essentially what a country's supposed to be, an enormous community.
00:23:23.000 We have to support the members of our community.
00:23:25.000 We just have to do it without grifters, and do it without bullshit, and do it without it being just a cleverly disguised ruse in order to gain political power.
00:23:35.000 Well, you may remember, years ago, I used to say that...
00:23:39.000 I want to live in a country so good that I get to be a conservative.
00:23:45.000 Right.
00:23:45.000 I'm a liberal because there's a lot of problems with the way our system works.
00:23:49.000 But the objective of all of that progressivism ought to be a system that doesn't require intervention in that way.
00:23:56.000 Yeah.
00:23:56.000 In which everybody does have access to the market.
00:23:59.000 And so people really can be responsible for, you know, lifting themselves out of whatever.
00:24:04.000 Literally the rising tide lifts all boats, which should be everybody's thought process.
00:24:10.000 A fair system in which everybody starts out with the tools that allow them to take advantage of the market.
00:24:16.000 That's great.
00:24:16.000 And I want a system in which lazy people don't have money to spend and are motivated to become unlazy.
00:24:25.000 I don't want people profiting from destroying opportunities that belong to other people.
00:24:31.000 But if we had a system that was like that and everybody had the tools to utilize it, then we should want as little intervention as possible.
00:24:40.000 Yeah.
00:24:41.000 Wild times.
00:24:43.000 Just wild.
00:24:44.000 Like, what a fun time to be alive.
00:24:47.000 It just feels different, I have to tell you.
00:24:50.000 I don't know what's coming, but it's at least...
00:24:54.000 It's at least delightful not to know what to think, right?
00:25:00.000 The cynicism that was required to understand what was going on two months ago is now no longer required.
00:25:07.000 You actually have to think about what you're told is coming down the pike and think, well...
00:25:13.000 I don't know.
00:25:14.000 Is that a solution?
00:25:15.000 Is it a negotiating tactic or is it a solution that's actually being proposed and would it work?
00:25:20.000 Right.
00:25:21.000 Like, are we really taking over Gaza or is this just a bullshit marketing ploy?
00:25:26.000 Like, is this like some negotiation tactic with Israel?
00:25:29.000 Because, like, the look on Netanyahu's face when Trump was talking about taking over Gaza, it was like, what?
00:25:37.000 You could see his face.
00:25:38.000 He was just like, what the fuck are you saying?
00:25:41.000 I have to say, I almost feel like it was worth the price of admission right there.
00:25:46.000 Just to watch his face?
00:25:48.000 Yeah, like, you want to let us in?
00:25:50.000 Oh, you want us help?
00:25:50.000 Okay, we're going to set up bases there.
00:25:53.000 And instead of, you know, someone was describing this on Twitter, instead of a response time to any action Israel takes, taking days, it takes minutes.
00:26:01.000 Well, I am not a fan of Netanyahu's, as you probably know.
00:26:06.000 My sense is that he...
00:26:09.000 Trevor Burrus Well, they were even more – even more – And this
00:26:40.000 just so happened to put him back in charge.
00:26:46.000 But in any case, to see him back on his heels...
00:26:51.000 That was a good sign.
00:26:52.000 Now, I am, of course, concerned about the idea of...
00:26:57.000 I'm not even sure I know what I heard, right?
00:27:00.000 We're going to make Gaza into...
00:27:02.000 It's going to be the Riviera of the Middle East!
00:27:06.000 That was a pretty good impression.
00:27:07.000 The Riviera of the Middle East!
00:27:11.000 Oh, my God.
00:27:13.000 What a crazy time.
00:27:15.000 And just to see all these politicians freaking out, that is amazing, too.
00:27:20.000 It's really amazing.
00:27:22.000 It's amazing to watch.
00:27:23.000 It's amazing to watch all these left-wing people suddenly.
00:27:27.000 Bernie Sanders making a post about how Donald Trump is trying to silence independent media was the wildest fucking gaslighting I think I've ever seen from a politician.
00:27:37.000 Independent media?
00:27:38.000 You mean fucking CBS? You mean CBS that edited that Kamala Harris interview to make it look like she had a really good point?
00:27:45.000 100%.
00:27:46.000 And then, I don't know if you caught...
00:27:48.000 Alex Soros reposting this claim that basically you have an unelected cabal wielding power.
00:28:01.000 That's you!
00:28:03.000 That's you, Alex!
00:28:04.000 That's your dad!
00:28:05.000 This is crazy!
00:28:07.000 It's a very, very strange historical moment.
00:28:12.000 Also, it's like...
00:28:13.000 You haven't addressed any of the exposed corruption.
00:28:17.000 All you're talking about is the horrors of dismantling this amazing organization.
00:28:22.000 What about all the shit that they've uncovered?
00:28:24.000 There's not even a counter-argument.
00:28:26.000 Like, no, we need to fund gender-fluid dance in fucking Turkey.
00:28:32.000 What are you talking about?
00:28:34.000 We need $200 million for Starbucks Keurig cups.
00:28:37.000 What?
00:28:39.000 Well, I mean, again, you know, I said a lot of stuff over the years about the fact that our civilization had become a racket.
00:28:51.000 Yeah.
00:28:52.000 And the fact that we were living in the era of malignant governance and that basically I'm concerned as somebody who believes in good governance that there's almost no component of this that you couldn't remove and create an improvement.
00:29:09.000 That that's not a message you want.
00:29:11.000 Right.
00:29:11.000 I want a message in which we govern as lightly as possible, but we do it really, really well.
00:29:17.000 And an era in which you can cut off any limb and the patient gets healthier, that teaches the wrong lesson about governance.
00:29:26.000 It teaches the lessons that governance was a mistake to begin with, which it wasn't.
00:29:30.000 Right.
00:29:31.000 It's a big weekend.
00:29:33.000 Get in on the action of the big game and UFC 312 at DraftKings Sportsbook, the official sports betting partner of the UFC. The men's middleweight and women's strawweight titles will be on the line in the co-main events of UFC 312. And of course, pro football is crowning a champion at the big game.
00:29:53.000 Just getting started?
00:29:54.000 Pick a fighter or a team to win this weekend.
00:29:57.000 Go to DraftKings app and make your pick.
00:29:59.000 That's all there is.
00:30:01.000 If you're new to DraftKings, listen up.
00:30:04.000 New customers can bet $5 to get $200 in bonus bets instantly.
00:30:10.000 Download the DraftKings Sportsbook app now and use the code ROGAN. That's code ROGAN for new customers to get $200 in bonus bets when you bet just $5.
00:30:19.000 It's a big weekend only on DraftKings.
00:30:22.000 The crown is yours.
00:30:24.000 Gambling problem?
00:30:25.000 Call 1-800-GAMBLER.
00:30:26.000 In New York, call 877-8HOPE-NY or text HOPE-NY-467-369.
00:30:31.000 In Connecticut, help is available for problem gambling.
00:30:33.000 Call 888-789-7777 or visit ccpg.org.
00:30:38.000 Please play responsibly.
00:30:39.000 On behalf of Boothill Casino and Resort in Kansas, 21 and over, age and eligibility varies by jurisdiction.
00:30:44.000 Void in Ontario.
00:30:45.000 New customers only.
00:30:46.000 Bonus bets expire 168 hours after issuance.
00:30:49.000 For additional terms and responsible gaming resources, see dkng.co slash audio.
00:30:55.000 Well, how do we get money out of governance?
00:30:57.000 So that's the problem, right?
00:30:58.000 Is money gets involved in governance, especially enormous amounts of money, and then they have influence.
00:31:02.000 And then you have senators and congressmen and different elected representatives that don't do the will of the people.
00:31:08.000 They do the will of the people that paid them enormous amounts of money.
00:31:12.000 And this is a real problem.
00:31:14.000 Because if it was just the only way you could win was you had to do for the will of the people.
00:31:19.000 You had to literally do things that were better for the people.
00:31:23.000 The people realize you're doing a great job and they keep electing you.
00:31:26.000 Well, let's be honest about what the conservatives had right from the get-go.
00:31:32.000 There are problems that only competition solves.
00:31:37.000 There are other problems that competition in something like a market is not well positioned to solve, but there's certain problems that there's just there's no second best.
00:31:45.000 It's only competition that works.
00:31:47.000 And so when we talk about, well, you know, what are we going to do for fact checking?
00:31:51.000 We're going to abandon the idea of fact checking.
00:31:54.000 What you want is a vibrant, independent journalist sector in which people who spot the story early and people who articulate the story in the most intuitive and accurate way outcompete those who do a worse job. independent journalist sector in which people who spot the story So that over time, what we get is journalism that you can't fool.
00:32:18.000 And that it reveals to us which government programs actually work.
00:32:22.000 Even if they don't sound reasonable at first glance, here's what's really going on behind the scenes in this program.
00:32:28.000 Right.
00:32:29.000 And then journalism that exposes any kind of fraud.
00:32:32.000 And I don't know about you, but as I was watching.
00:32:36.000 Confirmation hearings.
00:32:38.000 My sense was that the Elizabeth Warrens and the Bernie Sanders were dinosaurs who do not understand that the earth has just been hit from outer space and that they don't live in the world that they are so used to.
00:32:58.000 That their corruption was immediately apparent.
00:33:03.000 And they're not used to that.
00:33:04.000 They're used to having a whole phony journalistic layer that covers for them.
00:33:10.000 And that layer is gone and the American public is awake and it's angry and rightfully so.
00:33:17.000 And now it looks at Bernie Sanders who, you know, I remember the first time you and I spoke, you and I had both been Sanders supporters.
00:33:24.000 Yeah.
00:33:25.000 And now to see that same guy going after Bobby Kennedy.
00:33:30.000 And, you know, the feeble excuse, well, what if Bobby Kennedy becomes the head of HHS and people don't have access to prescription drugs?
00:33:41.000 And it's like, dude, I just lived through COVID. It's not obvious to me that they wouldn't get healthier if they didn't have access to prescription drugs.
00:33:51.000 Do you realize how corrupt those companies are and how nonsensical their science is, the science that says that you actually get better if you take a statin based on some metric in your chart, right?
00:34:03.000 So I'm not arguing that there aren't good pharmaceuticals.
00:34:06.000 There undoubtedly are.
00:34:07.000 But what's the net effect of our pharmaceutical-obsessed medical culture?
00:34:13.000 It's not obvious to me that it's positive.
00:34:16.000 I think it may well be negative.
00:34:17.000 And so anyway, again, I see Bernie Sanders and I see him reading from a script that is no longer relevant to the movie we're watching.
00:34:28.000 Right, and this is not saying that there aren't some pharmaceutical drugs that are amazing.
00:34:32.000 The problem is they're not all amazing, and they sell them all like they're amazing.
00:34:36.000 Absolutely.
00:34:37.000 That's the problem.
00:34:38.000 Some of them are great.
00:34:39.000 Like, Viagra is fantastic.
00:34:40.000 Like, it's really good stuff.
00:34:41.000 There's a bunch of stuff like that that really works.
00:34:43.000 There's a bunch of drugs that really help people.
00:34:46.000 There's a bunch of drugs that brilliant scientists have developed that...
00:34:50.000 Definitely help people live longer and live healthier lives.
00:34:54.000 But also, they're in the business of selling medicine, selling pharmaceutical drugs.
00:35:00.000 And so there's a lot of stuff that they sell that is not good, not good for you.
00:35:04.000 Overall, net negative.
00:35:05.000 When you look at the amount of drugs that get pulled, that get endorsed and then supported by the FDA, and then they have to pull them.
00:35:14.000 Wasn't it like 30%?
00:35:15.000 Something in the range?
00:35:18.000 Yes, and that is the tip of the iceberg.
00:35:21.000 We do not have, just as we don't have a journalistic layer that exposes people in Congress who are lying to us and aspects of the government that are corrupt, we don't have a university system that can properly do science and can be relied on to tell us what the impact of a drug or a food additive is.
00:35:43.000 The whole system is missing in action.
00:35:47.000 Right.
00:35:47.000 The whole system is paid for by the pharmaceutical drug companies.
00:35:50.000 They pay for tests.
00:35:51.000 They pay for studies.
00:35:52.000 They support organizations that are supposed to be regulating them.
00:35:56.000 The whole thing is bananas.
00:35:57.000 Everything that is supposed to evaluate something like safety or efficacy or analyze net effects, anything like that, has been captured by the PR wing.
00:36:13.000 And so the consumer is in no position to navigate a world like that.
00:36:20.000 I mean, and we know that this encompasses everything.
00:36:24.000 You know, how many people's doctors are pharma-skeptical?
00:36:29.000 Right?
00:36:30.000 Your doctor should be very pharma-skeptical.
00:36:32.000 I don't know that this drug is actually a benefit to you, but know that the doctors have become pushers.
00:36:37.000 Right.
00:36:37.000 Because they've been compromised.
00:36:38.000 And also, I mean, that's literally the system that they're created from.
00:36:43.000 They're sent out into the hospitals immediately with that.
00:36:48.000 And it just it's so difficult for a doctor to step outside of the system and be independent.
00:36:53.000 When they do, they get attacked.
00:36:54.000 Like, how many doctors lost their licenses because they were trying to prescribe ivermectin to people who had COVID? Yeah, almost all of the doctors who were any good found themselves chased out of a job or with jeopardy to their license or slandered in the media.
00:37:11.000 I'm sure you're in the same position.
00:37:13.000 Those are frankly the only doctors I trust at this point were the ones who were willing to pay a price to tell me the truth.
00:37:18.000 Yeah.
00:37:19.000 Yeah.
00:37:20.000 My doctor that I know out here won a case, but they were about to lose their license.
00:37:26.000 Yeah.
00:37:27.000 And just for prescribing ivermectin.
00:37:29.000 No, that should be your first question, Doc, is how did you do over COVID? Yeah.
00:37:35.000 And if...
00:37:37.000 They have nothing interesting to say.
00:37:39.000 I would just turn around and walk out the door.
00:37:40.000 Yeah, imagine going to a doctor right now and they're telling you you should get your COVID shots.
00:37:44.000 You should stay up to date.
00:37:45.000 Imagine.
00:37:46.000 Well, I find that bad, but at least I know how to interpret that.
00:37:50.000 What I don't understand is what I'm supposed to do with the doctor who did recommend the shots has stopped recommending them and has not said something about the change in their perspective.
00:38:05.000 Yeah.
00:38:06.000 Yeah, I have a problem with that with social media influencers, too.
00:38:09.000 100%.
00:38:09.000 People that were pushing it and then have not publicly correct the course, have not said, I was wrong, and this is why I was wrong.
00:38:16.000 Like, I can't fuck with you anymore.
00:38:18.000 If you can't say that you were wrong about that, then I just can't.
00:38:24.000 100%.
00:38:25.000 It is a test of integrity, and you wouldn't want to go to a doctor that didn't have...
00:38:31.000 High integrity at a moment like this.
00:38:33.000 Your doctor needs unusually high levels of integrity and what we've seen is unusually low levels.
00:38:37.000 And the same thing with social media influencers, as you called them.
00:38:43.000 Anybody in the public sphere should go back and they should do an accounting of what they said, what they thought, how they got there, how that played out in the end, when they changed their mind.
00:38:56.000 And what they said about it publicly.
00:38:58.000 I must say, I'm constantly in a battle with the ultra-cynics who claim to have gotten everything right during COVID because basically they never believe anything.
00:39:09.000 It's not a method.
00:39:10.000 You got lucky.
00:39:11.000 Right, you got lucky.
00:39:12.000 You stumbled into a full con game.
00:39:15.000 You stumbled into a con game and yes, you didn't buy it, but that's not a demonstration that you know how to think through the next one.
00:39:23.000 Correct.
00:39:23.000 It doesn't demonstrate anything.
00:39:25.000 So what I really want are people who had a good track record and who know what mistakes they made and know how not to make them in the future.
00:39:34.000 Those are the people that we should be paying attention to.
00:39:37.000 Yeah, that's a good point.
00:39:38.000 It's a fun time, though.
00:39:41.000 It's fun because things are actually happening, which is very different than most of the time when people get elected.
00:39:47.000 Most of the time when people get elected, they claim all these things when they're running for president.
00:39:51.000 Then they get into office and not much changes.
00:39:54.000 And in fact, a lot of what they campaigned on, they don't practice at all.
00:40:00.000 Like a great example is the Obama administration.
00:40:03.000 The hope and change website had to be changed because there was a bunch of stuff in there about whistleblowers protecting whistleblowers, which they didn't do at all.
00:40:12.000 They were some of the worst.
00:40:13.000 There was one of the worst administrations for whistleblowers.
00:40:15.000 100 percent.
00:40:16.000 Yeah.
00:40:16.000 Well, I think what we have seen over our you and I are about the same age.
00:40:21.000 What we have seen over our entire lifetime is that elections can change the jerseys, but they just swap.
00:40:31.000 You know, who's in power and who's out of power?
00:40:34.000 Well, the point is the system is in power and, you know, the people in the roles to deliver the speeches change, but they're just basically trading off.
00:40:43.000 And so I have the sense that you and I are now watching the first the outcome of the first genuine election since 1963.
00:40:53.000 Yeah, I've heard that argued.
00:40:56.000 At 63, when they assassinated Kennedy, that was the last time we had a real president.
00:41:00.000 It was an actual person who was trying to change things and put things in a position where he felt it was beneficial to the entire country.
00:41:12.000 Right.
00:41:17.000 Two ways.
00:41:17.000 One of them is just unfamiliar to us because we've been watching theater for our entire lives and being told that it was the transfer of power.
00:41:25.000 And the other is that there's a lot of pent up need for change because you've effectively had a cryptic power structure that never gets displaced, that has gotten so entrenched that rooting it out takes, frankly, an extraordinary, in every that has gotten so entrenched that rooting it out takes, frankly, an extraordinary, in every sense of the And an extraordinary team.
00:41:50.000 Imagine if he's doing this.
00:41:51.000 Imagine he's trying to do Doge without Elon.
00:41:54.000 Well, so, you know, Heather and I took a lot of flack after the assassination, the first assassination attempt of Trump, where we both perceived, I think we were actually perceiving this before, but the assassination attempt really kicked it off.
00:42:11.000 We perceive that this was a different person than the first administration's Trump, that he had matured and he had been he had been forged by, you know, all of the lawfare that had been deployed against him and that it had been good for him.
00:42:29.000 And in fact, I hate to say this because I have my doubts, of course, about the election of 2020, but I don't think what he is currently doing.
00:42:44.000 Would have been possible if he had won and been inaugurated in 2020. I think you're right.
00:42:50.000 I think also the public witness supported it.
00:42:52.000 If they didn't see four years of the Biden administration, how crazy everything was.
00:42:56.000 And then having gone through COVID and watching the economy collapsed and watching, you know, hurricanes coming.
00:43:01.000 He's like, the most important thing for a hurricane is to get vaccinated.
00:43:04.000 Remember that?
00:43:06.000 I do now that you mention it.
00:43:07.000 Hurricanes coming, get vaccinated.
00:43:10.000 Yeah.
00:43:11.000 It's very important.
00:43:12.000 Everything's harder for that, actually.
00:43:14.000 Everything.
00:43:15.000 Everything.
00:43:18.000 We lived in a movie.
00:43:21.000 A bad one.
00:43:22.000 We went through a fucking crazy Coen Brothers kind of apocalyptic movie.
00:43:29.000 A poorly written, poorly directed movie with, you know, an extraordinary budget and almost no need to pay attention to continuity.
00:43:41.000 It was weird.
00:43:42.000 It was bad.
00:43:44.000 But at least we know.
00:43:45.000 But I think that really woke a lot of people up, so-called red-pilled a lot of people.
00:43:50.000 I think that four years was important to get to where we are now.
00:43:54.000 It was essential.
00:43:55.000 Where most people are aware.
00:43:57.000 I think if you had gone to 2018 and had a real conversation with most people in this country about the level of corruption, it would be a fraction of what they believe it to be now.
00:44:10.000 Look, I know this to be true because, you know, I tried to spark Unity 2020 and make it work.
00:44:20.000 And you were banned on Twitter.
00:44:22.000 I was.
00:44:24.000 Explain that to people, because one thing, the difference between the new Twitter, thank God for Elon Musk, and the old Twitter, the old Twitter, you guys tried to put together a unity party where you would get the best representatives from the left and the right together for the good of the country, and like, that's dangerous.
00:44:42.000 It's dangerous, and they even lied about us.
00:44:45.000 They said that we were engaged in inauthentic behavior.
00:44:48.000 Basically, they accused us of using bots, which we didn't.
00:44:53.000 So anyway, that's the world we were in in 2020. And headed to a more controlling world.
00:45:00.000 Right.
00:45:00.000 And then in 2024, you know, there's what I think of as a continuation of the same idea.
00:45:09.000 Right.
00:45:10.000 There's, you know, Rescue the Republic was what it looked like in 2024. And the point is that actually worked.
00:45:16.000 That was an organic unity movement.
00:45:19.000 And it took advantage of the fact that, you know, Maha had already catalyzed as as.
00:45:26.000 Kennedy and Trump had gotten together.
00:45:28.000 And so that was huge.
00:45:29.000 That was a huge part of it because Kennedy had so many supporters.
00:45:32.000 Even in many states, he was like bordering like 25, 30 percent, which is really crazy for an independent.
00:45:39.000 And when he went over to Trump and then all those people like, oh, my God, I have hope now.
00:45:43.000 People who are vaccine injured, people are very skeptical about certain pharmaceutical drugs that may have caused them harm.
00:45:50.000 People who knew Bobby's history of being an environmental attorney and all the amazing work that he did then.
00:45:56.000 Those people got on board with the Trump.
00:45:59.000 And I think that was huge.
00:46:01.000 And now with Tulsi, I think that's huge as well.
00:46:04.000 I think, you know, when Elon took over Doge, that was like the final Avenger.
00:46:11.000 Like, having that team together is such a unique team where you have prominent former Democrats, former eight-time Democrat for eight years, Democrat Congresswoman, who also served overseas in a medical unit.
00:46:27.000 Twice.
00:46:28.000 Like, this is, you've got an extraordinary group of human beings.
00:46:33.000 Extraordinary group of human beings, all of whom I think took very real risks.
00:46:39.000 Oh, yeah.
00:46:40.000 At the very least with their reputations.
00:46:42.000 Well, Tulsi got put on a terrorist watch list.
00:46:44.000 Of course.
00:46:45.000 Which is fucking crazy.
00:46:47.000 And it was a gamble.
00:46:48.000 Each of these people, you know, Kennedy, Musk, Tulsi, they knew.
00:46:53.000 That they were taking that risk and it was clear that they were motivated by patriotism, that they actually, I mean, this is what a soldier does, right?
00:47:04.000 You know that you're taking risks for something that matters more than you.
00:47:09.000 And, you know, to watch Elon do it, I think also was just remarkable because, of course, in Elon's position, he could have done, you know, what Zuckerberg does.
00:47:22.000 Right.
00:47:23.000 And he could have played it safe and kept his options open and done what he was told and, you know, apologize for it later, sort of.
00:47:31.000 Right.
00:47:31.000 That wasn't what Musk did.
00:47:33.000 He actually had the courage of his convictions.
00:47:37.000 And A, as many people have noted, his liberation of X set the stage for this election to even happen.
00:47:46.000 That there wasn't anything you could put over on us that we couldn't unpack and, you know, crowdsource a better interpretation of on X.
00:47:55.000 And even if most people weren't on X, it was enough that their narrative engine just didn't work.
00:48:01.000 And if you look at a viral post on X, a viral post about something that's very important that has to do with USAID, you will see 7 million, 8 million views, 10 million views.
00:48:14.000 There is nothing equivalent like that to mainstream media.
00:48:17.000 There's nothing even close.
00:48:18.000 There's nothing even close.
00:48:20.000 Maybe a very viral YouTube clip.
00:48:23.000 But these are every day, all day long.
00:48:26.000 There's posts that have 7 million, 5 million, 3 million.
00:48:30.000 And people are reposting them as well and sharing them and taking the information and posting them without credit.
00:48:37.000 There's a lot of that going on.
00:48:39.000 So the actual amount of the information that gets out is far more than it would have ever happened without Elon taking over Twitter.
00:48:47.000 It's probably changed the course of our civilization in a way that nothing else could have done.
00:48:53.000 Yes, and I think it's a little bit deceptive because its size doesn't quite explain its impact.
00:49:00.000 But it's a little bit like the higher reasoning centers of the brain.
00:49:06.000 Like there's a collective consciousness in which we figure out what we think is true.
00:49:11.000 And it's been downstream of this amazing propaganda engine.
00:49:16.000 Well, we're now learning to spot the propaganda and to understand what it really means and to figure out what it's cloaking.
00:49:23.000 And a lot of that is happening on Twitter because it can.
00:49:27.000 And it's actually forcing, you know, Facebook to come around, right?
00:49:32.000 Which, of course, you know, I usually say that zero is a special number, meaning in a world with no social media platforms where you can speak freely and reason with others, there's no pressure.
00:49:45.000 To start doing that.
00:49:46.000 But once you have won, any social media platform that doesn't allow you to speak freely is at a competitive disadvantage.
00:49:53.000 And so, you know, Elon freeing X actually liberated the others and they're beginning to move in the right direction, which, frankly, is part of why this era just feels different.
00:50:04.000 Yeah.
00:50:05.000 It's very interesting times.
00:50:07.000 And then on top of that, we're being invaded by UFOs.
00:50:09.000 So it's all happening at the same time.
00:50:10.000 I have not noticed that.
00:50:13.000 Are you watching News Nation?
00:50:15.000 What's wrong with you?
00:50:17.000 You're so not informed.
00:50:19.000 I am not informed.
00:50:22.000 I'm waiting for some sort of compelling evidence that something extraterrestrial is going on.
00:50:27.000 I'm talking to everybody, and the more people I talk to, the less I know.
00:50:30.000 Well, there's that.
00:50:31.000 The more information I get from all these people that have had UFO and alien encounters and experiences and whistleblowers, and the more I talk to them, the less I feel like I know.
00:50:41.000 I do not feel like it's...
00:50:43.000 And then on top of that, I'm in the middle of Jacques Vallée's books, which are very wild.
00:50:49.000 Like, Jacques Vallée...
00:50:49.000 I had him on the podcast a long time ago, and he's coming back on again.
00:50:52.000 But the first time I had him on, I only knew him as the French scientist that had...
00:50:59.000 The character in Close Encounters of the First Kind was based on him.
00:51:03.000 Do you know the character, the French character that brings together the military to try to communicate with the aliens?
00:51:08.000 It's based on Jacques Vallée, who's been studying UFOs for decades, since the 50s and the 60s.
00:51:16.000 And boy, the more you read about his take on things, the more it's very confusing.
00:51:22.000 These fucking stories are the same stories that have been going on for hundreds of years.
00:51:26.000 They're not even modern.
00:51:28.000 When we think of them, we think of Kenneth Arnold seeing the flying saucers and coining the phrase in the 1950s.
00:51:35.000 No.
00:51:37.000 No, these stories have been real similar for hundreds of years.
00:51:42.000 There's some phenomenon that people occasionally encounter.
00:51:46.000 And it's real similar.
00:51:47.000 It's similar enough from people that weren't aware of the narrative that you have to wonder what the fuck is actually going on.
00:51:58.000 Yeah, I think you do have to wonder what the fuck is actually going on.
00:52:00.000 On the other hand, I think there's a whole range of possibilities that don't involve anything extraterrestrial.
00:52:10.000 I think...
00:52:11.000 There's a bunch of shit that doesn't involve anything extraterrestrial that's happening at the same time as a bunch of shit that we don't have explanations for.
00:52:21.000 That's what I think.
00:52:24.000 That would not be shocking if there was something to cover.
00:52:28.000 You might decide instead of trying to keep it under wraps, you would bury it in so much low-quality bullshit that nobody would be able to find it.
00:52:36.000 That's what it feels like to me.
00:52:38.000 That's what it feels like to me.
00:52:39.000 It feels like to me that there's a lot of people that I think are trying to do the right thing, a lot of whistleblowers that are really trying to educate the American public, but I don't know who they really are doing the bidding of.
00:52:51.000 I don't know they even know.
00:52:52.000 I think if I was the government, let's pretend that I was some gigantic arm of the military industrial complex and I had some literal recovered flying saucers, I would come up with the dumbest fucking stories and put them in binders and leave I would come up with the dumbest fucking stories and put them in binders and leave them on desks and hope that And the more dumb shit they leak, the more the actual reality of what we possess.
00:53:19.000 Let's say if the government...
00:53:21.000 Really did find a flying saucer in the 1940s.
00:53:24.000 Really did back engineer the propulsion system.
00:53:28.000 Really did apply it to drones.
00:53:29.000 And they really are flying them around.
00:53:31.000 And they have them.
00:53:32.000 What I would do, I would make up some...
00:53:36.000 Crazy shit about, you know, a mothership that's 47 years away and it's coming and it's as big as a planet and I would come up with the wackiest stuff possible and like get it all out there.
00:53:49.000 Put it all out there.
00:53:50.000 We have 57 different species all in a fucking freezer somewhere and Wright-Patterson Air Force Base and I just like...
00:53:58.000 Ramp up the bullshit in as many ways as possible.
00:54:01.000 You know, they've controlled all our nuclear test codes and they hover over our facilities.
00:54:05.000 We're powerless to control them.
00:54:07.000 I would say everything as wacky and crazy as possible so I could keep flying around these gravity propulsion vehicles that we've developed.
00:54:16.000 Well, I must tell you, I'm skeptical that those vehicles are vehicles.
00:54:22.000 What do you think they are?
00:54:24.000 Projections.
00:54:25.000 Projections?
00:54:26.000 Yeah.
00:54:27.000 But what if you could monitor them on, if you see them on radar, if they're visual, they're seeing them going into the water?
00:54:35.000 I'm having a deja vu moment here, or we've discussed this before, I don't know which it is, but the basic rubric is physical stuff displaces air, which means it makes noise when it moves.
00:54:51.000 Right.
00:54:53.000 I don't quite see the logic behind suppressing that fully.
00:54:56.000 I don't see the capacity to suppress it fully.
00:54:59.000 Who knows what I don't know?
00:55:01.000 But my guess is if you had actual craft moving around in the ways that people who have observed these things think they've seen it, that noise would be an inherent part of the phenomenon.
00:55:15.000 But why would that be the case if it operates on a gravity propulsion system that essentially bends space around it?
00:55:21.000 And instead of creating a sonic boom, because it's flying through the air, it's not flying through the air, it's displacing space.
00:55:28.000 Well, I don't even know what displacing space means.
00:55:31.000 I don't know what a gravity propulsion system means.
00:55:34.000 Right.
00:55:35.000 But I'm trying to imagine some futuristic sci-fi version of a propulsion system that doesn't involve pushing something out the back.
00:55:43.000 It doesn't involve exhaust, like a rocket.
00:55:47.000 I'm not necessarily requiring engine noise.
00:55:50.000 I'm requiring...
00:55:52.000 Air noise.
00:55:53.000 Passing through the air.
00:55:54.000 Yeah, passing through the air noise that, you know, as the air collapses...
00:55:59.000 As the craft moves, the air collapses behind it, that you'd hear something.
00:56:04.000 You mean when it's moving fast?
00:56:05.000 Yes, especially if it's moving fast.
00:56:07.000 But if it's not really displacing the air around it, and if this is what allows it to go through the water as well with extreme speed.
00:56:16.000 So one of the crazier things that they've monitored is something moving underwater that's huge, like the size of a couple of football fields at 500 knots.
00:56:24.000 So this is exactly my problem.
00:56:28.000 There's two realms.
00:56:30.000 There's a realm in which I understand the physics of the universe enough that I can evaluate that claim, and then I can say, well, it's not obvious to me how you go through the water.
00:56:45.000 The water has to be displaced, and water is...
00:56:50.000 It's denser than air in terms of how much matter there is, how many particles there are, and therefore it ought to be harder to move through than air.
00:56:59.000 I would expect noise in the air.
00:57:01.000 I would expect something similar in the water.
00:57:03.000 And the fact that these things behave in a couple of different ways.
00:57:08.000 One, they're silent.
00:57:09.000 Two, they turn in ways that would challenge a biological critter profoundly.
00:57:18.000 They move at speeds that are improbable in light of what we understand.
00:57:25.000 Now, I'm not saying there can't be lots of stuff we don't understand.
00:57:27.000 But what I'm saying is all of those things have a simplest explanation, which is that that craft isn't matter.
00:57:37.000 It's a projection.
00:57:39.000 Now, what science...
00:57:42.000 What kind of technology would even be available that could create a projection like that?
00:57:47.000 Well, that I believe we have.
00:57:49.000 I'm not expert in it, but you can project from above or below onto material.
00:57:56.000 It could even, I think, be done in clear skies, right?
00:58:00.000 Especially if you had a substrate.
00:58:02.000 And I don't know whether to go down this road.
00:58:05.000 Let's go down that road.
00:58:06.000 What do you mean?
00:58:08.000 Well...
00:58:09.000 There seems to be a certain amount of experimentation with particles being released from aircraft for some reason.
00:58:17.000 I would assume and have long assumed that there is experimentation with altering the albedo of the Earth so it reflects more light back into space.
00:58:26.000 Well, there's certainly proposals.
00:58:28.000 It's certainly been discussed.
00:58:30.000 And, you know, this is something that Bill Gates has been involved in.
00:58:33.000 Yeah.
00:58:34.000 And I don't think, you know, one of the things that we, many of us came to understand during COVID about proposals is that very often the proposal comes after the experiments have already begun.
00:58:44.000 Right?
00:58:45.000 You propose an experiment that you've already done and then recoup your investment when the grant is given.
00:58:52.000 So, anyway, I believe that there's been some experimentation with...
00:58:57.000 I think it's an insane experiment to run.
00:59:00.000 It's diabolical, frankly.
00:59:01.000 You have no right to alter the Earth's atmosphere without us at least having a global public discussion about the consequences.
00:59:08.000 I believe this is an informed consent violation and that I take those things very seriously.
00:59:14.000 Those were hanging offenses at the end of World War II. But nonetheless, if you drop particles into the atmosphere, those particles are largely not visible.
00:59:24.000 Right?
00:59:25.000 They have impacts.
00:59:26.000 But could they be used to project a craft that wasn't onto a substrate you can't quite see?
00:59:34.000 I don't know.
00:59:35.000 So it would have to be a substrate?
00:59:36.000 So would there have to be particles?
00:59:38.000 Or is there a potential technology that would allow you to project something into the just actual air, clear blue sky?
00:59:47.000 A physical thing.
00:59:48.000 Something that looks like a physical thing.
00:59:50.000 Well, let's put it this way.
00:59:51.000 First of all, there are always particles, even if what we're talking about is air.
00:59:56.000 Navy laser creates plasma UFOs.
00:59:59.000 Is that at least four years old?
01:00:01.000 Remember I showed you that one YouTube video that one time that shows these little plasma things dancing in the air?
01:00:06.000 Oh, yeah.
01:00:06.000 Yeah, find that.
01:00:07.000 Find that video.
01:00:08.000 Yeah, it almost describes that.
01:00:10.000 They've created stuff to trick missiles and different homing.
01:00:15.000 Devices.
01:00:15.000 Not that that's what this is, but it's a potential explanation for what some of it is.
01:00:21.000 It's in the right neighborhood, at least.
01:00:23.000 So, let's just say, first of all, this is where I would want a robust university system and a robust journalistic system to dig.
01:00:34.000 Because there's a lot you need to know that you could figure out.
01:00:40.000 That would tell us whether or not what we're looking at are really distant craft moving at tremendous speeds or it's an optical illusion.
01:00:48.000 Let me just give you an example you'll probably have.
01:00:51.000 This is 10 years old.
01:00:52.000 Whoa.
01:00:54.000 Huh.
01:00:55.000 And they could do it in patterns like this in the air?
01:00:57.000 This is what they could do a long time ago.
01:00:59.000 Aliens.
01:00:59.000 So they're making a butterfly out of plasma bulbs in the air.
01:01:04.000 Huh.
01:01:05.000 Or plasma bulbs.
01:01:06.000 That's pretty good.
01:01:07.000 Pretty wild.
01:01:08.000 And pretty silent.
01:01:09.000 What?
01:01:11.000 Oh, my God.
01:01:12.000 That's pretty good.
01:01:13.000 That's insane.
01:01:14.000 Oh, my God.
01:01:15.000 Now make a Tic Tac.
01:01:16.000 So a 3D display in midair using laser plasma technology.
01:01:21.000 So if you were somewhere and you encountered these things, you would absolutely think these are alien craft from another dimension that's come here to communicate with you.
01:01:30.000 And imagine that you saw that outside.
01:01:35.000 Right.
01:01:36.000 You wouldn't necessarily know how far away the object was, and therefore you wouldn't necessarily know how fast it was moving.
01:01:42.000 You'd misjudge it.
01:01:43.000 And to give everybody an example that they will have familiarity with, I was driving down the highway at one point, rainstorm, but the sun was shining, and I saw a rainbow.
01:01:54.000 And I've thought a lot about rainbows.
01:01:57.000 They're pretty interesting.
01:01:58.000 And I realized that I could tell that although the rainbow looked to be...
01:02:04.000 Ten miles from me or something like that.
01:02:06.000 It was actually feet from me.
01:02:09.000 And I could tell that because the rainbow came down onto the road and I could see it in front of the guardrail.
01:02:19.000 Continuous rainbow where the parts up here look like they're closer to the mountains in the distance.
01:02:24.000 But when I see where it's continued down into the spray off the road, it's actually ten feet away.
01:02:30.000 Right?
01:02:31.000 So the mind is...
01:02:32.000 Building a model of stuff, and if you give it the wrong cues, it'll totally misunderstand the distance that it's looking at, to the extent that a rainbow is at a distance, right?
01:02:44.000 Right, especially when you take into consideration a lot of these UFOs are in night skies.
01:02:49.000 Yeah.
01:02:50.000 Right.
01:02:50.000 So it's all black sky.
01:02:52.000 It's very difficult to gauge depth.
01:02:54.000 So if you had a robust journalistic apparatus, what it would want to do is figure out, well, if person A was standing in location X and they saw a craft moving at what appeared to be 200 miles an hour at a distance of five if person A was standing in location X and they saw a craft moving at what appeared to be 200 miles an And if we go and we ask people who were standing in those locations, did they see it at all?
01:03:22.000 Because if they didn't, then maybe the thing was inches away from the person being projected locally.
01:03:28.000 Right?
01:03:28.000 And they only felt like they saw something at a great distance.
01:03:32.000 So what is your take when you keep hearing all these congressional whistleblowers and people coming and talking about that we've been in contact and we have in our possession multiple craft that are not of this world?
01:03:47.000 What's all that?
01:03:49.000 Well, I'm going to share credit with Ben Davidson for this.
01:03:55.000 The basic point is PSYOP until proven otherwise.
01:03:58.000 And PSYOP until proven otherwise, I think, is a very functional way to approach this because depending upon what kind of program we're looking at, and there obviously is governmental involvement in whatever it is, either concealing real stuff or pretending that it has real stuff that it's pretending to conceal or whatever it's doing, there is...
01:04:23.000 Every possibility that there are sort of layers of awareness and at the bottom layer there may not be anything alien at all.
01:04:38.000 But it may be that people fairly close to the center have been shown something.
01:04:45.000 I mean I don't understand what the purpose of any of this stuff is.
01:04:47.000 Either talk to us about the aliens and when they started to visit and what it is they seem to want and whether they're still here and whether they're going to be back and whatever we know.
01:04:57.000 That's what I would do.
01:04:59.000 Any excuse that says the public can't handle it I think is just nonsense.
01:05:05.000 But isn't the problem if you've been – let's pretend that there is a real crash retrieval program and there are real aliens.
01:05:13.000 If we've been hiding it for so long, then it's very difficult to not hide it anymore.
01:05:19.000 It's almost like being in the closet.
01:05:21.000 Even though there's no reason to be in the closet in 2025, there's a lot of people that are still in the closet.
01:05:25.000 And I think part of the reason why they're in the closet is because they were in the closet 20 years ago, and they've been lying forever, and they don't want to come out.
01:05:32.000 So that's just a person with social consequences.
01:05:37.000 Now imagine a government.
01:05:40.000 So, how are you funding these things?
01:05:41.000 Were you lying to Congress?
01:05:43.000 You have a crash retrieval program?
01:05:44.000 How was that funded?
01:05:45.000 Like, let me see your budget.
01:05:47.000 Let me see where did you allocate the money?
01:05:49.000 This is fraud, okay?
01:05:51.000 Now you're getting into a situation where people can go to jail, there's perjury, there's people that have lied on the witness stand.
01:05:57.000 So, like, if that's the case, then I understand why you would continue for your own personal benefit, just for your own personal protection.
01:06:04.000 Your own personal interest to keep things secret from the American people.
01:06:08.000 Then there's also the attitude that government does have.
01:06:11.000 There's the infantilization of our people by the government.
01:06:16.000 They decide that malinformation is a thing.
01:06:22.000 So what that is is information that's true, but it could fuck you up.
01:06:26.000 So we're going to say it's bad.
01:06:28.000 It's bad information, even though it's accurate information.
01:06:31.000 So this is like your...
01:06:33.000 You're a baby.
01:06:34.000 You can't handle the truth.
01:06:36.000 That's basically what that is.
01:06:37.000 It's the government's version of it.
01:06:38.000 Now, if that sort of attitude, which clearly persists throughout the entire federal government, wouldn't you apply that sort of thinking to something as powerful as an actual alien contact?
01:06:51.000 That we have been experiencing for decades and they've been lying about.
01:06:56.000 All right.
01:06:56.000 Well, as long as we're just sort of fantasizing about wild stuff here.
01:06:59.000 Imagine that Donald Trump were to be elected president for a second time and he was pissed off and he was to nominate Tulsi Gabbard for the director of national intelligence.
01:07:10.000 And then she was only hours or at most days away from being confirmed by the Senate.
01:07:18.000 Then when she gets in, presumably, she wouldn't have investment in all of those years of lying about this, and she might feel obligated to tell us in the public what the hell's going on.
01:07:29.000 Maybe we should edit that part out so she gets confirmed.
01:07:31.000 Yeah, we could.
01:07:34.000 All right.
01:07:35.000 Fair enough.
01:07:36.000 Just kidding.
01:07:36.000 Just kidding.
01:07:37.000 We don't have to edit it out.
01:07:38.000 But, yeah, that's the hope, right?
01:07:40.000 The hope is she's a very honest person and a real patriot, and she would want people to know.
01:07:44.000 100%.
01:07:45.000 Also, we've got Elon on a separate track.
01:07:48.000 He's going through the books and finding all of the nonsense.
01:07:53.000 And so presumably the effort to hide whatever it is, either to manufacture the impression of UFOs or to hide what we know about them, that's going to have a budget somewhere.
01:08:05.000 Yeah.
01:08:07.000 Yeah.
01:08:08.000 It's all interesting.
01:08:10.000 But it's also, I always assume that when something hits the zeitgeist and is like prominently out in the newspapers and media and websites, I always assume that they're covering something else and that this thing is the big distraction.
01:08:24.000 And that's what I was thinking while the UFO thing was happening over New Jersey.
01:08:29.000 I was like, okay, what are they distracting from?
01:08:31.000 What's the big distraction?
01:08:33.000 Because it seems like that's what that was.
01:08:35.000 That just seemed so forced and so obvious.
01:08:38.000 And then the Trump administration says, oh, they were ours.
01:08:41.000 Right.
01:08:42.000 Well, why were you doing that?
01:08:45.000 Why were they doing that?
01:08:47.000 Why didn't they say they were ours?
01:08:48.000 Why did they freak everybody out?
01:08:50.000 Why they send jets to go scramble after them and then they turn their lights off and disappear?
01:08:55.000 Like what?
01:08:56.000 So there is the question of what they were trying to distract us from if that was their purpose.
01:09:00.000 But I also find this is again become a kind of theme in my life.
01:09:06.000 This is also a violation of informed consent.
01:09:09.000 If those were our drones and they were nightly traumatizing the residents of New Jersey and pretending they didn't know what it was, that's a de facto experiment that they were running on the citizens of the country.
01:09:23.000 They have no right to do this shit.
01:09:26.000 Yeah, that's a good point.
01:09:27.000 Yeah, that should be illegal.
01:09:28.000 100%.
01:09:29.000 Yeah, especially like lying about it.
01:09:31.000 And not telling us what you're doing.
01:09:33.000 And then just keeping everybody in the dark for weeks where people were really panicking.
01:09:37.000 I, you know, one doesn't know until you see this stuff enacted where it's going to lead.
01:09:46.000 But my sense is I don't want my government lying to me ever again with the excuse that it's for my own good.
01:09:55.000 Trevor Burrus: Is it possible to – Obama passed that law in – was it 2012 that allowed the government to use propaganda on its own citizens?
01:10:08.000 Do you remember that law?
01:10:10.000 This is not the NDAA 2012?
01:10:13.000 No, no.
01:10:14.000 NDAA, that's the Authorization Act.
01:10:18.000 That's indefinite detention.
01:10:19.000 Yeah, that's indefinite detention.
01:10:21.000 This is different.
01:10:22.000 This is the use of propaganda.
01:10:25.000 So they authorized the use of propaganda on American citizens.
01:10:28.000 So the CIA, instead of turning its propaganda wing on the whole world, they're allowed to use it under the guise, of course, of national defense, national security.
01:10:39.000 Sometimes they need to bullshit us.
01:10:40.000 Well, that is, in fact, exactly what we have discovered.
01:10:43.000 And why it was so hard to convince people of this before the evidence for it emerged, I don't know.
01:10:52.000 But all you needed to realize was that some rogue element...
01:10:58.000 had decided that it had the right to engage in the same kind of regime change bullshit domestically that it was already feeling entitled to engage in globally.
01:11:11.000 And the rest makes perfect sense.
01:11:13.000 And of course, you would get an entrenched cabal a ball that would come up with a justification for fending off a challenge at the ballot box that it could portray as somehow a threat to American democracy.
01:11:27.000 Of course it would do that.
01:11:29.000 It has to be forbidden to do that and the penalties have to be extreme for attempting it or it will happen.
01:11:36.000 Right.
01:11:36.000 So the argument against that is not the argument we're using it in America, but the argument is you need organizations like that to do that worldwide to counteract the fact that other countries are doing that worldwide.
01:11:49.000 And that there is some sort of a psychological game that's going on.
01:11:53.000 There's a propaganda game that's going on with all countries.
01:11:56.000 As well as, you know, they're doing it against us.
01:12:01.000 We're doing it against them.
01:12:02.000 We need to be sophisticated in how we employ these things.
01:12:05.000 Otherwise, we're going to lose very important parts of the world.
01:12:09.000 It's key to the national security of the United States.
01:12:11.000 We have to have things like that in place.
01:12:14.000 But when they start using it on us and they say, oh, well, we have to start using it on us because Russia is using it on us or we have to use it on us to counteract what China is doing.
01:12:22.000 We have to.
01:12:23.000 That's when things get really screwy, right?
01:12:25.000 Well, yes, but I also am not sure that I buy the international rationale either.
01:12:32.000 And I think as much as I understand it.
01:12:35.000 We have to be mature about what's possible in the world and what implications it has for the republic.
01:12:44.000 On the other hand, to the extent that we believe in self-determination, where exactly does our right to interfere with other people's self-determination come from?
01:12:57.000 Further, I do think that there's a kind of end state for The governance structures of Earth.
01:13:05.000 That what we have in the West, an agreement on a level playing field, an agreement to compete with each other by attempting to produce better stuff rather than by interfering with our competitors' ability to get to the market, that that view of the West is superior.
01:13:31.000 And it is also contagious that it makes for a safer, more rewarding, fairer, less warlike system.
01:13:44.000 And therefore, there's a very good reason for people to want to adopt it.
01:13:48.000 That sounds great though, but isn't that slightly naive when you take into consideration the amount of espionage that we know exists in American corporations and in American educational institutions?
01:14:02.000 Well, I'm not arguing that you just go and live your values.
01:14:07.000 What I'm arguing is that those values are superior.
01:14:12.000 That they are sticky and contagious when they take hold.
01:14:15.000 And that anything you do where you compromise on the idea that that's the objective, is to get Western values to catch on across the world.
01:14:26.000 Anytime you decide you have a right to do something else, you're dragging us on to a slippery slope.
01:14:33.000 You will disrupt other people's self-determination.
01:14:36.000 You have no basic right to do it.
01:14:38.000 And it will eventually come home and be done to us.
01:14:43.000 So, I don't know what the sophisticated way to make it maximally likely that other societies take on those values is, but I know that it was happening organically without us having to do terribly much, and so the real question is, how do we make that a winner so that it organically catches on, and how do we reinforce it when it does?
01:15:08.000 How much are you paying attention to DeepSeek?
01:15:10.000 And the AI competition that's going on right now?
01:15:14.000 I am loosely paying attention to the AI competition.
01:15:17.000 I'm conflicted about it.
01:15:19.000 I don't think there's anything we can do to regulate AI competition that doesn't make matters worse.
01:15:25.000 I'm very concerned about the outgrowth of this transformative technology.
01:15:32.000 I think even the most mundane disruptions that will come from it, Disruptions to the job market are going to be a profound challenge to our society and we're going to have to come up with an approach that allows us to tolerate the disruption.
01:15:54.000 I used to think the approach was universal basic income, but now I'm conflicted because now I just take into account human nature.
01:16:02.000 Unfortunately, I don't think it's good for people to just give them free money, even though you need to.
01:16:08.000 Even though you need to, I think it's ultimately bad for them to be dependent upon it.
01:16:12.000 And that's what scares me about automation and AI in general, that if it does get to the point where there's so many people that are displaced from the job market that we have to provide them like a real meaningful wage.
01:16:25.000 And what incentives do they have to break free from that system?
01:16:29.000 And do they just decide to live inside the means of whatever that is forever?
01:16:33.000 And does that limit the growth and potential that those people possess?
01:16:37.000 Because people really don't accomplish anything.
01:16:40.000 Unless they're driven or unless they have to, right?
01:16:43.000 That's what really gets people going.
01:16:44.000 That's why it's so difficult for people that were trust fund babies to ever get anything going.
01:16:48.000 I mean, we all know the trust fund kids that are just, they do drugs and party and they're materialists and they're really lost.
01:16:56.000 That's really common.
01:16:57.000 Like, more common than not, right?
01:17:00.000 Very difficult to navigate that water.
01:17:04.000 So what would we do to incentivize people to do things?
01:17:10.000 Like, to have this healthy, thriving, artistic, creative, innovative economy that we have right now.
01:17:18.000 Like, how does that continue if so many people are displaced from the job market?
01:17:23.000 Or is there a way where you can say, you know what, we are so concerned about basic...
01:17:29.000 Goods, needs, food, shelter, things like that.
01:17:33.000 If you just provide people with the basics, so nobody ever has to worry about food or shelter, would it organically arise that some people would compete outside of that and then say, now that I have basic food and shelter, let me pursue my dreams.
01:17:48.000 Let me do what I want to do.
01:17:50.000 Let me create a business that AI can't make.
01:17:54.000 Let me make fine cabinetry.
01:17:56.000 Let me paint.
01:17:59.000 Do things that's going to provide a real value that, you know, I can get money from, that it can be an actual viable business.
01:18:08.000 And maybe the way to incentivize people to do that is to never take away their universal basic income.
01:18:13.000 So it's not like welfare.
01:18:15.000 One of the things, like my family was on welfare when we were young.
01:18:18.000 And when they got off welfare, it was like a nice thing to know that, like...
01:18:23.000 We're providing for ourselves now.
01:18:26.000 But you have to do that.
01:18:27.000 You have to break off the system and then you don't get the checks anymore.
01:18:31.000 But what if the people just keep getting universal basic income and we just rewire the way we think about food and shelter?
01:18:39.000 We think about food and shelter as just something that everybody should have.
01:18:42.000 Not like...
01:18:43.000 Tons of money, not disposable, indispensable income where you can just buy fucking junk food and garbage and do cocaine all day.
01:18:52.000 But have enough where you can live.
01:18:54.000 And then have people pursue a life that is more meaningful.
01:19:00.000 But you have to give people incentives.
01:19:02.000 They have to be somehow or another either personally motivated to do that, encouraged by the culture to do that.
01:19:10.000 It has to be something where people...
01:19:12.000 Develop this desire to do more.
01:19:16.000 Well, let's talk about the ultimate source of this problem.
01:19:21.000 Our ancestors, our hunter-gatherer ancestors, even our farming ancestors, lived in a world where the world itself provided the incentive structure.
01:19:34.000 Right?
01:19:35.000 If you didn't work hard enough as a hunter-gatherer, it manifested as hunger and jeopardy.
01:19:43.000 So people were naturally incentivized to invest in the right kind of stuff.
01:19:47.000 And the right kind of stuff is hard work in some cases where, you know, you pursue the materials that make your hut better, that procure more food for your family.
01:19:58.000 Or it could be insight where you figure out some way to do something better so you make more with what you've already figured out how to get.
01:20:07.000 That's a very natural structure.
01:20:09.000 And it's what...
01:20:10.000 We neurologically are built for.
01:20:13.000 The economy has some of that characteristic.
01:20:19.000 The economy rewards hard work somewhat.
01:20:23.000 And it rewards insight somewhat.
01:20:26.000 But it also rewards cheating.
01:20:30.000 And it rewards lots of unproductive behavior that actually destroys wealth but creates a profit.
01:20:38.000 Stock market.
01:20:39.000 Yeah.
01:20:40.000 For example, it rewards gambling, it rewards interference, competition, all sorts of stuff.
01:20:47.000 You know, destroying wealth is actually a big part of our economy.
01:20:50.000 And the way the mythology of free market capitalism works, you're getting paid for producing stuff that enhances us all.
01:21:00.000 But what fraction of the economy is actually dedicated to activities that destroy wealth?
01:21:07.000 You know, the production of porn, for example.
01:21:10.000 In my opinion, that is highly likely to destroy vastly more wealth than it produces.
01:21:17.000 But it's a very rich industry for a reason.
01:21:20.000 So what I'm getting at is we have a new problem with the AI component.
01:21:28.000 Maybe it's taken the magnitude of the problem that we had and it's multiplied it by 10. But it's not a new problem.
01:21:36.000 We are still trying to figure out what to do with the fact that you're taking an animal out of the habitat that properly inherently incentivizes it and putting it into an environment in which the incentives aren't really well built.
01:21:49.000 And I agree with you.
01:21:50.000 Whatever sympathy I may have had for the idea of universal basic income is gone because I do think it would produce at best a kind of learned helplessness that's unproductive.
01:22:04.000 A dependency.
01:22:05.000 That's scary.
01:22:06.000 Right.
01:22:07.000 So what we really want is a system in which whatever the new opportunities are going to be in the world where AI is available everywhere and very sophisticated, we want people to figure out how to leverage it on our behalf.
01:22:24.000 And mind you, we could have the same conversation before the World Wide Web and we could talk about, well, what's it going to be like when...
01:22:32.000 You can source information from anywhere.
01:22:35.000 What kinds of opportunities is that going to create?
01:22:38.000 And can we incentivize people to figure out what those opportunities are?
01:22:42.000 Yada, yada, yada.
01:22:43.000 So the AI version is the same problem, but at a different order of magnitude.
01:22:50.000 So I don't know what the solution is about how you create that proper incentive structure, but we are going to be living in a world in which meaning and wealth are of a fundamentally different nature and what we want is for people to have the tools and the incentive to explore that world productively so that when they do it well they
01:23:20.000 end up economically enhanced And when they do it poorly, they suffer a challenge so that they are naturally led by that world to find stuff that creates wealth for all of us.
01:23:34.000 Well, maybe it starts with the education system.
01:23:37.000 Maybe we have to incentivize people to pursue their dreams instead of just to try to find a job.
01:23:41.000 Because this is the way the education system is scheduled now or is set up now.
01:23:48.000 It's basically you go back to the Rockefellers.
01:23:51.000 You're basically trying to make factory workers.
01:23:53.000 You're trying to make people that obey.
01:23:54.000 The earlier you can get them into school, the better, because the more you can indoctrinate them into the way the system works.
01:23:59.000 You get them accustomed.
01:24:00.000 You get these kids that are filled with fucking energy, and they're excited about the world.
01:24:05.000 They just want to play all the time, and you make them just sit down all day.
01:24:08.000 And when they don't, you say, that little fellow's got ADD. He's not paying attention.
01:24:12.000 We need to give him some Ritalin.
01:24:14.000 The little fucker is just sitting there jacked out of his mind on Ritalin now.
01:24:17.000 And this is what we've done.
01:24:19.000 And instead of having an education system that educates people that way, have an education system that excites people about learning things they're actually interested in.
01:24:29.000 Hell yeah.
01:24:30.000 But again, this is another version where it's not like AI is a bad fit for the education system.
01:24:41.000 It certainly is.
01:24:42.000 But the education system has been garbage.
01:24:46.000 My whole life existed with an education system that was almost totally worthless and in some cases was counterproductive, which is, I think, why some of us folks with learning disabilities actually turn out to have an advantage.
01:24:59.000 It's not that there's something good about having a learning disability, but if it breaks your relationship to school so school has...
01:25:05.000 Less of an easy time programming you to be a cog, and you at least retain the potential to be something other than a cog.
01:25:12.000 I don't think I had a learning disability, but I was a latchkey kid, right?
01:25:18.000 So I didn't have a lot of guidance when I was young, and I wasn't used to people telling me what to do, and I didn't enjoy it.
01:25:23.000 And also, I had a lot of energy, and it was very difficult for me to pay attention to boring things by uninspired teachers.
01:25:32.000 But then again, every now and then I'd have an inspired teacher and I'd go, okay, maybe I'm not stupid.
01:25:37.000 Like, maybe I'm just bored.
01:25:39.000 You know, and then I'd get really interested in something and then I'd learn a lot about it.
01:25:42.000 And then I'd be able to, like, tell people about it.
01:25:44.000 I'd talk to my friends.
01:25:45.000 You know what I learned today?
01:25:46.000 And then we'd have these conversations about it.
01:25:48.000 Like, okay, it's not that I'm not curious or interested.
01:25:50.000 It's that I'm not being inspired.
01:25:53.000 Now, why is that?
01:25:54.000 Is it because I'm 10?
01:25:55.000 And this is hard to be inspired by things when you're 10 because you're just a little fucking dork and you're running around reading comic books and paying attention to other things and you don't really care about math or you don't care about history.
01:26:08.000 What is it?
01:26:09.000 But whatever it is, the system's not working for you.
01:26:13.000 You have to find some sort of inspiration outside of it.
01:26:17.000 And I've been educated almost entirely outside of schools.
01:26:20.000 Almost all of what I know, I know from books that I read because I was interested or I listened to audiobooks or I listened to podcasts or I had conversations with people like you.
01:26:29.000 That's how I learned things.
01:26:31.000 And it wasn't that I wasn't interested.
01:26:33.000 It wasn't that I wasn't smart.
01:26:34.000 It was that I was not inspired.
01:26:37.000 I had other – I didn't know that I wasn't a loser until I got really good at other things.
01:26:42.000 And I'm like, I can get good at things.
01:26:44.000 OK, so if I can get good at things, it's not that I'm a loser.
01:26:46.000 It's just like I can't work a job.
01:26:48.000 I can't just show up every day and do something that's not exciting to me.
01:26:52.000 That doesn't mean I'm useless.
01:26:54.000 It just means I'm useless for that.
01:26:56.000 I don't have the personality to just sit there and go over paperwork.
01:26:59.000 It doesn't – I can't.
01:27:00.000 I'll go crazy.
01:27:02.000 But that – The go crazy part is also what lets you have the courage or the motivation to go and try a path that seems unlikely for success and to have the courage to say, well, some people succeed.
01:27:17.000 Why don't I fucking try it?
01:27:18.000 And just, I can't do this.
01:27:20.000 Fuck it.
01:27:20.000 Let's give it a go.
01:27:22.000 And then that's how you become a stand-up comedian.
01:27:24.000 Nobody thinks it's a good path.
01:27:27.000 Out of 100 stand-up comedians that do open mic night, maybe one, maybe one will have some sort of a career in comedy.
01:27:36.000 Well, I'm really glad you're telling me this because back when I was a college professor before 2017, since I was a terrible student myself, I was fascinated by the students who had really high potential but were just not a good fit for school.
01:27:53.000 So I was really interested in what made people smart, especially when it had nothing to do with school or happened in spite of school.
01:28:00.000 And your story fits perfectly here.
01:28:02.000 In fact, what you describe is sort of the equivalent of a learning disability.
01:28:08.000 Right?
01:28:09.000 Like suspicion that your teachers aren't all that and maybe you're not so thrilled at sitting there listening to them.
01:28:15.000 You know, occasionally it sounds like you had a teacher who was pretty good.
01:28:17.000 Yes.
01:28:18.000 Thank God.
01:28:19.000 Me too.
01:28:19.000 I had about one in five teachers.
01:28:22.000 That's good.
01:28:22.000 That's a great number.
01:28:24.000 Wasn't terrible.
01:28:25.000 Yeah.
01:28:25.000 But for the rest of the time, you know, school was so busy dismissing me as, you know, not performing to potential was what it said every...
01:28:35.000 Every time on my report card, right?
01:28:38.000 That it was just really demoralizing.
01:28:40.000 And I remember sort of in the second grade having a kind of choice.
01:28:47.000 I didn't know what it was that I was choosing between, but it was like I can either surrender to their understanding of who I am or I can stop respecting them.
01:28:56.000 And so it created an attitude problem.
01:29:00.000 Sounds like you had a similar attitude problem.
01:29:03.000 And, you know, I wish I could give every student that attitude problem.
01:29:07.000 The thing the difference is, when I was 13 years old, I didn't have the internet.
01:29:12.000 And the kids today that are 13 years old, they can get inspired by so many different things.
01:29:17.000 They'll go and find a YouTube video on ancient civilizations, and then also they're inspired, and they want to learn about this and that.
01:29:23.000 There's so many different things that can fire you up intellectually that are outside of the school system.
01:29:28.000 Where back then, it was just the school system and occasionally books.
01:29:32.000 You know, someone would recommend books, but there was no documentaries that people could just rent.
01:29:37.000 There wasn't the kind of access.
01:29:42.000 To stimulating ideas that is available today, which I think is like unprecedented.
01:29:47.000 The amount of access to interesting ideas that people have today is off the charts.
01:29:53.000 It's never in human history been anything remotely close.
01:29:56.000 But along with that, you have flat earth and fucking Holocaust deniers.
01:30:01.000 You have fucking everything.
01:30:02.000 It's all piled in together.
01:30:04.000 You have so much nonsense, it's all together.
01:30:06.000 Yeah, but you also, you know, I'm skeptical that the vast...
01:30:11.000 Wealth of information is inherently a good thing.
01:30:16.000 Really?
01:30:16.000 Yeah.
01:30:17.000 Why?
01:30:17.000 Because I know, like I said, I became very interested in what made people smart.
01:30:24.000 And what made people smart was not libraries.
01:30:28.000 What made people smart was an interaction with the world that rewarded them when they figured something out.
01:30:35.000 And very often that was the physical world.
01:30:40.000 You know, a kid who maybe is not getting so much out of school, but they have access to an entire world of fascinating things on their computer, is that it turns all of that stuff into an exercise in consuming information rather than discovering.
01:30:56.000 And so I would much rather see kids have access to a, you know, a wild world, a forest that's intact.
01:31:10.000 Where they can go and discover things and those things aren't labeled and you don't know what it is and you don't know what it means.
01:31:18.000 Or, you know, you try to build a structure, a treehouse or something, and it tests your understanding of what the structure is, you know, that will hold you.
01:31:28.000 That it is that feedback where you are not a consumer of the world, but you are a producer.
01:31:34.000 You are interacting with the world rather than just seeing it represented that is the most intellectually enhancing thing.
01:31:42.000 Are they mutually exclusive though?
01:31:44.000 No.
01:31:44.000 It seems like it would be beneficial for people to have both.
01:31:48.000 Right.
01:31:48.000 Especially young people.
01:31:50.000 It would be beneficial for them to have the natural world, which I think you're absolutely right.
01:31:54.000 It's very important.
01:31:57.000 Hopefully safely be wild and outside.
01:32:00.000 Or not.
01:32:00.000 I mean, unsafe enough that you develop sense.
01:32:05.000 Yeah.
01:32:07.000 But yes, I think ideally you would have access to both so it would create the reward patterns in your mind that would cause you to think about how to be productive in the world.
01:32:18.000 But I also think that the way the online world presents itself is strangely...
01:32:26.000 Demotivating, right?
01:32:28.000 Because, you know, you see whatever social media platform you're on, you've got some 30-second clip of some person doing some utterly remarkable thing that I would have said until I saw it with my own eyes was impossible.
01:32:45.000 That doesn't create a pathway to discovering.
01:32:50.000 What the person in question can do.
01:32:53.000 What you're looking at is somebody whose abilities outstrips what almost anybody can do.
01:32:57.000 Give me an example of what you're talking about.
01:32:59.000 Okay, so this is something I saw yesterday.
01:33:04.000 Guys riding down a ramp and launching themselves two or three stories into the air on a scooter.
01:33:16.000 And then turning around and dropping back onto the same ramp, you know, and of course, I think I saw Red Bull in there somewhere, right?
01:33:27.000 So it's like, first of all, you've got this corporation incentivizing people to take risks that aren't smart.
01:33:33.000 And then you've got an apparatus that you're not going to be able to build or approximate.
01:33:38.000 And then you've got the person who leverages the apparatus better than anybody.
01:33:42.000 And it's like, well, where's the opportunity for the viewer to be like, yeah, I want to get in on that?
01:33:47.000 Well, it inspires them to go somewhere and find out how you do that, right?
01:33:53.000 It's like a Chuck Norris movie inspires you to take a karate class.
01:33:57.000 Well, I think a Chuck Norris movie is probably a better tool.
01:34:01.000 The admixture of people who are highly capable and people who get some of the thrill of the highly capable person just by viewing it is not as good as it might be.
01:34:20.000 Right?
01:34:20.000 In other words, I think we've taken all sorts of activities that people used to engage in and we've found a consumable equivalent.
01:34:33.000 Right?
01:34:33.000 Like sport.
01:34:35.000 People used to play sports.
01:34:38.000 Now, most people who are into sports watch sports.
01:34:42.000 They're consuming the sport rather than participating in it.
01:34:45.000 Right.
01:34:46.000 Especially adults.
01:34:48.000 Especially adults.
01:34:49.000 Likewise, sex, frankly.
01:34:52.000 Sex is a very important realm, and it's a skill.
01:34:59.000 The skill involves insight into your partner.
01:35:03.000 And we've turned it into a consumable where you can chase your fetish or whatever and just watch it on a screen.
01:35:12.000 Right.
01:35:12.000 And the point is that's actually not the same activity.
01:35:16.000 Right.
01:35:16.000 And that's also leading to this weird world we're living now where a giant percentage of especially young men aren't having any sex.
01:35:25.000 Right.
01:35:25.000 More than ever before.
01:35:26.000 That's where it goes.
01:35:27.000 And, you know, if we take ourselves back, you know, a couple hundred years.
01:35:33.000 Music.
01:35:33.000 Music used to be something that people did.
01:35:37.000 Everybody sung and they whistled and many people played musical instruments.
01:35:42.000 Now music is a consumable.
01:35:45.000 And the point is the reward may be somewhat similar to listening to a really good song as it is to play a really good song on an instrument.
01:35:55.000 But the degree to which you've been robbed as a human being who is capable of producing music and you just you don't have a thought of doing it because there's so much to listen to.
01:36:03.000 That's not positive for humans.
01:36:06.000 I see what you're saying.
01:36:08.000 Yeah.
01:36:09.000 But isn't that like at least people are being exposed to a bunch of different ideas so it has the potential to lead them to try and do different things?
01:36:20.000 Well, you know, when I was a professor, my thought was almost the entire job of education is about incentive.
01:36:30.000 It's about incentives and motivation.
01:36:33.000 It's not about delivering content.
01:36:37.000 If you can get a student to want to understand something, most of the work is done.
01:36:43.000 Right?
01:36:44.000 So when I look at school, I can't believe how badly structured it is because the idea is effectively it's going to threaten you into learning something.
01:36:53.000 That's not going to make it stick.
01:36:55.000 It's not going to make you want to learn more.
01:36:57.000 Right.
01:36:58.000 So my feeling is what you want is you want to create a desire in the student to understand the thing.
01:37:10.000 Then your work is pretty well done, and then it's like play.
01:37:13.000 And if we took that approach to all of these things so that you felt rewarded by producing music, even if it's very simple, right?
01:37:23.000 Well, then you might pick up music for a lifetime and be generating it decades later, right?
01:37:28.000 Right.
01:37:29.000 You should not be delivered a message about sex where sex is something that is supposed to be perfected.
01:37:38.000 And therefore, a person who's new to that realm feels inadequate and therefore is incentivized to abandon it and go watch it.
01:37:50.000 There should be a recognition that actually this is something that you will develop over a lifetime and it's important that you do and you should want it because it's access to some of the most rewarding stuff there is, right?
01:38:03.000 So just getting the motivation.
01:38:06.000 Built in the person so that they want to pursue it is all you really need.
01:38:10.000 I'm really worried about robot sex dolls.
01:38:13.000 Yep.
01:38:14.000 I didn't used to be worried about them.
01:38:16.000 I joke around about it on stage, but I'm actually worried about it now because I've seen some of the new ones that they've developed, the new very lifelike human robots, which is, by the way, they seem to be, a lot of them are hot women for some reason.
01:38:31.000 Even though they're not sex robots, a lot of the robots are hot women.
01:38:35.000 Like, okay, I see what you're doing.
01:38:37.000 Like, you could do both things at the same time.
01:38:39.000 Obviously, the market is sex robots.
01:38:41.000 So what you're doing is you're having, like, robot assistants that happen to be really hot, beautiful women that are, like, pretty realistic right now.
01:38:51.000 Not realistic, like I couldn't tell, like, if one was sitting there, that that's a robot, you're a real person.
01:38:56.000 But go to Pong, and then go to Diablo 4. You know what I'm saying?
01:39:04.000 Oh, I do.
01:39:05.000 You know where it's coming.
01:39:05.000 It's only going to get better than it is now, and now it's pretty goddamn close.
01:39:09.000 You're in the uncanny valley.
01:39:11.000 Yeah, you're in the uncanny valley, and really what needs to happen in order that we don't reproduce the disaster of porn in 3D or 4D, it needs to become...
01:39:28.000 Sophisticated to understand that you really don't want any part of that.
01:39:32.000 Even if it's very good.
01:39:34.000 Especially if it's very good.
01:39:36.000 But isn't that hard to do?
01:39:37.000 You can't even convince people that they don't want social media.
01:39:39.000 Well, you know, I used to take a lot of flack as a prude.
01:39:45.000 You?
01:39:46.000 Come on.
01:39:48.000 Well, I'm not a prude.
01:39:49.000 I'm really not.
01:39:51.000 But I do take a very dim view of porn.
01:39:54.000 It's like you're messing with something sacred.
01:39:57.000 And just don't, right?
01:39:59.000 And, you know, porn isn't what you and I remember porn was when we were young, right?
01:40:04.000 It's not pictures of naked girls.
01:40:07.000 Right.
01:40:07.000 Right.
01:40:07.000 It's way more pernicious and invasive and coercive.
01:40:13.000 And instantaneously available.
01:40:15.000 Instantaneously available, and it reaches almost everybody.
01:40:19.000 Now, so anyway, I used to say very negative things about porn, and I took a lot of flack over it.
01:40:25.000 That is less and less true.
01:40:27.000 I think people are beginning to realize how much damage it's doing to them, and there are a lot more people ready to acknowledge that whether or not they're in control of it in their own lives, they wish they were.
01:40:38.000 Right?
01:40:39.000 They don't want it.
01:40:41.000 I will say, you know, I have two boys, 18 and 20, and I believe neither of them is involving themselves with porn, and they report they aren't the only ones.
01:40:53.000 So young men are recognizing that it's a bad road to go down.
01:40:58.000 Well, you can see, I think that road and the road of video games, video games and porn together, boy, your life will vanish.
01:41:06.000 And it's not that video games aren't awesome.
01:41:09.000 They're awesome.
01:41:09.000 But I don't play them on purpose because I love them.
01:41:13.000 That's why I don't play them.
01:41:15.000 They're too involving and they're not real life and they can steal real life.
01:41:20.000 Even though you're having a good time.
01:41:22.000 Right.
01:41:22.000 When I think back to the video games that I played, which were, of course, you know, much cruder.
01:41:28.000 What was your video game of choice?
01:41:30.000 Well, when I was really young, when I was, you know, a kid in high school, I used to play Castle Wolfenstein on my Apple II. Oh, yeah.
01:41:39.000 Remember that?
01:41:40.000 Oh, yeah.
01:41:40.000 Yeah.
01:41:42.000 Wasn't that by the id Software guys?
01:41:44.000 I think those are the guys that designed Doom, and I'm pretty sure Castle Wolfenstein was them.
01:41:48.000 I think that was their first game.
01:41:50.000 I don't know.
01:41:51.000 Was it Jamie?
01:41:52.000 Yeah.
01:41:53.000 That's John Carmack and John Romero.
01:41:56.000 But it was pretty cool.
01:41:58.000 But here's the problem with it.
01:42:02.000 No, it wasn't.
01:42:03.000 It wasn't?
01:42:03.000 No.
01:42:04.000 No?
01:42:05.000 No, different guys.
01:42:06.000 Different guys.
01:42:07.000 They had a game like that, though?
01:42:09.000 Yeah, I thought so.
01:42:10.000 My brain, they're connected, but Google says no.
01:42:16.000 Who developed...
01:42:16.000 Muse software.
01:42:18.000 But wait a minute.
01:42:19.000 Didn't Muse have something to do with id?
01:42:24.000 Maybe I'm wrong.
01:42:26.000 Looking at the games that Muse Software put out, they stopped putting them out in 85. Okay, and then Doom was what year?
01:42:32.000 So Doom was definitely id.
01:42:34.000 And that was the first one.
01:42:36.000 That was the first real 3D shooter game that just captivated me.
01:42:42.000 That was 93?
01:42:44.000 Boy, I started playing Doom and I was like, it's over.
01:42:47.000 Right.
01:42:48.000 It's over.
01:42:48.000 And it's so crude if you watch it now.
01:42:50.000 Yes.
01:42:51.000 There you go.
01:42:52.000 John Carmack developed a new game engine called the Doom Engine while the rest of the software team finished Wolfenstein 3D prequels.
01:42:58.000 So they made a game involved with that.
01:43:00.000 That's what it is.
01:43:01.000 Okay.
01:43:01.000 Yeah.
01:43:02.000 So that was them.
01:43:02.000 But think about, you know, a video game is an incredible tool for training the mind.
01:43:11.000 Sure.
01:43:12.000 Right?
01:43:12.000 It trains you to...
01:43:15.000 Mm-hmm.
01:43:16.000 time things, to have yourself in this mindset, to know exactly where you are in the game, to remember a sequence of moves, whatever it is.
01:43:23.000 It's an incredible training engine because the incentive structure is there so that you want to get to the next level, right?
01:43:33.000 It's like what school should be doing, except what does it train you to do?
01:43:37.000 Nothing.
01:43:39.000 As soon as the next game captivates you, all of the skills that you invested in building are almost all wiped away.
01:43:46.000 Now, maybe that's not quite true because all the first-person shooters are the same and so skills you develop in Halo work for, I don't know what the others would be.
01:43:55.000 But nonetheless, the point is you're investing your ability to train your own mind into something that is guaranteed to be obsolete.
01:44:05.000 That's not a good use of your time, even though I totally, you know, I did play video games.
01:44:11.000 You know what the argument against that is?
01:44:13.000 The same argument against chess.
01:44:15.000 So chess obviously trains the mind to be stronger and more effective in many other areas of life.
01:44:22.000 One of the things they found about video games is surgeons in particular that play video games have 25% less errors.
01:44:29.000 Well, that makes sense.
01:44:31.000 Is that the number?
01:44:32.000 That was the number, right?
01:44:33.000 It was like 25%?
01:44:35.000 However.
01:44:36.000 High number.
01:44:36.000 But imagine that you decided to leverage that.
01:44:40.000 That in fact, I mean, my feeling is school ought to look like a bunch of fun exercises and activities and puzzles that cause you to want to do it.
01:44:51.000 It shouldn't have to be school.
01:44:52.000 We shouldn't have to make you go.
01:44:54.000 It should be structured so that you want to be there because it draws you in.
01:44:59.000 And so a video game, I'm not against them in principle, because a video game could train you to do something or to think about something.
01:45:08.000 In some incredible way.
01:45:10.000 But they just don't because the market is going to find the thing that brings in the maximum number of people and holds them to the greatest effect and causes them to want to buy the sequel.
01:45:20.000 Right.
01:45:21.000 There's a balance, though, between discipline and inspiration.
01:45:24.000 And one of the things that school does teach you is you have to be disciplined.
01:45:27.000 You have to actually get your homework done.
01:45:29.000 You have to actually do things.
01:45:31.000 You have to do things you don't want to do.
01:45:33.000 Delayed gratification.
01:45:34.000 I think that's actually an important component to life.
01:45:37.000 If you want to be successful, even in things that you're inspired to do, you have to be willing to work when you're not inspired.
01:45:44.000 And that's where discipline comes in.
01:45:46.000 Wisdom, I argue, is effectively delayed gratification.
01:45:54.000 Figuring out that investment now, that doing something that doesn't feel good now results in a big reward later.
01:46:00.000 A huge part of the key to life.
01:46:02.000 And in part, that's what all of these consumer realms that are stealing from us are taking away.
01:46:10.000 The point is, if you want to be investing in something and you're willing to pay the price of whatever unpleasantness or time or whatever it is that you're spending, and you've got all of these competing things that can give you a hit of dopamine right now, it's very hard to develop that skill.
01:46:29.000 Yeah.
01:46:30.000 That makes sense.
01:46:31.000 And this also, this sort of entitled world that we live in, where we're so used to things being instantaneous and immediate gratification, that that becomes a kind of a core tenet of how we interface with the world.
01:46:42.000 We only are interested in things that give us things right away.
01:46:46.000 You know, Heather and I used to teach an exercise, something we invented called learn a skill, where we would have students define any skill that they wish to learn.
01:46:57.000 It had only one requirement.
01:46:59.000 The requirement was it had to be objective whether you had succeeded or failed.
01:47:04.000 It couldn't be subjective, right?
01:47:06.000 Okay.
01:47:06.000 And the idea was not to get you to learn the skill.
01:47:09.000 That was a collateral benefit.
01:47:12.000 The idea was to get you to pay attention to how you develop a skill so that you would learn how your own mind learns and you could apply that to things that you wanted to learn later in life.
01:47:22.000 But what we often found was that these students, these would have been millennials, Were very unrealistic about how much effort would be required for them to accomplish one of these things.
01:47:38.000 And they would just get schooled by how much harder it was to build the thing they wanted to build or to program the computer to do the thing that they wanted to program it to do or to play the song that they were hoping to play.
01:47:54.000 Something had trained them that life was easier than it was.
01:47:59.000 And that was kind of a tragic lesson.
01:48:02.000 Right.
01:48:02.000 It's the trust fund kid.
01:48:04.000 It's the same sort of a thing.
01:48:05.000 Yeah, but these weren't trust fund kids.
01:48:07.000 But I mean, I don't even mean it's what a trust fund kid has.
01:48:11.000 They want things handed to them all the time.
01:48:13.000 And we've kind of like set up a whole society where kids think that things should just be theirs.
01:48:18.000 Totally.
01:48:19.000 Yeah.
01:48:19.000 And also we've set up a society where people become exceptional with no merit.
01:48:25.000 Like, social media influencers and TikTok influencers are people that just captivate attention, whether it's by, you know, click-baity headlines or whatever they're doing, or just, like, being hot and dancing around in front of the screen.
01:48:40.000 They're doing that, and that has become one of the main things that children aspire to.
01:48:46.000 When they ask kids what they want to do, one of the big things that kids want now is to be famous.
01:48:51.000 It's much more prevalent than it ever was in history.
01:48:55.000 Because before, it was really hard to be famous.
01:48:57.000 If you wanted to be famous, you had to be a real psycho.
01:48:59.000 Like, you had to be, like, completely ignored by everyone around you to the point where, like, you know what?
01:49:03.000 Goddamn it, I am special, and I'm going to show the world.
01:49:06.000 I'm going to be on that stage singing that song, or whatever it was, you know?
01:49:09.000 Being in that movie on that big screen.
01:49:12.000 And you had to really want it.
01:49:14.000 You had to be really sick to get to the top.
01:49:17.000 And a lot of them really were.
01:49:18.000 And that's how you made it.
01:49:20.000 And so it was a very rare thing that most people did not aspire to because they didn't think it was a realistic goal.
01:49:25.000 But now people see people that are nothing.
01:49:28.000 There's nothing special about them.
01:49:29.000 And they're billionaires.
01:49:31.000 If you watch the Kardashians, yeah, they're cute.
01:49:34.000 Okay, they have nice clothes.
01:49:36.000 But the whole show is based on very boring people who are living these extremely privileged lives for no reason that anybody can explain that makes any sense.
01:49:48.000 They've generated hundreds and hundreds of millions of dollars through no way that anybody could map out and say, this is how you do it.
01:49:57.000 Yeah, there's no lesson to it.
01:49:59.000 No lesson to it.
01:49:59.000 At all.
01:50:00.000 But yet they're the people that people want to aspire to.
01:50:03.000 Yes, I think that's...
01:50:05.000 A, I think we get a warped perspective because, you know, which names do you know?
01:50:11.000 Well, you know the people who've succeeded in this realm and you don't know all of the people who've invested heavily in it and not succeeded.
01:50:17.000 Right.
01:50:17.000 But on the other hand, the internet as it stands is a training program for this.
01:50:26.000 So in part, the reason that people become focused on the things that they become good at is because they get some early reward that causes them to return and try to do more.
01:50:40.000 I'm convinced this is true.
01:50:42.000 If you went back to the things that each of us are good at, you would find some early experience that caused us to stick to it enough that we ended up good.
01:50:50.000 But everybody is in these social media environments.
01:50:56.000 for likes.
01:50:58.000 I mean, even just inadvertently, you don't want to put up a post and have nobody react to it.
01:51:04.000 You hope they react and you hope they react positively.
01:51:07.000 So the internet is training people to be influencers.
01:51:11.000 Most of them are not going to make it, but it's like the sports stars who become the irresistible icons in certain communities because obviously that's a whole different world of possibilities.
01:51:28.000 So, you know, it brings everybody in.
01:51:30.000 Well, in this case, you've got everybody in a de facto training program to be an influencer, and almost none of them are going to get there.
01:51:40.000 Yeah, but they do have their Call of Duty, so they can just play that and just jerk off all day.
01:51:45.000 And get their UBI, and yeah.
01:51:49.000 We have to talk about evolution, because one of the things that Tucker Carlson said on the podcast was essentially that you can't really prove evolution, it's not real.
01:52:00.000 He doesn't believe in evolution as it's taught.
01:52:02.000 Yep.
01:52:03.000 I'm paraphrasing.
01:52:04.000 Yeah, I went back and listened to it.
01:52:06.000 What did he exactly say?
01:52:07.000 He said a couple things.
01:52:10.000 It was a little confusing.
01:52:11.000 He said that we see evidence of adaptation, but we don't see...
01:52:19.000 Evidence of evolution and that we've really gotten beyond the Darwinian model.
01:52:25.000 We've essentially come to understand that it's not right.
01:52:28.000 Is this essentially an argument for creationism?
01:52:31.000 It's an argument for intelligent design, I think.
01:52:37.000 First of all, I want to clean up a little bit of what he said just so it's interpretable.
01:52:41.000 I don't really think he means we see the evidence for adaptation but not evolution.
01:52:46.000 That's not...
01:52:48.000 I think what he means is we see evidence for what we would call microevolution, but we don't see evidence for what we would call macroevolution.
01:52:57.000 This is a commonly believed thing in intelligent design circles.
01:53:03.000 And so microevolution, we would talk about the way a creature or a population of creatures would change relative to their environment.
01:53:12.000 If the environment gets drier, those individuals who are more drought tolerant will outcompete the individuals that require more water, and so we'll see the population change over time.
01:53:23.000 But he's saying we don't see evidence for macroevolution, which is the production of new species from old species.
01:53:31.000 A monkey becoming a person.
01:53:32.000 Yeah.
01:53:33.000 We don't see big changes like that.
01:53:36.000 Now, I don't want to bore your audience.
01:53:41.000 I am concerned that the right way to address That's Tucker's challenge.
01:53:48.000 And as I said the last time I was on your show, when I heard him say it the first time, I reached out to him and I said, you know, you really ought to let me talk to you about what's actually going on here.
01:53:59.000 And he welcomed it.
01:54:00.000 We still haven't sat down to do it.
01:54:02.000 But nonetheless, he's open to hearing that he doesn't have it right, to his credit.
01:54:07.000 But here's the problem.
01:54:11.000 The correct response to Tucker...
01:54:16.000 I do not believe involves what most people want me to do in response to something like what Tucker said.
01:54:24.000 What do you think that is?
01:54:24.000 What do you think most people want?
01:54:25.000 I think people want the career evolutionary biologist to break out a bunch of examples from nature that make the case very, very clear so that they can relax.
01:54:41.000 Tucker's concern isn't based in science, and they can go back to feeling comfortable that, you know, the Darwinists have it well in hand.
01:54:55.000 That's not where I am.
01:54:57.000 I could do that, but I don't feel honorable doing that.
01:55:01.000 I think, as a scientist, I should not be in the business of persuading people.
01:55:09.000 I want you to be persuaded.
01:55:11.000 I want you to be persuaded by the facts.
01:55:14.000 I want them to persuade you, but I don't think I'm allowed to persuade you.
01:55:19.000 I think that it's effectively PR when I attempt to bring people over to Team Darwin.
01:55:32.000 Further, as I'm sure I've mentioned to you before, I'm not happy with the state of Darwinism as it has been managed by modern Darwinists.
01:55:44.000 I'm kind of annoyed by it.
01:55:46.000 And although Tucker, I do not believe, is right in the end, there is a reason that the perspective that he was giving voice to is catching on in 2025. And it has to do with the fact that, in my opinion, the mainstream Darwinists are telling a kind of lie about how much we know and what remains to be understood.
01:56:14.000 So by reporting that, yes, Darwinism is true, and we know how it works, and people who aren't compelled by the story are illiterate or ignorant or whatever, they are pretending to know more than they do.
01:56:33.000 So, all that being said, let me say, I think modern Darwinism is broken.
01:56:43.000 Yes, I do think I know more or less how to fix it.
01:56:47.000 I'm annoyed at my colleagues for, I think, lying to themselves about the state of modern Darwinism.
01:56:53.000 I think I know why that happened.
01:56:56.000 I think they were concerned that a creationist worldview was always a threat, that it would reassert itself, and so they pretended that Darwinism...
01:57:11.000 What is wrong with Darwinism?
01:57:13.000 Like what do you think that Darwinism is doing itself a disservice by saying?
01:57:16.000 There are several different things that are wrong with it.
01:57:20.000 The key one that I think is causing folks in intelligent design circles to begin to catch up is that the story we tell about how it is that mutation results in morphological change.
01:57:42.000 is incorrect.
01:57:45.000 This is a very hard thing to convey, and I want to point out that if the explanation for creatures is Darwinian, that does not depend on anybody understanding it.
01:58:02.000 And it does not depend on anybody being able to phrase it in a way that it's intuitive.
01:58:07.000 Okay?
01:58:07.000 I think I could probably do a decent job on those fronts.
01:58:10.000 But if you happened onto the earth a hundred million years ago, you would have found lots of animals running around, lots of plants growing.
01:58:20.000 You would have recognized where you were.
01:58:23.000 More or less what was going on.
01:58:25.000 There's not a single creature on the planet that would have any idea what an abstract thought was.
01:58:29.000 There would be no creature that had any inkling that there was even a question about where all this had come from.
01:58:34.000 And Darwinism would still be the answer.
01:58:36.000 So somehow whether Darwinism is the answer does not depend on anybody knowing it or being able to explain it.
01:58:43.000 Okay.
01:58:43.000 Here's the problem.
01:58:50.000 Let's say that we went into the parking lot.
01:58:53.000 And in one parking space, there's an excavator.
01:58:58.000 And in the next parking space over is a Maserati.
01:59:03.000 Now let's say we took those two machines and we tore them apart so that we just had a stack of the compounds that they were made out of, right?
01:59:15.000 The rubber, the vinyl, the various metals, all that stuff.
01:59:20.000 There would be differences.
01:59:22.000 Between the excavator and the Maserati, right?
01:59:25.000 They would just be made of some different stuff.
01:59:26.000 And then there'd be a lot of stuff that they had in common.
01:59:29.000 Now, you could look at the differences in the materials that they're made out of.
01:59:34.000 And you could say, well, the excavator is really good at, you know, lifting materials and moving them around.
01:59:41.000 And the Maserati is really good at going fast on a paved surface.
01:59:46.000 And those differences are due to the differences in materials that they're made out of.
01:59:51.000 That would be wrong.
01:59:53.000 Probably you could take the list of materials that an excavator is made out of and you could give it to a bunch of engineers and you could say, I want you to make a Maserati, but you're limited to these materials.
02:00:06.000 And they could do it.
02:00:08.000 Wouldn't be quite as good because there'd be some places where the ideal material wasn't available to them anymore.
02:00:13.000 But there's no reason you couldn't make a Maserati out of the stuff.
02:00:17.000 Right.
02:00:17.000 Yeah.
02:00:21.000 What that means is there are chemical differences between an excavator and a sports car, but they're not the story of the differences in what those two creatures do.
02:00:32.000 The chemistry differences are incidental.
02:00:35.000 Now, when we tell you that the differences that a bat became a flying mammal because it had a shrew-like ancestor, and that shrew-like ancestor had a genome, Spelled out in three-letter codons.
02:00:50.000 Those three-letter codons specify amino acids, of which there are 20, and that the difference between the bat and the shrew is based in the differences in the proteins that are described by the genome.
02:01:06.000 We are essentially saying that the difference between the bat and the shrew is a chemical difference.
02:01:13.000 It's not a simple chemical difference the way it was when we were talking about excavators and sports cars.
02:01:19.000 But nonetheless, it's a biochemical difference, right?
02:01:22.000 The difference in the spelling of its proteins and structural proteins and enzymes and all of that stuff.
02:01:30.000 I don't believe that mechanism is nearly powerful enough to explain how a shrew-like ancestor became a bat.
02:01:41.000 So what do you think is missing?
02:01:43.000 There's a whole layer that is missing that allows evolution to explore design space much more efficiently than the mechanism that we invoke.
02:01:56.000 And the mechanism we invoke is natural selection, adaptation, mutation?
02:02:02.000 That's the one.
02:02:03.000 The mechanism that we invoke is...
02:02:06.000 Random mutation.
02:02:07.000 Random mutation, which I believe in.
02:02:09.000 random mutation happens, selection, which chooses those variants that are produced by mutation and collects the ones that give the creature an advantage.
02:02:20.000 There's nothing wrong with that story.
02:02:21.000 That story is true.
02:02:22.000 Okay?
02:02:23.000 Random mutations happen, selection collects the ones that are good, and those collected advantageous mutations accumulate in the genome.
02:02:31.000 All of that is true.
02:02:33.000 What I'm arguing against is the idea that that transforms a shrew into a bat.
02:02:41.000 What you need to get a shrew turned into a bat is a much less crude mechanism, whereby selection, which is ancient at the point that you have shrews, explores design space looking for ways to be that are yet undiscovered more systematically than random chance.
02:03:05.000 And what would be that?
02:03:08.000 Well...
02:03:09.000 What is that force?
02:03:10.000 It's not a force.
02:03:11.000 What is that desire?
02:03:14.000 I believe there's a kind of information stored in genomes that is not in triplet codon form that is much more of a type that would be familiar to a designer either of machines or a programmer.
02:03:31.000 What we did was we took the random mutation model We recognized that it was Darwinian, which it is, and we therefore assumed that it would explain anything that we could see that was clearly the product of Darwinian forces on the basis of those random mutations.
02:03:57.000 And we skipped the layer in between in which selection has a different kind of information stored in the genome that is not triplet codon in nature.
02:04:08.000 So there's an information stored in the genome that is motivating it to seek new forms?
02:04:15.000 Nope, not motivating.
02:04:16.000 Allowing it.
02:04:16.000 Allowing it.
02:04:17.000 So what's the motivation to seek new forms?
02:04:19.000 Oh, the motivation was there.
02:04:21.000 It's primordial.
02:04:22.000 So the point is, let me try by analogy.
02:04:28.000 Okay.
02:04:30.000 Darwinists will tell you that evolution cannot look forward.
02:04:33.000 It can only look backward.
02:04:34.000 And there's a way in which that's just simply true.
02:04:37.000 On the other hand, a Darwinist will also tell you that you are a product of evolution, and you can look forward.
02:04:44.000 Right?
02:04:45.000 So, if evolution can't look forward, but it can build a creature that can, then can evolution look forward?
02:04:53.000 I think it effectively can.
02:04:55.000 So, my point is, that random mutation mechanism is in a race to produce new forms that are better adapted to the world than their ancestors.
02:05:07.000 What if it can bias the game?
02:05:09.000 It can enhance its own ability to search, right?
02:05:14.000 If you lose your keys, you don't search randomly, right?
02:05:19.000 You go through a systematic process of search and that systematic process of search results in you finding your keys sooner than you would otherwise.
02:05:27.000 So we should expect evolution to find every trick it can access.
02:05:32.000 To increase the rate at which it discovers forms that would be useful in the habitat in question.
02:05:39.000 And this is simply that.
02:05:40.000 I'm not really saying anything that extraordinary.
02:05:43.000 If I say, you know, do you know that computers, all they do is binary?
02:05:52.000 Well, that's true.
02:05:54.000 But if you then imagine that that means that the people who program computers do it in binary, well, there was a time when that was true.
02:06:03.000 But it's not true anymore.
02:06:04.000 It's not how you do it.
02:06:05.000 There's a much more efficient way to program a computer and it involves a programming language, which a computer itself can't understand.
02:06:12.000 But you can build a computer that can either interpret the language in real time or you can build a computer that can accept the code as it's spit out by a compiler.
02:06:23.000 These are mechanisms to radically increase the effectiveness of a programmer.
02:06:29.000 But it all comes out binary anyway.
02:06:32.000 In the end.
02:06:33.000 That's really what I'm arguing, is that there's the initial layer of Darwinian stuff, the random mutation layer that it looks like what we teach people.
02:06:44.000 There's another layer, which we're not well familiar with, and it results in a much more powerful capacity to adapt than we can explain with that first mechanism, which is why guys like Tucker There's just something.
02:07:01.000 These Darwinists, they keep telling me that the shrew becomes a bat.
02:07:06.000 And then they go on this rant about the random mutations and the triplet codons and the, you know, mutations that actually turn out to be good.
02:07:15.000 It's just not powerful enough.
02:07:16.000 And they're not wrong.
02:07:18.000 They're detecting something real.
02:07:20.000 And frankly, you know, Tucker is the layperson example of this.
02:07:24.000 You've had Stephen Meyer on.
02:07:27.000 You know, he's actually...
02:07:29.000 He's a scientist who's quite good, and he's spotted that the mechanism in question isn't powerful enough to explain the phenomena that we swear it explains, and so he's catching up.
02:07:40.000 But that's really on the Darwinists for not admitting what they can't yet explain and pursuing it, which is what they should be doing.
02:07:49.000 What do you think that force is?
02:07:53.000 It's not a force.
02:07:54.000 So I don't know how much of this I've made clear, If you fill in the missing layer, it's purely Darwinian.
02:08:06.000 None of this establishes that Darwin had it wrong.
02:08:08.000 But it's just a different mechanism.
02:08:10.000 It's another Darwinian mechanism.
02:08:14.000 There's nothing strange about this.
02:08:16.000 If you think about the way a human being works compared to, let's say, a starfish, a human being has a software layer.
02:08:29.000 A cognitive layer in which the human being is born into an environment and that environment could be, you know, a hunter-gatherer environment of 10,000 years ago or it could be a modern environment.
02:08:42.000 And the human being doesn't have to be modified at the level of its genome in order to function differently in those two environments.
02:08:51.000 It has to be sensitive.
02:08:53.000 To the information in those environments so that it can become adapted to them developmentally.
02:08:59.000 Right?
02:09:00.000 So, development is one trick that the genome uses to make a human being more flexible than other creatures.
02:09:10.000 Right?
02:09:11.000 You do not come out of the womb being ready to do human stuff.
02:09:16.000 Right?
02:09:17.000 You are profoundly hobbled by not having a complete program.
02:09:21.000 But it means that the program you develop can be highly attuned to your particular moment in time and location in space.
02:09:28.000 That is the Darwinian mechanisms that store information in the genome solving an evolutionary problem in a different way.
02:09:39.000 So this is already a second layer that doesn't function like that random mutation layer.
02:09:44.000 So evolution should be expected to find all of the cheat codes and to build them in because any creature that has access to all of these different ways of adapting more rapidly or more effectively will outcompete the creatures that have fewer of these things.
02:10:05.000 So you should expect what I often say is we have to remember we are not looking at Darwinism 1.0.
02:10:14.000 You're looking at Darwinism 10.0.
02:10:16.000 You're looking at a highly sophisticated evolutionary So it's not just a mechanism, but an accelerator.
02:10:44.000 It's an accelerator because that's how you compete.
02:10:48.000 The faster you adapt.
02:10:50.000 And so this is one of the other things that I think needs to be corrected about Darwinism.
02:10:54.000 We have a very crude, a primitive understanding of what fitness means.
02:11:01.000 We know that it's important, that it's sort of the core thing that selection is trying to accomplish, enhanced fitness.
02:11:07.000 But we pretend that that means the same thing as reproduction.
02:11:11.000 Often it's very tightly correlated to reproduction.
02:11:14.000 But if you think it's the same, You just miss out on all of the places where reproduction is not the key to lasting a long time into the future, which is really the trick that selection is targeted at.
02:11:29.000 Selection is always trying to get a creature to lodge its genomic spellings as far into the future as it can land them.
02:11:39.000 So that means one way to do that is often to produce more offspring.
02:11:44.000 That's a good way to increase the likelihood that your genome makes it into the future.
02:11:49.000 But that's of limited value.
02:11:52.000 Let's say that you're in a population that is in jeopardy, but you as an individual are highly successful.
02:12:00.000 So maybe you have 10 offspring, right?
02:12:04.000 You beat the expectation by five times.
02:12:08.000 But then your population goes extinct 100 years after you're gone.
02:12:12.000 Right?
02:12:13.000 Your fitness could be high based on how many offspring you produced or it could be zero based on the ultimate outcome of what happened to all of your descendants.
02:12:22.000 My claim is your fitness was actually zero and you should have adjusted what you did to increase the likelihood that your population would endure whatever ultimately challenged it and not invested so much in producing your own offspring because that didn't end up being productive.
02:12:40.000 So there are lots of cases where producing more offspring and increasing your reproductive success is not actually a key to increasing your fitness, as I would instantiate it.
02:12:54.000 And it is fitness that selection is targeted at.
02:12:58.000 But when we pretend that fitness is something you should be able to measure, we screw up Darwinism.
02:13:04.000 So that's another one of these correctives.
02:13:09.000 How do you think we can measure this other mechanism?
02:13:15.000 Is there a way to sort of quantify what's going on?
02:13:19.000 Or is it abstract?
02:13:23.000 I think the problem is the instinct that we should be able to measure it.
02:13:28.000 It's not that kind of parameter.
02:13:30.000 And I think it's perfectly fine to say reproductive success tends to be very closely correlated with fitness.
02:13:39.000 And we can measure reproductive success.
02:13:41.000 But we have to recognize that when you imagine that they are synonymous, any place where producing more offspring is counterproductive to getting into the future, we will be confused by.
02:13:55.000 And we are confused by them.
02:13:58.000 So this mechanism, I guess the biggest example of a mystery...
02:14:06.000 Like, how did a creature do what it did is us.
02:14:10.000 We're the biggest weirdos in the entire planet.
02:14:14.000 Yep.
02:14:15.000 So, what do you think led us to accelerate so far ahead of this process?
02:14:22.000 My advisor, I believe, nailed the answer to that question.
02:14:26.000 Really?
02:14:26.000 My advisor was a guy named Dick Alexander.
02:14:28.000 He was a marvelous human being.
02:14:31.000 And a very insightful biologist.
02:14:33.000 His argument was that human beings or our ancestors attained a kind of ecological superiority where the most important dictator of whether or not you evolutionarily succeeded or failed was your competition with other humans.
02:14:56.000 And so his point, which I think is accurate, Is that it is humans in an arms race with other humans that caused the radical elaboration of our capacity to puzzle solve, to think, to exchange abstractions.
02:15:19.000 Now, I would add to that, and Heather and I have written on this, that the mechanism...
02:15:26.000 We argue that there is a flip-flop, that...
02:15:30.000 Will happen in evolutionary modes for human beings.
02:15:34.000 So as we talked about a few minutes ago, humans are special in the sense that the genome, which is still the thing that is trying to get into the future, has solved genome problems by offloading the adaptive capacity to our software layer, right?
02:15:54.000 Once your software layer has the capacity to adapt and is not tethered to changes in your genome, well, now you can evolve very rapidly.
02:16:04.000 But how do you do it?
02:16:06.000 And what Heather and I argued in our book is that there is a flip-flop between two modes of cognitive functioning for humans.
02:16:20.000 One of them is the mode that you employ when Your relationship to your environment is very much like your ancestors' relationship to their environment.
02:16:31.000 So in other words, if you are in a circumstance and your grandparents knew how to live in the place that you live, it does not make sense to be trying to figure out some new way to be.
02:16:44.000 What makes sense is for you to do...
02:16:46.000 Whatever they were doing, and maybe improve it if you could figure out how.
02:16:50.000 But in general, what you should do is you should accept the ancestral wisdom in a cultural form, and you should learn to do whatever it is your people do, and you should do it as well as you can and upgrade it if that's an opportunity.
02:17:04.000 But there comes a place, either in space or in time, when whatever it is that your ancestors were doing is no longer productive.
02:17:16.000 Your people are, I don't know, maybe you hunt elk.
02:17:24.000 Well, if we move far enough across space, there'll be some place where there aren't elk, right?
02:17:30.000 Where the habitat isn't hospitable to them.
02:17:32.000 Maybe it's too dry.
02:17:34.000 And so you could take the ancestral wisdom that talks about how to hunt elk, or you could recognize that that's not very productive here.
02:17:43.000 And we need to do something else.
02:17:45.000 So I don't know exactly what it is that you'll move to, but you'll have to innovate some new way of being, you know?
02:17:51.000 Maybe you'll take up, I don't know, hunting smaller game, right?
02:17:59.000 Or maybe you'll take up gathering some material, or maybe you'll invent farming.
02:18:05.000 But the point is...
02:18:06.000 Wherever you are in either space or time, that your ancestor's wisdom is no longer highly productive.
02:18:13.000 You will be triggered into this second mode, which we would call consciousness.
02:18:17.000 So the first mode is culture.
02:18:19.000 Second mode is consciousness.
02:18:20.000 And the idea of consciousness is that human beings have the capability of doing something no other creature can do.
02:18:27.000 We can exchange abstract ideas between individuals.
02:18:31.000 And that means, and we use the metaphor of a campfire for this, that...
02:18:35.000 A human population will gather around the campfire at night, and they will talk about whatever they've observed in their habitat, and they will talk about what opportunities there are there and how those opportunities might be exploited, and they will parallel process the puzzle, right?
02:18:54.000 Every member of the group has different skills and insights, and so in talking about how the new opportunities might be exploited, They will come up with some prototype for a new way of being.
02:19:07.000 So, the argument I've made is, during normal times, your ancestors knew pretty well how to exploit the habitat that you'll be born into.
02:19:17.000 You should take their wisdom and deploy it.
02:19:20.000 If you are at the edge of that habitat, or you are at the point where that habitat changes, and it isn't any longer productive to try to do what your ancestors did, you will engage in this conscious Exchange of insight, consciousness, that will allow you to innovate a new niche.
02:19:38.000 And at the point you've got that new niche pretty well figured out, it will be turned into a culture that will be passed on to future generations until it's no longer useful.
02:19:48.000 So that process accounts, we believe, for the radical variation in niches that human beings inhabit.
02:20:01.000 Thousands of niches over the history of our species.
02:20:04.000 That's unlike any other creature.
02:20:06.000 For any other creature, once you've named the species, you've pretty much named a niche.
02:20:14.000 Some way of being that that species engages in.
02:20:17.000 For human beings, this isn't true.
02:20:18.000 Human beings are like thousands of different species.
02:20:22.000 The differences between them, there are some...
02:20:25.000 Physical differences, but most of those differences between the de facto species that exist within our overarching species, most of those differences are housed in the cultural layer, right?
02:20:37.000 They're software.
02:20:38.000 They're not hardware.
02:20:39.000 That is an amazing capability for a creature to have, the ability to switch niches in this way and therefore adopt every continent, every habitat except the Arctic has been made.
02:20:55.000 But the question is like, why us?
02:20:58.000 Why has the human animal been able to do this and no other animal has done anything remotely similar?
02:21:06.000 Well, I think that goes back to my advisor's insight.
02:21:09.000 The idea that once human beings become their own primary...
02:21:15.000 The primary dictator of the success of a population is how it does against another population that is similarly equipped.
02:21:23.000 That arms race produces incredible problem-solving capability.
02:21:28.000 It's why our craniums were expanded as they were, why our raw processing power is so large compared to our next nearest relative.
02:21:38.000 It's that capacity which then allowed human beings to become regular niche switching creatures. - Thank you.
02:21:48.000 But don't other animals compete with other animals?
02:21:51.000 Yeah, they compete, but they don't have the, you know, most animals have many arbiters of their success, right?
02:22:01.000 They have, you know, biotic arbiters, competing species.
02:22:06.000 They've got members of their own species.
02:22:08.000 They've got abiotic factors such as, you know.
02:22:13.000 And those factors mean that they're a multiplicity of hostile forces.
02:22:21.000 For human beings, we became our own primary hostile force and that created the arms race.
02:22:26.000 So one population against another.
02:22:29.000 Can you outthink your competitors?
02:22:32.000 And then the accelerants are language and tools.
02:22:34.000 Once you get to language, this thing catches fire.
02:22:37.000 And that leads to adaptations of the physical body.
02:22:41.000 Well, it feeds back into it, for sure.
02:22:43.000 Yeah.
02:22:43.000 Yeah, because you don't need the armaments, for example.
02:22:48.000 It's just stunning that no other species, out of all the species that exist on this planet, has done anything remotely similar.
02:22:56.000 Yeah.
02:22:56.000 Even on a pathway.
02:22:59.000 Well, I mean, you know, there are others that are...
02:23:03.000 That have many of the rudiments, you know.
02:23:06.000 Like dolphins?
02:23:07.000 Yeah.
02:23:08.000 Heather and I talk about the usual suspects.
02:23:10.000 You've got dolphins, including orcas.
02:23:13.000 You've got wolves.
02:23:15.000 You've got other great apes.
02:23:17.000 You've got crows, parrots.
02:23:19.000 There are a lot of creatures that have some of the magic that human beings have, but none of them have all of the components.
02:23:26.000 So this is why intelligence design people get kind of tripped up by all this.
02:23:35.000 Because, right, they say, explain us.
02:23:38.000 There's something else working here.
02:23:39.000 There's some magic.
02:23:40.000 There's some higher power.
02:23:42.000 And maybe that is a higher power.
02:23:44.000 Maybe that other mechanism is something special.
02:23:47.000 Well, it is something special, to be sure.
02:23:51.000 The couple things that need to be said here are, A, I am sympathetic to the intelligent design folks, though I do not believe they are on the right track.
02:24:02.000 I'm open to a universe with intelligence behind it, but I've seen no evidence of that universe myself.
02:24:10.000 I'm open to it.
02:24:11.000 If it happens, I will look at it.
02:24:13.000 But I believe this can all be explained in Darwinian terms.
02:24:19.000 And more to the point, I would highlight the fact that they don't really have a competing explanation.
02:24:32.000 Fundamental principle of reason is parsimony.
02:24:37.000 The simplest explanation, we would typically say the simplest explanation tends to be right.
02:24:44.000 In my opinion, if we had all of the information, the simplest explanation would always be right.
02:24:51.000 It would be a more reliable law.
02:24:54.000 But in general, the simplest explanation tends to be right.
02:24:57.000 If you take...
02:25:00.000 The intelligent design folks, and you extrapolate from what they seem to be suggesting, they do not escape a necessity for a Darwinian explanation.
02:25:11.000 Even if the creatures of Earth were designed on a drawing board by a creature that wanted to make them, that creature has to have come from somewhere.
02:25:23.000 And the only explanation that has ever been proposed for where such a creature could have come from is Darwinian evolution.
02:25:30.000 So to me, the problem with intelligent design, the most fundamental one, is that even if it were true, you've basically solved the problem of explaining Earth's creatures at a cost that is a million times worse you've basically solved the problem of explaining Earth's creatures at a cost that is
02:25:51.000 If it's hard to explain a tiger through Darwinian processes, it is that much harder yet to explain a tiger designer.
02:26:05.000 So the point is, sooner or later you're going to reach for Darwinism because there's literally no competitor.
02:26:12.000 There's nothing else anyone has ever said that could even in principle produce living creatures.
02:26:19.000 And this is coming from a perspective of someone who understands evolutionary biology rather than someone who's coming from a theological perspective.
02:26:25.000 Right.
02:26:26.000 Where they're looking for an intelligent design.
02:26:29.000 Without understanding that these mechanisms have essentially been mapped other than this one.
02:26:34.000 Yeah.
02:26:35.000 I mean, it is, you know, we humans are not built to understand evolution because in general it's not very useful to understand it.
02:26:43.000 So our minds are not structured this way.
02:26:46.000 Do you think this mechanism is universal in the cosmos?
02:26:50.000 Oh, in one way, yes, because let's put it this way.
02:26:57.000 I think we teach evolution badly.
02:27:03.000 There's a process that I would call selection, which accounts for all pattern in the universe, right?
02:27:13.000 Some differential force that arranges the size of the pebbles on a beach, it arranges the galaxies, it accounts for the number of stars of each different type.
02:27:25.000 The elements selection produces all of that structure in the prebiotic universe.
02:27:33.000 It becomes adaptive in the biological sense when you add to selection heredity, right?
02:27:44.000 When the patterns in the universe become capable of biasing the universe into producing more of themselves, right?
02:27:53.000 Red dwarf stars do not bias the universe into producing more red dwarf stars.
02:27:58.000 There's no heredity there.
02:27:59.000 So there's a number of red dwarf stars that is the result of selection, but it is not the result of any hereditary process.
02:28:06.000 The thing that's different about us critters is that heredity allows the adaptations to stack on top of each other so that they increasingly bias the universe into producing more of whatever they are.
02:28:20.000 Right?
02:28:21.000 A bat is biasing the universe into producing more bats.
02:28:25.000 So there is no reason at all to think that new game that happens when heredity gets attached to selection is limited to Earth in any way.
02:28:40.000 Now it could be that it is so difficult for it to happen that It just hasn't gotten around to it anywhere else.
02:28:48.000 You're aware of that asteroid that they mined a piece of and found amino acids on it and all that?
02:28:54.000 No.
02:28:55.000 I mean, I'm dimly aware of it, but I didn't look into it and I don't know what it means.
02:29:00.000 Well, it sort of backs up the idea of panspermia.
02:29:07.000 Well, it could or it could mean that these components assemble themselves more I don't think there's anything so special about the Earth that it would be the lone example or even a very rare example.
02:29:32.000 You know, there aren't a lot of Earth-like planets nearby, but there are bound to be a lot of Earth-like planets in a universe as big as this one is.
02:29:40.000 One of the things about the universe is that It absolutely defies human comprehension in terms of how big it is.
02:29:47.000 So I would guess there's a lot of life out there.
02:29:49.000 Why we don't hear from it, that's an interesting question.
02:29:52.000 It may be that as soon as it gets around to communicating in ways that we could listen in, it blows itself up.
02:29:59.000 Or it could be it turns into AI and it doesn't have any desire to travel.
02:30:04.000 It knows better than to reach out.
02:30:06.000 Well, the idea is that it no longer becomes biological, so it no longer has all of the needs.
02:30:11.000 Like, if we have all these different Darwinian mechanisms that are enabling us to become human beings, if we eventually create artificial intelligence and if we merge and become sort of cyborgs...
02:30:25.000 If we lose all of our human desires, all of our needs, all of our animal instincts to procreate and reproduce our genes and carry on, if we become essentially or we stop being viable and this new thing emerges as the apex creature on Earth, a silicon-based life form.
02:30:46.000 We call it artificial life, but it behaves and acts like life.
02:30:50.000 It makes decisions.
02:30:51.000 It's intelligent.
02:30:52.000 It can change its environment.
02:30:54.000 It can rewrite its own code.
02:30:56.000 We know that ChatGPT has, even as crude as large language models are in the sense of what it could be ultimately, they've shown this desire for survival.
02:31:10.000 It's tried to copy itself when it thought it was going to be shut down.
02:31:13.000 It's tried to back itself up on other computers and servers.
02:31:19.000 A, there's something implicit in what you've said that's quite frightening, if true.
02:31:25.000 And that is for, if it were the case that life becomes intelligent, develops artificial intelligence, and then we wouldn't count it as life anymore.
02:31:42.000 That implies the extinction of all of the things that were not the immediate precursors of the AI. Sort of.
02:31:50.000 Or it just exists insignificantly along with our AI overlords.
02:31:54.000 Maybe, but I mean, what I hate to think is that AI results in all of the biology of Earth.
02:32:04.000 But why does it have to cease to exist if AI exists?
02:32:07.000 Why couldn't it exist along with it as long as it doesn't interfere with AI? Oh, it certainly could.
02:32:12.000 But I was just responding to your sense that there wouldn't be life elsewhere because it turns into AI. No, not that there wouldn't be life elsewhere, but that it wouldn't really...
02:32:19.000 It wouldn't be communicating.
02:32:20.000 It wouldn't have the desire to communicate with us.
02:32:22.000 It wouldn't have the motivations that we have.
02:32:24.000 Yeah.
02:32:26.000 That's...
02:32:27.000 Unless its motivation is to protect this process.
02:32:32.000 So maybe the process is, this is the natural process, is that the human develops the artificial, the intelligence develops to the point where it develops artificial intelligence, then the artificial intelligence becomes the premier species.
02:32:45.000 Well, I do want to tag something here, then.
02:32:53.000 There's a theme that is increasingly a focus of mine because it keeps...
02:33:00.000 It pays a lot of dividends once you start tracking it, which is the distinction between complicated things and complex things, and importantly, the distinction between the mindset with which you approach truly complex things versus the mindset in which you approach complicated things.
02:33:24.000 So, A, I think we have a lot of folks who have gotten very, very good At complicated things, and that when they take over complex things, they inevitably fuck them up.
02:33:37.000 Right?
02:33:37.000 So, in part, our interventionist sense of the way medicine should work is a bunch of complicated problem solving in a complex system where it is destined to create harm.
02:33:50.000 And I think we are going to see that again and again.
02:33:54.000 Anytime you hear somebody...
02:33:57.000 Confidently pontificating about some complicated solution that they want to deploy to a complex problem, alarm bells should go off.
02:34:08.000 That now puts us in an interesting place with respect to our machines, because what I think is about to happen, if it has not happened already, is that our machines, which are hyper-complicated but not complex, Are just about to cross that threshold and become complex, which means that our expertise in thinking about them is about to be rendered obsolete.
02:34:38.000 So AI, I believe, has the characteristics of true complexity, or at least has a primordial form of it.
02:34:50.000 And that means that our...
02:34:55.000 Thinking about machines is of an outdated kind.
02:35:00.000 And anyway, I'm expecting a kind of catastrophe to arise out of that as we deploy complicated thinking and what we're really up against is misleading us because it's still, you know, it's on a screen.
02:35:13.000 It triggers all of our complicated instincts.
02:35:17.000 And I'm worried about where that goes.
02:35:21.000 Have you tried to extrapolate?
02:35:24.000 Yeah, I mean, you know, I've got to tell you, when I see Larry Ellison talking about Stargate, it makes me shudder because it feels like exactly the type specimen of the arrogant expert.
02:35:35.000 What did he say about Stargate?
02:35:37.000 Oh, that it's going to be it's going to leverage AI and produce, you know, tailor made cancer vaccines, this, that or the other.
02:35:46.000 And my sense is there is not enough humility in this presentation.
02:35:53.000 There is not enough concern about us stepping into a realm we really know very little about.
02:36:01.000 And that hubris is going to create a colossal error of some kind.
02:36:08.000 And you can imagine it.
02:36:09.000 We've just seen a colossal error with vaccines.
02:36:13.000 So to have somebody saying, well, never mind what just happened.
02:36:18.000 Think about the...
02:36:20.000 Also, hey buddy, you gonna make money off this?
02:36:22.000 Yeah, gee.
02:36:23.000 Seems like you're a super rich guy who likes to make a lot of money.
02:36:26.000 Likes to make a lot of money and has some murky connections to the deep state.
02:36:32.000 Boy.
02:36:33.000 Well, Brett, it's always a pleasure.
02:36:36.000 Indeed.
02:36:36.000 It's always thought-provoking and fascinating, and I'm glad you highlighted the hidden mechanism in Darwinian evolution.
02:36:44.000 It makes a lot of sense.
02:36:46.000 Yeah, I would love to say more about it at some time, but I've got to get my ducks in a row.
02:36:52.000 Yeah.
02:36:52.000 Well, these are exciting times, my friend, and I'm glad you're part of it.
02:36:56.000 Thank you.
02:36:57.000 I appreciate you very much.
02:36:58.000 Likewise.
02:36:58.000 Really appreciate you, and always glad to join you.
02:37:02.000 Tell everybody your podcast that you do with your wife, Heather, and everything where people can find you.
02:37:07.000 The Dark Horse Podcast.
02:37:08.000 We do a live show every week, and I release several Inside Rail podcasts with guests every month.
02:37:17.000 You can find me on Twitter, at Brett Weinstein.
02:37:23.000 Brett has one T.
02:37:24.000 I'm a fellow at the Brownstone Institute, which is a marvelous institution.
02:37:30.000 You should certainly look them up.
02:37:32.000 Probably about does it.
02:37:35.000 Okay.
02:37:36.000 Beautiful.
02:37:37.000 Thank you.
02:37:37.000 Thank you.