TRIGGERnometry - November 27, 2024


Trump Will Renegotiate the World - Eric Weinstein


Episode Stats

Length

1 hour and 12 minutes

Words per Minute

162.11511

Word Count

11,732

Sentence Count

808

Misogynist Sentences

8

Hate Speech Sentences

27


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

The election is over and Donald Trump is now the president of the United States. Eric Weinstein joins me to talk about the implications of this momentous election result and what it means for the future of the country and the world.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:01.000 What's really exciting to me is, we don't know if this is going to be a disaster or
00:00:06.160 whether this is going to be a new golden age.
00:00:09.000 He's really the most anomalous politician we've ever had.
00:00:12.560 And I think that a lot of the discomfort comes from people trying to put him in one box or
00:00:16.720 another.
00:00:17.720 He just won't fit.
00:00:18.720 So what's your concern?
00:00:20.140 This is really dangerous stuff.
00:00:21.880 What he's about to do, in my opinion, is renegotiate the world.
00:00:25.980 We needed to shake this up for sure, but this is going up to 11.
00:00:30.900 The Cold War is forever.
00:00:32.400 I don't think that the level of imagination needed to imagine nukes is very high anymore.
00:00:38.860 Eric Weinstein, welcome back.
00:00:41.860 Good to be with you guys.
00:00:43.260 It's good to have you.
00:00:44.260 Listen, the election is over.
00:00:46.460 It's happened.
00:00:47.520 Trump is the president.
00:00:48.520 He's got a clear mandate to govern.
00:00:52.060 You were somebody who took a lot of flack in the run up to the election from people
00:00:56.360 who felt that you were, I don't know, fence sitting or whatever the term might be.
00:01:00.440 We're not going to start there.
00:01:01.660 That's ridiculous.
00:01:02.660 We can get to that.
00:01:03.660 What we should be talking about, first of all, is that this is a stunning result.
00:01:06.940 It is.
00:01:08.080 And it's because that just allows the trolls to determine the narrative.
00:01:15.580 We have a new situation that has not occurred probably since Ronald Reagan.
00:01:22.400 And I think this is a dramatic moment.
00:01:25.020 And I'm just, I guess I'm mostly thinking about all the things that can happen now as
00:01:30.660 a guy who was trying to do all sorts of policy adjustments from like the 1980s through about,
00:01:38.260 I don't know, the second Obama term.
00:01:42.020 And so what's really exciting to me is we don't know if this is going to be a disaster
00:01:48.660 or whether this is going to be a new golden age.
00:01:50.780 It's kind of huge variance.
00:01:51.780 Well, this is what I was going to ask you because, and it's somewhat a confession for me,
00:01:56.780 like what I was going to ask you really is, are you unhappy with the result?
00:02:02.060 I don't think it's the right question.
00:02:03.660 I mean, I think that the idea is that this is a pretty desperate situation and it could
00:02:10.020 go horribly wrong or amazingly right.
00:02:13.000 And I think the wide variance is what's confusing people.
00:02:15.880 Okay.
00:02:16.380 That's very interesting.
00:02:17.380 So a lot of people just feel like finally the, you know, we just had Rob Snyder on and
00:02:23.860 his, his line was the left went so far to the left, they left the country behind.
00:02:28.460 So a lot of people feel, well, that's kind of, we've arrested that slide, right?
00:02:33.640 And therefore a lot of people are happy.
00:02:35.260 Even people who were in the middle like us, but you are concerned.
00:02:40.000 Yeah.
00:02:41.000 I'm very concerned.
00:02:43.000 Why?
00:02:44.000 Um, because you're, you're, you're basically breaking, um, a lot of structure.
00:02:51.360 That's what I think is about to happen and that could go horribly wrong.
00:02:54.640 Um, but a lot of that structure was really diseased.
00:02:58.720 And so as a result, you know, you have a load bearing wall and you've got a contractor who's
00:03:02.580 got a, a bold plan to, you know, knock out load bearing walls.
00:03:06.200 You don't know whether they're capable of doing it and the whole apartment building is
00:03:09.200 going to come down or whether finally, uh, we're going to get a solid foundation.
00:03:13.200 So I think, I think people just aren't understanding that this is not the continuation.
00:03:20.200 This is first of all, not the first Trump administration where I knew the error he was going to make.
00:03:25.820 He thought he was going to be governing and you can't govern in Washington unless you have
00:03:30.020 a giant team and there are only two giant teams and Trumpism didn't exist.
00:03:34.720 So it wasn't like Trump could hire a bunch of Trumpists because Trump just made stuff up
00:03:40.340 that was totally idiosyncratic and of the moment.
00:03:43.160 And he's the only person who can do Trump, right?
00:03:46.520 It's just this completely erratic drunken boxing routine.
00:03:49.340 Uh, Elon is another version of this.
00:03:51.340 He just constantly comes up with new weird things and you never know what he's going to
00:03:55.000 do next.
00:03:56.000 That's a terrible, um, situation if you have to hire 10,000 people to run a town that runs
00:04:03.100 the country that, you know, influences the world.
00:04:06.120 I think this time around he knows that, but you know, Trump was the only president, according
00:04:12.060 to my research, who's never held a government or military office, um, previous to getting into
00:04:19.960 the oval.
00:04:20.960 So he's really the most anomalous politician we've ever had.
00:04:26.000 And I think that a lot of the discomfort comes from people trying to put him in one box or
00:04:31.020 another.
00:04:32.020 He just won't fit.
00:04:33.020 So what's the load bearing wall or what are the load bearing walls?
00:04:36.020 Well, first of all, I think it's the international agreements, NATO, NAFTA, um, you know, trade
00:04:43.760 rounds and anything that basically requires huge coordination with other countries is a, is a problem
00:04:50.800 in a democracy because the democracy can always revisit these things every four years.
00:04:55.840 And so in essence, we have a permanent, uh, foreign policy establishment that plays keep
00:05:01.420 away ball with the American public to try to make sure that you always get two candidates
00:05:05.980 who are going to continue the agreements so that our allies don't think that we're wobbly.
00:05:10.300 Um, and our foes know that we mean business and it's not going to be put at risk every
00:05:16.000 four years with a populist.
00:05:17.460 And Donald Trump, uh, was particularly hated in 2016 for different reasons.
00:05:22.580 The reason Washington freaked out about him, in my opinion, was that nobody knew whether
00:05:27.880 he was going to continue the agreements.
00:05:29.400 Do you think that the election of Trump is in many ways a sign that America is in crisis,
00:05:39.580 that the ordinary party, the ordinary ways of doing things simply don't work.
00:05:46.540 So we need to go to Trump as a, almost as a last ditch resort.
00:05:52.800 It's a good question.
00:05:53.940 I don't know how to answer it exactly.
00:05:55.420 Clearly, if you go out into the street, um, even in a place like Los Angeles, you'll always
00:06:00.240 hear people say, you know, California has fallen.
00:06:02.800 Los Angeles is, is history.
00:06:05.240 It's still mostly Los Angeles.
00:06:07.140 Uh, the crisis in general, isn't at that level.
00:06:10.660 Now we've had riots, of course, with George Floyd.
00:06:13.380 Um, the bigger issue is just that we have a stale series of agreements that came off of
00:06:18.900 World War II.
00:06:20.060 And I would say that in a certain sense, the last time the United States of America absolutely
00:06:24.460 existed in the sense that we think of it might've been around World War II, we have a decaying
00:06:29.140 function where we had a lot of coherence, uh, not only among the military, the regular government,
00:06:35.180 the branches of government, but also, uh, pivotal, you know, industry sectors, or let's say
00:06:40.200 communications where America worked as one towards a common goal.
00:06:44.780 So we've been decaying for a very long time, but we had a spectacular win in 1945.
00:06:51.440 That thing is coming to an end.
00:06:54.140 And so what we consider to be normal life is nothing of the kind.
00:06:57.480 The period between 1945 and the present is the most anomalous thing that will never occur
00:07:02.300 again.
00:07:02.900 And it's over.
00:07:04.580 What, why is it the most anomalous thing that will never occur again?
00:07:07.520 Uh, in part because of thermonuclear weapons, which was November of 1952 with Ivy Mike in
00:07:14.820 the Pacific, that was a big deal.
00:07:16.960 And because, uh, the side, sorry, this because of the size of Damocles sword, uh, it confuses
00:07:25.080 the Steven Pinker types into thinking, wow, this is just a marvelously peaceful era, but
00:07:29.600 everything is potential destruction.
00:07:31.380 So everybody was very well behaved now in 1962, uh, or maybe it was 1963, we had the test
00:07:37.360 ban treaty.
00:07:38.820 And so people haven't been exploding these fantastic weapons, uh, in ways that people can
00:07:44.200 see and feel them for a very long time.
00:07:46.580 And as a result, we don't really, we will for a while, we didn't think that nuclear weapons
00:07:51.220 were a part of our world.
00:07:52.600 We thought that that was, uh, you know, banished in the early 1990s with the demise of the cold
00:07:56.520 war.
00:07:57.360 So what's changed?
00:07:58.420 You're saying it's over, but I don't see people using nuclear weapons.
00:08:01.380 Um, I don't think that the level of imagination needed to imagine nukes, uh, is very high
00:08:10.800 anymore.
00:08:11.440 For some reason, the American mind got quite stupid when we declared victory in a peace
00:08:16.880 dividend and bill Clinton said a lot of things that weren't true.
00:08:20.420 Um, the, the cold war is forever.
00:08:22.420 There is no end of the cold war.
00:08:24.760 It may become a multipolar cold war.
00:08:27.120 It might be a safer cold war.
00:08:28.940 But if you look at, you know, the, uh, bulletin of the atomic scientists, the doomsday clock,
00:08:33.360 the last time I looked, it was the closest to midnight it had ever been at 90 seconds.
00:08:38.420 Um, you have a multi-polar situation with, uh, Tehran and Moscow and Kiev and, uh, Jerusalem
00:08:47.840 and potentially, you know, the Taiwan Strait, Indian Pakistan could go.
00:08:51.840 You have one giant policeman with a relatively small number of carrier groups.
00:08:57.720 You have alliances that haven't really been tested.
00:09:01.340 Um, I think a lot could happen.
00:09:04.260 I think you have a lot of unskilled players and, uh, you know, very dangerous skilled players
00:09:12.180 are probably safer than nicer unskilled players.
00:09:17.600 And what is the dynamic with Trump?
00:09:19.960 Because you say it puts certain big international things at risk, but is he really going to pull
00:09:26.880 America out of NATO?
00:09:28.060 It seemed to me like he was just using that threat to get the Europeans to pay for themselves,
00:09:32.200 right?
00:09:32.600 Well, that, that's just the thing.
00:09:34.280 His variant, it's the same point again.
00:09:36.400 His variance is high.
00:09:38.840 He knows that it's two different styles.
00:09:42.620 Let's put it that way.
00:09:43.460 Trump's style in business is to become uncertain.
00:09:46.580 You don't know what he's going to do.
00:09:49.100 Um, very few people, you know, like our, our friend, mutual friend, Sam Harris thinks he's
00:09:54.660 an idiot.
00:09:55.280 Now, clearly he isn't.
00:09:56.840 He's got multiple different ways of playing certain games.
00:10:00.400 If you study him carefully, he keeps selecting, um, permutations of the tricks he knows, but
00:10:07.060 it's a, it's a, it's a large combinatorial set of possibilities when he communicates and
00:10:11.660 he's still very effective in trapping people.
00:10:13.940 You know, that thing that he did with the Liz Cheney, um, the issue about, you know, he
00:10:18.800 slips in that hand her a rifle because he's, he's trying to get at the chicken hawk issue,
00:10:22.760 but he knows that the news media will mishear it.
00:10:25.240 And that way, um, they will jump on him and then he will be in a position to play
00:10:30.360 the next move where he corrects them and shows that they're completely non-trustworthy.
00:10:34.360 So I think in part in negotiations, you don't know what he's going to do next in governance.
00:10:40.020 You don't know what he's going to do next.
00:10:42.060 He's very creative, very unpredictable.
00:10:44.560 He's the most interesting politician of our time.
00:10:46.900 So that then comes back to ultimately in that situation, if you think that someone's predictable,
00:10:52.300 I guess most people would seek to judge what that person's likely to do based on their
00:10:58.800 perception of that person's skillset, intention, and character, right?
00:11:02.460 Do you trust this guy to have America's interests in heart or at heart?
00:11:06.860 That, that will be the, the way to assess it, right?
00:11:08.820 I trust that he has America's interests at heart.
00:11:11.000 I don't think for a guy who supposedly has Trump derangement syndrome, I don't think I
00:11:15.760 have the usual variant.
00:11:16.640 I don't think anyone's ever accused of having Trump derangement syndrome.
00:11:20.320 Well, the problem is, is that everything is quantized now.
00:11:22.940 So we try to snap everyone to grid.
00:11:25.140 So it's like, do you love him or do you hate him?
00:11:27.880 The point is, I very clearly appreciate him.
00:11:31.920 And I don't, I would not trifle with that man.
00:11:34.460 That is a very skilled, very unusual life form.
00:11:38.260 And that the question in my mind isn't, does he love America?
00:11:42.560 I think he, he really cares about legacy and greatness.
00:11:46.400 Um, and I think he really loves his children.
00:11:49.060 I think that's one of the things that you really have to go by is, is that those children
00:11:52.540 are really devoted to their, to their father.
00:11:55.380 And that tells me a lot.
00:11:57.240 I believe that he really loves America.
00:11:59.520 He's never been given his due.
00:12:02.900 So what's your concern?
00:12:05.380 That this is really dangerous stuff.
00:12:09.120 What he's about to do, in my opinion, is, uh, renegotiate the world.
00:12:18.020 One of the things that he, he excels at is if you have a tradition, he's not excited to
00:12:24.520 be the Nth person to carry the tradition on.
00:12:26.780 He's very excited to be the first person to break it.
00:12:29.680 You know, like moving the embassy in Jerusalem, um, throwing out the first pitch was a, was a,
00:12:36.680 an easier one to do.
00:12:38.280 I think he's going to do all sorts of stuff that hasn't, hasn't been done before, particularly
00:12:42.780 with the game theory that he's not seeking reelection.
00:12:45.040 And he knows what happened like the last time around and his, uh, the bench in the democratic
00:12:52.540 party is so weak.
00:12:55.380 These people have not been able to field, um, new recruits of high talent because there's
00:13:03.720 no room to move in that party.
00:13:06.260 I think it's a really good analysis of what you've said, because I think Trump is a natural
00:13:10.620 disruptor.
00:13:11.900 He's a natural disruptor.
00:13:13.500 That's what he does.
00:13:14.420 That's essentially what populism is.
00:13:17.400 But cometh a man, cometh a moment.
00:13:19.640 Don't we need a natural disruptor now because the natural order of things no longer works?
00:13:25.440 That's what we talk about all the time.
00:13:26.800 Do you mean to reverse the order of that?
00:13:28.360 It's usually cometh the hour, cometh the man.
00:13:29.620 Yeah, sorry.
00:13:30.140 No, no, no, because it was interesting to think about it the other way.
00:13:34.060 Yeah.
00:13:34.460 And he, it's a, he, that's what he's, he, that's what he's been put in place to do is
00:13:40.300 to disrupt, is to challenge.
00:13:42.160 The status quo isn't working.
00:13:43.980 We all acknowledge that.
00:13:46.300 So actually what we need is someone like him and we have no other choice.
00:13:51.100 So we need to roll the dice and just see what happens.
00:13:54.400 Well, but you're rolling the dice squared.
00:13:55.900 Elon, for example, is the other person I would associate with, I mean, I guess my, my view
00:14:03.940 of Elon, I've, I've really never met him.
00:14:06.680 I've, I've started minor correspondence with him.
00:14:08.980 Um, I view Elon as being like the only adult on planet earth at the moment.
00:14:15.820 And he's a completely chaotic, like your friend's crazy dad, but definitely dab.
00:14:20.780 So I think Elon, uh, has the long-term view of humanity.
00:14:25.880 He's absolutely brilliant.
00:14:27.920 He's totally chaotic.
00:14:30.300 And you're, it's basically Trump times Elon equals lit.
00:14:35.720 It's going to be lit.
00:14:37.440 It will.
00:14:37.980 Absolutely.
00:14:38.600 But if you look at what's going on in the Middle East, isn't chaos a good thing?
00:14:43.620 Isn't chaos where they can't predict which way they're going to go?
00:14:47.680 Iran can't predict which way America is going to go.
00:14:50.200 Isn't that good?
00:14:51.540 If we look at it as a term of these people are our enemies, we have pandered to them
00:14:56.500 for too long.
00:14:57.540 We need chaos.
00:14:59.420 We need disruption and we need to put them on the back foot.
00:15:06.060 Pretty bold.
00:15:06.980 I, too rich for my blood.
00:15:08.800 I'm out.
00:15:09.160 Uh, I don't know what to say.
00:15:12.320 Okay.
00:15:12.960 The problem is this thing I've talked about before, maybe with you guys can't remember
00:15:16.600 message violence.
00:15:17.600 Message violence is incredibly important.
00:15:20.300 And the way most people learn about message violence is from the Godfather, where the vest
00:15:25.500 is put on the table and then there's a fish in it and nobody knows what it means.
00:15:29.240 Somebody says, oh, it's an old Sicilian message.
00:15:31.240 Luca Brazzi sleeps with the fishes.
00:15:32.600 Iran just practiced message violence and it did it in Gaza coming across the border.
00:15:42.600 It did it with Hezbollah.
00:15:44.560 It did it with the Houthis in Yemen.
00:15:46.180 And most importantly, it did it with hypersonic ballistic missiles, killing almost no one in
00:15:55.940 Israel.
00:15:56.740 And what was the, what was the point of that was to say, you have no iron dome and Tel
00:16:03.580 Aviv has no Islamic holy sites.
00:16:07.300 It's got really good hummus and falafel in Niafa, but there is no, what percentage of
00:16:13.400 the earth's Jewish people live in greater Tel Aviv, completely unprotected from a ballistic
00:16:22.640 barrage sent with love from Tehran?
00:16:24.820 I don't know.
00:16:26.240 It's, it's not small.
00:16:27.620 You're talking about, you ever been to war?
00:16:32.040 I haven't.
00:16:32.660 No.
00:16:33.080 Okay.
00:16:34.440 Even our military in the U S mostly hasn't been to war.
00:16:38.980 It's been to all sorts of operations.
00:16:40.360 It's been to, uh, you know, conflicts, war is serious stuff.
00:16:49.060 And most of us who are, you know, the world is in the middle of a masculinity crisis.
00:16:54.360 People are talking tough because they're trying to remember what it's like to be male.
00:16:58.520 But I just think we've forgotten about how dire the consequences of the miscalculation.
00:17:07.240 And when you have so many people who have not learned how to speak violence, like one
00:17:12.620 of the things in message violence is you make the killing that you do do so completely, I
00:17:19.180 don't know, picturesque, disgusting, horrible.
00:17:21.720 Like what ISIS would do with those videos with the 4k production values and all that.
00:17:26.840 That is intended to say a small amount of violence goes the longest distance.
00:17:33.100 And so we don't really think about this stuff very accurately because we don't speak this
00:17:38.140 as our native language.
00:17:39.300 And I just worry that we can say all these things and we have no idea what we're actually
00:17:44.600 viscerally buying when we say, yeah, we've got to shake things up.
00:17:48.460 Yeah.
00:17:48.780 And look, because there is an element of risk, but I think we can all agree that the way
00:17:52.980 that the Democrats handled Afghanistan, the Middle East, wasn't working.
00:17:59.260 I'm with you.
00:17:59.960 So there needs to be something else.
00:18:04.860 Yep.
00:18:05.540 You have me there too.
00:18:06.560 My point is there was a lot of room between the bizarre behavior of the Democrats and business
00:18:14.420 as usual.
00:18:16.080 And this wild improv troupe loaded with talent, but that doesn't have a traditional power base.
00:18:27.380 There's a lot of skill in Washington, D.C., and I think this is something that we don't
00:18:31.620 talk enough about.
00:18:34.140 Washington, D.C. is largely filled with geeks.
00:18:37.960 People who, you know, if you're in the FDA, you really know your nutrition stuff.
00:18:41.960 And, you know, if you're in the Department of Energy, you've got to know about nuclear
00:18:47.560 weapons and blast radii and all this kind of stuff.
00:18:50.140 People, very often, the permanent class in Washington, D.C. is just incredibly skilled
00:18:57.880 and very geeky.
00:19:00.400 And I worry that this new team, I don't know who's coming.
00:19:07.700 I have no idea if they know how to staff this thing and how the political appointees are going
00:19:14.100 to work with the permanent class.
00:19:17.720 So it's exciting.
00:19:19.940 I mean, I think you can probably see from my demeanor, it's super exciting.
00:19:24.800 It's just, it's also terrifying.
00:19:28.020 We needed to shake this up for sure.
00:19:30.540 But this is going up to 11.
00:19:33.020 So just sticking with Iran, we've had a lot of people on the show recently who feel that
00:19:40.440 given that the source of the threat to Israel is Iran, inevitably that clash is going to
00:19:46.680 manifest itself in some way.
00:19:48.600 Why?
00:19:49.760 Because if Israel's, their argument would be if Israel's purpose is to eliminate the threat
00:19:55.060 to Israel, given that that threat is emanating from Iran, it would mean the necessity of having
00:20:01.640 to degrade their nuclear facilities, destroy their oil revenues, that their argument is
00:20:09.340 you can, you can, you know, destroy Hamas and Hezbollah as long as you want, but ultimately
00:20:15.720 that's not where the threat is coming from.
00:20:18.460 This is why I worry about podcasts.
00:20:21.920 Let me first say that I don't speak Farsi.
00:20:25.060 My Hebrew is in terrible disrepair and I don't speak Arabic.
00:20:29.840 In order to say such things, you really should know something about the region.
00:20:35.520 And I worry that what you have is a lot of us jawing off as if we know what we're talking
00:20:40.900 about when we don't.
00:20:42.880 I would like to think that we know enough to realize that the population of Iran is not
00:20:48.520 necessarily on the side of the mullahs.
00:20:50.740 And you only have this issue that eventually Israel and Iran will come to some massive conflagration
00:21:01.020 if you don't believe that regime change inside of Iran is possible.
00:21:06.440 I personally think, and this could be completely wrong, that Tehran is home to very modern,
00:21:16.460 very progressive people who are living under tyranny, that there isn't a profound hatred
00:21:25.020 of all things Jewish or Israeli.
00:21:27.760 And I just seriously think that we need to make sure that the people making the decisions
00:21:33.880 don't have a simplified picture of what is or is not inevitable.
00:21:37.680 Back in the days of Shimon Peres, he was always upset that the Palestinian Arabs were occupying
00:21:42.560 the Israeli mind.
00:21:43.760 And he was saying, look, we have a nuclear, we have a potential nuclear theocracy.
00:21:48.100 Nobody knows what a potential nuclear theocracy is.
00:21:51.880 It's a very strange game theoretic structure, particularly if you have a culture of death
00:21:57.580 in paradise.
00:21:59.040 So, you know, my feeling about this is it would be better if we had people who were true experts
00:22:09.380 in the region making these pronouncements.
00:22:12.740 I worry that there's a lot of hot talk because it's fun.
00:22:15.240 And I worry that there are a lot of men with masculinity issues who like talking about this
00:22:22.040 stuff in ways that could be very, very dangerous.
00:22:25.400 Well, and not men.
00:22:26.400 There's lots of people.
00:22:27.240 That's why I was putting an argument to you.
00:22:28.860 I certainly wouldn't give any super credibility to it.
00:22:32.560 But the logical conclusion of what's been happening so far is that, which is why I think a Trump
00:22:40.080 presidency has the potential to change things, because I don't think we want things to go
00:22:44.200 in that direction.
00:22:45.860 I don't know.
00:22:48.460 I mean, this is a lot of, it's just, it's very weird.
00:22:52.460 Everybody knows stuff, you know?
00:22:54.280 People said, well, is it going to be Biden or Trump?
00:22:58.140 I said, I don't know whether Biden would make it to November.
00:23:00.700 And then they said, no, this is going to be a nail biter.
00:23:03.820 And I would say, how do you know that?
00:23:06.000 It doesn't seem like a nail biter.
00:23:08.040 I worry that there's this epidemic of certainty that the internet brings out in us and that more
00:23:13.700 or less, I keep saying, I don't know, I don't know, I don't know.
00:23:16.280 And I still think I'm outperforming almost everybody in terms of a lot of these predictions.
00:23:22.360 And I don't know what to do about that, because I'm not very confident of this stuff myself.
00:23:29.500 And I've just seen this wall of very certain people, the same people, you know, who tell
00:23:36.280 me that Putin will never use nuclear weapons because he has so much to lose are the same
00:23:41.440 people who told me that Putin will never invade Ukraine.
00:23:43.500 It's exactly the same people.
00:23:46.300 I don't know what to make of it.
00:23:48.620 It's a profound point.
00:23:50.720 And it's a position that comes with a lot of integrity.
00:23:53.540 And actually, it's brave to say, I don't know.
00:23:56.540 Because now we live in a world where everybody's on a team.
00:23:59.800 Everybody has a particular worldview.
00:24:02.440 And what worldviews give you is certainty.
00:24:06.580 Because it gives you a lens to see the world.
00:24:08.420 You guys aren't on a team?
00:24:10.580 Are you?
00:24:11.140 You weren't on a team the last time I talked about it.
00:24:12.920 No, no, we're not on a team.
00:24:14.820 Yeah.
00:24:15.400 But it's saying most people.
00:24:16.880 No, I just.
00:24:17.720 We're better than most people.
00:24:18.740 Yeah.
00:24:20.420 Pure, virtuous.
00:24:22.260 Just, you know, just better.
00:24:24.440 Okay.
00:24:25.100 Be that as it may.
00:24:27.060 Yeah, I don't know what to do about this.
00:24:29.100 I'm, you know, I always try to tell people that the objectivists in Ayn Rand were collectivists
00:24:37.140 because they had a team and a name.
00:24:38.940 You know, to me, the thing I'm most interested in is what system will allow radical individualism,
00:24:45.820 and I mean profoundly radical individualism to thrive.
00:24:48.640 And my concern is, is that the teams are taking up all the oxygen.
00:24:56.160 Wokistan versus Magistan.
00:24:59.160 Okay.
00:24:59.560 Most of us aren't really either.
00:25:02.240 For all the talk about blue-haired people, mostly I don't see people with blue hair.
00:25:05.900 And I don't think I've ever seen a MAGA hat, despite having been in Florida and Texas over
00:25:12.620 eight years.
00:25:13.080 I don't think I've ever seen somebody wearing a MAGA hat just in the street.
00:25:16.680 So I really think that we have this completely flawed impression of who we are because our
00:25:26.300 online selves aren't even really tied to our real selves.
00:25:28.680 And as we've discussed with many people, there is a lot of bots and there's a lot of foreign influence and all sorts
00:25:35.860 of things going on there.
00:25:37.060 Completely.
00:25:37.800 Which are amplifying, in my opinion, quite deliberately, the divisions that may or may not exist.
00:25:44.840 And actually, if you look at policy-wise, I don't think, you know, your interview will probably go out before the interview
00:25:53.360 of Batia.
00:25:54.260 But when we had Batia on, she made this point very well that most people agree on like 80% of the policies.
00:26:03.720 But, you know, the one thing I've really taken away from this trip to America, you know, we come here regularly,
00:26:09.600 is I used to think that the media embellish and exaggerate and, you know, they're sort of like a little bit
00:26:20.740 directionally accurate, but they exaggerate and they...
00:26:24.240 That's adorable.
00:26:26.120 I know.
00:26:26.480 Thank you.
00:26:26.860 Thank you for that patronizing and condescending you.
00:26:29.840 I've been around for too long.
00:26:32.080 However, you see, I also think that actually the vast majority of people, especially outside America,
00:26:38.140 but actually many, many people inside America also think like that.
00:26:41.840 Okay.
00:26:42.120 That's been my experience, right?
00:26:43.680 I saw this in New York Times, therefore, you know, it is, you know, he's not Hitler, but he's kind of like,
00:26:49.380 you know, 20% towards Hitler type of thing.
00:26:53.240 And this trip, having, you know, we went to a Trump rally to see it with our own eyes and we blah, blah, blah, blah, blah.
00:27:00.200 Did you get the hat?
00:27:01.200 No.
00:27:01.580 No.
00:27:02.080 But we did see Hulk Hogan do a speech, so...
00:27:04.620 And what I took away from it now is they're not embellishing and exaggerating.
00:27:12.000 They're just lying.
00:27:15.620 So it's not just bots and social media and people on Twitter.
00:27:20.020 It's the entire information space that is completely distorted.
00:27:24.500 They're maxing out.
00:27:25.420 This is an old structure in its dying throes.
00:27:30.320 Right.
00:27:30.520 And so, you know, as the old advertisement goes, what are you saving the shivis for?
00:27:36.120 There's a point where you liquidate whatever you have left because you're losing.
00:27:40.980 But, yeah, the lying isn't working as well.
00:27:46.960 In particular, you know, I think Elon used Trump getting shot and the actual bravery to say,
00:27:52.860 oh, perfect moment to jump out of the Democratic Party because at least we know that was authentic.
00:27:59.380 Right.
00:27:59.860 And then the idea is that you have these preference cascades and Timur Koran's theory of how you foment a revolution
00:28:06.820 is that you get brave people like Elon to say, this I believe, you know.
00:28:13.320 And then the question is, is there a slow clap of the next person and does the slow claps spread?
00:28:19.200 And I think that partially what you're seeing is a giant exodus from the Democratic Party
00:28:25.140 and the mind control of mainstream media.
00:28:28.720 Now, mainstream media in large measure, the gated institutional narrative, is traded between
00:28:34.620 institutions.
00:28:36.360 If nobody was watching CNN, Pfizer would still need it in order to talk to its regulators,
00:28:42.900 you know.
00:28:43.200 And in a certain sense, more and more people don't believe it.
00:28:48.680 You do have to keep track of it because it's a set of instructions as to what you're allowed
00:28:52.920 to believe while at work or what you're allowed to believe if you still wear a suit and tie
00:28:58.480 and go to meetings and you're in that class.
00:29:03.180 So it's very important that you receive your instructions on an almost daily basis as to
00:29:08.240 how to coordinate your points of view.
00:29:10.540 And I always give this example of Tahrir Square when Mubarak was overthrown and the New York
00:29:15.700 Times was not using the word revolution.
00:29:17.300 People would say, oh, the demonstrations seem to have resulted in Mubarak's departure.
00:29:22.440 I said, you mean the revolution?
00:29:23.540 They're like, no, no, no, it wasn't a revolution.
00:29:25.660 He said, what are you talking about?
00:29:27.380 Well, I wasn't hooked up.
00:29:28.360 I was using Arabic accounts that tweeted half in English.
00:29:34.500 And so they were using the word revolution, but we weren't using the word revolution.
00:29:37.860 So it's amazing how mind control actually works.
00:29:40.000 It was also the case that our minds are divided.
00:29:42.160 You have a part of your brain that knows that the New York Times is more or less lying.
00:29:48.440 And you have another part that agrees to believe it.
00:29:52.140 Yeah.
00:29:53.620 And it's that part of the brain that agrees to believe it.
00:29:57.020 It's so powerful.
00:29:58.280 And I find myself, even now, when I see a New York Times headline or I see a mainstream media headline, I immediately go, oh, that's what's actually happened before I actually have to check myself.
00:30:12.880 And I believe both in different parts of my brain.
00:30:19.300 You know, it's like I used to talk about having a jihadi sandbox in my mind so that I could understand how jihadis think.
00:30:26.520 So, you know, you have a place in your brain.
00:30:28.900 Do you do accents ever when you were a kid?
00:30:31.300 Yeah.
00:30:31.880 I still do them now.
00:30:32.780 Okay.
00:30:33.960 We frown on this officially.
00:30:35.200 You know, the part of your brain that suddenly knows how to talk cockney or, you know, with a Gujarati accent or whatever is a different module.
00:30:45.480 And we have a bunch of different modules.
00:30:47.980 And I just find it funny that we still talk.
00:30:50.840 Well, do you believe this?
00:30:51.760 Do you not believe this?
00:30:53.440 Very often I believe something intellectually that I don't believe viscerally.
00:30:58.300 But reality is still reality.
00:31:00.040 There still is a reality, which is why if you make trades based on a false narrative, reality will slap you in the face.
00:31:06.380 Right.
00:31:07.040 So the whole conversation in this moment really is about whether the left and the Democrat Party is going to face reality, which is the way they talk and the things they say and the way they operate are not popular with Americans.
00:31:24.820 Or they can continue to imbibe a narrative that isn't accurate and that does not reflect the reality of what their fellow citizens think.
00:31:35.520 Isn't that the issue here?
00:31:37.320 I think that you're talking about something that's hopefully about to happen, which is that a certain kind of base reality is too difficult to deny.
00:31:45.200 Well, if you keep losing elections, it's too difficult to deny.
00:31:48.660 Right.
00:31:49.440 Well, they've lost one just now.
00:31:51.800 But this is going to be a very consequential one.
00:31:53.780 First of all, it puts J.D. Vance, who I consider a friend, on deck.
00:32:00.560 Man, is that guy smart and good, combines all sorts of aspects of progressivism.
00:32:06.580 I think he ran a campaign with Donald Trump as a loyal number two.
00:32:11.920 But J.D. is a powerhouse in and of himself.
00:32:16.540 I think he could run a campaign that would just be irresistible to all sorts of people.
00:32:20.940 So I would like to just point out that you could easily have 12 years coming off of this election and you could have a Supreme Court that was completely dominated by Donald Trump and J.D. Vance.
00:32:34.600 And it will transform the country.
00:32:38.360 So this is a very consequential election to have screwed up.
00:32:41.400 It's not just that they lost an election.
00:32:43.520 It's also the end of something.
00:32:45.060 Because, you know, Obama, it doesn't matter what you think of him and whether he was a disappointment to you in office, that was real star power.
00:32:55.200 And you watched him degrade himself as he threw his support behind Kamala Harris.
00:33:02.840 The Clintons are highly degraded.
00:33:06.940 Gavin Newsom could not keep a straight face with the switch when Kamala was put forward.
00:33:13.740 He said, oh, yes, it was a top-down, inclusive affair.
00:33:16.400 Ha, ha, ha, ha.
00:33:17.220 This was such a bad story that no one knew how to defend it.
00:33:22.720 And I also think that, you know, Kamala Harris's apparent drop in IQ is due to the fact that nobody can explain the Democratic Party.
00:33:32.420 It's a series of horse trades and intellectual half measures.
00:33:35.800 And nobody knows how to, it doesn't have any coherence.
00:33:41.140 Are you the party of sweetness and light?
00:33:44.320 Are you the party of the working class?
00:33:46.200 Or are you really the party of, you know, the transgendered and financial billionaires worried about the carried interest exemption?
00:33:55.900 It just didn't make any sense.
00:33:58.080 And there was no way to defend it.
00:33:59.520 And still got close to 50% of the popular.
00:34:03.240 Because so many people are dependent on these narratives.
00:34:05.800 To me, the Democrat Party, and we're talking about reality, they're so divorced from reality in so many different ways.
00:34:15.500 Like, they talk about being democratic, but Kamala Harris didn't go through any primary.
00:34:20.560 They just appointed her.
00:34:21.640 This is a great point, because I think people didn't really update as to what happened after that point got made.
00:34:28.820 So I started talking to my academic friends.
00:34:31.100 I said, you can't say democracy is on the ballot.
00:34:34.240 But there's no primary.
00:34:37.100 And immediately, all of my, like, poli-sci-aware academic friends said things like, oh, no, no, no.
00:34:43.780 A political party is free to choose whoever it wants.
00:34:47.680 And democracy is, you know, we've only really had these primaries since the 68 convention in Chicago of the Democratic Party.
00:34:55.680 And the real thing is called the invisible primary.
00:34:58.500 And everybody in the beltway knows this whole thing.
00:35:00.960 You're saying, oh, cool.
00:35:02.100 So you just pulled the mask off and says, democracy is a binary choice between two parties that have given you every single president since Millard Fillmore.
00:35:12.180 And that's democracy.
00:35:14.020 Nobody is going to send their son with a rifle to fight for this vision of democracy.
00:35:20.820 The thing that inspires us, that gets us to put our right hand over our heart, is the idea of a government by, of, and for the people not perishing from this earth.
00:35:29.280 The idea of, you know, this concept of perfectly legal.
00:35:34.000 Something is legal means that you're allowed to do it.
00:35:37.300 Something is perfectly legal means somehow there's a loophole in the law and you can't get tagged for it, even though it's completely wrong.
00:35:44.580 Well, it's perfectly legal, perfectly permissible to just select a candidate.
00:35:49.640 And it was an abomination.
00:35:52.580 And it's kind of just fun because the same people had this tagline, democracy is on the ballot.
00:35:59.380 Like those t-shirts were already printed figuratively.
00:36:02.400 And so, yes, you've installed a candidate who was the worst candidate available until she became America's sweetheart.
00:36:10.340 And the whiplash from that period of time just forced the machinery to reveal itself.
00:36:17.900 And it seems like that's one of the logical fallacies within the Democrat Party.
00:36:23.520 But it's just one after another after another.
00:36:26.220 And look, I know, nobody's fully coherent, no political party, blah, blah, blah.
00:36:31.580 There are always inconsistencies.
00:36:33.360 But you look at the thing, you know, they talk about being the party of minorities.
00:36:36.280 Yet when black men don't go out to vote for them, they admonish them as a group and go, well, how dare you?
00:36:42.600 All you black men, what I mean, you've done this.
00:36:44.540 And you're going, is that anti-racist?
00:36:46.900 You know what we're doing?
00:36:47.460 We're doing therapy.
00:36:48.940 We've been through like a North Korean brainwashing experiment.
00:36:53.560 And we can't believe that this happened.
00:36:58.080 It's just, it's so bad.
00:37:00.660 And every single person of any kind of originality of thought or independence of mind rejects it.
00:37:08.740 Now, why did this ever work?
00:37:10.520 I learned in part a lot about the U.S. from having a podcast.
00:37:15.560 And it was very popular with people I didn't expect.
00:37:21.180 Truck drivers who really care about physics.
00:37:26.180 Electricians who care about the Middle East.
00:37:28.160 There are all sorts of people who solve puzzles every day, general contractors.
00:37:33.860 And these people are in a creative field because every day is different.
00:37:39.480 You never know what house you're going to show up or what trucking route or who knows what.
00:37:43.220 And they've got time on their hands.
00:37:44.840 And they're not beholden.
00:37:45.840 Nobody cares whether your electrician is MAGA or woke.
00:37:50.060 As long as they get the job done and they leave your house in decent order and they don't charge you too much.
00:37:54.540 I would argue, Eric, that it's impossible to be a bricklayer and be woke.
00:38:00.160 Because to be woke is to be disconnected from reality.
00:38:03.240 And to be a bricklayer is to be very much grounded in reality.
00:38:06.860 I agree.
00:38:07.880 And so, in general, most of the point I was trying to make.
00:38:11.000 Sorry, Eric.
00:38:11.700 That's all right.
00:38:12.440 It was very forceful and commanding.
00:38:14.200 I kind of enjoyed it.
00:38:16.280 The way I saw it is that these people, by and large, were America's secret brain trust.
00:38:23.140 And they're not having it because there's no reason.
00:38:26.880 Now, if you take that and you compare that to the very highly educated, one of the arguments that the Democratic Party likes to make is, you know all the people with master's degrees and PhDs and JDs and MDs vote for us.
00:38:40.500 And the answer is because you're an interest structure.
00:38:43.300 And the workplace, if you're in a woke workplace, particularly large institutions that have to, I forget, what is it, above 15 employees, you have to follow the civil rights legislation about inclusion, all this kind of stuff.
00:39:00.040 Those workplaces are constantly demanding fealty to this woke parchment.
00:39:08.960 And that's what's going to have to collapse.
00:39:11.500 The contractors in general, they might vote Democrat or they might vote Republican, but they're not easily taken in because they have a relationship with the unforgiving.
00:39:23.720 An electrician who starts to believe in crazy ideas is going to encounter 110 or 220 relatively quickly.
00:39:31.940 It's a short ride.
00:39:32.700 And that's the beauty of talking to these people, because exposing yourself to reality encourages a simplicity of thought.
00:39:43.480 You keep saying reality.
00:39:45.820 Yeah.
00:39:46.840 Okay, physics.
00:39:47.880 Well, I call it a relationship with the unforgiving.
00:39:50.740 You try to fudge something that isn't fudgeable.
00:39:54.640 Like you see this very often if you take somebody who doesn't go hiking into the mountains.
00:40:01.380 At some point, they'll just sit down and say, I'm tired.
00:40:06.120 And you say, okay, I want to go home.
00:40:10.160 Well, it's 8.7 miles down that way.
00:40:12.960 We've got 45 minutes of daylight.
00:40:15.160 So how does that work?
00:40:16.880 The person just has no idea that they have to actually negotiate the unforgiving.
00:40:21.980 So, you know, do something in your life.
00:40:25.440 Play an instrument.
00:40:26.360 Climb a rock wall.
00:40:28.860 Do electrical work.
00:40:30.400 You'll very quickly figure out whether you have a positive relationship with the unforgiving.
00:40:34.560 So you mentioned that this is what we're doing is therapy because the fake worldview, let's say, that we've had or had imposed on us is shattered.
00:40:44.800 Is that why a lot of people who may not have been huge fans of Trump are nonetheless relieved and happy that he's been elected?
00:40:50.660 Because it has demonstrated that they're not insane.
00:40:54.980 Yeah.
00:40:55.280 You know that scene in the Terminator series?
00:40:57.740 Come with me if you want to live.
00:41:01.320 He's what's available.
00:41:04.120 Love him or hate him.
00:41:05.240 Now, you know, then the next thing you get is you get that once people say, I am pro-Trump, they start to excuse all of his excesses and the MAGA movement's excesses.
00:41:15.740 And so people need this kind of cognitive consonance.
00:41:19.860 And so they get rid of their democratic dissonance and then they become cognitively consonant with the idea that everything is great on the MAGA side.
00:41:29.900 And, you know, this is very hard to avoid.
00:41:32.140 Yeah, especially if you're a public figure, because people will then try to tag you with that.
00:41:38.800 Like, and this comes back to the criticism that we were discussing, because ultimately I think what it comes down to is you are an independent thinker and that's why we like you.
00:41:49.920 But every four years there's an election and you have to condense all that complexity into a binary choice.
00:41:56.880 Two binary choices.
00:42:00.880 Well, actually, one quaternary choice and one up to quaternary choice.
00:42:07.820 There's who you vote for and what you recommend.
00:42:12.060 So, I went to the polling place.
00:42:18.020 I may have voted one way or the other.
00:42:19.800 I may have thrown away my vote.
00:42:22.020 I may have not voted in that particular race.
00:42:25.260 But the question is, can I recommend something to my first cousin once removed?
00:42:29.920 And if I looked at the amount of hatred that came my way, there was no way you could ask me to tell some sweet 75-year-old gal, hey, get into this car with these people.
00:42:48.460 Stupid Jew.
00:42:50.320 Oh, my God.
00:42:52.180 And, you know, this gets to my frustration with both the Democratic and Republican movements.
00:42:58.100 Are there no men?
00:43:00.920 And I hate to say it this way, but it's like, are there no men who stand up and say, you cannot talk that way?
00:43:10.020 I just watch these pylons and these incited online mobs and this vitriol and hatred.
00:43:21.420 It's like, somebody stand up as an individual.
00:43:24.920 We can't do that in the MAGA movement.
00:43:28.400 We're not going to be going after individuals like that.
00:43:31.480 Well, no.
00:43:32.420 See, the problem with what the right did is during the left's control over the entire system, the right made free speech a sacred value.
00:43:44.320 Thank God.
00:43:44.820 Of course.
00:43:46.280 But what they also did is they confused the freedom for people to speak with the necessity of accepting all speech as being equal, essentially.
00:43:58.620 Right.
00:43:58.980 So we can't say to somebody, stop attacking somebody online because it's just rude.
00:44:05.980 There's no need to do it.
00:44:06.920 No, no, no.
00:44:07.340 Because that violates.
00:44:08.540 Well, are you trying to shut down my speech?
00:44:10.700 That's what people say.
00:44:11.560 100% I'm trying to shut down your speech.
00:44:14.440 I can't believe this isn't taught in settings.
00:44:20.020 The easiest part of public parks is building public parks.
00:44:24.380 The hardest part is keeping the syringes and the gang members and the sex in the bushes away from the swings and the toddlers.
00:44:32.280 The easiest part of free speech, and I very much appreciate Elon for doing this, is opening the pipe and keeping it open.
00:44:41.020 The hardest part is the culture so that it never occurs to anyone to act like this except for a tiny fringe few.
00:44:52.700 The culture of free speech is absolutely about the social restraint of dangerous, bad speech.
00:45:01.260 This idea that the cure for bad ideas is good ideas is nonsense.
00:45:07.900 Ideas compete on fitness.
00:45:11.940 We have a fitness landscape where a really bad idea can easily outcompete good ideas, better ideas.
00:45:19.240 The only thing that works is if people of good character use something that precedes cancellation.
00:45:27.880 Cancellation is a terrible concept.
00:45:29.860 I don't know where it came from.
00:45:31.820 But shunning, social shunning of bad speech in a free speech environment is not the same thing as saying you cannot say that.
00:45:40.800 There's an important concept, and I don't know why this isn't taught.
00:45:44.720 I don't know where it went.
00:45:46.060 It's mustn't.
00:45:47.980 Mustn't isn't shouldn't.
00:45:49.600 And mustn't isn't can't.
00:45:51.020 You must not say or do certain things in a free speech environment because if free speech doesn't pay a dividend, gentlemen, it's going to go away.
00:46:01.280 The whole key to free speech is the culture of self-restraint, decency and prosperity so that people feel that they've got a lot to lose.
00:46:11.200 Yes, the problem with that is the Internet is anonymous and for very good reasons, and we would want to protect some anonymity, I think.
00:46:21.120 But an anonymous environment, because of the way human beings are, we've all been in a car behind a windscreen.
00:46:30.120 Human beings don't seem to be capable of behaving in that way without the threat of repercussions, right?
00:46:37.320 And if you're anonymous, there is no threat of repercussions because you're anonymous.
00:46:41.200 So you've made us some assumptions that we should have this very high level of anonymous speech and that the real problem is coming from anonymous accounts.
00:46:54.460 Well, so look, Jordan's talked about this.
00:46:56.880 Jordan Peterson's made this point that the anonymous trolls on the Internet behave in the way that they behave because they're anonymous.
00:47:04.300 So his shortcut was, well, we've got to end anonymity.
00:47:07.560 And a lot of people, I think, rightly said there are some problems with this because anonymity is actually highly valuable.
00:47:13.900 It's valuable for journalists in Iran.
00:47:15.640 It's valuable for this.
00:47:16.460 It's valuable for that, right?
00:47:17.760 It's valuable for people to be able to retain that anonymity in certain situations.
00:47:22.940 You're frowning at me.
00:47:24.640 I'm just laying out the argument, right?
00:47:26.000 No, I'm just, I want to make sure we don't build in assumptions.
00:47:29.460 I don't think that we know that Jordan Peterson is wrong.
00:47:33.980 I don't think we know.
00:47:35.440 Well, wrong and right, these are very simplistic concepts.
00:47:38.280 The point I'm making is there are tradeoffs to everything.
00:47:40.840 You could ban anonymity on the Internet and that would have lots of positive benefits.
00:47:44.940 It would also have lots of terrible consequences.
00:47:47.380 Sure.
00:47:47.880 Right.
00:47:48.060 You could also not ban anonymity on the Internet and then you'd get what we have now.
00:47:53.080 So those are the two shelling points.
00:47:55.160 Yeah.
00:47:55.760 And then you have all the interesting stuff in between them.
00:47:58.700 And as a guy who's never opened borders or closed borders or never pro-life or pro-choice, etc., etc.,
00:48:04.080 I don't know how the shelling points completely derange us.
00:48:07.740 But you can have anonymous accounts in which you have to become known to the platform in order to open one.
00:48:16.500 Which will get hacked and then the government of Iran is going to find out where you live.
00:48:20.520 So I'm going to continue with this.
00:48:23.100 There are all sorts of technological measures and countermeasures.
00:48:27.240 And it's not clear to me that if you choose to use anonymity that you shouldn't bear some risk.
00:48:33.080 Because the risk that you're putting other people at, it's what you just said before.
00:48:38.780 It's about constraints.
00:48:40.300 It's about trade-offs.
00:48:42.360 And it's about objectives.
00:48:44.520 What we have now, I'm concerned, is not going to deliver us free speech in the long run.
00:48:50.220 No, I agree.
00:48:50.860 The ability to ruin lives using these tools is too easy.
00:48:56.080 The friction for ruining a life.
00:48:57.880 You know, even sometimes what you have is you have a situation whereby it's very hard to catch anybody.
00:49:04.260 But if you do catch them, the penalties are so dire that the expected return of that strategy is turned negative.
00:49:10.820 I think that what I'm very concerned by is that we have a group of people who are used to being controlled,
00:49:17.080 who've decided that they want no controls whatsoever.
00:49:20.540 And clearly, the right thing to do is to get the controls as far away from law and restriction as possible.
00:49:29.860 And as much towards mustn't.
00:49:32.760 How do we do that?
00:49:33.320 I would say the question is, how did we do that?
00:49:39.720 And in large measure, you know, I don't mean to give you a big ego, but you did say something at the ARC conference that I wish I could remember exactly.
00:49:49.020 You said, I'm not a conservative, but I will tell my conservative friends that if they want their culture conserved, they're going to have to deal young people in.
00:49:59.460 People need a stake.
00:50:00.640 People need to feel that they have a lot to lose.
00:50:05.500 And you are not going to be able to preserve the West, whatever that means, if people don't feel that this is a glorious place and we are so lucky to have it.
00:50:19.060 The whole thing is about the returns to decency.
00:50:23.260 You know, I can't tell you how dispiriting it is when you see these discussions about, you know, nice guys never get the girl.
00:50:35.240 You're like, well, maybe not nice guys, but what about good guys or good guys despite themselves?
00:50:43.460 You know, I've always said that the secret is not the pickup artistry, but is the Han Solo character.
00:50:50.100 He was a good guy despite himself.
00:50:52.780 And, you know, the vibe is always, do I have to save the world again?
00:50:56.300 And I really was hoping to make some money this week.
00:50:58.820 Okay.
00:50:59.760 You know, extreme altruism and decency has to pay a dividend.
00:51:06.400 And I just don't know how to get all of this fake masculinity out of the equation.
00:51:10.740 You know, if you, if you're worried about whether or not you're a man, try standing up for a friend, try standing up to a mom, try standing alone.
00:51:19.200 You know, that's, that's like, it's terrifying.
00:51:22.280 I find it really unpleasant in many ways, but at least I'm not worried about being a coward.
00:51:28.020 The word coward is the one word I think that I'm really just doesn't land.
00:51:32.700 I've had to do so much unpleasant stuff.
00:51:35.120 The thing that really makes me sad is I don't feel like I've inspired people.
00:51:42.180 I don't feel like I've inspired people to stand up for their friends.
00:51:46.260 You know, I have a policy that says there is, if you're a friend of mine, there's nothing you can come after my friend with that will cause me to repudiate him.
00:51:56.180 And the thinking behind it is, is that I don't want to set up an incentive structure.
00:51:59.700 Do you condemn or condone?
00:52:01.120 You see all these, I neither condemn or condone my friend.
00:52:04.300 I don't know what he did.
00:52:06.140 It was 8.5 years ago.
00:52:07.520 I didn't know him yet.
00:52:08.100 You know, like that kind of weakness.
00:52:10.780 Men deal with things in private.
00:52:14.200 And, you know, I might be entirely supportive of you in public.
00:52:19.980 And then I might, you know, take you out behind the woodshed and say, what the F?
00:52:26.540 Men need friends and men need to know how to stick together and men need space to be male.
00:52:31.340 Deciding that every group of men is a threat to women is a disaster.
00:52:38.340 If men are going to stand up for women, they need space to work out their stuff.
00:52:42.940 So, you know, my feeling about this is we need a culture that creates mustn't.
00:52:51.400 Yes, but I asked you how you do that on the internet and we're not closer.
00:52:58.440 Well.
00:52:59.560 Because it's the internet you're talking about.
00:53:03.120 That's where this happens, right?
00:53:04.780 So my claim is, is that most of the time I don't see in the street, like you see these
00:53:12.020 videos where people are doing horrible things.
00:53:14.480 Like there's some old woman and some guy will just like clock her.
00:53:18.300 You know, what do they call it?
00:53:19.380 Happy.
00:53:20.660 Happy slapping or something.
00:53:21.940 Happy slapping or something like that.
00:53:23.500 And we're just aghast because we can't believe that that kind of internet level behavior exists
00:53:31.620 in the real world.
00:53:33.980 We need at some level to recognize we are partially solving this already in the real world.
00:53:41.120 For all the talk about how terrible the streets in Los Angeles are, you can sit out at a cafe
00:53:45.700 and not have any expectation something horrible is going to happen to you.
00:53:48.960 We need to figure out how to return costs to bad actors.
00:53:55.220 But that's why I'm asking, sorry, Francis, just to finish.
00:53:57.380 No.
00:53:57.640 That's why I'm asking about the online environment.
00:53:59.720 So I keep saying things and you guys keep saying, well, we have to protect this person
00:54:03.040 in Iran.
00:54:04.420 Hold on.
00:54:05.100 No, no, no.
00:54:05.480 I'm just asking you how it gets solved online because your argument of the real world is
00:54:09.820 very different.
00:54:10.440 In the real world, in the physical realm, there are very clear material potential consequences
00:54:15.860 to bad behavior.
00:54:16.740 So you post a bond and the idea is that if you're found to be misbehaving, it's under
00:54:22.960 the sole discretion of the platform whether to confiscate money that you put up or you
00:54:27.900 have some sort of a series of cryptographic signatures where, in general, you need multiple
00:54:34.240 agents in order to figure out whose account it is.
00:54:36.600 But you actually have to have an ID.
00:54:41.660 I could probably generate 13 different ideas.
00:54:44.380 But if you decide that anonymity is sacred and I should be able to create 10,000 accounts
00:54:50.760 for free.
00:54:51.560 No, you shouldn't be.
00:54:52.840 Yeah.
00:54:53.340 But I think this is the conundrum.
00:54:58.200 The ideas you listed don't seem to me like they're particularly workable.
00:55:02.440 Well, you didn't like my idea of solving the Middle East problem last time either.
00:55:05.540 I thought it was a lot better than you thought it was.
00:55:07.800 Well, I'm sure it was a lot better than I thought it was.
00:55:10.020 But my point is that I think the dilemma for social media companies is massive.
00:55:19.620 And the problem you've identified is totally true.
00:55:22.920 There is a culture, particularly on Twitter, where everyone performatively acts like a dick
00:55:30.580 and you can't say, why are you being a dick?
00:55:35.640 Because then you're being a pussy.
00:55:37.680 That's kind of how it works.
00:55:38.960 That's the dynamics of the conversation.
00:55:42.080 So then you have this current situation where you drive out every regular human being and
00:55:48.300 only the monsters have their ball.
00:55:54.040 Guys, I'm sorry.
00:55:55.920 I've dealt with this before.
00:55:57.300 I've watched Jordan.
00:55:58.460 In my opinion, Jordan wanted to get rid of anonymity.
00:56:00.820 I don't necessarily want to get rid of anonymity.
00:56:02.440 I want to come up with a schedule to return costs.
00:56:04.960 The key point is, you know, get 100 super smart people who understand economics, who understand
00:56:10.740 technology, cryptography, into a room and task them.
00:56:16.620 How do we return costs to bad actors without banning bad speech?
00:56:19.920 How do we create mustn't?
00:56:23.520 It's a puzzle.
00:56:24.480 Have an X prize.
00:56:27.020 You know, come up with scenarios where you have speech simulations and you try to figure
00:56:31.460 out, you know, in general, we have type one error where we create a decision boundary
00:56:36.720 as to what must and mustn't correspond to.
00:56:41.900 Must you allow us?
00:56:42.920 Mustn't you allow us?
00:56:43.860 How do you return the cost?
00:56:45.340 And let's get busy.
00:56:46.560 Let's have fun.
00:56:47.280 I just, the conversation is trapped in the old terms of service, free speech versus censorship
00:56:53.880 thing.
00:56:55.660 Anytime you see that kind of polarization, try to push it off to the side because in general,
00:57:02.020 what it's saying is, is that shelling points are going to determine, you know, when does
00:57:06.820 life begin?
00:57:07.360 Let me guess.
00:57:07.820 Viability, moment of conception, birth, nobody says, you know, I don't know, let's look at
00:57:14.860 the embryology and figure out where neural tube formation is, something like that.
00:57:20.600 Eric, are you hopeful for America as we're sitting here?
00:57:25.980 I mean, if you want to get into radical hope, I'm the most hopeful person.
00:57:33.420 I mean, like, I have hope at levels that would be offensive to you guys.
00:57:39.160 But also terrified.
00:57:40.460 Yeah.
00:57:41.600 Well, because I don't understand what the hell the rest of you look for.
00:57:47.340 To me, I'm having an illusion, which is that I'm the only sane person on planet Earth,
00:57:52.240 and that's never a good sign psychologically.
00:57:55.680 You do live in Los Angeles.
00:57:57.000 What?
00:57:57.440 You do live in Los Angeles.
00:57:58.560 And I am, yes, and I am narcissistically the center of my own world.
00:58:03.420 The really exciting thing is, is Elon is the other guy talking about it, is interplanetary
00:58:12.060 diversification so that we don't have all of our human eggs in one basket.
00:58:16.720 And there's basically two routes, either chemical rockets or not chemical rockets.
00:58:23.980 So Elon's on chemical rockets.
00:58:26.180 I don't think that's what happens next.
00:58:28.700 I think that what happens next is that the string theory juggernaut that is blocking the
00:58:35.300 exit is about to die.
00:58:38.660 And there are huge things happening right now, but there was a tremendous podcast of Kurt
00:58:45.920 Jaimungel on theories of everything, interviewing Leonard Suskin, where for the first time, a
00:58:51.600 string theorist totally misgaged the quality of the interviewer, because normally an interviewer
00:58:57.300 can't push back.
00:58:58.360 So you have interviewers who are string theorist cheerleaders like Neil deGrasse Tyson or Sean
00:59:04.680 Carroll or Brian Greene.
00:59:07.260 But suddenly there was a guy who seemed to be running a normal channel who knows way too
00:59:12.600 much about theoretical physics and mathematical physics.
00:59:15.680 And he started pushing back on one of the fathers of modern string theory.
00:59:19.200 And during the interview, the guy goes, Leonard Suskin goes from saying, you should always
00:59:25.420 follow the conventional wisdom because it's almost always right.
00:59:29.140 Don't go with the weirdos.
00:59:31.500 And Kurt says, well, do you think string theory is the only game in town?
00:59:35.500 He's like, more or less, yes.
00:59:38.680 And Kurt says, well, what about the other theories?
00:59:42.100 He says, what other theories?
00:59:43.900 So Kurt starts listing the theories that Kurt is aware of.
00:59:47.400 And Leonard Suskin was forced to spin on a dime.
00:59:51.160 First time in 40 years, somebody got an answer to the question.
00:59:53.860 How do you know that string theory is the only game in town if you don't know anything
00:59:57.500 about what's going on outside of string theory?
00:59:59.460 It's the dumbest question.
01:00:00.940 And for 40 years, nobody's been able to crack it.
01:00:04.440 Kurt Jai-Mungel just figured it out.
01:00:07.400 The interview is there for all time.
01:00:09.420 And Leonard Suskin suddenly becomes a much better version of himself.
01:00:12.580 I called him an asshole on Chris Williamson because that's what he is.
01:00:16.980 He's asked on this program.
01:00:18.900 It's very funny.
01:00:20.020 He said, well, what do you think about Eric Weinstein?
01:00:21.380 He said, I don't know who that is.
01:00:22.600 I was sitting next to him after having talked to him at Natty Seiberg's lecture at Stanford
01:00:29.400 in 2017.
01:00:30.240 I have a picture where he's right next to me.
01:00:32.120 It's ridiculous.
01:00:33.780 Something huge is about to happen, gentlemen.
01:00:36.640 And that is, is that we are about, I think, to go beyond the standard model and general
01:00:43.320 relativity.
01:00:43.860 And if you think about Richard Nixon as putting in the speed limit of 55 miles an hour around
01:00:51.340 1974, something like that, which lasted until the mid-90s, we've had the speed limit being
01:00:58.240 the speed of light, or C, since 1905 until the present.
01:01:02.320 And just imagine if the successor theory to Einstein's general theory of relativity doesn't
01:01:09.420 have the same restrictions.
01:01:10.560 Instead of fantasizing about wormholes or time dilation or generation ships, you may find
01:01:17.500 that there are new degrees of freedom.
01:01:19.040 And if you want to talk to me about what I'm really excited about, what I'm really optimistic
01:01:23.700 about, is I'm hopeful that we're going to recapitulate within physics something that
01:01:29.280 happened in technology.
01:01:30.800 And I don't know that this is true or possible, but here's the hope.
01:01:34.280 In the early 2000s, there was a demonstration at a TED talk, and you can see it online, where
01:01:40.740 this guy says, here's our new interface, and I'm touching the screen and moving pictures
01:01:46.720 around.
01:01:47.100 And then he takes a picture and he goes like this.
01:01:50.420 It's called a multi-touch gesture.
01:01:52.680 And in that particular case, it's called pinch to zoom.
01:01:55.100 The biggest hope that I have is that the next theory has something like pinch to zoom.
01:02:06.260 And that the entire concept of being limited by the speed of light and this Einsteinian
01:02:11.340 moat, where it would take us over 100,000 years to get to the next star, let alone the
01:02:16.200 optimal star, going as fast as man has ever traveled before.
01:02:20.540 There's no way you can do that.
01:02:21.980 You know, my people are like a bit over 5,000 years old.
01:02:25.600 I guarantee you, you're not going to get along on a generation ship for 100,000 years.
01:02:31.420 The idea of not having the speed.
01:02:34.660 Israel and Hamas on the spaceship.
01:02:37.300 Guys, we have to get along.
01:02:41.800 What about possible free energy?
01:02:45.200 What about the idea of dark chemistry?
01:02:48.140 Physics has been the provider of our economy.
01:02:53.660 The World Wide Web comes from it.
01:02:55.000 The semiconductor.
01:02:56.220 All of molecular biology was founded by physicists.
01:02:59.340 And physicists has been, physics at its highest level has been dead since 1973.
01:03:05.080 The thing that governs it is called the Lagrangian.
01:03:07.980 The Lagrangian of our world has not moved in 51 years.
01:03:12.140 And 40 of those years have been taken up by string theory blocking the exit.
01:03:18.060 I am so hopeful that with Elon, that with new talent at the National Science Board, the National Science Foundation, OSTP, PCAST, all of the science agencies, and the Department of Energy, which, by the way, is really the Department of Physics.
01:03:33.380 It came out of the old AEC, the Atomic Energy Commission, the hope that the U.S. could restart cowboy science, that we could drive a wooden stake through the heart of DEI, which devitalizes us, that we could bring back the great man theory of physics, which is just a belief in the individual rather than the committee, or that something is in the zeitgeist or the volksgeist.
01:03:55.380 It could build us the most fantastic weapons we've ever seen, but it's now time to risk it all.
01:04:08.480 The thing that I'm really excited about is the end of the stagnation in theoretical physics.
01:04:15.440 And in particular, I have an entrant in that, which is called geometric unity, which has been completely misunderstood, because nobody has actually really gone into it and taken it seriously.
01:04:26.360 And it's going to be hysterically funny if for 40 years, I've been trying to have the conversation with a community that refuses to have the conversation openly and honestly.
01:04:36.960 There's like all this backstabbing and skullduggery and interference competition, all that, is for the same time that string theory has taken up all the oxygen, geometric unity has been available.
01:04:47.560 And if I'm correct, there's so much great work to be done and there's so much hope and there's so much possibility, because if we can start believing that the stars are a vacation destination, that we can homestead in the cosmos.
01:05:01.520 It's going to be that moment with Ferdinand and Isabella on a scale that we've never seen.
01:05:07.860 And I think it's just, it's hard to talk about because nobody has a, there are very few unprecedented things.
01:05:14.420 If you think about the hydrogen bomb in 52, nobody had ever seen a thermonuclear explosion on planet earth.
01:05:22.500 That's the kind of thing that only happens in the sun.
01:05:26.020 This is one of the only things that's never happened before.
01:05:28.580 And the other one is AI.
01:05:31.520 And you can just tell how bad this campaign was by how little AI figured in it.
01:05:38.960 It should have been all about AI.
01:05:43.080 And the one person who had a really interesting new idea about AI was barely heard.
01:05:48.700 And that was Nicole Shanahan.
01:05:52.500 You could like barely hear it.
01:05:54.460 It was a whisper.
01:05:54.960 AI is going to require something that isn't UBI, and it's called Kosian analysis.
01:06:03.100 And the economists know all about it.
01:06:04.540 It's the deepest idea to come out of economic theory, potentially.
01:06:09.120 And in order to keep AI from destroying the world's labor market and destroying capitalism itself, I'm very excited that Kosian analysis is going to be able to restructure the way in which AI crashes over our labor market.
01:06:25.920 So what does that involve, Eric?
01:06:28.140 The idea being that here's the crazy idea.
01:06:33.040 Assume that you and I are at odds with each other.
01:06:36.480 You're a polluter, and I need the environment pristine in order to fish in a lake.
01:06:40.480 The question is, how should we figure out how much pollution you should be allowed for your furniture factory and how much pristine water I need in order to feel comfortable about selling my fish?
01:06:52.180 What Kos figured out, and was a complete shock when he introduced it at the University of Chicago in the money and banking seminar of Milton Friedman, was that if you assign rights to pollute the lake, we can either give you the rights as the furniture guy, and then I could buy them away from you as the fishing representative.
01:07:08.540 Or you can give them to the fishermen, and the furniture factory can buy the rights.
01:07:15.780 Now, if we give them to you, you're going to get rich.
01:07:18.680 And if we give them to me, I'm going to get rich.
01:07:20.160 And if we distribute them, we'll get something else.
01:07:23.420 But the weird thing is that the lake gets used the same way because the market figures out the same efficient allocation of the underlying resource.
01:07:31.520 I wrote a paper in 2002 called Migration for the Benefit of All, where the whole problem was that the conservatives were for open borders.
01:07:43.560 Nobody remembers that the Wall Street Journal was leading the charge for a five-word constitutional amendment.
01:07:48.700 There shall be open borders.
01:07:51.960 And so we were trying to figure out how to get around it.
01:07:53.640 Somebody at the UN named Manolo Abella saw me give a crazy talk.
01:07:59.360 Nobody liked the talk except for Manolo.
01:08:00.980 He said, this is great.
01:08:03.380 Why don't you write it up?
01:08:04.340 So I wrote up a thing that said that the workers in each field should have the Kosian rights and license them to the employers to bring in foreigners if all of the costs are incorporated.
01:08:17.860 And what that did was say, imagine you're the only American cab driver in New York City.
01:08:24.040 You don't want to have a restriction.
01:08:26.820 You want a ton of people to come in and drive cabs.
01:08:30.380 But then you're going to get licensing income as your wage income goes down.
01:08:34.740 So your wage income goes down, your licensing income goes up, and you're not going to make a deal unless you're made Pareto improved or better off.
01:08:42.180 So imagine you do that with AI.
01:08:44.400 Imagine the idea is that the workers license the right to use the AI that is trained on this corpus of human knowledge and that their wages will go down and their licensing income will go up and the underlying market will not be destroyed.
01:09:00.540 We have to try to keep AI from destroying the market because otherwise we're in command and control.
01:09:06.860 And then you have UBI, and UBI is cancer when it comes to dignity.
01:09:10.840 Here. Here's some money. Go amuse yourself. You're now useless. This is terrible.
01:09:17.280 But if you're an owner, you're saying, look, I have a right to my labor market.
01:09:22.420 I'm going to allocate so many licenses to use AI.
01:09:26.580 They're going to beat up my wage.
01:09:27.880 But I'm going to make money from the licensing, and you're going to make money from the product and the gain in efficiency.
01:09:31.960 So it's super, super exciting that Nicole was talking about this, and Jay Bhattacharya, Pia Malani, myself, and Nicole started ideating about it.
01:09:45.320 And then we got overwhelmed with the drama of the election.
01:09:48.400 Let's get the drama behind us.
01:09:49.780 Let's fix CPI, which was broken by the Boskin Commission, appointed by Patrick Moynihan and Bob Packwood in order to artificially represent the inflation rate as lower.
01:10:03.660 That way, taxes would go up because they're shielded by tax brackets, and entitlement payments would go down because they're cost of living adjusted.
01:10:10.200 That's super important.
01:10:12.380 Let's reverse the whole H-1B nonsense, which was the thing created by Peter House and Eric Block at NSF and the National Academy of Science as a conspiracy to pass the Immigration Act of 1990.
01:10:25.500 I've been tracking all of these adulterations to the U.S. code.
01:10:33.140 We've been getting farther and farther from what worked by making laws.
01:10:36.940 And what I hope and pray, and I don't care whether I get a job in Washington or not, is that somebody calls and says,
01:10:45.700 Eric, what do you know about unfucking the United States of America?
01:10:49.620 Because we have made so many bad laws, whether it's the Bayh-Dole Act, the Mansfield Amendment, the Eilberg Amendment, Immigration Act of 1990, the Smith-Munt modernization.
01:11:01.080 We've got to know how we screwed ourselves over historically.
01:11:04.800 And I don't know whether you guys even know this, but I used to be the co-founder of the Science and Engineering Workforce Project at Harvard and the National Bureau of Economic Research.
01:11:16.220 I conducted a giant survey of cell biology for the American Society of Cell Biology in order to show that people were exploiting their graduate students and that China was getting a periscope into everything we were doing.
01:11:29.580 So for a long period of time, I was very focused on policy, but there was nothing to do because basically the Dick Morris innovation was that the Republicans and the Democrats would compete to steal the silver and sell it rather than keep the American machine humming.
01:11:45.580 So what I hope and pray is that Elon and Trump and company just ask, how did we get into this terrible spot?
01:11:55.080 How can we start fixing our official statistics, our laws, our code of federal regulations, and getting rid of all of the rot of the last 32 years?
01:12:06.000 Eric, thanks for coming back on the show.
01:12:08.040 Head on over to Substack where we're going to ask Eric your questions.
01:12:11.020 This is from Phil Morrissey, if asked by the Trump administration what role you would be best suited to fulfill and what could you achieve in said role?