TRIGGERnometry - July 28, 2024


"The Real Risk is Totalitarian World Government" - Peter Thiel


Episode Stats

Length

1 hour and 6 minutes

Words per Minute

176.24084

Word Count

11,687

Sentence Count

492

Misogynist Sentences

2

Hate Speech Sentences

19


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, I sit down with Peter Thiel to talk about his vision for the future of technology and AI, and why courage is in short supply in the modern world. Peter Thiel is a billionaire entrepreneur, venture capitalist, and venture capitalist. He s been around for a long time, and is one of the most influential people in the world in terms of venture capital and venture philanthropy. He has invested heavily in venture capital, and has been a regular contributor to The New York Times, Forbes, and The Huffington Post.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.700 Broadway's smash hit, The Neil Diamond Musical, A Beautiful Noise, is coming to Toronto.
00:00:06.520 The true story of a kid from Brooklyn destined for something more, featuring all the songs you love,
00:00:11.780 including America, Forever in Blue Jeans, and Sweet Caroline.
00:00:15.780 Like Jersey Boys and Beautiful, the next musical mega hit is here, The Neil Diamond Musical, A Beautiful Noise.
00:00:22.660 April 28th through June 7th, 2026, The Princess of Wales Theatre.
00:00:27.120 Get tickets at Mirvish.com.
00:00:31.000 We've been in a relatively stagnant period for something like 40 or 50 years.
00:00:37.720 There is a tremendous amount that's happened in the world of bits, around computers,
00:00:42.180 but many other things that would have been called technology have really stagnated,
00:00:46.820 and the sort of world of adepts has seen dramatically less progress.
00:00:51.040 Are we going to get this dangerous AI that disrupts our society and maybe it becomes this dangerous weapon,
00:00:58.960 or are we likely to get the one world nanny state that stops it from being built?
00:01:04.940 Since people are worried about the former, not the latter, that tells you you should worry about the latter, not the former.
00:01:09.440 Peter Thiel, thank you so much for giving us your time, hosting us here in your office.
00:01:15.880 It's great to be with you.
00:01:17.540 The first thing I want to ask you is actually two questions that are very tied in together.
00:01:21.240 You're one of the most visionary people, I think, that we could possibly talk to.
00:01:26.020 Where are we today, the Western world, and where are we going to be 10, 15, 20 years from now, do you think?
00:01:32.740 Well, it's hard to tell.
00:01:38.120 Questions about the future are really hard for people to answer.
00:01:40.740 My thesis that I've articulated in various forms over the last 15, 20 years is that as regards questions of technology and science,
00:01:56.960 we've been in a relatively stagnant period for something like 40 or 50 years, since maybe the 1970s.
00:02:05.840 That, you know, there is a tremendous amount that's happened in the world of bits around computers, internet, mobile internet, maybe crypto, maybe now AI.
00:02:19.340 But many other things that would have been called technology in, let's say, 1967, the year I was born,
00:02:25.720 like rockets, supersonic aviation, underwater cities, the green revolution, agriculture, new medicines, have really stagnated.
00:02:37.060 And this sort of world of adepts has seen dramatically less progress.
00:02:42.380 And this is sort of a thesis that I've articulated about, and it's about the present and maybe the last 50 years or so of our past.
00:02:51.340 It gets debated a lot.
00:02:53.760 People don't, you know, agree with it.
00:02:55.180 It's very hard to figure out because it is perhaps also a feature of late modernity that things are extremely specialized.
00:03:04.600 And so, you know, who are you to say that there's no progress in string theory?
00:03:09.980 It takes half a lifetime of study to even, you know, get a handle on it or what's going on in quantum computers or cancer research.
00:03:17.520 And so it's extremely hard to get a handle on these things.
00:03:22.440 But my sense is that in many areas of the sciences, many of these STEM fields have been quite corrupted, politicized, bureaucratized,
00:03:34.860 you know, de-risked, where people aren't willing to have bold ideas take risks.
00:03:39.240 And we have this sort of incrementalism and snail's pace type of a progress.
00:03:45.200 And it shows up in a lot of different ways.
00:03:48.960 But economically, it shows up in these sort of stagnant living standards in, you know, governments that have ever-rising debt because, you know,
00:03:59.200 it's okay to have debt and borrow from the future if you have a lot of growth.
00:04:03.040 And if you don't have growth, it's very, very dangerous.
00:04:06.520 And it shows up in sort of a variety of different ways.
00:04:11.940 And that's what I think has been the story for some time.
00:04:16.340 And probably if one were to extrapolate my median case is that that's what continues.
00:04:23.520 We continue to have, you know, a certain powerful progress in the world of bits and continued regulation, stagnation, sclerosis,
00:04:36.300 whatever you want to call it, in the world of atoms, which is, in my judgment, the more important part because, you know,
00:04:41.140 we're physically embodied beings and, you know, we don't just want an app that tells you that you're going to get dementia.
00:04:48.420 We want actually cures for that, too.
00:04:50.580 So, you know, it's not just on the level of data and information.
00:04:54.620 And you said in a speech around 10 years ago now that in the modern West, courage is in shorter supply than genius.
00:05:02.280 A, what did you mean exactly?
00:05:03.840 And B, why does that matter?
00:05:06.840 Well, there sort of are a lot of different applications of it.
00:05:14.460 But it's, you know, it's, you know, I often I often have said that, you know, our greatest political problem, in a sense, is the problem of political correctness, of the sort of conformity of thought.
00:05:30.240 And then, you know, if we're, and the question is always, you know, in order to get fresh new ideas and, you know, a larger surface area of discussion and debate, do we need, do we need great genius?
00:05:44.720 Or is it just like some sort of courage of going against certain kinds of social norms and things like that?
00:05:57.560 And, you know, and, you know, my intuition is the problem of political correctness, you know, is, is, is very great.
00:06:03.700 The pressures for conformity, you know, in some sense are, and people don't get sent to the gulag, but they, they, they somehow the felt sense is, is that these pressures are extremely high.
00:06:15.320 And that's why, you know, even a, you know, a little bit more courage would probably do a lot of good.
00:06:19.780 Peter, isn't genius a combination of the two, because you need to have the brilliant idea, but you also need to vocalize the idea.
00:06:27.400 And then you also need to implement it.
00:06:29.500 And if you think about the geniuses, and we can mention right the way through history, practically every one of those ideas was controversial.
00:06:37.880 And in some time, and at some points, even it could have ended the life of the person who vocalized it.
00:06:44.740 Yeah, that, that, that surely is, is correct.
00:06:50.640 I think I was using genius, you know, in a more narrow sense of someone who has just a very high IQ, which would probably be, you know, the, the sort of IQ mania, Silicon Valley perspective where, you know, people talk, you know, Google tries to give people quasi IQ tests and these algorithm tests when they hire people.
00:07:08.360 And, and, and, and, and then, and so I do, I do think you, you, in some sense get people who are very intelligent in, in, in some sort of definition, but, but end up being not very creative, not very impactful in, in all these other ways.
00:07:25.580 And it's, it's quite strange that, you know, the, the academia version of it would be, you know, there was, there was a type that was already going extinct when I was in college in the 1980s, which I would describe sort of the, the, maybe the brilliant, but eccentric, the eccentric professor type who had, you know, was somewhat of a polymath, had ideas about a lot of different, different subjects.
00:07:46.500 And, and, and that type has, has, has, has basically gone extinct, whereas you probably still have a lot of people in academia still who would score reasonably highly on an IQ test.
00:07:59.420 But that's a very interesting point. Can you blame people for not having courage when they're incentivized to conform? Because, you know, we can, we talk about Google. Google don't want people challenging.
00:08:12.580 Google don't want people who are, you know, so creative that they're difficult to control. They want people to come in to do the job and then leave.
00:08:21.700 Well, you can, you can, you know, it's a great deal. This is surely done at the margins. And so, you know, yeah, I'm not encouraging people to be foolhardy or to be martyrs or to be suicidally insane or anything like that.
00:08:42.580 But, you know, like that, but, you know, at the margin, I think it would do a lot of good if people had just a little bit more courage in all of these cases.
00:08:50.160 There, there are probably ways you can, you can blame people for putting themselves into these contexts.
00:08:57.280 So if, you know, if someone goes into academia and thinks that they will, they'll have this sort of creative intellectual career, you know, at some point, they should figure out that the pressures are enormous and militate against that in very powerful ways.
00:09:16.360 And, you know, they should try to figure out a way to do something creative outside of academia because they will not be able to do inside.
00:09:24.420 So, so yes, I think, I think there's a way in which if you look at it locally, it's always not your fault.
00:09:29.460 You know, you're going to Google that day and, you know, you don't want to create a hostile work environment or something by, by articulating some heterodox view on gender relations or something like that.
00:09:40.020 But then if you, if you widen the aperture, I think you can blame people a little bit more for, you know, having made the decisions to put themselves in these situations, not thinking about the social context enough and, you know, having, having had the blinders on in some sense.
00:09:54.420 Do you think that's one of the reasons why there's so many autistic people who go on and found big companies or in these huge level entrepreneurs because they see the world in a different way and they're not as connected to wanting to be part of a group as much as neurotypical people are?
00:10:15.160 Yeah, it's, it's, it's probably, probably if someone's fully autistic, there, there, there probably are ways that that's, that's pretty not helpful in, in all sorts of ways.
00:10:26.720 But, but yes, certainly there are probably people who are very mildly on the spectrum where, where it's, it's strangely been, been helpful and there probably are a number of people in tech where, where something like that is, is true.
00:10:40.960 Um, but then I, I think this is, this is not a pro Asperger's, this is, this is a, this is more a commentary on, uh, wow, our society is really insane and deranged.
00:10:51.300 Well, in a, in a, in a, you know, in a, in a healthy society, uh, someone who, who had Asperger's would, would be a less functional person and would be able to get less done.
00:11:00.780 And, uh, we must be just in an incredible social pressure cooker society where, you know, an average person with a pretty good social EQ just picks up on all these pressures and, uh, um, you know, uh, knows to censor and get rid of every heterodox idea they ever had.
00:11:21.100 Hmm. Well, very much on the point of society being deranged, I want to come back to the idea of the future. And I, I, I understand perfectly well what you say, making predictions, especially about the future is a bad idea, as they say.
00:11:32.880 However, in your career, the thing that many people will know you for is making investments or making moves that were visionary. That's why that I introduced you the way I did. I wasn't sucking up to you. It's just an observation of your trajectory through life.
00:11:48.000 So I would be curious, if you don't mind, just finding out what you see coming down the pipe. And, and we're not going to say, you know, Peter Thiel in 2024 predicted this and now he's an idiot because it didn't happen. I guess we're just curious of what you might see as the things that are likely to occur.
00:12:04.780 Well, I'm always, I'm always extremely hesitant with, um, the categories, the buzzwords. And, uh, so, you know, if, if, if we, you know, there's, in Silicon Valley tends to, you know, traffic in, you know,
00:12:18.000 mobile internet or cloud computing or big data machine learning, you know, the, the current one is AI, which, you know, people have been overusing for a long time and has gone completely into, into overdrive. And, you know, as a, as a venture capitalist or investor, you know, I, I want to invest in successful businesses. And, uh, you know, I think, I think the, the really successful businesses, um,
00:12:47.140 have to, um, have to do something that's unique, you know, it's, um, you know, uh, they have a moat or dare we, dare we say even a monopoly, um, you know, around, around the business, they're, they're doing something that, um, isn't just this commodified, uh, competition. And so, you know, there, there, there are, and I always think of a restaurant as the, uh, as the sort of paradigmatic example of a bad business. If you want, um, you know, if you want, you know, um,
00:13:15.140 nature bared red and tooth and claw, uh, bloody competition, you should open a restaurant. And, uh, I always, you know, my one heterodox idea I have is that competition and capitalism are opposites. In a competitive world, you compete away the profits and you do not accumulate, uh, capital. Competition is for losers. You want to find, you want to find, uh, something where you're doing something that's unique.
00:13:40.140 You have a significant head start. You're, you're, you're ahead of people. And so, so I don't know, on a technology side, I would, I would, I would say that I think the AI breakthroughs are important. They're going to, they're going to have an enormous impact on our, our society in, in, uh, in very different ways. Uh, but, um, as investments, they're very, very treacherous at this point. And I think it's roughly, I mean, the rough analogy is that AI in 2024 is like,
00:14:10.120 like the internet in 1999, it's clearly going to be important, big, transformative, have all kinds of interesting social, you know, political effects, maybe even effects about how humans think about themselves. Uh, but, uh, on a business level, it's very, very treacherous because there were, you know, there were a lot of different internet businesses that failed and even the ones that succeeded.
00:14:32.120 It was, you know, it was quite a rollercoaster. You know, Amazon was the leading e-commerce site in 1999. It was $113 a share on a free, on a pre-split adjusted basis on a, in terms of the price of the time in December of 1999.
00:14:46.120 By October of 2001, it was five and a half dollars. You know, you had to wait till the end of 2009 to get back to the 99 level. And then it went up, you know, it went up 25 X from there.
00:14:56.120 So if you'd held it from December 99 to today, you would have made 25 times your money, but you would have first lost 95%. And then if you'd bought it in October, 2001, you would have made 500, 500 times your money or so.
00:15:07.120 And, um, and so, so in some sense, Amazon was, was the obvious internet company to invest in. And even that was, you know, quite, quite a rollercoaster.
00:15:17.120 And my, my suspicion is that's, that's roughly where we are in AI. It's, uh, it's correct as a technology, but then, uh, you know, extremely, uh, bubbly and crazed as a, you know, as a company building thing or as a, as a sector to invest.
00:15:33.120 A lot of people are very concerned about AI from the perspective of the things that you mentioned, which is the impact it's going to make on human beings, the way we relate to each other, whether we have jobs to go to, which have been the source of not just money, but meaning and purpose to a lot of people over time, the source of social connections where you meet, you might meet your spouse.
00:15:55.120 I mean, the, the, the, the, the things that people think might happen as a result of AI could potentially transform humanity in a way that will be extraordinarily significant. Do you see that happening?
00:16:10.120 Well, um, you know, I, I suppose there are, uh, there are a range of dangers and risks. Um, there is, yeah, there are questions how it affects the labor market, whether is it fundamentally a compliment to human labor that makes humans more productive or, or is it a substitute for human labor where, uh, human workers will get paid, paid less at the end of the day? Uh, you know, most, you know, the history of the industrial revolution,
00:16:39.120 was that, uh, you know, most of the time the Luddites have been wrong. Machines, you know, didn't really replace people altogether. They freed people up to do more productive things and, um, in some sense increased the GDP per, per human being. And, uh, and my intuition is that that's, that seems like the far more likely outcome with these AI technologies that is, was, you know, also what computers did and, you know, in some sense, what, you know, what has been happening.
00:17:08.120 Since the industrial revolution, um, I think there are, you know, there are complicated distributional questions. Will, will a lot of the gains be captured by a few big tech companies? Will it be more, um, more evenly distributed in our society? I don't think people have particularly good models on that. And of course you have, you know, and of course you have also, uh, all these, uh, scarier existential risks where, you know, maybe, I mean, you know, I, I don't, I don't really believe the science fiction version where the AI
00:17:38.100 AI becomes, you know, um, you know, um, you know, a superhuman godlike being and decides to destroy the world. Why not? Uh, I, I think, I think well before you get to that point, um, it can be weaponized by humans in a military use, which is probably also just as scary.
00:17:55.500 And so, um, you, um, there's sort of all, so there are all sorts of, you know, there are all sorts of intermediate scenarios that are dangerous. But I, I, yeah, look, I, I would concede that it is a, you know, it's a fairly, uh, there are, there are some great dangers in the technology.
00:18:10.380 And I understand why people are nervous or scared about it. Um, I, however, the, the place where I do strongly come down on the opposite side of the precautionary principle and the effective altruists and the East Bay rationalists and the, you know, Eliezer, Bostrom, Cabal of, of, of people is that, uh, you know, if we, if we talk about different, you know, different kinds of existential risk in the world, you know, in the, you know, nuclear war,
00:18:40.360 there's the, uh, the, the AI that kills everybody, maybe dangerous biotech, maybe, uh, you know, um, maybe climate change or various types of, um, environmental factors.
00:18:53.360 And parenthetically, it's always interesting that the, the people who talk about these things, uh, are always just focused on their own one.
00:18:59.360 And so you can, you know, I always think there's a critique of someone like Greta that she's insufficiently apocalyptic in her thinking because she's not worried about AI and she's not worried about nuclear weapons.
00:19:08.360 And then the AI people that aren't worried enough about climate change and maybe we should get them all in a room and have them fight it out first.
00:19:14.360 But, uh, the existential risk that I always want to also put into the hopper, if, if we, if we were to have a comprehensive discussion of these risks is, um, is the risk of a totalitarian one world government.
00:19:27.360 And I think, uh, and I think, uh, and I think that, um, the answer, the implicit answer to so many of these existential risks is a totalitarian one world government.
00:19:36.360 And so, you know, Greta thinks climate change is the biggest problem. Everybody should ride a bicycle.
00:19:41.360 I would submit that, uh, the way you would actually do this would be, uh, would be, um, you know, going from the frying pan into the fire of, of, of this.
00:19:50.360 And in a similar way, if we, um, if we were to really regulate and, and stop, um, AI from a precautionary principle, you would need something like global compute governance or, or something like this.
00:20:04.360 Um, which, uh, would have to be pretty heavy handed because, you know, anyone can program a computer and it can done on this very local level.
00:20:11.360 So it has to be, you know, much more heavy handed than, you know, the international regulatory bodies that regulate, let's say nuclear weapons proliferation, where, you know, it's, it's hard to build a nuclear, nuclear weapon.
00:20:22.360 And so you, you don't need necessarily a super, um, heavy handed one world government to stop it.
00:20:27.360 You, you would for AI.
00:20:29.360 And so, uh, a lot of it, uh, has the, you know, has the character where I think that that risk is, is much greater than, um, you know, the risk people want to talk about.
00:20:38.360 And then if I, if I had to do the sort of the, I don't know, the, not sure, the contrarian take or the, you know, if you held a gun to my head and said, which do I think we're going to get?
00:20:45.360 Are we going to get, you know, this dangerous AI that disrupts our society and maybe, you know, it becomes this dangerous weapon?
00:20:53.360 Or are we, are we likely to get, uh, the one world nanny state that, uh, stops it from being built?
00:20:59.360 Um, since people are worried about the former and not the latter, that tells you, you should worry about the latter and not the former.
00:21:04.360 It's interesting you say that because you well know a Harari who's a very interesting thinker who identifies many of the, uh, global threats, the ones that you describe.
00:21:14.360 He's open in terms of calling for global coordination in order to deal with them.
00:21:20.360 Is your concern about that based on the fact that no one else is concerned about it?
00:21:25.360 Or are you saying you see steps being taken towards that outcome, the global government outcome?
00:21:32.360 Uh, I, my, my intuitions are that a, a true global government would be, uh, would be quite bad.
00:21:42.360 It would be, it, it, uh, it would have a totalitarian character.
00:21:48.360 It would have a character that there could be no escape from it.
00:21:51.360 Um, you know, my classical liberal intuition would be that the marginal tax rates would, um, be somewhere between 95 and 100%.
00:21:59.360 So like the UK then?
00:22:01.360 Yeah.
00:22:02.360 If you could actually stop people from leaving the UK.
00:22:06.360 Yeah.
00:22:07.360 With the, you know, what, uh, I don't know what, um, what Corbyn would set the tax rates in the UK if, uh, if you could actually prevent people from leaving.
00:22:14.360 But what I'm asking is Germany, if you built a wall to stop people from leaving.
00:22:18.360 But what I'm asking is Peter, do you see things that are currently happening that are taking us in that direction?
00:22:25.360 Do you see people, you know, coordinating in the shadows, so to speak, to make that world government a reality?
00:22:31.360 Well, I think, I think that is, I think in a way that is the, the implicit answer to all these, all these existential risks.
00:22:38.360 Do you think that's why they are being talked about so much?
00:22:41.360 Um, I, you know, I, I think there are, there are good, there, look, there always are true believers.
00:22:49.360 They're useful idiots.
00:22:50.360 There are, uh, people who are in part of a racket.
00:22:53.360 And so, you know, is, is environmentalism.
00:22:56.360 Are there people who genuinely believe it's a problem? Yes.
00:22:59.360 Are there people who are useful idiots and, uh, just tools for others' agendas? Yes.
00:23:05.360 And are there people who are part of a corrupt racket? Yes.
00:23:08.360 It's, and it's always, these things always have elements of all three.
00:23:11.360 If it was, if we could just collapse it to one of them, they, uh, they wouldn't be as, as, as powerful as, as they are.
00:23:17.360 But I think, um, but I think, um, but yes, I, my, my, my, my sense for it is that, uh, you know, a number of these things, the, the implicit answers, uh, require this sort of, uh, supranational coordination in a, in a very deep way.
00:23:35.360 And, and then my sort of political philosophy, uh, sense is that that kind of coordination, um, you know, would, it would, it would be very non-democratic.
00:23:45.360 You'd have, you know, you'd be deferring even more to experts, even more, um, even more to, um, you know, extremely large centralized, uh, structures.
00:23:55.360 So it would be very non-democratic, very bureaucratic, uh, probably fairly high taxes.
00:24:01.360 It's, you know, in some ways the, the kind of transformation that you've had in the, you know, as, um, as Europe has turned into the EU.
00:24:08.360 You know, it was in some ways the, the common market was envisioned in, you know, in 1979 by, by Thatcher when she was pro EU in 1979, because it would be, you know, you'd have this level playing field and you have, you'd have this, this market and it would be a way to, to weaken the unions and all these things.
00:24:27.360 It would sort of be push things in a more capitalist direction.
00:24:30.360 But then as, um, you know, as the, as the common market got created, it came with, you know, this bureaucracy in Brussels that regulated the size of bananas and everything else you could think of.
00:24:40.360 And, uh, that's not in the econ one textbooks that you have free trade.
00:24:44.360 The trade always comes with, um, a super national bureaucracy that regulates it and standardizes and things like this.
00:24:51.360 And, and, uh, yeah, my, my judgment is that, uh, that trade off, uh, would be, uh, you know, maybe it's still okay on the level of Europe because, you know, one can still leave, one can still leave Europe.
00:25:03.360 Um, but, uh, on the level of the world, uh, it would be quite another matter.
00:25:07.360 Broadway's smash hit, the Neil Diamond musical, a beautiful noise is coming to Toronto.
00:25:14.060 The true story of a kid from Brooklyn destined for something more featuring all the songs you love, including America forever in blue jeans and sweet Caroline like Jersey boys and beautiful.
00:25:25.200 The next musical mega hit is here.
00:25:27.480 The Neil Diamond musical, a beautiful noise, April 28th through June 7th, 2026.
00:25:33.480 The princess of Wales theater, get tickets at mirvish.com.
00:25:37.360 Broadway's smash hit, the Neil Diamond musical, a beautiful noise is coming to Toronto.
00:25:44.360 The true story of a kid from Brooklyn destined for something more featuring all the songs you love, including America forever in blue jeans and sweet Caroline like Jersey boys and beautiful.
00:25:55.540 The next musical mega hit is here.
00:25:58.000 The Neil Diamond musical, a beautiful noise, April 28th through June 7th, 2026.
00:26:03.800 The princess of Wales theater, get tickets at mirvish.com.
00:26:07.360 Are you optimistic, Peter, about the future of the United States or are you one of those people who looks at it now and sees that we're in a period of steady decline?
00:26:21.420 I am, I always, I always dislike the, you know, frames of extreme optimism or extreme pessimism because, you know, in some ways, you know, my question about the future, you know, maybe the place where I should disagree with the whole premise of your question is, it's not like the future is, you know, written out there somewhere.
00:26:44.880 And it is that all we have to do is, you know, and it is that all we have to do is, you know, sit back and eat some popcorn and watch the movie of the future unfold.
00:26:52.760 You know, my, my bias, it always comes down to individuals or small teams of people and that the question of agency is extremely important.
00:27:00.400 And, you know, and, you know, and we get to decide in part what kind of a future we want to build.
00:27:05.800 And, and, and if you, if you are extremely optimistic or extremely pessimistic, I think they both end up, both of those attitudes lead to, lead to a kind of, you know, lead to a kind of passivity.
00:27:24.000 Extreme pessimism, there's nothing that can be done.
00:27:26.600 Extreme optimism, there's nothing that needs to be done.
00:27:29.340 And so I think of both of them as sort of code words or euphemisms for sloth and laziness in practice.
00:27:36.100 And so, you know, probably a healthier attitude is moderate optimism, you know, moderate pessimism, where, you know, at the margins, you know, a lot can be done.
00:27:46.700 So with that, you know, big qualifier, yes, you know, there are all sorts of places where one can have very serious concerns about the United States.
00:27:57.820 You know, the deficits are out of control.
00:28:00.500 There's sort of all sorts of things that seem to be on a deeply unsustainable trajectory.
00:28:06.280 The, the thing that I think is very paradoxical about it is that, you know, maybe we have absolute stagnation or even, maybe even decline.
00:28:17.760 But on a relative sense, there's just this felt outperformance.
00:28:22.560 And I, I've started to wonder whether the, you know, the absolute crisis and the relative outperformance are somehow very deeply linked.
00:28:31.380 Because if someone who, you know, is somewhat pessimistic lists all these places where the U.S. has very deep problems, the rebuttal from people like you coming from the U.K.
00:28:44.300 will always be something, well, do you want to move to the U.K.?
00:28:46.940 Or where would you want to go?
00:28:47.960 And, and then that's very hard to answer.
00:28:51.060 And somehow, maybe, maybe, maybe it's a, yeah, it's a, there are these problems in the U.S.
00:28:56.660 And it's, it's coupled with a sense that there are, there are really no other places that are doing better at all.
00:29:04.360 Or that these, that the problems are maybe even more acute.
00:29:07.820 And, you know, the demographic crisis is more acute in Europe or the, the sort of tech stagnation is even more felt.
00:29:14.880 We, we at least have, still have the tech piece, you know, the, you know, the, you know, the IT computer piece is still working in the United States.
00:29:21.080 So, Peter, what are the things that you would change about this country in order to make it more effective, work better?
00:29:30.580 And I think one of the things that we can all agree on is tackling that deficit.
00:29:35.340 Because I'm no economist and I'm not a numbers guy, but I look at the numbers and I'm like, I'm pretty worried.
00:29:44.160 Well, it's always, you know, yeah, in theory that if you could wave a magic wand, there are all kinds of things you would, one would try to do.
00:29:53.440 You know, my, you know, probably my, my policy intuitions are, are still broadly quite libertarian in terms of what one should do.
00:30:04.580 And so, I think, I still think there is a lot that one could do by deregulating, having, you know, a less severely regulated economy.
00:30:15.880 I mean, everything from, you know, the zoning laws here in Los Angeles, you know, if you look out, you look out the window, we have all these skyscrapers, you do not, you don't, you don't see a single construction crane.
00:30:26.300 And that, you know, that tells you, that tells you something about, you know, an incredibly, you know, bad regulatory regime where it's very, very hard, very expensive to build, to build new buildings.
00:30:41.760 And, and, and there's, and, and so I, yeah, my intuition would still be that there's, you know, a lot that one can do on the regulatory side.
00:30:51.880 You know, I think, I think the, the answer that the left has is that you have to raise taxes like crazy.
00:30:59.080 I, I don't think that's the one we should try to do.
00:31:03.660 It's, you know, the, the Republicans probably don't have a great answer to this right now.
00:31:08.380 And, and I think they're, they're implicit answers.
00:31:10.420 We're just going to keep borrowing money indefinitely.
00:31:14.040 And I worry that that's not going to be adequate at the end of the day.
00:31:18.520 And we will eventually get sort of a, you know, a, a very big move to the left if we, if we don't figure out some way to get back to growth.
00:31:27.820 Can I just ask on this point, I've been wondering later, Peter, and feel, obviously, you feel free to disagree with me entirely.
00:31:33.400 But it seems to me people often talk about political polarization, and it's tangible, of course, in both our countries.
00:31:40.420 But the one thing that I'm wondering is, is the inability to deal with the deficit a reflection of that polarization?
00:31:48.660 In other words, if you were running a small company and you had 40 employees, let's say, and you hit tough times, and you said to all your employees,
00:31:57.240 guys, look, in order for us to survive as a business, and for all of us to keep our jobs, we all got to take a 10% pay cut, you're going to get less money, I'm going to get less money, you're going to have less money to spend on your family, you're going to have less money for social benefits, you're going to have less money for healthcare.
00:32:11.480 But that's how we're going to make it, as we're a team.
00:32:15.740 That works if you feel that you're one team.
00:32:19.040 But if you've got a society in which half the country, suspicious and hateful, you might argue, of the other, that seems to me to be the position where you might struggle to tackle something like, effectively, what we're doing is spending more money than we have, right?
00:32:34.360 Yes.
00:32:34.740 Do you think that these things are connected?
00:32:36.240 They are somehow connected, but probably the causation is very different from the way you're articulating it.
00:32:41.660 The way I would articulate it is that maybe a sort of representative democracy, sort of a constitutional Republican government of the sort the U.S. has, it always works.
00:32:56.440 You know, you have a lot of checks and balances.
00:32:58.420 The decision-making process isn't fast.
00:33:01.860 It requires a lot of complicated compromises to make decisions.
00:33:06.960 And perhaps it works best when you have a lot of growth going on in the background.
00:33:12.600 And so if you have an ever-growing pie, then, you know, there's always some question, how do you divide up this growing pie?
00:33:18.280 And if you're, like, a very difficult, obnoxious political actor, you don't get a bigger piece of the pie for yourself, and that sort of a person doesn't do well.
00:33:29.880 But then if the pie is not growing and it becomes this very brutal zero-sum thing where there's a winner for every loser or something like that, you know, I would expect the politics to have, you know, a much nastier sort of edge.
00:33:44.700 So I, again, my, you know, sort of a man with a hammer sees a nail everywhere, but I would say that the sort of relative stagnation that we've had, you know, I think of the polarized and nasty politics as downstream from that.
00:34:00.660 And then, you know, and then probably, you know, the kind of bad compromise you always end up with in that is, well, we just keep borrowing money because that way, you know, we can sort of pretend that we have growth.
00:34:12.780 And the future will take care of it, even though it obviously won't if the growth doesn't arrive.
00:34:19.040 Because it all comes down to a weakness of leadership, in my opinion, Peter, in that we are a society that seeks comfort.
00:34:28.540 And everything has been tailored to our own comfort.
00:34:32.340 So why are our politicians going to make us feel discomfort?
00:34:35.960 And that discomfort that we feel is just going to make, it's going to be even more shocking because we've done our, we've spent our whole lives avoiding it.
00:34:47.600 Yeah, surely there are, there are elements of all of these things that are correct.
00:34:52.040 But it's, you know, it is, it is, it is sort of unclear what kind of leadership one is likely to get in a, you know, in a deeply stagnant zero sum world.
00:35:03.320 And it's, you know, it's, it's, it's likely to be, you know, very polarized and not very charismatic and not very unifying.
00:35:13.540 So, Peter, that being the case, effectively feels like every rock we lift with you, stagnation is under it.
00:35:21.880 Is there a way for, for our, it almost doesn't feel like you're just talking about the West.
00:35:26.900 You're talking about the entire world really at this point, right?
00:35:29.120 Is that fair?
00:35:29.520 It's, you know, there's a, there's a way that the crisis takes different forms.
00:35:34.820 But, you know, I would say in the, the developed countries, I always think the progress requires us to do new things.
00:35:42.680 And, and so it, if the younger generation will do better than their parents, we have to have some kind of innovation.
00:35:49.900 You know, there, there may be other ways to do it, but, but, but technology, I think is, is this incredibly technological progress, scientific progress.
00:35:59.520 Are these incredibly vectors for the developed world.
00:36:02.520 You know, for the developing world, um, there probably is some kind of globalization story where, you know, China maybe does not need to invent anything new.
00:36:12.260 If they just copy or steal or whatever, all the intellectual property from the West, maybe they can just catch up to our, to our living standards.
00:36:20.400 And, uh, and, uh, and then we can get into questions, you know, whether, whether, um, globalization without technology can work or how well that's going to work.
00:36:29.320 But there, there is, there seems to be, there's some kind of globalization convergence story that one can tell for, um, you know, um, the, um, the, the less developed countries.
00:36:39.920 But I always think, uh, yeah, if we divide the world, you know, and again, if we went back to the fifties and sixties, you would have divided the world into the first world and the third world.
00:36:48.200 The first world was the part that was technologically advancing.
00:36:51.140 The third world was just sort of messed up and stuck.
00:36:53.340 So it was a pro tech, uh, story, but a non globalizing thing.
00:36:57.580 They were just these separate worlds.
00:36:59.340 And now we divide it into the, uh, developed and developing worlds, which is, um, a pro globalization story because it's a story of convergence.
00:37:06.900 The developing countries will become developed, but then it's also, uh, implicitly a story of stagnation where the developed world is that which is done, finished.
00:37:15.300 There's nothing more to do.
00:37:16.140 We are developed.
00:37:17.380 And so, yeah, so the, the kind of, I don't know if you, if you wanted, I don't know, a slogan or something would be something.
00:37:23.340 Like, you know, how, how do we, um, um, um, re start developing again?
00:37:30.480 How do we have progress in the so-called developed world?
00:37:33.980 How do we move beyond the so-called developed world?
00:37:36.740 Right.
00:37:36.980 It's always, you know, it sounds good, but it's actually this very pessimistic description of, of where we are.
00:37:42.580 Well, we don't know what a woman is and we're told that's progress, but in terms of actual progress, isn't, isn't AI the answer here?
00:37:51.820 It boosts our productivity minimum, as you say, if not just takes over and makes everything free of charge effectively.
00:37:57.940 Isn't that how we get out of this?
00:37:59.760 Is that possible?
00:38:00.380 It's, um, well, I, I, I certainly think it, it can help and it's something that should be, that should be pushed.
00:38:08.500 But this is again where I, I, I would come back to the, um, the, um, the internet circa 1999, which, um, you know, it led to a lot of great companies.
00:38:17.980 Um, it, uh, it probably did increase the GDP, you know, some, it did increase productivity some, but, um, you know, in a sense, when this was the only new thing that really happened in the last quarter century, um, it, it, it probably was not enough to, you know, transform the living standards.
00:38:38.740 We had, we had this manifesto we wrote, uh, for my venture fund back in 2011, where we had the tagline, you know, they, they promised us flying cars and all we got was 140 characters.
00:38:48.560 And, uh, and it was not meant, you know, it was, you know, in some ways there's all these ways you can make fun of Twitter or I guess, uh, now, now X and, uh, you know, where, uh, but, but it, it worked on the level of a business, right?
00:39:01.660 It was, you know, a few thousand people, they had very cushy jobs, um, they could work from 10 a.m. to 3 p.m. and smoke marijuana at the office or whatever they were doing.
00:39:12.080 And so it, it, it worked on the level of the business, but it wasn't necessarily enough to, you know, increase living standards across the board for our society.
00:39:21.220 And then that's, you know, that, that I, I, my sort of placeholder would be that AI is something like the internet.
00:39:28.720 It will, you know, you know, yeah, there are all sorts of places where you can, you know, ring efficiencies out of the system.
00:39:35.780 Um, but, uh, but I, I don't know if it will be as, um, as economically transformative.
00:39:41.220 And you mentioned, as, as we need, you mentioned social media.
00:39:45.600 Um, I think a lot of people, well, I'm certainly one of them, rather than saying a lot of people, I'll say, I am concerned about the impact social media is having on our brains.
00:39:54.820 Uh, not only young people, but the way we relate to each other.
00:39:58.120 And increasingly, um, I see with younger generations, we've had numerous people on the show where from younger generations where it's clear, we all know because we grew up without the internet,
00:40:09.020 that most of the way the conversations are had online is bullshit, but younger generations don't.
00:40:15.320 Um, are you worried about what social media is doing to us?
00:40:19.200 You know, I, I always think it's too easy to turn social media or various other Silicon Valley, uh, tech companies into, into the scapegoats for all of our problems.
00:40:28.860 And, uh, surely the, the, the bigger problem, you know, maybe, you know, surely the bigger problems are things like, um, the failure of, of the schools, the wokeness of, you know, K through 12 schools, the, uh, derangements of the universities.
00:40:49.220 The, the, the, the, the, something that's gone really haywire on the educational thing.
00:40:53.440 Um, and then there probably are, there are ways in which, um, you know, I, I, I think a lot of younger people don't know what they should be doing with their lives.
00:41:03.480 Uh, and this, this again would be more, more of the stagnation than that you're spending too much time on, on TikTok or, or something like this.
00:41:12.260 Um, and you know, uh, there are things I, I, I don't, I don't like, I don't like all these things that push us towards conformity.
00:41:20.200 I think, you know, the critique of social media that I would, the political critique I would have would not be that it's polarizing our society, that actually that it's homogenizing our society.
00:41:28.700 Um, there's, there's less heterodox thinking, but, uh, but again, if, if you think of it as a compliment or alternative to the mainstream media, um, we probably had still have a wider range of ideas.
00:41:42.260 That you can explore on the internet than you could before.
00:41:44.680 So, uh, so yeah, there's probably, I don't know, there's something probably wrong with radio and television and, and all these forms of media also, you know, in some ways made people dumb.
00:41:53.260 In some ways, uh, you know, people shouldn't be working all the time, you know, that you have some downtime, some entertainment.
00:41:59.660 And, uh, if you, if you, if you think of it as a, you know, is it, is it really, is it really worse than television was for people?
00:42:08.620 It's an interesting point.
00:42:09.920 I mean, I mean, so people would actually argue that social media is a pipeline.
00:42:14.080 I mean, if you think the wokeness, let's compare it to a virus.
00:42:16.960 I mean, that's the standard, uh, metaphor.
00:42:20.540 It's really the, the, it's, it's how it's, how it transmitted really into everybody's brain.
00:42:27.080 It started at the university and then it went into, yeah, it leaked out the lab into Twitter, into Facebook, into Instagram.
00:42:36.300 And that's when it started to proliferate.
00:42:41.700 Yeah, but I, I, um, I still, I, I still think that was not, again, we can, it's very hard to know these, these cultural arguments, but I don't know.
00:42:52.460 I, I, I'd be open to sort of a religious interpretation that it's, um, it is, you know, it is, uh, Christianity, the, you know, the main religion of the Western world, uh, you know, it always takes the side of the victim.
00:43:05.360 And, uh, and there's something where, um, it is like some kind of deformation or intensification.
00:43:13.080 And maybe you should think of wokeness as ultra Christianity or hyper Christianity.
00:43:17.960 It's, uh, it's just like a extreme intensification.
00:43:21.500 And, uh, and, you know, it's maybe there's no forgiveness.
00:43:24.120 And so it's sort of, uh, it's, uh, it's, it's, it's, you still have original sin and you have all these bad things that happened in the past.
00:43:30.220 The past is terrible and you can never overcome it, but, uh, but there's, there's surely is a religious interpretation of this is sort of, you know, what, what, what happened is, um, as let's say the church lost a certain amount of authority, but people didn't become, you know, rationalist, atheist people.
00:43:45.700 They, uh, they, uh, they, they, they went into the sort of, uh, woke, uh, religion, which, you know, has to be, which I would interpret as, as, you know, a certain, you know, extreme form of Christianity.
00:43:59.440 Yeah, because, you know, there's a religious interpretation, there's, you know, there's an economic one, you know, it's, uh, there's a, um, there is a, uh, there is a, uh, there's an educational one.
00:44:10.440 And then, you know, there's obviously some technology piece, but, uh, but that was, you know, it, I don't know, it was probably channeled by, you know, I think, I think the liberal, the bad liberal idea has been channeled by Hollywood for decades.
00:44:22.660 Hmm. And because it's, it's an interesting point about the, you know, you were saying about bad Hollywood ideas, because it, it seems to me that we were sold a lie with, with the whole new atheism movement, where it was kind of said, we don't need religion anymore because we have rationality.
00:44:41.140 We have science, we have facts, science, rationality, and facts.
00:44:45.580 I mean, great, but they're not going to fill that particular part of you that needs filling that religion does so beautifully.
00:44:52.120 Well, it, you know, it's, it's, you know, I, I, I always, I always think one should try to steal man.
00:44:58.620 So there are all these things I disagree with, the new, disagree, disagreed with the new atheists on.
00:45:03.700 But if I had to steal man, new atheism circa 2005, um, you know, I think it was a very politically correct way to be anti-Muslim.
00:45:13.880 You sort of grouped all these religions together, Judaism, Christianity, Islam, a bunch of others, and then, uh, they're violent and intolerant, and they just randomly kill people, and, um, and it was a problem with Islam, or maybe fundamentalist Islam, or, or, or something like this.
00:45:31.760 It was sort of a politically correct way, um, to be, and, uh, to be opposed to that.
00:45:36.900 And there was surely some, some need for that.
00:45:41.300 Maybe there still is today, if you look at, you know, um, you know, the, um, the sort of, uh, I don't know, um, murderous insanity of the Hamas, uh, people in Gaza, and things like that.
00:45:52.680 Um, and then, um, and then the, um, the sort of geopolitical way in which it lost its way is, at some point, um, you know, the crisis, the, the, the danger to the world.
00:46:06.900 The, the, the, the West is surely more, uh, from communist China than it is from, um, you know, um, medieval Islam, or something like this.
00:46:17.380 And, uh, and I think the new atheists did not have anything to say about communist China, which is, you know, it's, it's, it's, it's, it's a consensus theory of truth.
00:46:28.440 It's, uh, you know, it's a social theory of truth.
00:46:30.660 It's the wisdom of crowds or the wisdom of the communist party, which somehow distills the, the collective.
00:46:36.120 It is, uh, it claims to be scientific.
00:46:38.740 You know, it's probably not, but of course the word science always gets misused.
00:46:42.420 It's almost always, whenever people use the word science, it's almost always a tell that it's not science.
00:46:46.560 So it's, you know, we don't call it physical science or chemical science, um, but it's social science, political science, climate science.
00:46:53.640 So it's, you know, so.
00:46:54.540 That's how you know it's bullshit.
00:46:55.480 I'm in favor of, I'm in favor of science, but I'm not in favor of, uh, people using the word science most of the time.
00:47:00.500 And, uh, and, uh, and so it's scientific socialism and, uh, the new atheists were, were, were, um, you know, they, they were, they were good at, um, you know, um, explaining why bin Laden was a bad person.
00:47:15.520 They were a lot weaker when it came to Xi Jinping thought.
00:47:18.520 Because what the new atheists did is they took away the idea of religion.
00:47:23.920 The flaw with the new atheists is they didn't really know how to replace it.
00:47:29.220 And what you created was a vacuum and something is going to fill a vacuum.
00:47:34.360 Uh, yeah, I mean, there's sort of a lot of different levels.
00:47:37.160 I, I, I, I sort of, I, I don't really like going as much as you're going into the sort of, uh, spiritual, moral, life, life direction.
00:47:45.780 Uh, uh, I, I think it's, if you say it's a, a critique of things that, uh, yeah, it's, it's a critique of societies that are organized in a certain way.
00:47:56.200 Um, and I, I think it was important to have a critique of a medieval Islamic society that, that was, that, that, that, that, that there were a lot of things that were not desirable about that.
00:48:06.900 Um, I think it is perhaps equally important to have a critique of a totalitarian communist society.
00:48:14.320 Um, and I think that, in my judgment, that is a greater threat.
00:48:18.120 And, um, and that's, that's one where, um, there's something about the methodology and the approaches, you know, they, they weren't able to say.
00:48:25.940 Right. Yeah. And so, yes, you know, maybe, maybe, uh, religion sometimes brings very bad things out of people and we should find a way to criticize religion when it does, does that.
00:48:35.220 Uh, but the, the notion that only religion brings bad things out of people, you know, maybe you can defend this in 1780 before the French revolution, but surely that's been, that's been out of date since 1789.
00:48:48.180 Right. Well, it's interesting that you mentioned, um, how you see comparatively the threat of communist China and the threat of Islam, because, uh, as I'm sure you're well aware, uh, certainly in Europe, particularly on the right, the concern about the demographic dimension of that, the concern about the fact that European societies are failing to integrate their Muslim populations.
00:49:12.560 Well, certainly less well than the United States is now giving rise to very strong sentiments about immigration generally, but about Muslim immigration in particular.
00:49:23.420 And a lot of people in Europe would say, actually, you know, Muslim terrorists are way more likely to have a, a, a material impact on my life or a grooming gang in England or whatever versus communist China way far off in the distance.
00:49:36.620 It's not really affecting me personally. How do you see those two threats and why do you say you're more concerned about China?
00:49:42.560 Um, yeah, well, they, they, look, they, they, um, there probably are, are ways that one has to be able to talk about more than one thing at a time.
00:49:53.660 Yeah.
00:49:54.020 But, uh, and I, I, there, there, there, there probably are all sorts of things that, uh, where people were, you know, too cavalier about these things.
00:50:08.880 Uh, you know, the, the, the, the, the, the number, the number, the demographic number that I've, I've seen, um, is, and if you look at continental Europe, so not the UK and not, um, not Russia, Soviet Union.
00:50:21.200 But, um, um, in 1930, um, had something like 10 million Jewish people and something like 5 million Muslims, mostly like in the Balkans.
00:50:33.000 And, uh, today it's something like, uh, maybe less than 2 million Jewish people.
00:50:38.600 Uh, so there's a Holocaust and then a lot of Jews left, uh, and something like 50 million Muslims.
00:50:44.960 And so the ratio of Jews to Muslims went from twice as many Jews to 25 times as many Muslims, you know, less than a hundred years later.
00:50:53.060 And if you have a 50 to one demographic change, surely that's, that's something, you know, one, um, one, one, one, one should have thought about what that, what that meant.
00:51:04.220 And then people were, you know, too cavalier about, it doesn't matter because the education institutions work great.
00:51:11.300 And people, all these people will become, you know, um, modern liberals and productive members of our society.
00:51:18.660 And so it was, it was, it was, yeah, there were, there were demographic questions people didn't want to talk about.
00:51:23.020 There were educational things people didn't want to talk about.
00:51:25.280 And they, they were, they were, they were all linked to the, uh, the, you know, the, the, the way I see the China one is, um, sort of qualitatively quite different is that it is, you know, it is in some.
00:51:38.940 Um, it is in some zero, you know, it is, it is determined to beat the West, to catch up to the West in these, you know, questions of science and technology.
00:51:53.060 And then, uh, you know, in some ways, uh, to, you know, exert some, some leverage through that where, where it can, you know, it can dominate the planet.
00:52:03.540 And, uh, why do you say that, Peter, what's your evidence for that?
00:52:05.620 You're, you, you're someone in, in the public space who seems to be uniquely vocal about this.
00:52:11.260 Why are you so focused on that issue?
00:52:15.280 I, you know, I think, I think that that's, uh, well, I don't know, I think there's, there's different, different levels.
00:52:26.020 One could, one could, uh, look at it, but, uh, it is, um, I, I think that's the way the, the China,
00:52:35.620 this leadership sees it.
00:52:36.620 Let's, you know, so it's, it's, it's always, you know, you have these sort of questions about the, the, the, the, the Thucydides trap, the, the rising power meets the, the great existing power.
00:52:47.380 And does this often lead to conflict?
00:52:49.180 And, and, and I think in the Western world, you know, we've, uh, people have generally looked at this, you know, rather optimistically.
00:52:58.560 I don't, I don't really think that's the way people in, in China think of it.
00:53:02.940 They, they, they think of themselves as, you know, on an, on an ever greater collision course with, with the West.
00:53:09.280 And then, you know, that on some level, maybe it leads to armed conflict over Taiwan.
00:53:14.280 Maybe, maybe it leads to some sort of, um, really violent decoupling in, in, in, in different ways.
00:53:20.960 Um, and, uh, and then that's something, you know, that I think, you know, we, we have to think about very hard.
00:53:28.100 Peter, there are people who, sorry, there are people who go, look, the threat of China has been wildly, you know, blown out of all proportion.
00:53:37.140 And they'll point out to the fact of, you know, the demographic problem that China has with the old people, which is due to the one child policy.
00:53:44.540 They'll look at the fact that the economy is not doing particularly well.
00:53:49.040 In fact, some people are predicting it to go into recession in the next couple of years.
00:53:54.200 So we're overstating the threat of China.
00:53:57.360 You know, where would you push back on in those points?
00:53:59.700 Well, I, I certainly think we understated it for a very long time.
00:54:03.820 Yes.
00:54:04.120 And so, uh, and there was probably, I don't know, there's sort of always, I think, a prehistory where, uh, in, uh, in 19, uh, 1989, you would have, you know, I always think, you know, we had, you had Tiananmen in June of 89, the Berlin Wall comes down in, uh, November of 89.
00:54:21.300 If it had been reversed and Tiananmen happened in June of 1990.
00:54:24.300 So in summer of 89, Brent Scowcroft, the Bush 41 national security advisor, flies to Beijing, reassures them, we don't care about all these people who are killed in Tiananmen because you're anti-Soviet.
00:54:34.240 You're blocking the Soviet Union.
00:54:36.020 Six months.
00:54:36.620 Um, and if it happened the other way around, maybe we would have, uh, we would have rethought the China thing back in 1990 or, or something like this.
00:54:45.480 And, and there were, yeah, there were surely, um, a lot of, you know, very dubious decisions.
00:54:52.300 There was the decision to admit China at the WTO there, um, and, and sort of to, you know, to hollow out, you know, a great deal of the, of, of, of, of, of the economies of, of the Western world.
00:55:03.280 Um, and, um, you know, maybe it doesn't matter if we're, if we're only concerned about, you know, how much it costs the consumer to, you know, to buy, you know, to buy, uh, to buy an electric car.
00:55:12.440 But if, if, if these things have a military dimension, um, and, uh, you know, you no longer have a shipbuilding industry in the UK, uh, and then you will not have a Navy and you will not, and the UK will not have a role to play in protecting Taiwan.
00:55:25.180 And, uh, and so, so, you know, there were, there were sort of these, these, uh, these, um, these, uh, these questions that one should have thought about.
00:55:33.780 Uh, and so I think there was, there was a great power version of this.
00:55:37.900 And then, uh, I think there was also an ideological version of this where, uh, where maybe, you know, maybe there was a way to, to, um, manage the rise of China.
00:55:48.540 If it had been, uh, if it had been transformed into a liberal Western democracy, you might've still had rivalry, might've still had, it was, you know, it was, it was non-trivial to have the handoff from, you know, the British empire to this American centric world.
00:56:02.260 But there was a way, there was a way that could work.
00:56:05.560 Uh, and then, uh, and then there was some sense in which China was just not becoming a liberal democracy.
00:56:11.500 And this is sort of where the, you know, Fukuyama end of history thing was, uh, has been, has proven to be comically wrong.
00:56:18.460 And people should have figured that out much earlier, you know, and I think they would have figured it out in June of 1990 if Tiananmen had happened one year later.
00:56:26.680 And, but for the, because of that one year delay, it was something like, you know, maybe that maybe took until the Trump presidency that this even, you know, this even started to, to, to register, uh, to register as an issue.
00:56:40.620 Um, and yes, I, I, I, I, you know, I'm not, I don't, I don't think we should go to war with China.
00:56:48.240 Um, I don't think, um, but I think, I think, I think we should be very realistic about how, how deeply misaligned we are.
00:56:57.840 Uh, how, um, you know, how the, uh, sort of totalitarian ideology, uh, is deeply incompatible with our values and, and all these ways where, uh, where, um, you know, in some sense, China wants to become America and, and, uh, that's, you know, wants to become the leading power.
00:57:21.240 And that's, that's a, you know, that's a setup for, for a very, very difficult thing to manage.
00:57:25.860 Peter, it's been, let me ask, actually, a couple more, uh, we've got a little bit of time.
00:57:31.060 Uh, on geopolitics, Peter, you mentioned totalitarian ideology.
00:57:36.760 We have Iran, we have China, we have Russia, all making moves, to put it mildly, around the world.
00:57:45.100 How do you see the geopolitical situation?
00:57:47.180 Well, they are, there's, there's some way where they are, they're all entangled.
00:57:57.920 You know, they're all, they're all entangled with each other.
00:58:03.940 There are all these things one has to, you know, one has to think of, of separately.
00:58:08.980 Um, it's, uh, I don't know, there's a lot of things one could say about, about each of them.
00:58:17.280 You know, probably, um, there probably are, it's, it's probably, there, there's probably a way that, uh, um, the Middle, Middle Eastern policy of, of, of the US, of the Western world should be focused extremely squarely on Iran and the Iran problem.
00:58:35.720 And I think there are critiques one can have of the neoconservatives, of Obama, of all, all these different people the last, uh, 20 years where the focus was on Iraq or on, um, on all these different, all these different things.
00:58:49.060 And it ended up being a distraction from Iran.
00:58:51.580 And the reason I would say Iran is the most important one is, uh, you know, if, if they, if they achieve a nuclear weapon, um, I think that has the effect of radically destabilizing the Middle East.
00:59:03.800 I don't think they would use the nuclear weapon, but it, it would mean that they could, um, they could, uh, support Hezbollah, Hamas, other things with far more impunity.
00:59:12.840 And you'd, you'd get sort of a violent regional war.
00:59:16.360 You know, the Korean war starts in 1950, one year after the Soviet Union gets the bomb.
00:59:20.780 Vietnam war starts 1965, one year after China gets the bomb.
00:59:24.300 And, um, because the bomb means that we can't really, um, we can't really hit back at the, the people who are supporting the North Koreans or the North Vietnamese.
00:59:35.900 And so you get a very nasty regional war.
00:59:38.560 And that's, that's, that's why, uh, I think you want to, you want to do a lot to prevent the Iranian, uh, the Iran from getting the bomb.
00:59:46.100 And that's, that's, that's, you know, that's the Middle East focused that, you know, there's, there's a, there's a great deal that one, you know, that one can say about, uh, about, about Russia.
00:59:57.240 Um, I think I was probably in 2016, I, you know, I gave two, two speeches, uh, uh, for Trump at the, one of the convention and one of the Washington press club.
01:00:08.280 I was not pro-Russia, but I was, I was sort of anti-anti-Russia involved that this was not the battle we have.
01:00:15.200 You know, we have a bigger crisis with China and it's, it's a distraction from that.
01:00:19.520 Um, and, uh, you know, the, the, the place where I'm a little bit more confused on that at this point is that, uh, I, you know, I think in some sense, uh, I think of Russia as, as a, you know, the, the Ukraine-Russia war is almost already a proxy for the conflict over Taiwan.
01:00:36.940 And in some ways, um, Russia is a, is a Chinese client state of sorts.
01:00:41.660 It's like North Korea or, or something like that is very different, but, um, but, uh, and then, uh, and then, and then the real China.
01:00:49.520 One of the challenges is China, which, uh, in some ways, um, you know, uh, um, I mean, maybe the broader Chinese playbook is to sort of, uh, you know, uh, organize the developing countries against the developed world.
01:01:07.900 And this is sort of, you know, Iran is, you know, this poor country in the Middle East versus the wealthier Saudi Arabia or something like this.
01:01:17.100 And, and then Russia is the, you know, the former Eastern bloc country against Western Europe.
01:01:22.600 And then there's a version of that, of that playbook in, in many other parts of the world that China wants to play.
01:01:28.000 That's, yeah, that's, that's where the, the problem of, of, of, of these is, is, is the way that they're all entangled with each other.
01:01:34.760 You know, I was, I was going to actually ask because we've touched on it, but we haven't spoken about it, which is Taiwan.
01:01:42.160 Taiwan, I mean, how do you see that situation evolving?
01:01:51.640 It's, um, man, it's, it's, it's, it's, it's a, it's a, it is a, it's quite a big black box.
01:02:03.240 I, I don't, I don't know if we're capable of defending Taiwan.
01:02:08.480 And so I think we have to somehow be realistic about what our actual, uh, military capabilities are.
01:02:15.820 And, um, um, but I, I think if, you know, if the, if the Taiwan crisis comes to head, I don't know if we end up with a, with a, uh, you know, a full scale war with China.
01:02:27.380 I think you end up with, you know, with extreme economic decoupling.
01:02:31.700 So, uh, you know, uh, I, I still don't think TikTok will be banned until 24 hours after China invades Taiwan.
01:02:39.360 And, uh, and, you know, you got the, you know, you got the Nord Stream pipeline between, uh, Russia and Germany, and we have the equivalent of a hundred pipelines between China and the West and the pipelines.
01:02:51.340 They will all blow up the day of the, of the Taiwan invasion.
01:02:55.620 And then, um, and then I think, um, we would be well advised to, uh, think about the decoupling to prepare for the decoupling in advance and, uh, not have this fake notion that the coupling somehow creates stability.
01:03:11.340 In the case of the Nord Stream pipeline, the coupling of Germany and Russia led to instability because it made Putin think you could invade Ukraine and Germany would not go along.
01:03:20.240 And then the Germans didn't understand anything about energy.
01:03:22.880 And so they, they, they were actually tough on Ukraine, but it almost blew up the whole economy.
01:03:27.940 And, uh, you know, I think the China, Taiwan thing, you have to think of the 100 pipelines between the West and, um, and China will blow up.
01:03:37.380 And, uh, surely it's better if that happens on our timetable than theirs.
01:03:42.120 Well, hey, at least there won't be a global government, right?
01:03:46.280 Not for the time being.
01:03:47.420 Not for the time being.
01:03:48.340 Peter, it's been an absolute pleasure talking with you.
01:03:50.340 We're going to go to locals to ask our supporters questions before we do.
01:03:54.000 The final question of the interview is always the same, which is, what's the one thing we're not talking about?
01:03:58.660 You talked earlier about political correctness, preventing people from saying what they should.
01:04:03.000 What should we be talking about?
01:04:06.120 Man, it's, it's just, it's, it is, it is just always this, this crisis of the West, how we get back to the future that we've been going through.
01:04:14.580 And then that, that, that's, that's surely, uh, you know, how do, you know, how do, how do, how do we create a, a better world for the young generation in, in, in, in these Western societies?
01:04:26.000 Well, give us some ideas on that.
01:04:27.380 It's, uh, it's, uh, it's, you know, I, I, I think so much, the, I, I, I always say the paradoxical answers, you know, maybe, maybe, maybe I'll disagree with the premise of the question, you know, undercut this interview too much.
01:04:40.700 It's, it's, it's always the, the, the UK bias is too much on the level of speech, too much on the level of, you know, sort of some Oxbridge, uh, rhetoric, debating society, uh, and the UK.
01:04:54.220 He's having a go at me, man.
01:04:55.240 Yeah, exactly.
01:04:56.000 Well, right, done.
01:04:56.640 I just did a great speech.
01:04:57.660 You did very well.
01:04:58.760 You're, you're fantastic at, at, at that, that sort of a thing.
01:05:01.720 And then, um, but, you know, uh, the sophists, what they have in common with the biblical God is they believe in the omnipotence of speech and, uh, and it's also, we just, you know, we need to act and we need to do things.
01:05:14.660 And, uh, and, uh, you know, I, I, I, you know, I, I think talking about it is perhaps necessary, uh, but it surely is not sufficient and, uh, we need to actually act on things and, uh, and build the future.
01:05:27.840 As a great philosopher once said, uh, a little less conversation, a little bit more action, please.
01:05:33.100 There we go.
01:05:33.680 We end the interview with me getting criticized by Peter Thiel.
01:05:36.600 Head on over to Locas, where I'm sure he'll do a lot more of that.
01:05:40.700 Peter, you're given the option of following the teaching and example of just one philosopher in all human history.
01:05:47.040 Who would it be and why?
01:05:57.840 Thank you.
01:06:05.700 Thank you.