The Tucker Carlson Show - February 21, 2025


Ray Dalio: America’s Hidden Civil War, and the Race to Beat China in Tech, Economics, and Academia


Episode Stats

Length

44 minutes

Words per Minute

152.03989

Word Count

6,831

Sentence Count

512

Misogynist Sentences

1

Hate Speech Sentences

5


Summary

In this episode, we discuss the current state of the U.S. political system and how to deal with it. We also discuss the impact of the immigration reform bill and how that could impact the future of the country.


Transcript

00:00:00.000 Want to own part of the company that makes your favorite burger?
00:00:03.420 Now you can.
00:00:04.560 With partial shares from TD Direct Investing, you can own less than one full share,
00:00:08.380 so expensive stocks are within reach.
00:00:10.380 Learn more at td.com slash partial shares.
00:00:13.100 TD, ready for you.
00:00:15.540 So I've heard you say that the United States is in a civil war,
00:00:19.300 and I think most Americans don't perceive that.
00:00:22.540 Can you tell us what you mean by that?
00:00:23.980 Well, what I mean by a civil war, I should say a type of civil war, right?
00:00:29.220 And what I mean is that there are irreconcilable differences
00:00:35.080 that each side is willing to fight for in order to get the outcomes that they want,
00:00:45.000 and that in that environment, the issues of how does the legal system work,
00:00:50.460 is that going to stand in the way of that fight,
00:00:53.800 or is there going to be a fight that will make the cause more important than anything?
00:01:03.340 So that's the type of situation that we're in.
00:01:07.220 And those gaps, we understand there's wealth and values gaps that are entering into this.
00:01:13.660 This is, we've seen this through history.
00:01:16.660 So where that goes is a different question,
00:01:19.360 but we are in that type of civil war, are we not?
00:01:22.420 Clearly we are.
00:01:23.680 Clearly we are.
00:01:24.840 How are they resolved?
00:01:26.820 I mean, clearly they can be resolved through violence,
00:01:29.020 but what are the other ways you resolve the kind of conflict we have?
00:01:32.480 Normally, they're resolved through conflict,
00:01:36.500 because you get to the point where both sides can't reach agreements.
00:01:41.260 Both sides don't even want to talk.
00:01:43.740 Both sides don't want to respect the rule of the law.
00:01:48.600 So when we're dealing with things like Sanctuary City issues,
00:01:51.900 and we're dealing with enforceability,
00:01:54.300 who has the enforceability, okay, and you almost have to play out,
00:01:59.460 okay, enforceability means police forces and such things.
00:02:03.340 Right, people with guns, yeah.
00:02:05.540 Right, people with guns and, you know, causes.
00:02:09.080 Just because the legal thing says they shouldn't do that,
00:02:11.900 that's not going to stand in the way.
00:02:13.340 We have that kind of a situation.
00:02:15.620 So we are probably past the point of being able to resolve that
00:02:21.220 by compromise and empathy and all of that.
00:02:27.160 So normally it goes that way.
00:02:29.520 I mean, the only thing that can be done is to have the fear of that
00:02:35.640 create a necessity for having another path.
00:02:43.320 You know, like we were talking about the debt situation.
00:02:46.200 Yes.
00:02:46.620 So can there be a fiscal commission that gets together
00:02:51.480 and then achieves those things or not?
00:02:54.300 I think it's unlikely.
00:02:55.800 I think we're going to more fragmentation.
00:02:58.500 States, there are some states and other states,
00:03:01.860 I think we're going to see more fragmentation.
00:03:04.520 And so, but, you know, it's like this dynamic through history.
00:03:09.800 This isn't the first time this happened.
00:03:11.140 This happens repeatedly through history.
00:03:13.080 And usually, you know, it runs its course.
00:03:18.700 So the way our leaders in the United States have dealt with it
00:03:22.000 over the past 30 years has just been to ignore it,
00:03:25.140 completely just ignore it.
00:03:26.040 Well, there's a cycle, you know, the cycle was, let's say, I don't know,
00:03:33.360 Ronald Reagan and Tip O'Neill, and they get together
00:03:36.460 and they were operating in a certain way.
00:03:39.320 And the manifestations of the circumstances, you know,
00:03:44.480 the manifestations of debt or wealth gap or values gaps were not as great.
00:03:48.980 So you didn't, over that 30-year period, have as much polarity in many different ways.
00:03:56.480 So now you've gone to greater polarity.
00:03:58.800 If you watch statistics, everything I do comes from measuring things.
00:04:03.220 So I look at statistics.
00:04:04.420 The gap in measuring conservative or liberal votes in the House and the Senate
00:04:17.480 is the greatest gap since 1900.
00:04:21.740 And the voting across party lines is the least since 1900.
00:04:29.100 So you see this gap.
00:04:31.600 You see it in the elections, right, the green and red.
00:04:35.680 So the blue and red.
00:04:37.780 So we're not, it's not just an evolutionary, it's where we are,
00:04:44.640 where we have gotten to, that is the irreconcilable questions.
00:04:49.320 Does the Supreme Court, you know, we thought about the Supreme Court differently
00:04:54.620 not long ago, right?
00:04:56.800 The Supreme Court was the Supreme Court.
00:04:59.660 And so now it's different.
00:05:01.600 So what accounts, I mean, there are a million ways to measure this and all of them.
00:05:07.160 Do you agree?
00:05:08.100 Well, of course I agree.
00:05:09.440 Of course I agree.
00:05:10.240 And I agree with you that every measurement shows the same result, which is the country
00:05:13.580 is polarized.
00:05:14.000 Okay, so we know where we are.
00:05:15.600 But what, completely.
00:05:17.340 And we're not sure how it's resolved.
00:05:19.200 But I think it's also worth pausing to ask,
00:05:21.500 what happened, what was the change that led to the polarization that was unimaginable even
00:05:27.000 35 years ago?
00:05:27.920 Um, the change was, um, in a combination of the system working well for the majority of the people,
00:05:39.580 and which has to do with the majority of the people not being productive.
00:05:44.500 You have productivity equals income.
00:05:49.000 Right.
00:05:49.640 Okay.
00:05:49.900 So now if you take education, um, and you take measures of how productive or how well-trained
00:05:57.700 you're going to be, you see, and therefore also income, your productivity, you see by all of
00:06:04.280 these measures, great, great gaps that exist.
00:06:07.700 So by way of example, um, is, um, unicorns and the changes that we're seeing, fabulous changes
00:06:15.120 in what we're seeing in, uh, technologies.
00:06:17.640 Yes.
00:06:18.120 But it really comes down to, if you take the number of people who have been making those
00:06:24.340 changes and having unicorns in this wonderful world, they go to the best universities and
00:06:29.760 they make these wonderful things happen.
00:06:32.020 That's about 3 million people in a country of a little over 330 million people.
00:06:38.260 And if you take the average, 60% of Americans have below a sixth grade reading level, 60% of
00:06:46.960 Americans.
00:06:47.760 So when we deal with education, you have to make that population productive and, um, and
00:06:55.620 through productivity, they become educated and they become productive.
00:07:00.360 They earn money and do you have a better society?
00:07:03.020 So a number of things changed that, um, it was the combination of globalization and technology.
00:07:12.180 Think about, I remembered, and you probably remember what the middle class working on an
00:07:17.980 assembly line at an auto plant was like and how manufacturing occurred.
00:07:22.980 Yes.
00:07:23.540 Okay.
00:07:24.440 A combination of foreign producers and, uh, automation changed all that.
00:07:31.300 Of course.
00:07:32.360 So that produces a larger wealth gap.
00:07:35.300 And then with that wealth gap, we also have very large, um, uh, values gaps.
00:07:42.600 But it's driven by the wealth gap.
00:07:44.800 It's, it's both, you know, um, some population here we are at the world government summit in,
00:07:51.400 uh, Dubai and you have very globalization and everybody, the elites, let's call them the
00:07:59.060 elites where they're among the elites are here doing deals and, and, you know, facing questions
00:08:04.540 and all of that.
00:08:05.780 Um, and at the same time, then there's those who are dealing with their basics.
00:08:11.240 So, um, wealth gaps contribute to it, but there's also values gaps.
00:08:16.920 Technology is part of the reason that we're here.
00:08:22.800 Religion.
00:08:23.700 Yes.
00:08:24.880 You know, belief systems.
00:08:27.940 These are important too.
00:08:29.540 Of course.
00:08:30.920 But we're on the edge of this AI transformation, which seems like it's going to accelerate the
00:08:36.560 trends that have led us to where we are right now.
00:08:38.580 So what do you, I mean, if artificial intelligence, you know, increases efficiency, but leaves an
00:08:44.620 even greater number of people with that meaningful work in the United States, what happens?
00:08:52.140 There needs to be a game plan.
00:08:55.620 Yeah.
00:08:56.100 Okay.
00:08:57.440 There needs to be a game plan.
00:08:59.720 That's the main thing.
00:09:00.620 In other words, I can describe the circumstance.
00:09:03.100 Yes.
00:09:04.060 Okay.
00:09:04.420 And we can agree that there needs to be a game plan.
00:09:08.960 Well, let me ask you since.
00:09:10.000 I'm not responsible for the game plan, you know.
00:09:12.640 But since you know everybody and you spend your life talking to everybody and, you know,
00:09:16.200 you're one of the world's biggest investors.
00:09:17.500 So, you know, you're taking these questions seriously.
00:09:20.760 Are you familiar with a game plan in progress?
00:09:23.520 There is not a game plan.
00:09:25.120 At all?
00:09:25.900 No.
00:09:26.680 That seems crazy.
00:09:29.360 You know, yeah, it seems crazy.
00:09:31.200 It seems so.
00:09:31.740 We are going from a transition.
00:09:35.820 I'm just being analytical, right?
00:09:37.620 Yes, of course.
00:09:37.940 Mechanical.
00:09:38.800 Okay.
00:09:39.280 We are going from a transition in which there is, I don't know, collectivism, a multinational,
00:09:53.360 you know, all the constituents working together kind of environment that has also created a
00:09:59.300 bureaucracy and inefficiencies and so on.
00:10:03.240 So you're going from an environment in which there was a World Health Organization, a World
00:10:08.360 Trade Organization, a World Bank, and all of that, to unilateral, in my own interest.
00:10:17.820 In other words, as a country or within a country, as a constituency, my tribe, what is
00:10:24.560 my interest?
00:10:25.800 And you're going to fight for it.
00:10:27.200 So we have evolved into that kind of lay of the land situation.
00:10:34.080 And so the question is almost, what is the we?
00:10:37.440 Who is in control?
00:10:39.440 Okay.
00:10:40.120 I mean, we change control very quickly.
00:10:43.700 I've noticed.
00:10:44.340 And then who has the plan?
00:10:47.300 So now you get in control.
00:10:48.840 You fight to get into control.
00:10:50.120 You're in control.
00:10:50.880 You got to do things quickly.
00:10:52.220 And you're doing things quickly.
00:10:53.860 You know, we don't have the continuity to be able to work together, to be able to have
00:11:00.180 a plan.
00:11:03.520 Well, to be more specific about it, I would say that people developing the technologies,
00:11:07.640 so they would be various Chinese companies, of course, but also Google and Microsoft, Sam
00:11:13.880 Altman.
00:11:14.180 Tucker, you're so idealistic, but realistic.
00:11:18.240 Yes, we all should work together.
00:11:20.600 No, no, no, no, no.
00:11:21.200 I'm merely saying if I'm, you know, unleashing something on the global population, then I think
00:11:26.840 it's fair to ask me, like, what, you know, like, what do you expect to happen to everybody?
00:11:31.960 I think, no, no, no, but I think that's what I mean.
00:11:34.800 The notion that it's fair to, denies the reality that we're in an environment of pursuit of self-interest.
00:11:44.600 So if you take the fight, let's say, of technologies, of course, the one who wants to get the latest
00:11:52.120 AI out wants to beat the other one who does it, let alone an American firm and a Chinese firm.
00:11:59.500 And I'm just trying to describe the reality to you, Tucker.
00:12:03.880 So now let's look at those realities.
00:12:06.580 That's the reality.
00:12:08.040 So when you say they should, okay, that's the theoretical should.
00:12:12.880 They should come up with rules that is better for the harmony of the people as a whole.
00:12:19.620 Okay, I agree they should.
00:12:21.840 I guess what I'm saying is I'm once again disappointed by people.
00:12:27.200 You've got to grow up, Tucker.
00:12:28.480 I know, I'm 55, I'm still disappointed.
00:12:31.600 But you're absolutely right.
00:12:33.900 And I'm not here to sort of inspire a moral lecture from you or deliver one.
00:12:38.420 I just want to kind of know what you think is going to happen.
00:12:42.000 So you have these technologies.
00:12:43.840 Is it fair to say that they really are as transformative, they're as big a deal as?
00:12:47.740 Hugely.
00:12:49.000 The greatest.
00:12:50.020 So I've studied history, right?
00:12:51.560 Yes.
00:12:51.780 I study, you know, what was the impact of the printing press?
00:12:57.300 And what was the impact for the industrial revolutions and so on?
00:13:01.580 This, in my opinion, is the biggest impact that we have because it will revolutionize all thinking that applies to everything.
00:13:11.180 It applies to everything.
00:13:12.420 So whatever you're doing, it will make it much more efficient, much more powerful.
00:13:19.180 But that includes wars, too.
00:13:21.580 You know, everything is going to be radically transformed because anything that we apply thinking to is going to be very much transformed by it.
00:13:30.400 We did an interview with a woman called Casey Means.
00:13:34.480 She's a Stanford-educated surgeon and really one of the most remarkable people I have ever met.
00:13:40.760 In the interview, she explained how the food that we eat produced by huge food companies, big food, in conjunction with pharma, is destroying our health, making this a weak and sick country.
00:13:54.240 The levels of chronic disease are beyond belief.
00:13:57.180 What Casey Means, who we've not stopped thinking about ever since, is the co-founder of a healthcare technology company called Levels.
00:14:06.120 And we are proud to announce today that we are partnering with Levels.
00:14:09.580 And by proud, I mean sincerely proud.
00:14:12.580 Levels is a really interesting company and a great product.
00:14:15.640 It gives you insight into what's going on inside your body, your metabolic health.
00:14:20.300 It helps you understand how the food that you're eating, the things that you're doing every single day are affecting your body in real time.
00:14:26.820 And you don't think about it.
00:14:28.100 You have no idea what you're putting in your mouth and you have no idea what it's doing to your body.
00:14:31.080 But over time, you feel weak and tired and spacey.
00:14:36.160 And over an even longer period of time, you can get really sick.
00:14:38.820 So it's worth knowing what the food you eat is doing to you.
00:14:43.520 The Levels app works with something called the Continuous Glucose Monitor, a CGM.
00:14:47.860 You can get one as part of the plan or you can bring your own.
00:14:50.820 It doesn't matter.
00:14:52.140 But the bottom line is big tech, big pharma, and big food combine together to form an incredibly malevolent force, pumping you full of garbage, unhealthy food with artificial sugars, and hurting you and hurting the entire country.
00:15:07.820 So with Levels, you'll be able to see immediately what all this is doing to you.
00:15:11.160 You get access to real-time personalized data, and it's a critical step to changing your behavior.
00:15:17.100 Those of us who like Oreos can tell you firsthand.
00:15:20.380 This isn't talking to your doctor in an annual physical, looking backwards about things you did in the past.
00:15:25.660 This is up to the second information on how your body is responding to different foods and activities, the things that give you stress, your sleep, et cetera, et cetera.
00:15:34.820 It's easy to use.
00:15:37.240 It gives you powerful, personalized health data, and you can make much better choices about how you feel.
00:15:42.500 And over time, it'll have a huge effect.
00:15:44.700 Right now, you can get an additional two free months when you go to levels.link slash Tucker.
00:15:49.900 That's levels.link slash Tucker.
00:15:52.880 This is the beginning of what we hope will be a long and happy partnership with Levels and Dr. Casey Means.
00:15:58.860 They speak of darkness and danger, but totalitarian novels also give us hope.
00:16:04.820 Showing us how to defend our society from the horrors of tyranny.
00:16:09.280 In Hillsdale College's free online course, Totalitarian Novels, Hillsdale President Larry Arnn teaches us lessons from classic novels like George Orwell's 1984 that are as relevant today as ever.
00:16:22.240 Sign up now for Hillsdale College's free online course, Totalitarian Novels, at tuckerforhillsdale.com.
00:16:29.680 That's tuckerforhillsdale.com.
00:16:32.200 It's the time of year we focus on the people who matter most in our lives.
00:16:35.660 Now, if there's one way to show your family, the people you love, that you love them, it's by protecting their health and their safety.
00:16:42.100 And a really obvious way to do that is by preparing for unpredicted moments.
00:16:47.180 And there are a lot of those breakdown of supply chains, overwhelmed hospitals, natural disasters, wars, whatever happens next.
00:16:54.920 You can't see it coming, but you can be prepared for most of it.
00:16:57.400 And that's why a Jace case works.
00:17:00.000 A Jace case is a personal supply of prescribed emergency medications.
00:17:04.280 So if things fall apart, you're OK.
00:17:07.280 There's an unexpected global disruption.
00:17:09.220 You can protect yourself and your loved ones.
00:17:12.200 So this February, show them you care.
00:17:14.400 Get the Jace case today.
00:17:15.680 You'll have the right meds on hand when you need them.
00:17:19.080 You only need them once.
00:17:20.300 You ought to have them.
00:17:21.920 Can you give us some concrete examples that you believe will come to pass, you know, in the next few years?
00:17:28.360 What will change that we at this point can understand?
00:17:31.380 Well, right now, tests of AIs is we, all of them, can pass tests that are equivalent to the PhDs in all fields in one mind.
00:17:57.500 So that there is what we call moly paths, people who can think both across domains.
00:18:06.720 So in that, we have these operating so that it's not just like a PhD in one area.
00:18:13.660 It's like a PhD in all areas.
00:18:16.800 And that it could look across those areas and give you answers and operate that way.
00:18:22.620 That has created the, there's an acceleration of this because it compounds as it learns the learning compounds and it produces that.
00:18:31.460 So that is a reality today.
00:18:34.580 People just haven't yet experienced all of that.
00:18:37.660 And so you're very quickly going to be in a situation where the problems are going to be given to it.
00:18:45.500 You're going to ask it strategies and so on that can take into consideration all of the things that are happening from everywhere and how the cause-effect relationships work.
00:18:57.520 Think about it this way.
00:18:59.260 There's so much complexity in the world.
00:19:01.640 Everything that we're, you know, what happened, there's economic policies or economic things.
00:19:07.980 There are financial things.
00:19:09.860 There are health things.
00:19:12.100 There are all of these things.
00:19:12.940 And they all relate to each other.
00:19:15.140 It's, you know, it's called, I think, the butterfly syndrome.
00:19:18.680 You know, if a butterfly changes, flaps its wings, it has these secondary consequences.
00:19:25.100 This is all very complex.
00:19:27.040 It's very complex for the human mind to think about those things.
00:19:30.620 Yes.
00:19:30.900 We're now having a situation where it can all be taken into consideration and be a partner, a thought partner,
00:19:38.680 that can actually go beyond our capacities to think about those relationships.
00:19:45.020 And in thinking about those relationships and so on, it has an enormous impact.
00:19:51.880 Somebody in the medical area was giving me the example of learning about all the causes,
00:19:58.920 but with the data that they're going to have on each one of us about what were our experiences
00:20:04.840 and what is our diagnosis of each of the parts.
00:20:08.100 And you watch that over time.
00:20:09.900 What air do you breathe?
00:20:11.760 What environment are you in?
00:20:13.360 What stress are you in?
00:20:14.580 And all that.
00:20:15.480 That there will be the understanding of these cause-effect relationships that in turn change things.
00:20:21.780 And then you go down to the microscopic level of dealing with practically at the molecular cell level in dealing with these problems.
00:20:31.320 Okay.
00:20:31.780 The changing of DNA and these types of things.
00:20:35.440 All of those we are in the midst of a tremendous revolutionary change.
00:20:41.700 What about the field of economics?
00:20:43.800 Can you take kind of the art and the guessing out of it at this point?
00:20:46.620 So you're saying, you have said many times, you've written a lot about it, but the need of governments to get down to 3% of GDP with their debt.
00:20:55.940 So or else everything collapses.
00:20:58.200 How do you do that?
00:20:59.380 And all these political consequences, the population doesn't want less money spent on them.
00:21:03.720 Obviously, you could get, you know, governments falling and stuff.
00:21:06.580 Wouldn't AI just solve that for you?
00:21:10.500 You said it produces strategy.
00:21:11.780 There's, there's, does AI control human nature?
00:21:18.400 No, but these are human nature.
00:21:22.420 My bet is that human nature is going to be the biggest force and it's all going to come down to like how we are with each other.
00:21:30.460 I'm so glad though.
00:21:31.780 So there'll be room for human beings still, even as we're changing DNA and implanting chips in our brains and stuff.
00:21:37.280 Well, if not, we're lost.
00:21:39.800 And if so, we're dealing with each other.
00:21:43.420 Yeah.
00:21:43.800 I don't know how well that's going to go either.
00:21:46.360 But I wonder like that, you know, there's still debates.
00:21:49.740 I mean, you're effectively an economist.
00:21:51.820 The debates about supply side versus demand side, like what is, you know, what is the near and far term effect of these economic causes?
00:21:58.300 We're going to, we're, we're going to be able to understand at a micro level how things work better.
00:22:04.520 Exactly.
00:22:05.000 Okay.
00:22:05.620 So, by the way, again, I put out this, this writings, this study that shows the mechanics.
00:22:13.900 And I want to convey that mechanic so everybody can see the mechanics.
00:22:17.380 But you're going to go down to a molecular level.
00:22:20.360 That means like nowadays, we or policy makers like the Federal Reserve think about something like inflation.
00:22:29.540 Right.
00:22:29.980 And there'll be maybe five measures of inflation.
00:22:33.520 And we're using the term inflation because our minds are limited in its capacity of the number of things we could think about.
00:22:42.540 When we're now in this new reality, which we now are, you can go down to a molecular level essentially in saying, I could see all the different transactions of what was bought and what was sold and why.
00:22:57.940 And now I can really have a level of understanding.
00:23:01.460 We don't have to be at this grand level that we don't.
00:23:04.820 We're going to be at the molecular level of understanding individual transactions and what's affecting them and be able to deploy resources at the individual molecular level, just like we can do it in biology or physical existence and so on.
00:23:22.800 So, I mean, this will, lots of downsides to AI.
00:23:26.080 Obviously, I dread it.
00:23:27.500 I would end it if I could.
00:23:28.860 But there are upsides, and this sounds like one of them.
00:23:31.240 So, like if you.
00:23:31.840 Of course.
00:23:32.540 So, we have a COVID again, and we're thinking about should we issue COVID checks with AI?
00:23:37.140 We can know the effects of it.
00:23:39.060 Right, what the money's going for.
00:23:40.660 And that'll change everything.
00:23:41.960 It'll change the controls.
00:23:43.640 But it is, of course, a two-edged sword, right?
00:23:46.680 It'll change who controls it, who has access to it, who can use it detrimentally to other people.
00:23:53.580 All of these things are part of the question.
00:23:56.600 Is there any way to avoid, like, totalitarian social controls under AI, with AI?
00:24:01.200 I think there's a question of whether you can have social and totalitarian controls, or maybe you just have anarchy.
00:24:11.660 I mean, I don't know where we're going.
00:24:13.100 I can't tell you.
00:24:14.500 I cannot tell you what this world.
00:24:17.180 I do believe we're going to go through a time warp.
00:24:20.040 Okay?
00:24:20.480 What I mean, it's going to feel like, pooh, you're going through over the next five years.
00:24:25.860 And that environment is because of these five major forces, all of these things, and the changes in the technologies, particularly artificial intelligence and related technologies.
00:24:38.840 So the world five years from now is going to be a radically different world.
00:24:44.120 And I don't know what that's going to look like.
00:24:45.900 When you go into the world of quantum computing and what quantum is like in so many different ways, it raises questions of, you know, what is that like?
00:24:54.820 I'm not smart enough to tell you what that world is going to look like.
00:24:58.040 But as an investor for 50 years-
00:24:59.620 Who is in control?
00:25:00.660 I don't know who's in control.
00:25:01.520 Okay, so, but, right.
00:25:02.980 So this is, like, the opposite of what you've done your whole life, where you try to predict, you know, five years hence.
00:25:08.020 That's your whole business, right?
00:25:09.420 Well, that's, I'd say my business is try to predict, but I'd say, first thing, whatever success I've had in life has more to, been due to my knowing how to deal with what I don't know than anything I know.
00:25:26.480 Okay?
00:25:27.140 So how you deal with what you don't know-
00:25:29.960 I believe that.
00:25:30.460 Okay?
00:25:30.980 Is so important.
00:25:32.460 So, yes, my business in a nutshell is I try to find a bunch of bets that I think are good bets, but to diversify well so that I have a bunch of diversified bets because I do not know.
00:25:46.620 I mean, in terms of my actual track record, I've probably been right about 65% of the time.
00:25:52.700 Okay?
00:25:53.300 And any one bad bet can kill you.
00:25:55.740 So I've known how to deal with that.
00:25:58.080 That's what I've learned, including how to deal with what I don't know.
00:26:01.420 So now you're describing an environment where you can't really know anything about the world in five years.
00:26:07.860 You, well, I can know, I could place good bets.
00:26:11.140 Okay?
00:26:11.460 There are some things that are highly knowable.
00:26:14.820 Okay?
00:26:15.480 Highly knowable, like they say, you know, death and taxes.
00:26:19.760 Right.
00:26:20.140 Okay?
00:26:21.640 Demographics.
00:26:23.080 Okay?
00:26:23.480 So I can know or have a view, for example, that owning, I believe, owning dead assets is not going to be a good thing.
00:26:32.340 So I could think about alternative storeholds of wealth.
00:26:36.480 I can think about that.
00:26:38.380 I can place some bets that allow me, you know, they're not the certain bets, but I can place enough bets and have enough diversification that I can be relatively confident of some things.
00:26:52.280 But never absolutely all totally confident.
00:26:56.160 But I think when we're coming back anyway, that's the reality.
00:26:59.380 I'm just describing our reality the best I can.
00:27:03.160 That's why people who are confident in the future and are just experiencing the present, you know, right now, people are describing, all of them are describing how things are.
00:27:16.940 And almost everybody thinks the future is going to be a modified version of the present.
00:27:23.940 A pure extrapolation forward.
00:27:25.240 Yeah.
00:27:25.840 Okay?
00:27:26.420 Yeah.
00:27:26.560 Things are good.
00:27:27.700 All right.
00:27:28.040 I get it.
00:27:28.440 Okay?
00:27:29.320 Things are...
00:27:30.000 Well, I'll guarantee you, there will be big changes.
00:27:33.220 So those are dumb people, you're saying, making those comments.
00:27:35.700 I'm not, I'm saying it's understandable, but when you study change and the nature of change, it's a, you know, the world changes in dramatic ways because of causes that we can look at and get a good understanding of.
00:27:54.360 But we can't be sure about anything because of the nature.
00:28:00.080 But in this specific case, I mean, that's always true and wise people understand that.
00:28:03.860 Like, you don't, you're not in control of the future.
00:28:05.440 Of course, you're not God.
00:28:06.420 But in this specific case where there are specific technologies whose development we understand because we're watching it, it almost feels like there's no human agency here.
00:28:16.300 Like, not one person ever suggests, like, well, why don't we just stop the development of the technologies by force?
00:28:20.880 Well, there's, I think you're being theoretical again, you know.
00:28:27.120 Well, I don't know.
00:28:27.520 Just look how the system works, okay?
00:28:29.460 Yes.
00:28:30.000 Who makes what decisions how?
00:28:32.840 Like, I think you have the bias of we should stop all these technologies and just stop it.
00:28:39.280 Somebody else has another view and the other people have other views.
00:28:44.040 And as a result, and then there's a means by which those views turn into actions, okay?
00:28:49.640 And so it's correct, the system that we're dealing in will make those types of decisions.
00:28:56.460 And we could discuss the pros and cons of all of those things.
00:28:59.820 But that's just how it works, right?
00:29:02.140 I don't know.
00:29:02.860 I mean, there are all kinds of pernicious, longstanding things that we've stopped, like the global slave trade.
00:29:07.620 The Brits are like, we're not for this, we're stopping it.
00:29:09.560 And they did.
00:29:10.420 Tucker, you are using that we stopped, okay?
00:29:14.220 No, Britain stopped it.
00:29:15.080 Okay, but I'm just trying to say, you have to look at the system and say, who has their hands on the levels of power?
00:29:23.040 Right.
00:29:23.880 And what will they do and what are their motivations?
00:29:28.080 How does the system work?
00:29:29.760 I think that's right.
00:29:30.380 Okay, how does the machine work to make decisions?
00:29:34.560 Okay, so if we can agree on this person makes these types of decisions about these things and it works this way, then we can say we, the collective we, can do that.
00:29:46.260 But this theoretical collective we that is going to make decisions, like we could sit here and be very theoretical.
00:29:52.700 No, no, no, I get it.
00:29:54.380 A determined nation probably can't stop this.
00:29:57.460 So that, my second question is, you keep hearing that there's this AI race between the United States and China.
00:30:02.940 Yeah.
00:30:04.120 Is it true that one country will be completely dominant by the end of this race and that that will be meaningful?
00:30:10.340 No, I think that what's going to happen is, and again, I'm speaking now probabilistically, I think that there will be different types of developments.
00:30:20.880 But by and large, it's very difficult to keep intellectual property that when you take the products of the intellectual property and you put them in the public.
00:30:31.540 Exactly.
00:30:32.040 That'll last about six months, I mean, at most.
00:30:35.280 And you'll develop the nearest and so intellectual property protections and isolation is probably not going to work.
00:30:45.720 So and so now we're going to have different advantages and disadvantages.
00:30:52.120 Let's say, for example, in China's case, there are many fantastic chips, not quite at the same level.
00:30:59.640 See, we design chips, but we can't produce chips effectively.
00:31:04.020 We can't produce, by and large, we can't produce things, any manufactured goods, as effectively, cost effectively, by and large.
00:31:13.180 We have a problem doing that.
00:31:15.040 So what we'll do is we'll design those better chips.
00:31:19.080 You won't have the intellectual protections.
00:31:21.760 And you're going to then have the production of things in China at a very inexpensive way.
00:31:28.540 Manufacturing.
00:31:29.880 China has about 33 percent of the manufacturing in the world, which is more than the United States, Europe and Japan combined.
00:31:38.840 They manufacture effectively cheaply.
00:31:41.960 They will embed chips in the manufacturing.
00:31:44.460 The application of chips, they're going to probably they are more ahead on.
00:31:49.240 China's more ahead on the application of chips.
00:31:51.560 There's robotics.
00:31:52.920 So we're talking about just thinking.
00:31:54.840 But when you connect the thinking to bodies that are automatic bodies, too.
00:31:58.720 Yes.
00:31:59.020 And you have robotics and so on.
00:32:00.660 They're ahead on that type of thing.
00:32:02.060 So different entities are going to be ahead in different ways.
00:32:06.240 And we're going to then be in this world in which there is competition in that world.
00:32:11.020 And then there's an attempt to be protectionist or whatever or to fight those differences.
00:32:16.160 And that's what the world looks like.
00:32:19.180 Does the combination of AI and robotics bring manufacturing back to the United States?
00:32:24.740 We are behind in both of those areas.
00:32:28.760 Greatly behind.
00:32:29.520 You know, so, you know, I would say we're not going to have competitive advantages in those things.
00:32:38.740 What we're competitive in is that small percentage of the population that is uniquely inventive in terms of inventiveness.
00:32:52.300 You know, the number of Nobel Prize winners in the United States.
00:32:56.460 The United States dominates Nobel Prize winners in the world.
00:33:00.260 The inventiveness, best universities and so on.
00:33:02.580 We have a system that is a legal system and a capital market system, and we can bring the best from the world all to the United States to create an environment.
00:33:14.240 If we can work well together in that inventiveness with rule of law working and all of that working, we have those things that are our competitive advantage.
00:33:25.140 We do not have manufacturing, and we're not going to go back and be competitive in manufacturing with China in our lifetimes, I don't believe.
00:33:34.900 OK, so now the question is how we deal with that.
00:33:38.800 Our inventiveness, you just said, and many have said, comes from our education system, from our universities.
00:33:44.480 But then you began the conversation by saying that AI is already.
00:33:47.300 And foreigners.
00:33:48.040 If you look at that population, there's three million people who's, you know, just basically changing.
00:33:54.160 About half of them are foreigners.
00:33:55.900 If you can attract the best and the brightest.
00:33:58.260 Right.
00:33:58.640 And there's a lot to be attracted to in the United States from the best and the brightest, because we are a country of all of these different people operating this way.
00:34:07.540 And we create these equal opportunities.
00:34:10.040 Look who's running some of the countries, companies.
00:34:12.600 They come from different places.
00:34:13.680 If we can have the best in the world come in that kind of environment to be creative and so on, we can invent and so on, but we can't produce.
00:34:24.820 But those, the people you're describing have come to our universities.
00:34:28.960 That's basically.
00:34:29.380 That's right.
00:34:29.820 Silicon Valley's there because Stanford's there.
00:34:31.320 That's right.
00:34:32.140 I was in a restaurant the other night, in fact, this weekend, and I had a little trouble hearing what people were saying.
00:34:37.360 And I thought to myself, I'm a little young to go deaf.
00:34:39.920 Why?
00:34:40.360 Well, because I grew up shooting, bird hunting, target shooting.
00:34:45.180 And I remember my father saying, just stick a Marlboro filter in your opposite ear and you'll be fine.
00:34:49.920 I wish we'd had suppressors, but we didn't.
00:34:54.180 You can now.
00:34:55.680 Check out Silencer Central.
00:34:57.760 Silencers play a crucial role in improving accuracy, maximizing your experience, and protecting your hearing.
00:35:05.000 They're not dangerous or scary.
00:35:06.580 It's just the opposite.
00:35:07.740 Not using them can be dangerous.
00:35:10.740 Have dinner with me in a restaurant and you'll know what I mean.
00:35:13.940 Silencer Central can fix your problems immediately.
00:35:16.540 They will find the perfect silencer for you and make it very easy to buy one.
00:35:21.160 It's not the hassle you thought it was.
00:35:23.000 I know because I just went through it.
00:35:24.580 So you get approved and then Silencer Central ships your order straight to your door.
00:35:29.420 No hassle whatsoever.
00:35:31.460 It is easy.
00:35:33.660 It doesn't get any better, in fact.
00:35:35.200 So if you thought it was impossible to shoot suppressed, you were wrong.
00:35:39.660 Go to silencercentral.com right now.
00:35:42.340 Start browsing.
00:35:43.400 Use the code TUCKER10 for 10% off your first purchase of Banish Suppressors.
00:35:49.700 Highly recommended.
00:35:50.420 But you began the conversation by saying that AI is now at the point where, you know, the machines have the equivalent knowledge of a PhD in every different topic.
00:36:00.240 So, like, at some point, are you going to have universities?
00:36:04.160 We'll redefine what universities are like.
00:36:06.780 But you're going to have that combination of things working together.
00:36:11.340 Because, still, we're a long way from, not a long way, but we're away from the point of the decision-making will be made by the AI.
00:36:22.000 Because, okay, and the wisdom will be by the AI.
00:36:26.180 Like, you're not going to have the AI determine how you raise your kid.
00:36:30.720 And different people will raise their children differently and so on.
00:36:33.700 But the actual, you'll rely on it, but it's really the magic for the foreseeable future is remarkable people with remarkable technologies producing remarkable changes.
00:36:47.700 And then we're going to have, then, the consequences of that.
00:36:51.500 As long as human decision-making plays a role, I'm totally fine with it.
00:36:54.820 But you say that the university is going to change.
00:36:58.380 I mean, how could it not?
00:36:59.360 I thought the Internet was going to get rid of universities.
00:37:01.640 It didn't happen.
00:37:02.200 But, I mean, how long does it take for the current model to change?
00:37:08.080 It's pretty resistant to change.
00:37:11.320 I think it's, yes, it's slow to change.
00:37:14.320 And those who change slowly will be left behind.
00:37:18.100 And then you'll have the best.
00:37:20.520 I would say you see things taking place.
00:37:26.220 But, anyway, I would expect that it's going to,
00:37:30.040 life is more like a game, I think.
00:37:34.720 You know, it's almost like, it's almost like a video game.
00:37:38.800 And you're going to be able to have real-world learning experiences in many different ways to be able to provide education.
00:37:46.420 But it's going to be interactive.
00:37:47.980 You're going to see a type of merging, whether you like it or not.
00:37:51.560 You're going to see a type of merging of the man and the artificial intelligence.
00:38:00.020 Does that worry you?
00:38:02.960 Of course it worries me.
00:38:04.680 And then it excites me.
00:38:07.040 You know, what worries me most fundamentally is how people are with each other.
00:38:12.060 Can we put harmony and happiness, togetherness?
00:38:19.760 Can we resolve decisions, issues?
00:38:22.620 Can we deal with these issues together?
00:38:25.220 That is the most important thing.
00:38:27.440 I think we emphasize too much the wonderful, remarkable things that we get from AI.
00:38:36.300 Like, we'll have greater life expectancy and less disease, and we can have all of those things.
00:38:42.120 But the question is, do we have harmony, quality of life?
00:38:48.020 You know, I did a study.
00:38:49.860 Also, I put it out free online, which is ratings, lots of statistics used for rating various conditions of countries, 24 top countries.
00:38:59.140 It's called the Global Powers Index.
00:39:03.000 It's online free for anybody who wants to look at it.
00:39:04.980 And I rated different powers, economic power, military power, education power, and so on.
00:39:13.520 Then I rated health, how long you live, the diseases you're encumbered by, and so on.
00:39:20.180 And happiness, what your happiness level is.
00:39:23.520 And what's interesting about that is that the measures of power don't have a, past a certain level of living standards, don't have a correlation with health, which is amazing, because you have all the money to produce the health, or, and have no correlation with happiness.
00:39:48.100 That, like, for example, in the United States, which is the most powerful country in the world, by these measures, our life expectancy is five years less than Canadians.
00:40:03.560 So they're right next to us, so they're right next to us, and five years less than countries of equal income levels, okay?
00:40:11.140 So health, we don't, there's poor correlation.
00:40:14.880 And in happiness, there's no correlation.
00:40:20.420 Like, Indonesia has the second highest happiness rating, you know?
00:40:26.620 So all I'm saying is, think about also how we work with each other, how we are with each other.
00:40:33.880 The highest determinant of happiness is community.
00:40:41.120 Of course.
00:40:42.260 If you have a good community, you have happiness, and it has a positive effect on health.
00:40:48.060 So I think it all comes down to how we are with each other, that is going, in dealing with all of the questions that you're raising.
00:40:58.060 There's been no advance in, like, changing human nature over time, right?
00:41:02.500 Right. I mean, less than technological advance, no advance in getting along with each other.
00:41:06.660 Yeah, it goes in ebbs and cycles.
00:41:10.020 I would look at it this way.
00:41:12.280 There are many religions in the world.
00:41:16.160 Most of the religions have two components to them.
00:41:19.960 The first component is, you know, follow the word of God, and you have to follow it, you know, in that way.
00:41:26.740 But the others are about harmony.
00:41:31.100 And they are, in other words, how do we create a harmonious society?
00:41:36.520 You know, so you look at the Ten Commandments or something, and there are rules for how do you achieve harmony.
00:41:43.320 And there are things like karma.
00:41:45.180 Okay, how do you achieve harmony?
00:41:46.880 So different societies have different ways of trying to achieve harmony.
00:41:52.160 So I think that's important.
00:41:53.980 Harmony.
00:41:54.580 I think it comes down to some basic things.
00:41:56.820 Like, I watch, I study this thing, and I go around the world.
00:42:01.240 I think, first, do you educate, raise your children well?
00:42:07.360 Okay, in other words, educate them in capability, so they're capable, but also in civility and how they are with each other.
00:42:16.900 Because a capable, civil person will come out to a society in which they can work well together to be productive.
00:42:26.720 That people have to be productive, right?
00:42:29.520 So, but in order to be productive, you have to have a harmony.
00:42:32.480 They have to deal with the questions.
00:42:34.180 I'm answering your question about how we deal with this.
00:42:38.040 I'm saying we have to deal with it together.
00:42:39.920 There's only in an environment where there's harmony rather than fighting, are you going to be able to address the types of questions that you're raising, right?
00:42:50.120 How do you, you know, when you ask me, you know, is what's going to happen with AI?
00:42:56.740 And then you say, we need to do this and we need to do that.
00:43:01.380 It strikes me that how the we's deal with each other, to be able to deal with those things is the most important things.
00:43:09.840 And there are basics of how we deal with each other.
00:43:12.520 That's the most important thing.
00:43:13.980 I agree with that.
00:43:15.140 Last question.
00:43:16.360 You didn't grow, I don't think you grew up in like a billionaire world.
00:43:20.280 No, my dad was a jazz musician and, you know, we had a low middle class family, but I have it everything I ever needed.
00:43:27.540 Well, so that's kind of my question.
00:43:28.580 Now you live obviously in a billionaire world, which is, where do you run into more happy people?
00:43:32.800 Oh, almost general.
00:43:35.360 If you get past the things that you, your basics, you know, if I can, you know, health, education, habitat, and you get past those, you've got everything you need.
00:43:47.340 And then if you have community, you have everything you need.
00:43:51.100 That is the best.
00:43:53.940 Rydelio, thank you very much.
00:43:55.360 Thank you.
00:43:55.760 Rydelio, thank you.
00:44:25.760 Rydelio, thank you very much.