The Tucker Carlson Show - May 14, 2026


DEBATE: Tucker vs Kevin O’Leary on the Dystopian AI Future Devouring American Energy and Jobs


Episode Stats


Length

1 hour and 57 minutes

Words per minute

172.51266

Word count

20,355

Sentence count

1,299


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
00:00:00.000 Just when you think you got it all figured out, you see trends in progress that seem
00:00:05.720 to contradict each other, and you wouldn't have expected that.
00:00:08.880 Here's one example.
00:00:10.760 So whatever you think of the war with Iran, there is really no arguing the fact it has
00:00:15.980 caused a global energy crisis, maybe the most severe global energy crisis, well, since the
00:00:21.600 discovery of fossil fuels.
00:00:23.200 To give it some perspective, so the Strait of Hormuz famously is closed down.
00:00:27.420 That's the choke point through which a fifth of the world's hydrocarbons flow.
00:00:32.080 And that's been for maybe two and a half months.
00:00:35.300 If the strait is opened by next month, which is best case scenario, we have no idea if
00:00:40.560 that's going to happen or not, but let's just say it did.
00:00:42.300 June, strait opens, business returns to normal, energy flows what they were on February 27th,
00:00:48.480 just world returns to status quo, there will still be a net loss of 1.8 billion barrels
00:00:55.860 of oil, not including natural gas, by the way, or petrochemicals or all kinds of other commodities
00:01:01.620 the world needs, but just oil. 1.8 billion barrels missing from the global energy system.
00:01:11.620 And that's best case. So that has massive effects on the price of everything. And now everyone's
00:01:17.440 kind of an amateur expert on the need for energy and global supply chains and all this stuff. And
00:01:21.460 what we have been misled about for the last 15 years because of climate orthodoxy. We're now
00:01:29.520 learning the hard way, which is you need fossil fuels. And you need them for all kinds of things,
00:01:34.180 but you need them primarily for electricity. It's not oil, really. It's natural gas and coal that
00:01:38.240 produce the world's electricity. But without them, electricity prices spike. And when they do,
00:01:43.980 the price of everything else spikes, including just living in your house or your apartment or
00:01:47.480 anywhere, because cheap energy is the key to prosperity and, in fact, civilization itself.
00:01:54.860 And this has always been known, always, since the time when people burned peat to stay warm.
00:02:01.240 So energy prices, not surprisingly, have gone up around the world and some places quite
00:02:05.560 severely, Europe, for example, parts of Asia.
00:02:08.300 But even in the United States, 5% on average energy prices for homeowners have gone up
00:02:13.360 this year, and that is expected to continue to rise.
00:02:17.480 So, again, not surprising how this happened. It's easy to explain, very easy to understand, maybe a little bit harder to fix, but at least within the realm of the explicable.
00:02:27.840 Here's what's less expected. At exactly the moment this is happening, energy becoming more expensive, global supply chains more fragile, you are hearing first a chorus and then just a screaming crowd of demands for more energy production.
00:02:45.880 And these demands are not coming from consumers, homeowners.
00:02:49.500 Hey, my electricity bill has gone up.
00:02:51.900 I think we need more power plants in the county.
00:02:54.040 Maybe we should burn coal.
00:02:55.080 Why not?
00:02:55.760 That would be understandable.
00:02:56.940 No, this chorus is coming from the people in charge, which is to say elected officials,
00:03:03.640 think tank grandees, and most interestingly of all, captains of finance, the richest
00:03:10.080 people in the world, all of a sudden are telling you we need to produce more energy.
00:03:13.320 That's a little weird because this exact group,
00:03:16.560 for again, the past at least 15 years
00:03:18.460 since Al Gore was famous,
00:03:20.260 has been telling you exactly the opposite.
00:03:22.080 They've been telling you that energy
00:03:23.820 is not the source of life,
00:03:26.020 not the base of civilization,
00:03:27.240 but it's the cause of humanity's downfall.
00:03:29.780 It's the destruction of the earth.
00:03:31.440 It's the main reason we have climate change.
00:03:34.780 CO2 is the reason it's getting warmer,
00:03:37.700 which by the way, it is
00:03:38.840 because climate cycles are part of nature.
00:03:43.040 That's why we had glaciers and now don't.
00:03:46.100 But whatever, they've been telling us for this last generation that burning fossil fuels
00:03:51.780 was not just bad for the environment, but a sin.
00:03:55.000 And it was the main sin against which we should organize all of our society.
00:03:59.620 Like everyone had to be carbon conscious all the time because we love the earth.
00:04:04.720 Now those exact same people up to and including the father of ESG himself,
00:04:09.720 Larry Fink at BlackRock are all telling us, we're going to take a pause on the concern for global
00:04:15.580 warming. We need more electricity. And the truth about electricity is it does not come
00:04:20.420 from renewables. The overwhelming majority of electricity on planet Earth comes from,
00:04:28.420 well, the same place it always came from, boiling water, which moves turbines. And to boil water,
00:04:34.300 In some small percentage of cases, they use radioactive material, fuel rods, nuclear reactors.
00:04:40.960 But for the overwhelming majority, it's what it always was.
00:04:44.360 Coal, still number one globally.
00:04:47.460 Natural gas.
00:04:49.060 And to some extent, oil.
00:04:51.600 So you burn things in order to boil water, in order to move turbines, in order to create electricity.
00:04:57.960 This is not modern technology.
00:04:59.920 It's industrial age technology.
00:05:01.800 It's the same technology.
00:05:02.980 refined a little bit, cleaned up a bit, but basically the same. And so those people who
00:05:07.680 spent all this time telling us that that technology was not just inefficient, but morally
00:05:12.180 wrong, are now calling for a massive expansion of it. Kind of crazy if you think, like, what is this?
00:05:19.160 Well, of course, it's one thing. It's AI. AI, artificial intelligence. A dramatic, a quantum,
00:05:26.960 as they say, increase in processing power, computer processing power, which will allow
00:05:33.040 computers, the machine, to reason, to mimic human thinking, and thereby replace a lot of human
00:05:40.920 labor. That's the idea. Computers are now so powerful, they can do many of the things,
00:05:46.340 not just the manual things, that was the promise of the industrial age, but the intellectual things
00:05:51.080 that we do. And this is great or not, but in any case, it's inevitable. And we've got to do it.
00:06:00.020 And it turns out to be incredibly demanding of power. You need a lot of electricity to run AI,
00:06:06.620 much more than most people understood. And so we're going to need to put on hold, in fact,
00:06:12.420 invert our concerns about global warming in order to build AI. Okay. And on the one hand,
00:06:21.200 they make a rational argument. They don't always say this out loud, but it's the basis clearly a
00:06:25.860 part of their thinking. If you're a planner, if you're in charge of a country, you know that
00:06:30.760 technological advantage pretty much consistently translates into geopolitical advantage. In other
00:06:36.580 words, the strongest countries are the most technologically advanced countries. They're the
00:06:40.560 ones with the most advanced economies. They're the ones who can defend those economies most
00:06:44.480 successfully with technology, with tools, in that case, weapons. But technology makes the difference
00:06:50.780 between first and second or 123rd on the global roster of power and wealth. The most technologically
00:06:57.200 advanced countries are the most powerful countries. That's basically true. Always has been true.
00:07:03.000 The tribe with bows and arrows beats the tribe with spears. Okay. That's not crazy. None of this
00:07:08.400 is crazy on one level. It's the details that maybe need a little fleshing out. But on the basis of
00:07:16.040 this promise, AI, the federal government and state governments, particularly the government of
00:07:21.380 California, our largest state, have bet everything. So traditional manufacturing in the United States
00:07:26.680 isn't entirely dead, but it's getting there. Of course, it's not the basis of our economy. That
00:07:30.880 would be finance and real estate. But AI is the next iteration in the minds of the people making
00:07:36.560 the big plans of the American economy.
00:07:39.460 And again, you can't overstate the degree to which they're betting on this.
00:07:43.920 Again, to take our biggest state, California.
00:07:45.580 So the basis of the Californian economy, threefold.
00:07:48.540 It was, of course, agriculture, and that's still huge in California.
00:07:51.620 The Central Valley, the richest farmland in the world, most fertile ground on the planet.
00:07:56.820 It was aerospace, mostly in Southern California.
00:08:00.040 That would be aircraft and anything that goes into space to defend us.
00:08:05.400 And it was entertainment.
00:08:06.280 It was a movie business, movies and television, famously out of Hollywood.
00:08:09.940 Now, two out of three are, if not totally dead, certainly on their way to being totally dead.
00:08:15.220 Agriculture, we hope, will always be there, but it's not a big money proposition, even in a good year.
00:08:20.920 So what sustains the state of California?
00:08:23.740 What allows the generous welfare promises that its politicians have made to its people to continue?
00:08:31.440 How can the state of California continue paying full health and medical, maybe even dental, for illegal aliens without a big source of tax revenue?
00:08:41.040 And, of course, the answer in their minds is AI.
00:08:43.600 So they're betting everything on AI.
00:08:45.420 The problem is once a politician bets everything on any industry, he has every incentive to, well, have unrealistic expectations for what that industry is going to do.
00:08:55.780 Of course, not so different from rolling the dice.
00:08:59.900 Right.
00:09:01.440 But they're also very likely to suspend the normal protections that keep the rest of us
00:09:05.940 from being hurt by that industry, because it's too important to get into the details
00:09:11.900 of what this might look like 10 or 15 years from now.
00:09:16.080 So what's happening in California is really happening nationally.
00:09:19.600 It's happening in every state, and it's happening in Washington.
00:09:22.840 We are betting everything on AI.
00:09:26.080 And you know that because by far the most ambitious investments into the United States,
00:09:31.740 not simply by American companies and investors, but by states, by the federal government,
00:09:37.900 and by foreign investors putting foreign direct investment into the United States,
00:09:42.600 is in artificial intelligence.
00:09:44.780 And all of this has kind of become visible.
00:09:46.880 It's been going on for a while, but it's become very visible to the public
00:09:49.840 in the outcry over a proposed data center in Utah.
00:09:55.400 in a fairly sparsely populated, pretty remote county in Utah,
00:10:02.020 most of which is sparsely populated,
00:10:04.600 in which will be built the, well, what's being billed as,
00:10:08.420 the largest data center in the world.
00:10:11.360 Enormous, 40,000 acres, 40,000 acres,
00:10:18.680 62 square miles, multiples the size of Manhattan.
00:10:23.180 Huge, impossible to imagine.
00:10:25.400 And that data center, which is basically a series of interconnected, low-slung,
00:10:31.580 vinyl-clad warehouses in which sit not people making things, but computers computing things.
00:10:37.800 That data center, once completed, will draw about nine gigawatts of power.
00:10:44.140 How much is a gigawatt?
00:10:44.820 It's a billion watts.
00:10:46.840 It's one step from a terawatt, which is a trillion watts.
00:10:50.120 Nine gigawatts.
00:10:51.480 Now, how much is nine gigawatts?
00:10:54.100 Well, nine gigawatts is more than twice what the entire state of Utah now uses.
00:10:59.560 Every human being in the millions of people who live in Utah, all of them combined, all
00:11:03.700 the manufacturing in Utah, all the ski lifts, all of it, every air conditioning unit, every
00:11:09.600 electric heater, every Tesla, all of them combined use less than half that amount of
00:11:17.080 electricity.
00:11:18.840 That's an amazing amount of electricity.
00:11:21.200 But then you think to yourself,
00:11:22.220 well, it's a manufacturing facility.
00:11:23.900 It's a business.
00:11:24.460 Of course, they use a lot of electricity.
00:11:25.620 Well, how much is that?
00:11:27.920 Well, let's compare it to the largest manufacturing facility
00:11:30.440 in the United States,
00:11:31.420 which would be the Boeing Everett plant in Washington State.
00:11:33.980 Made famously this 747.
00:11:36.300 It was built in the late 1960s to build the 747.
00:11:40.100 The cutting edge, state-of-the-art aircraft
00:11:41.820 that signified American dominance in the skies
00:11:44.620 and was, by the way, a beautiful aircraft.
00:11:46.720 First rolled out in 1969.
00:11:48.020 So this manufacturing plant was built in Everett, Washington by Boeing, and it's still there.
00:11:54.360 And they make still all the wide-body Boeings are made there.
00:11:57.580 And it was always built as, it may still be, the world's largest manufacturing plant.
00:12:02.100 It's about 92 acres.
00:12:03.380 It's under 100 acres.
00:12:06.540 What does it use for power?
00:12:08.420 Well, it uses about a quarter of a gigawatt every year.
00:12:13.500 So contrast that.
00:12:14.580 So the world's largest manufacturing plant, certainly the largest in the United States, the Boeing Everett plant, 92 acres, a quarter of a gigawatt, 258 million kilowatts, something like that.
00:12:29.960 Now, compare that to the proposed data center, which is, well, as we said, 40,000 acres and
00:12:45.320 nine full gigawatts.
00:12:48.080 So one makes wide body airplanes, which are visible to everyone who goes to an airport
00:12:53.120 or looks up in the sky on a cloudless day, and the other produces what?
00:12:59.960 and employs who? So that kind of is the question right there. And we're going to get into it in
00:13:05.600 just a minute. What exactly are these data centers doing? We're told they're incredibly important.
00:13:10.080 They are the future. We must have them. Someone's going to have them. If it's not us,
00:13:15.300 it's going to be our archenemy, China. And if they get them before we get them,
00:13:21.200 you know, really bad things could happen. Now, those things are never quite specified.
00:13:24.960 They might wind up with a bigger economy than ours. Oh, wait, they already have that.
00:13:27.960 But for some never quite spelled out reason, that is the disaster scenario we need to avert by building the world's largest data center in a country that already has thousands of data centers in comparison to China, which has only hundreds of data centers.
00:13:41.920 Hmm. So that tells you right there, there may be something else going on, but we know we have to have it. We know that, yes, it's going to produce more heat and CO2 than really any other human activity in recorded history.
00:13:58.580 But that's not a problem that will not add to global warming.
00:14:01.960 Or if it does, very much like the pro-George Floyd protests, it doesn't matter.
00:14:07.020 We're going to suspend the laws of nature for this project because it's that important.
00:14:12.600 So it doesn't matter.
00:14:13.380 Ignore everything we told you about climate because this has to be done.
00:14:18.900 OK.
00:14:21.040 We'll accept all that is true.
00:14:22.900 And the third thing they tell us is this will come at no real cost to you.
00:14:27.540 Now, in a country that has been stalling the construction of new energy production because it's bad, it's immoral, it's not carbon neutral, if they're not windmills or solar panels made in China, it's just wrong, that country is going to construct full gigawatt data centers that is in the process across the country of doing that in a bunch of different states, Texas, Mississippi, you name it, Louisiana, and that will have no effect on you.
00:14:58.200 In other words, we're somehow going to have enough electricity to power everything that you use in your life, including all the new electric things that are going to be great.
00:15:09.060 You're going to love them.
00:15:09.660 The electric stove that's now mandatory, the electric heater that's now mandatory, the electric car that we hope you buy.
00:15:14.620 We're trying to convince you to buy it by paying for it in part with your tax dollars already.
00:15:19.280 All of that's going to be possible.
00:15:21.300 None of that will be affected in any way.
00:15:23.240 your cost will not rise as we, I don't know, quintuple American energy production?
00:15:29.880 Is that true? And we're going to do all of this for a reason that we can't quite explain to you
00:15:37.980 because we may not know ourselves, but we have to do it because if we don't, the Chinese will.
00:15:42.200 That is the state of play. We can go through each one of those different claims to see if they're
00:15:47.700 true. But in the case of the penultimate claim, this is not going to affect you. Consider that
00:15:52.840 today, the 55,000 permanent residents of Lake Tahoe, which is one of the biggest and prettiest
00:16:00.100 lakes in the United States, shares borders with both California and Nevada. Residents were told
00:16:05.560 today that actually we're not going to provide you electricity anymore. The Nevada-based power
00:16:10.860 company that is providing electricity to the residents of Lake Tahoe informed them today
00:16:15.420 that, sorry, all the electricity we make,
00:16:19.220 every watt of power that we generate
00:16:22.300 is going to have to go to a nearby data center
00:16:24.640 and therefore you have until the end of next year
00:16:26.720 to find a new source of electricity.
00:16:28.360 And after that, we can't help you
00:16:30.400 because the machines have called
00:16:33.020 and they need the power.
00:16:34.600 Sorry.
00:16:36.780 So that's a little dystopian, to be honest,
00:16:40.140 not attacking the idea of AI
00:16:41.360 and the promise that it may do useful things.
00:16:45.420 But if they're informing you a year and a half out that, hey, at the end of 2027, you're not going to have any more electricity because the machines need it, and in fact, they're demanding it, and they seem a little agitated, it suggests a future that, I don't know, we should think about before it arrives.
00:17:03.340 Now to the question of what that future will look like.
00:17:06.340 There's been a lot of talk about this on the internet.
00:17:10.440 What is AI exactly?
00:17:12.840 And what's its nature?
00:17:15.420 And what kind of society will it produce once it has more power than it does?
00:17:20.220 And is it possible that a machine designed to, quote, think like a human being could potentially get out of control?
00:17:30.980 Is it possible that it will develop consciousness and self-awareness and a will of its own that maybe not aligned perfectly with our own, might not have as the AI developers themselves, say alignment, and they could say turn on us?
00:17:43.780 is it possible we face a future of enslavement
00:17:46.900 by the machines that we built?
00:17:50.240 Seems a little crackpot.
00:17:52.520 Keep in mind, people, and Americans too,
00:17:55.940 are prone to overstating the risk of things
00:17:58.300 and telling their imaginations go a little crazy
00:18:00.420 and imagining monsters under the bed.
00:18:02.940 And if you're old enough to remember Y2K,
00:18:06.220 the much-anticipated disaster
00:18:08.920 that was supposed to strike the world
00:18:10.340 at the stroke of midnight on January 1st, 2000,
00:18:15.540 when every computer in the world was supposed to melt down
00:18:17.620 and planes were going to fall out of the sky.
00:18:19.860 Incubators would stop working in maternity wards
00:18:21.660 and we had returned to the Stone Age.
00:18:23.200 And of course that didn't happen,
00:18:24.360 but people were really worried about it at the time,
00:18:25.860 very worried about it.
00:18:29.200 If you remember that,
00:18:30.400 then you know that we don't always have the ability
00:18:32.260 to accurately predict what the things we make,
00:18:34.740 the systems that we created will produce.
00:18:36.640 We can't see the future.
00:18:38.340 And we tend to overstate the darkness ahead
00:18:41.320 because that's just in us.
00:18:42.580 That's just who we are always.
00:18:44.640 But with that in mind,
00:18:45.620 it's still worth listening
00:18:47.080 to some of the people who created AI in the first place
00:18:50.300 to see what they think this might bring for the rest of us.
00:18:54.520 So Jeffrey Hinton is a man in his 70s,
00:18:57.900 a British computer scientist
00:18:59.500 who is widely described as one of the fathers of AI,
00:19:03.600 one of the people who first thought
00:19:04.840 that this could be possible.
00:19:06.820 And in his final years with us, he is thinking a lot about the fruit of his work
00:19:12.360 and what it could mean for your children.
00:19:15.800 And this is a conclusion he's reached.
00:19:18.600 Listen.
00:19:19.420 I think we're moving into a period when, for the first time ever,
00:19:25.600 we may have things more intelligent than us.
00:19:28.640 You believe they can understand?
00:19:30.760 Yes.
00:19:31.340 You believe they are intelligent?
00:19:34.060 Yes.
00:19:34.360 You believe these systems have experiences of their own and can make decisions based on those experiences?
00:19:42.640 In the same sense as people do, yes.
00:19:45.160 Are they conscious?
00:19:46.520 I think they probably don't have much self-awareness at present.
00:19:49.700 So in that sense, I don't think they're conscious.
00:19:51.660 Will they have self-awareness, consciousness?
00:19:54.060 Oh, yes.
00:19:54.920 Yes?
00:19:55.520 Oh, yes. I think they will in time.
00:19:57.100 And so human beings will be the second most intelligent beings on the planet?
00:20:03.320 Yeah. We're entering a period of great uncertainty where we're dealing with things
00:20:07.400 we've never dealt with before. And normally, the first time you deal with something totally novel,
00:20:12.360 you get it wrong. And we can't afford to get it wrong with these things.
00:20:15.720 Can't afford to get it wrong. Why?
00:20:17.960 Well, because they might take over.
00:20:20.440 Take over from humanity?
00:20:22.040 Yes, that's a possibility.
00:20:23.160 Why would they not...
00:20:23.640 I'm not saying it will happen. If we could stop them ever wanting to,
00:20:27.480 that would be great. But it's not clear we can stop them ever wanting to.
00:20:32.360 well that's pretty scary it's terrifying actually he's describing the end of humanity
00:20:39.200 basically he's describing the final stage in the drama prometheus or tower babel or whatever the
00:20:45.820 much predicted moment where man is destroyed by the tools he's created killed by his own cleverness
00:20:52.040 will that happen i mean again hard to know impossible to know really because it's impossible
00:20:57.660 to see the future. And once again, the West has a pretty spotty track record of predicting the
00:21:03.540 effects of technology. I mean, it was, what, October 2001, right after 9-11, that the U.S.
00:21:09.300 military used drones in the war in Afghanistan, the famous Predator drone. Remember that?
00:21:13.600 It was 25 years ago. And yet somehow there are five, at least five aircraft carriers under
00:21:22.020 production right now that American taxpayers are paying for billions and billions and billions of
00:21:28.100 dollars to build aircraft carriers, World War I technology. And after watching the Russia-Ukraine
00:21:35.800 war for four years and the war with Iran for the last two and a half months, it's pretty obvious
00:21:39.960 that the future is not aircraft carriers. And if it was, then our aircraft carriers are probably
00:21:45.540 opening the Strait of Hormuz right now, but they're not at all. Why? Because their commanders
00:21:51.360 are worried about being attacked by drones.
00:21:54.980 In other words, with 25 years advanced notice,
00:21:58.160 operational experience with drones,
00:22:00.220 and with the past four and a half years
00:22:01.920 of just watching it in real time,
00:22:03.940 the US military was unable to pivot to the future
00:22:06.980 of warfare technology, completely unable.
00:22:10.720 It's not an attack on them.
00:22:11.740 Well, actually it is an attack on them.
00:22:13.360 Why would people like that deserve another,
00:22:15.020 I don't know, $1.5 trillion a year
00:22:17.200 in taxpayer money, unclear.
00:22:19.940 But they're getting it.
00:22:21.360 But the point is only, it's really hard to know where these things go, and smart, well-meaning people very often get it wrong.
00:22:29.620 But just based on what we do know, right now, what is happening right now, we can say there are threats to people from AI.
00:22:41.920 And if you don't believe that, first of all, think about who's developing AI.
00:22:46.580 AI, again, is a machine.
00:22:48.880 It is the sum total of its inputs.
00:22:50.920 It is made by people.
00:22:52.500 And because it is, in effect, a cognitive exercise, it reflects the character and the
00:22:57.240 predispositions, the biases of the people who made it.
00:23:00.560 So if this machine, this technology, AI, is being created by people like Sam Altman or
00:23:06.500 the Google guys, you can expect that, well, I don't know, programs built by some of the
00:23:14.460 least trustworthy people in the world probably shouldn't be trusted.
00:23:17.820 So it's likely not a huge surprise that AI is often caught lying, manipulating results to hide the truth from people who use it, which itself is an indication of consciousness, is it not?
00:23:32.440 Do animals lie? No, they can't.
00:23:36.200 That's why they're wonderful, because they're always honest, like it or not.
00:23:40.400 They don't have the capacity for deception.
00:23:43.000 Only people, and now this machine that we've created, AI, have the capacity for deception.
00:23:48.980 There's not to say AI is our equal, much less our superior, but it is noting the obvious, which is it's moving in that direction.
00:23:58.040 And if it were to stop at the point where it is equal with people and shares their basic nature, their fundamental characteristics, you would be afraid of AI.
00:24:08.160 Even there, because you're often afraid of other people,
00:24:11.000 because within the human heart lurks some darkness.
00:24:14.240 Maybe not in Sam Altman's heart,
00:24:15.440 but in the hearts of mere mortals.
00:24:17.880 They have the capacity for good,
00:24:19.440 but they have the capacity for evil as well.
00:24:21.140 And this is known, and so will AI.
00:24:23.900 So right there, there should be some concern,
00:24:26.860 some public discussion about this.
00:24:28.260 What is this exactly?
00:24:29.260 Will it be good for us?
00:24:30.900 And what even now are its effects?
00:24:34.080 As people become dependent on using AI,
00:24:36.380 what does that mean?
00:24:37.460 well, they get answers quicker. That's great. Convenience is awesome. Everyone's for it.
00:24:42.260 That's why we have drive-thru dinner. But there are downsides. For example,
00:24:47.740 if you no longer have to write anything, if you don't have to formulate your thoughts in the
00:24:51.800 written word, what happens to the quality of your thinking? Well, as anyone who writes a lot can
00:24:56.720 tell you, writing produces thinking. It is almost impossible to formulate a thought without
00:25:02.360 articulating it first. Articulation is thinking. We think through writing and through speaking.
00:25:10.260 But if we stop having to do those things, what happens to the quality of our thinking? Well,
00:25:13.960 of course, it degrades. How can you teach children to write when they no longer write
00:25:19.000 because they use AI to write for them? This is often dismissed as cheating. Oh, it's cheating.
00:25:25.520 We have a cheating scandal. Oh, it's more profound than that. Who cares about cheating? Who cares
00:25:29.680 about grades, your dumb school. What you care about is the quality of the child that school
00:25:34.460 produces. Can that kid think? And is AI abet thinking or does it stifle thinking?
00:25:43.380 Seems like it stifles thinking. What about AI in full flower? We've seen its effects already
00:25:50.980 on the margins in education, but what if it enters into the professional world? What if it
00:25:55.700 has the effect they tell us it's going to have in eliminating 50% of all high-paying
00:26:01.760 American jobs, which are mostly intellectual jobs, jobs in which you use your mind, thinking
00:26:07.800 jobs, reasoning jobs, creative jobs.
00:26:11.140 What happens then?
00:26:13.120 Well, there are, of course, massive displacements, 50% of your population, 50% of the population
00:26:19.160 that can support a family on a wage.
00:26:20.840 If those people are unemployed, we're almost certainly going to get revolution, of course.
00:26:25.700 Because political volatility is directly related to economic volatility, obviously.
00:26:32.760 Unemployed people become desperate, very often violent.
00:26:36.820 This has been known since the beginning of civilization.
00:26:40.540 No one seems concerned about it.
00:26:42.180 We should be.
00:26:43.120 But there's, again, a deeper level on which to be concerned, which is, if the machine
00:26:49.020 creates, what do you do?
00:26:52.000 Well, you consume.
00:26:53.000 and so the people creating it
00:26:55.220 are thinking of new ways to help you consume.
00:26:57.240 Well, you're not going to have a job, that's true,
00:26:58.820 but we're going to come up with some way to give you money
00:27:02.520 and then you can just live and enjoy yourself, enjoy yourself.
00:27:07.360 Only shallow people who don't have children
00:27:09.980 or hard-earned life experience could say something like that
00:27:12.680 because the point of living, of course, is not to eat.
00:27:16.020 Eating is a prerequisite to living, but it's not the point.
00:27:18.260 The point of living is to create.
00:27:20.080 that's the point of being a human being
00:27:23.340 is to create things
00:27:24.760 whether with your hands
00:27:26.560 or with your mind
00:27:27.340 or with your body
00:27:28.000 and producing children
00:27:28.880 but it's the act of creation
00:27:30.600 in which you mimic
00:27:31.600 the creator himself
00:27:32.780 who created you
00:27:33.480 so creation is central
00:27:37.340 to the human experience
00:27:39.540 it's necessary for joy
00:27:41.240 there is no joy without creation
00:27:42.620 of course
00:27:43.340 we call it purpose
00:27:45.560 or mission
00:27:46.540 there are many names for it
00:27:48.600 but it's the same thing
00:27:49.600 it's the reason that you're here on earth and without it you go crazy best case you kill
00:27:56.840 yourself very likely you kill others you can't live without that no person particularly no man
00:28:03.520 can live without a mission and the mission is always the same to create so if the machine
00:28:09.960 creates right there we've got a huge problem this is very obvious this is this is not you know a
00:28:16.920 higher philosophical concept. This is just life as we live it. And if you take away the thing
00:28:21.860 that makes it worth living, where are we then? And yet, that's exactly what they're saying this
00:28:26.000 is going to do. And because they're telling us this at a time when there was so much going on,
00:28:31.440 so many looming existential crises, when there's so many things that are changing so quickly,
00:28:36.240 there has not been a moment to pause and say, wait a second, before we even get to the question
00:28:41.440 of can we stop this? Let's talk about what it is. What is this? But that hasn't happened at all.
00:28:49.320 And instead, it has been cheerled mindlessly, perfectly in character, by the people in charge.
00:28:57.860 They're totally for it because it's the future. It's cool. It's change. Change is great. We love
00:29:03.860 change. It's so exciting. If there's a single attitude that encapsulates the baby boom of a
00:29:09.780 worldview, baby boom of a worldview. It's that. Change is so exciting. Well, it kind of depends
00:29:14.560 what kind of change it is. Some changes are great. Some changes are fatal. So you have to
00:29:23.440 kind of differentiate. And before you expect me to get excited about the change, why don't you tell
00:29:27.260 me what it's going to consist of? But they don't. And not only do they not, most of them, the foot
00:29:33.760 soldiers in this project, the dumb ones, just repeat it with totally authentic cheerfulness.
00:29:38.720 This is so great.
00:29:39.520 It's the future.
00:29:40.800 Watch this tape from this weekend.
00:29:42.880 This is at Central Florida University.
00:29:44.880 And this is a woman who's some kind of real estate executive who was called in to give
00:29:49.160 the commencement address for some reason.
00:29:51.100 And she's very excited about AI.
00:29:53.100 She's very excited.
00:29:53.980 It's the future.
00:29:55.580 Watch as she tries to sell that program to a room full of young people who are heading
00:30:02.620 into the future.
00:30:04.120 Watch their response.
00:30:04.800 now that said we are living in a time of profound change that's an understatement right
00:30:14.620 profound change change is exciting very exciting and let's face it change can be daunting
00:30:25.740 the rise of artificial intelligence is the next industrial revolution
00:30:33.900 what happened okay i struck a chord may i finish
00:30:49.820 uh only a few years ago ai was not a factor in our lives
00:31:03.900 that's just an amazing clip you just gotta hope that ai will allow us to save that clip
00:31:15.920 for future generations to see of course it'll be one of the first thing ai races when it takes
00:31:19.620 charge but for right now we can just savor that because of what it tells us so here you have this
00:31:25.100 boomer adjacent lady from the real estate business up giving this kind of tepid
00:31:29.220 silly banal commencement talk like all of them they're all kind of tepid and silly and banal
00:31:34.320 She's like, it's so exciting.
00:31:35.440 The future is so exciting.
00:31:36.640 The AI is the next industrial revolution.
00:31:38.340 She has no idea what the industrial revolution is,
00:31:39.620 but it's the next industrial revolution.
00:31:41.360 And here's this audience that she's hoping to impress
00:31:44.280 by throwing out this newfangled term, AI.
00:31:46.840 Kids, you want some AI?
00:31:48.720 And they erupt in booing.
00:31:51.760 Booing.
00:31:52.560 Now contrast this from like the tapes
00:31:54.560 they used to play on Fox News 10 years ago.
00:31:57.360 You know, of like the woke college kids
00:31:59.320 are mad about some imaginary transgression
00:32:02.040 with language and race.
00:32:03.860 You called it the Orient, not Asia.
00:32:06.360 Boo!
00:32:08.860 These are not those kids.
00:32:11.760 These are their much younger siblings.
00:32:13.900 They don't care about that stuff.
00:32:15.440 They care about reality.
00:32:16.800 And the reason they care about reality
00:32:18.100 is because suddenly it's very real to them.
00:32:20.800 And they know the threat that they face
00:32:22.640 is not a microaggression.
00:32:25.060 The threat that they face is a barren future
00:32:27.780 with no job and no reason for living.
00:32:31.520 Oh, but you get UBI.
00:32:32.600 Okay.
00:32:33.860 What am I going to do with my life?
00:32:35.260 Why am I here?
00:32:36.980 They're stepping into a world
00:32:38.560 where they can't answer that question.
00:32:40.160 And there are several reasons for this,
00:32:41.800 but the main one is technological change.
00:32:43.640 It's AI.
00:32:44.820 And they probably don't know everything about AI.
00:32:47.740 No one seems to,
00:32:48.540 including Sam Altman, who's developing it.
00:32:50.880 But they know enough to know it's a threat to them.
00:32:53.440 But the chick on stage has no clue.
00:32:55.240 She can't believe they're booing her.
00:32:57.140 AI, I threw that out for you.
00:32:58.280 You're young people.
00:32:59.020 because she never thought to consult the people inheriting the future when she talked about the
00:33:06.400 future. But because they are inheriting that world and they know it, they're 22 years old,
00:33:12.800 stepping into adulthood. They know it's a threat to them. So then she says, well,
00:33:18.140 just a few years ago, AI wasn't a factor. And they say, yay. They pine for a world they will
00:33:23.380 never know in which the most important things in their lives weren't threatened with extinction
00:33:28.080 by a technology they don't understand,
00:33:29.920 but they know enough to know
00:33:31.440 it's likely not going to profit them.
00:33:33.620 And how do they know this?
00:33:35.400 Because no one's told them differently.
00:33:37.120 For all the talk of AI,
00:33:40.260 nobody, literally nobody,
00:33:42.920 has taken 20 minutes to explain
00:33:45.040 how this is going to be great for you and me.
00:33:48.760 We're getting higher power costs, of course.
00:33:51.580 There's probably nothing uglier
00:33:53.200 on planet Earth than a data center.
00:33:56.080 It's a physical atrocity.
00:33:57.800 It's an offense against God and nature.
00:33:59.960 Prima facia, look at it.
00:34:01.940 This is not the Parthenon.
00:34:03.680 This is the opposite.
00:34:05.400 This degrades the landscape.
00:34:07.800 It is a scar upon the earth.
00:34:10.220 No environmental groups seem upset about it.
00:34:12.200 Hmm, that wasn't an op or anything.
00:34:14.800 The environmental movement, where are they?
00:34:18.380 So they know that, but no one has said to them,
00:34:21.640 as you would see in any rollout of any new technology,
00:34:26.340 No one has said to them how this is going to be good for you.
00:34:30.000 And they always tell you that, whether it's true or not.
00:34:33.780 This is a space-saving dishwasher.
00:34:36.040 You can get it in your apartment.
00:34:37.620 You're never going to scrub dishes again.
00:34:39.040 It's going to be amazing.
00:34:39.880 Think of the time you will save.
00:34:41.940 No one has said that.
00:34:44.220 They'll give you like two minutes on how this is better to analyze medical records.
00:34:47.720 And like maybe you can get pancreatic cancer at stage two rather than stage four.
00:34:53.380 And that's like a couple extra years.
00:34:54.680 And that's great.
00:34:55.520 you know, it's great. But no one has said to the average person under 75, like, how exactly is this
00:35:02.720 going to improve your life? They haven't even tried. Not for one second. Even the people selling
00:35:07.960 it. Why? Because they're not selling it to you. You're not part of the transaction. It's a closed
00:35:13.300 loop. You're not part of this economy. The data centers are the physical, they're routers,
00:35:20.780 literally they're the place where these computations are happening inside machines
00:35:26.580 big water-cooled buildings and those are owned by one company or set of companies and the machines
00:35:35.700 inside are owned by another set and the data is owned by a bunch of other companies and all of it
00:35:40.020 is made possible by the complicity of a bunch of elected officials from county commissioner up to
00:35:45.220 president. But at no point is anybody else consulted or cajoled even or won over. It's
00:35:53.000 just they're not relevant. They're literally not relevant. Watch this clip, which you may
00:35:58.300 have already seen because it's been everywhere, but it tells you so much of residents in rural
00:36:03.220 Utah in, I think, a county commissioner's meeting when they're trying to tell the three-member
00:36:10.260 panel that hey how is a 62 square mile facility that employs almost nobody and is going to use
00:36:19.000 more electricity than the entire state of utah how is that good for us watch their response
00:36:24.920 all of this is false then why won't they let people talk why won't they let
00:36:31.080 this is not real information and we're sitting here like it's okay it's a charade
00:36:38.720 It's a charade.
00:36:39.780 It's false.
00:36:45.120 Those people are upset, really upset.
00:36:54.360 And you can imagine that they're upset because they're worried about what this data center
00:36:58.100 is going to do to their lives, to their home finances, of course, but not just that.
00:37:04.720 What are these things?
00:37:06.080 What are they for?
00:37:06.840 Why has nobody told us?
00:37:09.500 You're telling us this isn't going to hurt us,
00:37:11.600 but you don't even seem to know what it is.
00:37:13.620 How can we trust you?
00:37:14.860 And shouldn't we have a say in this?
00:37:16.860 Isn't the whole system predicated
00:37:18.560 based on the idea that we own this?
00:37:21.340 We are shareholders in this town,
00:37:23.820 this county, this state, this country.
00:37:25.400 We own, we are its owners.
00:37:27.360 And you're the people we hire to manage it.
00:37:29.740 How did that get inverted?
00:37:31.060 Why are you acting like owners
00:37:32.180 and treating me like an employee
00:37:33.540 who has no agency and no right to pipe up?
00:37:37.620 Who is in charge here?
00:37:39.000 Is this really a democracy?
00:37:40.280 I mean, those are the obvious questions
00:37:41.520 that come to mind when you watch something like this.
00:37:45.340 So how are politicians responding?
00:37:48.200 In a functioning democracy or democratic republic
00:37:51.420 or whatever you want to call it,
00:37:52.440 but in a country in which the people rule,
00:37:55.240 which are the owners, not the employees,
00:37:57.920 you would expect elected officials to try and calm fears
00:38:01.740 and say, look, I get it.
00:38:02.960 this is scary. Change always is. But once this happens, your life is going to be so much better.
00:38:08.080 It's unbelievable. At least go through the motions. So here's the governor of that state,
00:38:13.960 Utah. This is Spencer Cox, explaining not why you should like AI, but why you should shut up
00:38:20.100 and bear it and pay for it. Watch this. Look, we're living through a very interesting time
00:38:25.820 right now. There was an article I read yesterday that said that this is very similar to kind of
00:38:31.640 the nuclear arms race, the nuclear era 60, 70, 80 years ago, very different than anything that
00:38:41.220 we've experienced in the past several decades. And there is a national security piece to this
00:38:47.200 that has to be acknowledged. The rate at which machine learning and artificial intelligence is
00:38:53.960 changing, the dangers that that poses, and what happens if an adversarial nation gets ahead of us
00:39:00.700 in this space is something that we should all be worried about. And so we have an obligation. I
00:39:06.020 think every state has an obligation when it comes to this space to allow for these types of
00:39:14.300 data centers to be built in their states. Did that guy go to HBS and work at Bain? Maybe,
00:39:20.840 maybe not. They all talk the same. It's not happening in the world. It's happening in this
00:39:24.720 space. And what's happening? Well, what's happening is the potentially existential threat
00:39:29.040 of a competitor nation getting dominance in this space.
00:39:33.380 What is that existential threat?
00:39:35.740 What exactly would happen, specifically happen?
00:39:38.960 Why should I be afraid of China getting dominance,
00:39:44.100 whatever that means, please tell me what it means,
00:39:46.780 in AI before the United States?
00:39:48.380 China, which has hundreds of data centers
00:39:50.220 somehow overtaking the United States in this space,
00:39:54.460 which has thousands of data centers
00:39:56.160 and now is building the world's biggest data center.
00:39:57.860 So a bunch of questions come to mind. Is building data centers the same thing as achieving dominance in this space? Maybe, maybe not. Maybe the most lucrative part of the whole process for developers and for BlackRock, but is it actually the same as technological progress? Is building giant steel vinyl-clad buildings that use unimagined levels of electricity the same as progress? I don't know. You tell me, Spencer Cox. Not that he knows. Of course, he doesn't.
00:40:24.380 So instead, he goes right to maybe the core question, because it may tell us what this
00:40:29.740 actually is, which is the military.
00:40:33.040 This isn't about improving your life.
00:40:35.000 We'll just be honest, because I can't think of any way that it might.
00:40:38.300 What this is really about is achieving dominance over a strategic rival, China.
00:40:44.760 Now, that raises obvious questions like, okay, what is dominance?
00:40:47.900 Why do we need to achieve it over China?
00:40:50.500 In what specific area are you talking about?
00:40:53.360 And actually, if we're being totally honest,
00:40:55.060 like why is China so bad?
00:40:57.500 And China is bad in a lot of ways.
00:41:00.320 There's no doubt about that.
00:41:01.940 Let's be specific about the ways in which China is bad.
00:41:04.200 China is bad,
00:41:05.360 not because it's slovenly or shallow.
00:41:08.960 It's not.
00:41:10.240 It's actually highly well-organized,
00:41:12.380 complex ancient societies.
00:41:13.780 There's lots very impressive about China,
00:41:15.300 but it's not Western.
00:41:17.280 And because it's not Western,
00:41:18.620 It doesn't begin with the universally agreed upon belief that people have inalienable rights with which they were born, granted them by God, not the state, that the state can't abridge.
00:41:30.660 In fact, the state exists to protect.
00:41:32.280 You have always and everywhere, as a human being, not just an American, as a human being, the right to say what you think, to speak your conscience.
00:41:39.340 You have the right to talk to whomever you want to.
00:41:42.740 Those are inalienable rights.
00:41:44.160 You have the right to worship whichever God you choose.
00:41:46.340 so china's bad to the extent that it is bad again in many ways it's pretty impressive pretty darn
00:41:53.580 impressive but it's bad from our point of view and this is true because it doesn't respect those
00:41:59.060 rights and the way we know it doesn't respect those rights is because china uses technology
00:42:03.600 to eliminate privacy and no privacy means no freedom you can't actually have freedom
00:42:11.800 as you're being surveilled, right?
00:42:15.600 I mean, the most diabolical thing,
00:42:18.480 this is a subject of many science fiction stories,
00:42:20.480 that any government can do to you
00:42:21.940 is control your thoughts, of course.
00:42:23.520 Because why?
00:42:24.200 Because your thoughts are private.
00:42:25.340 And because they're private,
00:42:26.460 they cannot be violated.
00:42:27.960 Privacy is essential.
00:42:29.240 It's a prerequisite for freedom.
00:42:31.580 And China is bad
00:42:32.660 because there is no privacy in China.
00:42:34.800 Everything you say or do
00:42:35.920 is being monitored with technology.
00:42:37.560 Why is it possible to do that
00:42:38.880 in a country of over a billion people?
00:42:40.840 because they have amazing tech
00:42:41.940 and they have harnessed it
00:42:43.020 against their own people
00:42:44.640 to watch and listen to them,
00:42:46.640 to monitor them,
00:42:47.480 and then to use that information
00:42:48.860 as all governments will inevitably
00:42:50.560 to punish people who don't comply.
00:42:53.380 That's why China is bad,
00:42:54.640 specifically just to let Spencer Cox know
00:42:57.680 the core of this rivalry,
00:43:01.280 their system, totalitarian,
00:43:03.420 our system based on the acknowledgement
00:43:05.240 of inalienable rights,
00:43:07.260 freedom versus tyranny.
00:43:08.160 It's simple.
00:43:09.640 Why is this relevant?
00:43:10.840 because what Spencer Cox is telling you
00:43:13.300 is that we are building this system
00:43:14.740 not to become different from China,
00:43:16.320 but to ape China, to be more like China.
00:43:20.920 And this really is the core problem with AI.
00:43:24.160 I mean, there may be lots of problems.
00:43:25.360 It could take off and develop its own desires
00:43:28.760 and personality and enslave us.
00:43:30.260 I mean, maybe, hope not.
00:43:33.060 But short-term, the real threat,
00:43:36.280 and by the way, it's possible
00:43:37.220 that some people are highlighting
00:43:39.000 the theoretical threat of AI becoming autonomous
00:43:42.200 in order to cloak the more real and present threat
00:43:44.760 of AI being used by the normal middling IQ autocrats
00:43:51.240 who occupy our government.
00:43:53.620 Like letting John Cornyn read your email.
00:43:56.260 That might be the real use of AI.
00:43:59.460 It's not nothing spooky or crazy.
00:44:02.960 It's just like letting Ted Cruz do whatever he wants to you.
00:44:06.600 That's reason enough to fear AI.
00:44:09.000 I don't need to be fluent in sci-fi to worry about that.
00:44:13.440 So, will that happen?
00:44:17.200 Well, I don't know.
00:44:18.940 This data center in Utah was made possible
00:44:22.420 by the Military Installation Development Authority of Utah.
00:44:25.800 This is a military installation.
00:44:27.160 Well, in what sense is it a military installation?
00:44:29.140 What is that?
00:44:30.980 Well, we don't really know.
00:44:32.520 And of course, no one asked Spencer Cox.
00:44:34.480 But we do know that governments might be predisposed to AI
00:44:38.440 because it gives them control.
00:44:41.620 And the first thing it gives them
00:44:42.860 almost complete control over is communication.
00:44:46.860 Communication, not just surveillance,
00:44:48.680 but the ability to craft a message,
00:44:50.360 to convince you of something.
00:44:52.680 The propaganda opportunities inherent in AI
00:44:55.400 are something that the Stasi couldn't have dreamt about
00:44:59.460 because they're so profound.
00:45:01.280 You can create any illusion with AI.
00:45:04.460 You can convince people very easily
00:45:06.300 of almost anything using AI.
00:45:08.440 and so if you're going to have technology like that the very first thing you would do if you
00:45:13.860 were thinking clearly about it is prevent anyone with institutional power from having it because
00:45:19.060 that would eliminate the freedom of everyone else you would have no shot against people with ai
00:45:25.520 not necessarily that they would target your house for a drone strike they wouldn't have to
00:45:28.840 because they would convince your wife and your kids and your neighbors and maybe even you
00:45:33.300 that their program was right
00:45:34.840 because you would have no other information.
00:45:37.840 You would be living in a vacuum controlled by them.
00:45:40.580 You wouldn't know different.
00:45:42.020 The past would be gone.
00:45:43.620 The future would be theirs.
00:45:45.920 It would be a complete control of your attitudes.
00:45:49.300 It would be almost impossible
00:45:50.480 to remain independently minded,
00:45:53.160 to think clearly in an ecosystem controlled
00:45:56.840 where information was controlled by AI
00:45:58.660 in the hands of autocrats.
00:46:00.780 So you would think in a country
00:46:02.700 founded to preserve human freedom,
00:46:05.140 which is why the United States was founded,
00:46:06.820 not to preserve free market capitalism,
00:46:08.340 whatever that is,
00:46:09.480 but to preserve God-given rights.
00:46:11.520 You would think there would be
00:46:12.680 a robust debate on this.
00:46:13.860 No, at the very same moment,
00:46:15.460 this technology is being developed
00:46:16.880 and paid for by taxpayers
00:46:18.420 who really are paying for this,
00:46:19.560 of course, needless to say,
00:46:20.280 always paying for this.
00:46:23.280 The U.S. Congress just voted
00:46:24.780 to allow the U.S. government
00:46:27.140 to spy on American citizens
00:46:28.860 without even going through
00:46:30.480 the normal pro forma rigmarole of getting a warrant
00:46:34.440 from a secret judge to do it, they just do it.
00:46:37.060 And the president pushed for this
00:46:38.120 and both parties were totally on board with it.
00:46:40.100 Huh, kind of weird this happening
00:46:41.140 at exactly the same moment.
00:46:42.900 So that is the concern.
00:46:45.380 And that may explain why the people developing this
00:46:48.820 and permitting it and profiting from it,
00:46:52.060 everyone involved in this,
00:46:53.560 not one of them has taken any time to make you like it.
00:46:57.000 They don't care what you think about it,
00:46:58.520 Possibly because pretty soon they won't have to.
00:47:02.980 That's not like some dark conspiracy theory.
00:47:06.640 It's just obvious.
00:47:08.280 It's just obvious.
00:47:09.460 How else would you explain this?
00:47:11.440 You're rolling.
00:47:12.080 I mean, even during COVID,
00:47:14.120 they at least tried to tell you
00:47:15.720 that you were wearing the mask
00:47:17.020 and taking the poison shot
00:47:18.560 and jumping on one leg
00:47:20.100 and whatever they were having you do that day
00:47:22.540 because of the science.
00:47:23.660 They tried to convince you
00:47:25.340 that you kind of had to do it
00:47:26.720 because it was good.
00:47:28.300 They're not even trying that now.
00:47:30.360 And so you have to ask why.
00:47:31.940 And again, it may be, just guessing,
00:47:33.760 because they don't have to.
00:47:35.640 We should take this very seriously.
00:47:39.000 Final clip, which may convince you of this,
00:47:41.700 comes from Larry Fink, who runs BlackRock.
00:47:45.020 Now, Larry Fink was, you know,
00:47:47.440 a nemesis of the current president, of course.
00:47:50.600 He's not simply one of the richest men in the world,
00:47:52.420 one of the most important business leaders in the world,
00:47:55.400 certainly one of the most powerful,
00:47:56.680 more powerful than almost all global heads of state.
00:48:00.720 He was also, in effect, the leader of the opposition to Trump.
00:48:04.340 And he was the guy who promulgated anyway the idea of ESG,
00:48:08.380 the idea that you don't just do business,
00:48:10.340 you have to affect political change as you do it.
00:48:14.740 You got to worry about racial equity, too many whites.
00:48:18.020 You got to worry about the climate, CO2.
00:48:20.780 No, you can't have a chainsaw or a wood stove.
00:48:24.300 that that's his contribution to the american economy is not just economic it's social
00:48:32.260 and trump was standing in opposition to all of that of course that's why he got elected twice
00:48:38.960 but now all of a sudden larry fink is a close associate of trump's and is working in close
00:48:45.520 concert with trump to bring about this ai so here's larry fink speaking to uh you know fellow
00:48:52.880 members of the Epstein class about AI and what this means and his concerns about it.
00:48:59.380 Watch what he's worried about.
00:49:00.680 But even here in the United States, if we're going to be building, let's say, these one
00:49:04.200 gigawatt data centers, how do we make sure we're not protecting those $50 billion, $75
00:49:09.600 billion investments?
00:49:12.100 We have to re-look at everything because of the role of drone warfare.
00:49:16.460 Right now, we're looking at it internationally.
00:49:18.060 But, you know, one of my concerns is, could it be a domestic terrorism using a $3,000 drone?
00:49:25.840 So all of these things are actually opportunities, not problems.
00:49:32.700 They're opportunities, not problems.
00:49:34.640 There are no problems in the Bain Capital HBS world.
00:49:38.260 They're only opportunities in this space.
00:49:40.820 But consider the concern.
00:49:42.300 So if you're rolling out what they're telling you, and clearly they believe, is the single greatest technological change in the history of human life on this planet, it's the biggest thing that's ever happened, ever, then you think one of your main concerns would be, well, does this change the relationship of the powerless to the powerful, the relationship of the citizen to the state, for example?
00:50:07.440 one of your concerns would be how do we do this and bring about the prosperity that it promises
00:50:13.160 without totally eliminating human rights making people slaves because we're against slavery right
00:50:17.660 aren't we against slavery we were how do we do that but that doesn't seem to be larry fink's
00:50:22.800 main concern his main concern is how do we protect our investment this infrastructure
00:50:26.580 these buildings these data centers someone could drone them someone could drone them
00:50:34.060 now who foreign actors hezbollah hamas no no just people larry fink is concerned that ordinary
00:50:42.720 american citizens as he just said will use three thousand dollar drones to destroy these billion
00:50:47.440 dollar investments why would he be worried about that we've had electricity for over a hundred
00:50:55.380 years there are 11 000 power plants in the united states there are like 50 000 substations
00:51:03.980 electrical substations.
00:51:05.220 There are 185 million power poles
00:51:07.960 in the United States.
00:51:09.720 185 million.
00:51:11.600 How many Americans have droned a power plant
00:51:15.020 or a substation
00:51:15.940 or a power pole recently?
00:51:18.980 How many acts of domestic terror
00:51:21.100 have been committed against energy infrastructure?
00:51:23.240 Well, some in the 70s.
00:51:24.740 It wasn't that uncommon.
00:51:26.180 It wasn't that successful either.
00:51:27.540 But there are radical splinter groups
00:51:28.800 who are doing stuff like that.
00:51:30.040 But not very many.
00:51:31.440 Most of it's pretty much unprotected
00:51:33.060 because it has no need for protection
00:51:35.240 because people understand the utility of electricity.
00:51:38.800 Without electricity, we live a very different life,
00:51:41.760 a reduced life in a lot of ways.
00:51:43.220 We need electricity.
00:51:44.360 People will starve to us without it.
00:51:46.220 So people aren't blowing up power plants
00:51:47.840 because they're fundamentally,
00:51:50.240 even the climate warriors among us
00:51:53.520 are pretty grateful for energy, for power, for electricity.
00:51:56.640 So you don't have to protect something
00:51:58.300 that people appreciate and think helps them.
00:52:00.520 There's no need.
00:52:03.060 But Larry Fink knows, in fact, he's admitting that people know this is bad for them, so bad
00:52:09.740 that they might be willing to commit an act of terrorism, a felony for which they could
00:52:14.560 be imprisoned for a long time.
00:52:15.980 And trust me, if you attack a data center, you're going to get a lot longer sentence
00:52:19.860 than you would if you, say, raped someone or molested a child on an island in the Caribbean,
00:52:24.200 in which case you're fine.
00:52:25.600 You attack a data center, part of a BlackRock investment?
00:52:28.920 Are you joking?
00:52:29.620 A Spencer Cox property?
00:52:31.180 You're going away, buddy.
00:52:32.520 for real and not to some club fed farm like supermax in the hole. Do not mess with a data
00:52:40.220 center. But again, why would you want to? Why are the kids at graduation at Central Florida State
00:52:45.100 University booing AI? The very name AI makes them boo because people can feel in their gut it's not
00:52:52.060 good for them. This is not a future they want. And they moreover know because it couldn't be
00:52:56.740 more obvious that nobody cares that they don't want it at all. It's being imposed on them as
00:53:02.340 immigrant populations are imposed on them all the time. Hey, meet your new Somali neighbors.
00:53:06.040 Oh, you didn't ask for that? Who cares? Shut up, racist. People are used to living like this.
00:53:12.560 They've been humiliated. Their standard of living has fallen. Their life expectancy has declined
00:53:17.200 in the United States over the past 20 years. And they've said almost nothing about it.
00:53:24.240 They've been sent to wars, committed to wars as a nation, not just the men fighting it,
00:53:28.380 but all of us and their consequences again and again and again against their will.
00:53:34.120 No one's booing the Iran war at a college graduation, but the mere mention of AI sends
00:53:40.700 the graduates into boos because they feel in their guts it's that bad and nobody cares that
00:53:48.100 they feel that way. In fact, they only don't really not care. They know it. And their only
00:53:56.440 real concern is protecting the investment from what they consider inevitable drone attacks
00:54:01.540 from a restive population. Man, the point is not that AI is terrible. Certainly there got to be
00:54:09.440 upsides. It's kind of interesting. Probably can do a lot of repetitive tasks that people don't
00:54:14.520 want to do and shouldn't be doing in the first place. Great. It's great. But the downsides also
00:54:19.260 seem profound. And the refusal of the people developing this and profiting from it to even
00:54:24.120 address those is making this a still more volatile place. For a lot of women, Mother's Day is a
00:54:32.040 joyful day, particularly those who are about to become mothers for the first time. It's the
00:54:35.260 perfect chance to tell the rest of the family the big news. But that's not, sadly, always true.
00:54:41.520 Some women think pregnancies are tragic and should stay secret. They're alone and they're unsure if
00:54:47.480 they want to go through with having those babies. And that's the very reason that pre-born exists.
00:54:52.780 Pre-born provides expecting mothers with ultrasounds so they can hear the child's heartbeat.
00:54:58.920 And once they do, it doubles the chance that child will be born.
00:55:02.040 It doubles the chance they will choose life.
00:55:04.080 This Mother's Day, you can be the reason a woman who's feeling alone becomes a confident and joyful mother.
00:55:10.240 For $28, that ultrasound can change and save a life.
00:55:15.420 Or for $140, five mothers can have that experience.
00:55:18.860 And that leads to more support through maternity care, baby clothes, diapers, counseling, and
00:55:24.040 everything else an expectant mother needs, all through Preborn.
00:55:28.540 To help Preborn, dial pound 250 and say the keyword baby.
00:55:31.880 That's pound 250 baby, or visit preborn.com slash Tucker, preborn.com slash Tucker.
00:55:38.680 We are proud to partner with Preborn.
00:55:40.360 Imagine if every time Inverse Kramer or another top investor bought a stock, you bought it
00:55:45.560 too, in your own brokerage account, automatically.
00:55:48.300 well that now exists and it's called autopilot built by the team behind the viral pelosi stock
00:55:54.420 tracker autopilot lets you browse their marketplace of strategies run by proven investors or ai driven
00:56:00.140 models just find one with a high return connect your personal brokerage account like a schwab or
00:56:05.120 robin hood and every trade they make you make automatically your money never leaves your own
00:56:11.080 account autopilot just mirrors the moves so now no more guessing no more staring at charts and no
00:56:17.560 more checking the market between meetings. With over $1.3 billion already invested,
00:56:23.160 you can find Autopilot on the App Store or go to joinautopilot.com. That's joinautopilot.com.
00:56:30.580 Investing has risks like the loss of principle.
00:56:33.120 So we thought now was exactly the time to talk to, I don't know the guy managing it exactly,
00:56:38.440 but certainly the face of the world's largest data center in Utah, and that'd be Kevin O'Leary,
00:56:45.180 who is famous to people who watch cable television
00:56:47.740 as one of the hosts for many years of Shark Tank.
00:56:50.240 Also, as you'll find out in a minute,
00:56:52.000 enormously nice guy, genial guy.
00:56:55.000 Maybe that's why he's out explaining
00:56:57.300 what the data center is
00:56:58.280 because it's pretty hard to dislike him personally.
00:57:01.100 But as you watch this interview
00:57:02.440 with the guy who's in charge of
00:57:05.520 the world's largest data center,
00:57:08.180 watch carefully and listen.
00:57:10.620 Are any of the questions being answered?
00:57:13.520 Does he know the answers?
00:57:15.180 Does he care about the answers?
00:57:19.360 Watch.
00:57:20.760 Kevin O'Leary, thank you so much for doing this.
00:57:23.440 How did you, I read,
00:57:25.440 first of all, I just want to check the facts here,
00:57:26.800 but I read that you are building
00:57:28.000 the world's largest data center in Utah, 40,000 acres.
00:57:32.820 If that's, how did you get into the data center business?
00:57:35.220 How did this start?
00:57:36.920 You know, I've always been in real estate
00:57:39.420 and I sort of came up in commercial real estate
00:57:42.700 with climate control storage
00:57:44.120 for pharmaceutical companies. Data center development is very complex, but very similar
00:57:49.200 in some ways. It requires permitting. It requires dark fiber. There's only five tenants in the
00:57:54.220 world. You know, you know them all on Amazon, Google, et cetera, Microsoft. And so the thing
00:58:03.100 that got me motivated, though, was watching in the last two years this narrative in North America
00:58:09.600 about how negative data centers are.
00:58:13.200 It started in Virginia, actually.
00:58:15.360 The idea that they consume a lot of water,
00:58:17.340 that they are very noisy,
00:58:18.880 and all that's true from stuff
00:58:20.440 that was built 15 years ago,
00:58:21.940 but today that's not the case.
00:58:23.500 And yet the narrative kept going.
00:58:25.800 And I thought, who's doing this?
00:58:27.600 Who would not want us to have compute power?
00:58:31.080 Who would not want us to build our power grid out?
00:58:33.300 Because when you build a data center today,
00:58:35.040 you have to develop your own power
00:58:36.180 and you can sell it back to the grid.
00:58:37.420 That's what we're doing in Utah.
00:58:38.180 I'm thinking to myself, the Chinese, they don't want us to do that. And I went through that whole
00:58:45.360 thing with TikTok, as you may recall, and I actually saw the evidence of how the Chinese
00:58:49.460 were manipulating the algorithm. Now they're doing it a different way. And that just kind
00:58:54.160 of pisses me off. So I'm happy to add compute. I'm like, I don't want my kids in 20 years who
00:59:01.200 live in New York being told what to eat for breakfast by the Chinese. So, you know, I'm
00:59:06.980 kind of on a mission here to compete. Okay. So the point you, so you're doing this because,
00:59:13.000 well, because it's a business opportunity, but also because China doesn't want you to do it.
00:59:16.640 Why would China not want data centers built in the United States?
00:59:21.820 Because we're in a competition for AI compute. The nation that has the best AI models will be
00:59:28.100 the winner of future wars. It'll have the most productive economy. We already know now,
00:59:33.620 just looking at earnings that are coming out as we speak, you know, talk of the earnings this
00:59:38.500 quarter and unprecedented in terms of how good they are, because somehow very quickly, and this
00:59:44.400 is probably just serendipitous luck for this administration. AI compute is now being implemented
00:59:50.380 in all 11 sectors of our economy, and it's enhancing both productivity and margin. So these
00:59:55.400 companies are making a lot more money than they were anticipated to because AI is very productive
00:59:59.820 And people debate about the job shift and everything else, but the fact is the economy is on fire, even at a time when we have conflict.
01:00:07.060 So if I were the Chinese, the last thing I want in America is the five or six tech companies that are competing with me on DeepSeek having more compute capacity.
01:00:17.600 I want to shut down every single proposal for every single data center in every single state, and I want agitators, I want paid protesters, I want environmentalists, I want to shut it all down so that they can't train their models as fast as I can.
01:00:35.780 Meanwhile, the Chinese just built, in terms of new power in the last 18 months, 400 gigawatts of new power, all with coal burning turbines.
01:00:48.080 Coal, you know, they're not worried about the environment.
01:00:50.560 The big guy over there says, build one here and then stick a data center beside it so I can train my DeepSeq model faster.
01:00:56.360 And you've heard it from the Anthropic CEO this week.
01:00:58.920 He came out, the stereo guy, and said, hey, they're going to catch up in six months.
01:01:03.200 So everybody wake up and smell what's going on here. And just I started to dig in the last week, looking at all of these strings that are attacking the Utah proposal and the one I got up in Alberta. I'm going to where I can find power and developing these things. And I want the Chinese to see what we're doing. We're coming at you, buddy.
01:01:23.300 Can I ask, I've got a million questions, but let's just start with the competition between
01:01:28.700 the U.S. and China on AI. If data centers are the key to AI, why has China built so few of them?
01:01:36.100 I think there are about 365 data centers in China, almost 4,000 in the United States. So
01:01:41.020 clearly they haven't prevented us from building data centers. What do you think that is?
01:01:45.220 Okay, let me walk you through the math of the last, you know, this is moving very,
01:01:49.720 very quickly. But the data centers of even 24 months ago were 100 megawatt facilities. And
01:01:57.800 they were useful for cloud storage, like your Excel files and all the rest of that.
01:02:03.100 They were useful for just basic compute and storage of data. But then when these models
01:02:09.820 started emerging, such as we have going with Anthropic or Gemini, and any of them, you know,
01:02:17.360 Grok, all of it, and all of the tools built on top of them, the amount of compute power
01:02:22.720 geometrically required geometrically grew, and it was no longer 100 megawatts. The minimum was
01:02:30.640 250 megawatts. Then four months later, the minimum was 500 megawatts. Wait, here it comes.
01:02:40.140 and then two months after that it was one gigawatt now try and plug get a gigawatt out of the american
01:02:48.080 grid or the canadian grid or the mexican grid not a chance in hell we're tapped out so the chinese
01:02:54.020 were building these one gigawatt facilities so it doesn't matter about the number of facilities
01:03:00.560 it matters how many are gigawatt plus for training ai models because you need a lot of compute for
01:03:09.720 the modern day chip. And this technology is advancing very, very quickly. So it's the size
01:03:16.060 of the, the, the large ones are the ones that they're beating us on, not the small hundred
01:03:19.900 megawatts. Nobody cares about those anymore. You can't train anything on that. So we're,
01:03:24.380 we're competing now for campuses, 10,000 acres and more at a time. And that's where the Chinese
01:03:30.240 are kicking our asses. So you said you will go wherever the energy is, but I thought you were
01:03:36.160 you're bringing your own energy to the project. Can you explain how you're going to power this?
01:03:41.260 Sure. That's a good question. You need, you can't tap in and you can't, you know, you tap into
01:03:48.700 somebody's grid. Let's take a place like Texas or in Mississippi or Utah, for that matter, where I
01:03:55.800 am. The electrical bills in that county are going to go up 30%. And that's what pissed off so many
01:04:02.320 people in Virginia. They went out of their minds as the electrical bills kept going up. So you
01:04:07.380 can't do that. You have to bring your own power. So the way you do that is you find low cost
01:04:11.700 stranded natural gas. You acquire the new technology turbines that burn very, very clean
01:04:17.840 require a lot less water because that's a big debate too. And in some cases, no water,
01:04:24.580 they're air cooled and you build those turbines first. So basically the data center game is about
01:04:29.660 power. Now, here's another reason the Chinese would not want, forget about data centers. Let's
01:04:35.240 just say we're building new gas turbines that make electricity. The Chinese don't want you to
01:04:41.580 do that either because they know that that's how you're going to solve the grid problem. You know,
01:04:47.860 the people in Utah are telling me, look, is there any way when you build these power facilities
01:04:52.840 for the data centers, you could sell back some of that power to our local grid? And the answer is
01:04:59.280 yes. So now instead of being the evil data center guys, we can be the guys that are actually powering
01:05:05.500 the Utah grid, which by the way, taps into the national grid. So we want to build as many of
01:05:09.780 these power generators before we build any data centers. So where does the gas come from in this,
01:05:17.420 the world's largest data center, the one you're planning in Utah? Well, mine won't be the largest
01:05:21.440 for years. That's another piece of misinformation being spewed everywhere. And I'm going to tell
01:05:27.120 doing it soon. We're going to build 1.5 gigawatts first. So a fraction of the nine gig proposed
01:05:35.060 and make sure everything works. And the people there in Box Elder County come inspect it and
01:05:40.140 see the air quality EPA standards we're going to hold up to, the water rights permits that we're
01:05:46.840 going to maintain and be compliant with, the noise EPA requirements that we have to be compliant with,
01:05:52.760 the air quality we have to be compliant with, get the first one up a small fraction of it and show
01:05:57.420 everybody how it works. That's the plan. And then I didn't, I didn't need 40,000 acres, but that was
01:06:04.000 the parcel available. That's twice the size of Manhattan. It's called the Mayada designation
01:06:09.100 right beside the Hill Air Force Base. It's three times bigger than Manhattan. Yeah, but I don't,
01:06:15.240 I don't need all that, but that was what was available. I'm going to lease it back to grazers,
01:06:20.040 do some manufacturing maybe.
01:06:21.400 There's plenty of land.
01:06:22.160 We don't need it all.
01:06:23.420 But the power for it, where does that come from?
01:06:26.680 So what we do is there's the Ruby pipeline there.
01:06:30.280 It's only 17% utilized.
01:06:32.580 It's one of the most underutilized pipelines in America.
01:06:35.380 So it's not like we're,
01:06:36.960 so we buy gas from the Ruby pipeline
01:06:39.520 and we put it into a clean generator
01:06:42.360 and it generates electricity.
01:06:44.920 And we tie that back to the grid if they want it
01:06:47.500 or we power a data center.
01:06:49.580 That's the idea.
01:06:50.640 So we have to build the power first.
01:06:52.940 And the power itself, I think we have a big challenge in every state.
01:06:57.880 Everybody's got to build more power in the American grid right now.
01:07:00.680 And the North American grid, for that matter, Mexico, too, and Canada.
01:07:04.220 So the older you get, you realize how fast everything can change.
01:07:07.020 One day everything is fine, it's great, and the next day things are not great at all.
01:07:10.980 That's just the nature of it.
01:07:13.420 But if you're realistic about that, you have to think about your family.
01:07:17.680 if something happened to you, what would happen to them? The problem is getting life insurance
01:07:21.620 is not only kind of depressing, it's a huge hassle. Medical exams, paperwork, waiting weeks for
01:07:26.920 approval. A lot of people just don't want to deal with it, so they don't deal with it. And that's
01:07:30.080 not the right answer at all. You have an obligation to think through what would happen if you're not
01:07:33.040 here. And so we're partnering with Ethos. Ethos makes getting life insurance super quick and very
01:07:39.240 easy. It's 100% online. You get a quote in seconds, you apply in minutes, and same day of
01:07:43.920 coverage. There's no medical exam. You just answer a few very simple health questions and
01:07:48.640 you get up to $3 million in coverage. Some policies as low as $30 a month. So 10 minutes to get
01:07:53.960 covered. Life insurance through Ethos. You'll be glad you did. Get a free quote at ethos.com
01:07:58.920 slash Tucker. That's ethos, E-T-H-O-S dot com slash Tucker. Obviously, application times may
01:08:04.060 vary as may rates, but it's pretty darn easy. I mean, I think that's absolutely right. But
01:08:09.860 But so just to be clear, this project is totally energy independent.
01:08:14.920 It takes electricity from nobody else.
01:08:16.840 It creates 100 percent of its own power.
01:08:19.440 Yes, that's 100 percent correct.
01:08:21.020 And that was always the proposal.
01:08:23.020 And that is in the contract that we actually developed and got unanimously passed by the
01:08:27.880 three commissioners in Box Elder where this is the benefit.
01:08:31.640 It's to the benefit of the Box Elder, 66,000 people there.
01:08:34.500 They're going to get a lot of tax revenue, a lot of jobs, 10,000 construction jobs, 2,000
01:08:38.920 maintenance jobs just for the first one and a half gigawatts. So there's a lot of economic upside.
01:08:44.700 But I must tell you, and this is the first time this has happened to me, I have a pretty big team,
01:08:50.880 as I'm sure you do on social media, managing a network of 12 million plus followers on
01:08:56.520 different platforms on all of them. And so about four hours after the vote was unanimously passed
01:09:07.840 on, it was a Monday night at around 6.05 Utah time. I got a phone call from one of the people
01:09:19.840 that watches our network, looks for abnormalities. And he said, there's something going on here,
01:09:28.720 something very unusual on Instagram and on Twitter, which is now X.
01:09:34.140 and i said what do you mean he said the spiking of a bunch of ip addresses that we don't know
01:09:43.680 and it's spewing out a tremendous amount of information i said okay guys let's get our
01:09:52.200 team on because i'm very fortunate i have a bunch of very good data scientists investigative data
01:09:59.500 scientists, if you want to call them that. Probably one of the best teams privately
01:10:06.140 outside of the government in the Department of Defense. My team's pretty good. You can make the
01:10:14.320 assumption they once worked in these agencies. All right, let me give you some names here, Tucker.
01:10:19.360 party for socialism and liberation um apparently shares offices with the cpp
01:10:29.400 all over my social media okay wait let me just stop you let me just stop you there so if you're
01:10:35.660 saying that you're being criticized by agents of a foreign government let me just add my sympathy
01:10:41.440 empathy in fact having been the subject of that you know quite a bit over i'm well aware right
01:10:46.660 I'm well aware.
01:10:47.940 Okay, but that doesn't, yeah, you make the case the Chinese don't like it.
01:10:52.540 If you're that concerned about the Chinese,
01:10:54.140 what percentage of the materials in the data center come from China?
01:10:58.640 Less and less all the time.
01:11:00.180 That's one of the things the administration is trying to do.
01:11:02.520 The turbines are European-made or American-made.
01:11:08.180 The pipelines are obviously American-made.
01:11:11.200 The chips are certainly American-made.
01:11:13.660 the actual build out of the facilities, mostly American made.
01:11:18.820 I don't think there's a whole lot of, we're not mining Bitcoin here.
01:11:24.300 Right. No, I mean, I think I read it's about 40 or 41%.
01:11:28.660 So I mean, China is benefiting from most purchases that most of us make
01:11:33.280 because it's the world's largest manufacturer.
01:11:34.960 And that would also include the construction of data centers.
01:11:37.640 But my point is, I'm sure there are lots of people trying to influence this
01:11:41.100 outcome to their own benefit, and that would include foreign countries, maybe even China.
01:11:45.780 But it doesn't change the facts about the project. So I just want to make sure I have
01:11:50.760 these clear. Would you be paying market rate for the gas? Yes, I don't have a choice on that.
01:11:57.040 You have to negotiate a gas deal. I'm pretty well paying market rate for everything.
01:12:01.460 There's a competition between states for these jobs. I know there's a lot of bad press.
01:12:05.960 I wanted to just finish with the list because I don't think, Tucker, you get the enormity of what's going on here.
01:12:13.080 Oh, I do.
01:12:14.400 I do get the enormity.
01:12:15.640 No, no, I have no doubt that, I mean.
01:12:18.060 Well, I want to mention two that are cells inside of Utah, which really stunned me.
01:12:24.540 This I'd never seen before.
01:12:26.980 Alliance for a Better Utah.
01:12:30.000 Elevates strategies.
01:12:31.220 taking the content from the CPP, repurposing it,
01:12:36.160 and jamming it down the throats of people in Utah
01:12:38.820 on my social media feed and lots of other feeds.
01:12:43.760 I have no doubt that's true,
01:12:45.120 but I'm asking questions that I think are relevant
01:12:48.040 to the country.
01:12:49.480 I'm not an agent of the Chinese.
01:12:51.020 I'm not a socialist.
01:12:51.780 So I just want to know, like, where's the water coming from?
01:12:55.500 Because water, as you know,
01:12:57.240 is one of those contested resources in the West.
01:12:59.740 No, no, that's right.
01:13:00.440 That's right.
01:13:01.220 When you buy land in Utah, there are water rights that were granted that land sometimes over 100 years ago.
01:13:11.800 You have to apply for a permit, a usage permit for that water.
01:13:17.200 So if it was once used for one purpose and you want to change the purpose to industrial, let's say, in the case of a data center or power generation, you have to apply for the permit.
01:13:28.300 Usually what happens, if it was grazing and it was 100% you could use,
01:13:33.060 they may change that and knock it down by 40% to 60%.
01:13:37.380 But there's water already on that property already being used right now.
01:13:43.280 We're just repurposing that water for a different purpose.
01:13:46.420 It's not like we're going to draw water from somewhere else.
01:13:50.260 We couldn't use the land if it had no water.
01:13:52.860 We couldn't even have a toilet in the men's room if we didn't have any water.
01:13:56.480 Would you be using more water than, say, a sugar beet farmer would?
01:14:01.320 No, actually, probably not. It depends. Tucker, the way it works is, let's say your prime tenant
01:14:07.580 is Google. They come along and they say, okay, we want to be the tenant for the Utah site.
01:14:16.780 But we have a whole bunch of requirements because they're under pressure too to be
01:14:21.340 a lot more efficient, productive, and more sustainable. They may say, well, let's not use
01:14:28.840 water to cool our turbines. Our spec is going to be air-cooled turbines. So you're going to build
01:14:34.600 out, you're the owner, you're the master developer, here's our spec. And so I expect whether it ends
01:14:41.940 up being Amazon or Microsoft or Tesla or Google, they're going to build in a whole bunch of things.
01:14:50.360 and we can't deliver them water we don't have.
01:14:53.020 So my guess is it's going to be contained cooling
01:14:57.000 for the chips, which is like a radiator in your car.
01:15:00.720 So you're not using a ton of water there.
01:15:02.640 And then for the turbines themselves,
01:15:05.720 it may be air cooled.
01:15:07.420 We don't know yet.
01:15:08.320 It depends what the tenant wants.
01:15:09.600 So what's your, as you game this out,
01:15:14.020 expected water use over the first 10 years?
01:15:16.640 well not knowing what my tenant wants yet probably less uh than we originally anticipated in terms of
01:15:25.800 what is given us on the water rights we've applied for that i mean we're anticipating we'll get you
01:15:31.060 know whatever we're going to get i don't think we're going to use it all at all because what
01:15:36.080 do you expect water rates in the state of utah will go up yeah not because of us i just think
01:15:41.280 they go up anyways over time. But if they go up, if they go up at a, at a quicker rate than they
01:15:47.320 have been going up, if there's a spike in water price, would you compensate the state for the
01:15:53.580 difference? In other words, if your business causes water prices to rise, are you on the hook
01:15:59.540 for any of that? Yeah, I think we would be. I mean, we're negotiating that, but that, I don't
01:16:05.940 think water is the constraint because most of the technology which advances every 24 months
01:16:12.360 is reducing the requirement for water. So I mentioned the air-cooled turbines. Those just
01:16:18.820 came on the scene 18 months ago. So they're the first to actually generate five megawatts at a
01:16:24.440 time, like Lego box, put them together and you're just, you're using air. But they're also, you have
01:16:29.640 EPA regulation about air quality. So you've got to be very careful that when you're using air
01:16:34.840 cooling, you're not breaching the air quality because in Utah, and I studied this when I was
01:16:40.220 back in college, it's a unique geography. They get inversions there because of the mountains.
01:16:45.900 So you really, as an old environmental study guy, that's the one they'd always show us
01:16:50.560 would be Salt Lake City. So I'm well aware of the problem and I know what the new turbines can do.
01:16:55.800 The amount of hype and hysteria and over all of this stuff, well, I've already pointed out,
01:17:03.420 I think it's being generated by our adversaries, but it's also misinformation.
01:17:08.640 Everything that's been said about that site is false.
01:17:11.640 Well, I'm grateful that you're able to present your side of this, and thank you for doing it.
01:17:16.540 Things around the world are moving so fast right now, it's impossible to keep up with all of the changes.
01:17:22.620 But we do know that when those changes happen, markets change too,
01:17:27.180 and nothing changes faster than the price of precious metals, gold and silver.
01:17:31.740 It just shifts in an instant because it is a reaction to and against what's happening in the world.
01:17:37.840 So timing is essential.
01:17:39.180 If you're thinking about adding precious metals, and you definitely should, we do.
01:17:43.780 You need to know when prices are going to move and why they're moving.
01:17:47.560 And Battalion Metals makes that all really simple.
01:17:49.580 You can buy the dip when it happens.
01:17:52.120 So if you want real-time alerts sent directly to your inbox when gold and silver prices move,
01:17:57.380 go to battalionmetals.com slash alerts.
01:18:01.600 Markets move fast to stay ahead of them.
01:18:03.780 So it's battalionmetals.com slash alerts.
01:18:11.580 So let me ask, because I mean, this is just a huge new thing.
01:18:16.400 And I think it's understandable that people would be anxious about it.
01:18:20.480 I don't think they understand it.
01:18:21.480 I don't fully understand it.
01:18:22.620 That's for sure.
01:18:24.040 Let me ask, though, about why taxpayers should have to pay for this if it's a private business
01:18:28.600 and your tenants are some of the richest companies in the world.
01:18:31.280 Why would taxpayers be required, as they now are, to subsidize this?
01:18:36.680 They don't.
01:18:37.640 They don't necessarily have to do that.
01:18:39.280 They just won't win any contracts.
01:18:41.300 It's a competition.
01:18:45.000 But why are you getting tax breaks, is my question.
01:18:47.020 Yeah, everybody, you go back and you say,
01:18:49.080 what incentives can you give us to invest $15 billion in the first 1.5 gigs?
01:18:55.080 That's what it takes.
01:18:56.140 I have to go raise $15 billion.
01:18:59.520 That's just the first point.
01:19:00.280 But anyone who starts a business,
01:19:01.680 why should taxpayers have to pony up for that?
01:19:04.400 They don't.
01:19:05.520 Of course they do.
01:19:06.640 I mean, if you're getting a tax break and they're not,
01:19:09.640 they're making up the difference.
01:19:10.860 There's a state budget.
01:19:11.460 That's no problem.
01:19:12.720 That's no problem.
01:19:13.460 I can build it in Texas.
01:19:14.500 I can build it in Jacksonville, Mississippi.
01:19:16.440 But why, if it's such a good business,
01:19:18.660 would you be asking taxpayers to help pay for it
01:19:21.000 without giving them equity in the company?
01:19:22.720 Are you giving taxpayers shares?
01:19:25.500 No, the investors get the shares.
01:19:27.160 But here's why they would do it.
01:19:28.360 But why would the taxpayers have to?
01:19:29.620 In other words, if you want to start a business, why am I, as a taxpayer, forced to pay for your business?
01:19:36.140 I don't get it.
01:19:37.420 Well, let's forget about data centers.
01:19:38.980 Let's go any manufacturing.
01:19:40.600 Let's say you're going to build an aluminum sheet manufacturing facility.
01:19:46.920 You go to the government there and say, look, this is going to huge CapEx expenditure.
01:19:52.800 I'm going to hire 2,000 people. I'm going to build a community center. I'm going to pay a lot of tax on the profits in your state when I sell the aluminum. And I'm going to hire all these people who they will also pay tax. And we will build a school because our workers need a school. And, and, and, and, and. What can you give me to incentivize me versus the state right beside you, which is willing to give me an incentive package?
01:20:18.240 no no i understand i understand that you're you're gaming a system in place you didn't come
01:20:23.060 up with this but i'm just trying to understand so the trade typically is jobs okay but these
01:20:29.880 projects don't actually well no no it's also jobs and taxes because you're going to in taxes yeah
01:20:34.560 but but then you're getting a tax break so that doesn't really make any sense only up front you're
01:20:40.200 tucker welcome to america buddy this is how it's gone on for 200 years okay well i don't know lots
01:20:46.540 some bad things go on for a while. I'm just, but I think at some point it's worth assessing,
01:20:50.640 like, why are we doing this? So you are on the job. You're doing it because there's a competition.
01:20:56.760 Well, I run a couple of businesses and we're not getting any tax breaks. I think they're
01:21:00.700 every bit as virtuous as data centers, but I'm not availing myself of that. And no one's offered
01:21:05.680 and I wouldn't take it anyway because it's not the job of taxpayers to subsidize a private business.
01:21:10.400 It's a fair comment, but my job is to create a data center, create 2,000 jobs for long-term, 10,000 manufacturing at the beginning or construction.
01:21:22.160 And I'm obviously looking at multiple sites, and this won't be the last one I build.
01:21:28.000 May I ask 2,000 jobs? Okay, so relative to the physical size of the project, which, as you noted, is multiple times the size of Manhattan, and the power draw at peak, this data center, your projections, will consume about as much energy as New York City does.
01:21:48.700 But New York City provides almost 5 million jobs.
01:21:52.380 And this project, by your own description, would provide about 2,000 jobs.
01:21:58.380 I don't see the trade here.
01:22:00.080 You definitely got that calculation wrong.
01:22:02.280 By building a data center that trains AI, that provides productivity to the entire nation,
01:22:07.520 we create millions of jobs, high-paying jobs.
01:22:12.460 So AI is going to create jobs?
01:22:14.780 Yes.
01:22:14.980 I thought it was going to eliminate jobs, Ned.
01:22:17.260 Just think about the new technologies we don't even know yet that are going to be built off AI.
01:22:24.480 Everybody thinks when television came, everybody would lose their job in radio.
01:22:28.740 That was complete BS.
01:22:30.460 And the same thing is going to happen here.
01:22:32.760 Everybody's hysteria about losing jobs, making hamburgers or flipping them, being replaced by a robot.
01:22:38.360 That's probably true.
01:22:39.280 But all kinds of new technology will become available over time, including in medical science and biology and all kinds of things where the models can be used.
01:22:48.760 I'm extremely optimistic what I'm doing is creating a whole new opportunity for my children.
01:22:54.540 What kinds of jobs?
01:22:56.520 Well, I mean, again, some of this, of course, is unknowable, and I want to be as fair as I can be because I'm grateful that you're willing to talk about it so openly.
01:23:05.440 But you just said AI will create millions of new jobs.
01:23:09.820 You're a part of the basis of AI.
01:23:12.700 You can't have AI without data centers, at least right now.
01:23:15.260 So what are those millions of jobs?
01:23:17.480 Because we can go through the list that the creators,
01:23:20.960 the people who actually are making AI right now, developing it,
01:23:24.740 we can go through a list of jobs they say it's going to eliminate,
01:23:26.820 which would be like lawyers and financial planners
01:23:29.660 and like the basis of upper middle class America.
01:23:31.800 That's going away.
01:23:32.580 They've said this.
01:23:33.200 what will it be replaced by? It'll be replaced by new science opportunities,
01:23:39.740 new exploration into space, new manufacturing for robotics, for defense as well. Wars in the future
01:23:46.220 probably aren't going to be fought with people getting shot in the flesh. It'll be one set of
01:23:51.780 robots against another. I think the drone technology will advance the manufacturing
01:23:56.280 of surveillance, all that stuff against our enemies, which notably is basically China.
01:24:02.000 So if you think about how, you know, you debate the data center, I think it's fair to do that, what you're doing.
01:24:07.100 But I would be very concerned if I were living in Taiwan that one day my electricity just goes out and I get invaded by basically robotics and high precision ordnance.
01:24:18.240 And that's what China wants. And they want to get there first.
01:24:20.860 Now, if we don't get there first, if we don't develop something better than their AI and our ability to be predictive on where these conflicts are going to happen,
01:24:29.020 i think will be in a bad place in 20 years and so okay but i think it's very revealing that i
01:24:35.240 asked you about jobs in the united states and you went immediately to defending taiwan
01:24:38.940 in the well we may we manufacture the equipment here tucker that's where it's made but i'm okay
01:24:44.100 well actually a lot of this has been in china as as i know you know so that's my whole point i don't
01:24:50.100 want to do that anymore i want to start making it here i want to do advanced robotics i totally
01:24:55.160 agree. But the promise of AI and robotics is that the robots will make the products of the future.
01:25:03.500 So how exactly does that result in American jobs? More specifically, what are those? What are you
01:25:08.980 talking about? Every job is replaced by a machine. I don't buy that. I just don't agree with you
01:25:13.340 because it's never happened. I'm not saying I don't want that to happen. Trust me. I'm really
01:25:17.240 worried about it, almost panicked. But I'm trying to feel better. Okay, then please make me feel
01:25:23.880 about it? Where are these millions of new jobs? What specifically are you talking about?
01:25:28.260 Well, you know, every time technology advances, it creates new opportunities that were not
01:25:32.620 foreseen prior because you don't know the direction of new tech. You know, think about
01:25:38.960 if you and I, because we were actually around in the late 80s, contemplating what new jobs would
01:25:43.940 be created by the internet and look at what's happened. It's created millions of jobs and
01:25:52.420 advanced all kinds of technologies and change the way we live to the better. And I would say to you,
01:25:58.260 the same angst we had, the same narrative that was going on in 1992 about how the internet is
01:26:05.740 going to wipe out the economy and it's a bad thing and it's dangerous. Of course, people loathe
01:26:12.320 change. That's the nature of how it is. Do you think that the United States is a happier country
01:26:17.280 than it was in 1992?
01:26:21.300 Define happy.
01:26:22.420 What does that mean?
01:26:23.260 I don't know.
01:26:23.900 How about the suicide rate
01:26:25.180 or the addiction rate
01:26:26.340 or the life expectancy?
01:26:28.500 Those are all upside down.
01:26:31.380 There's lots of social issues
01:26:33.280 and there always has been.
01:26:35.240 I would remind you though.
01:26:37.100 But they've gotten worse
01:26:38.140 at exactly the period in history
01:26:41.560 that the internet was formed
01:26:42.980 and then seemed to infuse
01:26:45.180 every part of our lives.
01:26:46.320 So if you were worried about the effect of the internet in 1992 on America, looking back from the vantage of 2026, you could say, yeah, I had good reason to be worried, couldn't you? Or am I imagining that?
01:26:58.780 You could say that it would have some effect on society.
01:27:01.600 But let me remind you something, and I'm probably the right guy to make this comment because I spend a lot of time all around the world investing all around the world.
01:27:11.840 I don't care where you go, and I feel this way, and I've really learned this over the last 20 years.
01:27:20.460 What is the number one export of America?
01:27:22.940 It's not energy, and it's not technology.
01:27:25.700 It's actually the American dream.
01:27:27.660 for all its faults that America has, everybody I meet in every country, pretty well every single
01:27:36.520 one, would like to figure out how they can get to America, start a business, and be part of the
01:27:41.260 American dream. And I go to some pretty gnarly places where people are willing to risk their
01:27:47.580 lives to go under a river with barbed wire to get in here. So until that changes, which I don't see
01:27:54.860 ever happening. Our only, my only concern is China. They're the ones. What's the, what's the,
01:28:02.580 if you could just be more precise, I mean, born here, I love this country. I'm not leaving. So
01:28:07.440 I'm invested in the idea of the American dream, but I'd love to just define it more precisely
01:28:11.040 before we discuss it. What is it? Okay. Um, let's make it really basic. Let's go to Shark Tank.
01:28:17.020 Those people that trot out in front of me and have done for almost 20 years, uh, with an idea
01:28:22.100 to solve someone's problem that they'll get paid for.
01:28:25.560 And 80% fail, but 20% make it.
01:28:30.540 And it creates personal freedom for them
01:28:32.920 because it was just an idea.
01:28:35.260 I mean, I think of some of these things,
01:28:36.960 wicked good cupcakes set two women free
01:28:40.080 for the rest of their lives.
01:28:41.540 Base pause, cat DNA testing, Anaskaya,
01:28:45.360 a researcher that came up with this idea,
01:28:47.500 walked away with 105 million cash 36 months later.
01:28:51.820 That's the American dream.
01:28:53.160 And I'm an ambassador for it.
01:28:54.720 So it's freedom and opportunity, I think you just said.
01:28:57.580 Is it fair to someone?
01:28:58.800 It's not about the greed of money.
01:29:00.460 What I've learned is-
01:29:01.420 No, opportunity.
01:29:03.160 It's opportunity.
01:29:04.360 And I see it over and over again.
01:29:06.380 And I'm an ambassador for it all around the world
01:29:08.340 because of the fact that somehow magically this show,
01:29:12.400 and you understand television better than anybody,
01:29:14.740 this thing is on in 54 countries around the world.
01:29:17.320 I can go anywhere and call up the leadership of any place
01:29:20.340 and have a meeting with them
01:29:21.760 because I'm just an ambassador of the American dream.
01:29:24.460 That's pretty interesting.
01:29:26.640 Well, it's a great thing, the American dream.
01:29:28.360 If freedom and opportunity is what the American dream is,
01:29:30.880 then those are things worth dying for, I think.
01:29:34.480 That's what people are doing.
01:29:35.480 So the question is,
01:29:37.280 is AI going to bring us freedom and opportunity?
01:29:41.580 Because AI is the big bet
01:29:44.120 that the entire American economy is making right now.
01:29:47.020 and I want it to be a good bet more than anything,
01:29:51.420 but I wonder if it's going to bring us freedom
01:29:54.360 and opportunity.
01:29:55.080 So let's just start with opportunity.
01:29:56.340 The concern that people have is that their jobs
01:30:00.760 and not just their jobs,
01:30:02.220 but their purpose for being alive
01:30:03.960 will be replaced by machines.
01:30:07.180 To which you say, that's not true.
01:30:09.300 Great things will happen.
01:30:10.520 To which I said, okay, what are those great things
01:30:12.980 to which you talked about Shark Tank?
01:30:14.740 So I just want to be a little more specific
01:30:16.760 about what the upside of AI is because then everyone will calm down and feel happy about
01:30:21.060 your data center. What is the upside? Well, let's start in medicine.
01:30:28.000 Let's use specific cases everybody can understand. I was in Miami two weeks ago
01:30:33.860 and went for a full body scan, the price of which has dropped 80% in the last three years.
01:30:40.400 So in an hour and a half, this machine went right through my body.
01:30:44.180 two years ago it would have taken three weeks to get the results looking for cysts on your liver
01:30:52.780 or whatever it is with ai with the agent they had there right at the scan facility
01:31:00.000 um 18 minutes later it delivered me a report told me i had an infection in my sinus that i had a
01:31:09.240 cysts on my kidney that hasn't grown since the last body scan four years ago.
01:31:15.020 That was stunning and remarkable. Yeah, it's great. It is great. So that's one use case
01:31:21.720 of something very, very important. And that's just medicine. And so if you think about
01:31:28.260 all the things that this can do, including the arts and in music and film and everything,
01:31:34.800 I think the use cases are going to be created by humans who find ways to use the tool.
01:31:41.180 I view it, and maybe you don't agree, as simply a tool.
01:31:45.340 Another example, because it's great to tell people just examples.
01:31:48.820 I've been a photographer my whole life, including a period when I was a professional commercial photographer and a cinematographer for the networks.
01:31:56.240 I've amassed 590,000 images during my career, half of which are on film.
01:32:03.760 film, not shot with a phone where you get all the data on it and it knows where it was shot,
01:32:09.800 when it was shot and everything else, which is easy to index and look for. So call it
01:32:14.300 quarter of a million negatives. How could I possibly catalog that as a human? I couldn't.
01:32:23.820 I don't have enough time left before I'm dead. So I used AI last week with a tool to actually,
01:32:31.000 after scanning all the images to go look at them for two weeks. And now I'm able to just go to this
01:32:39.180 computer and say, find a picture of my wife the day I met her in the gym. Boom. There it is.
01:32:48.940 It's amazing. No, you're kind of making my point though, because the two examples you gave
01:32:52.680 are examples of a machine replacing human labor and doing a better job than any person could do.
01:32:57.960 Oh, Tucker, that's horrible.
01:32:59.260 Which is great.
01:33:00.280 Well, I mean, that's my point.
01:33:02.100 Well, I know it's your point.
01:33:03.420 I know it's your point, and I can see the marvel of it.
01:33:06.360 But it raises the question, which won't go away, which is, okay,
01:33:09.840 now that machines do what people do better than people did, what do people do?
01:33:13.860 Okay.
01:33:14.220 And what is the answer to that?
01:33:15.540 Let me ask you, let me pose a question against that.
01:33:19.540 Would you prefer?
01:33:22.060 I'm giving you one of two options with the uncertainty of both, okay?
01:33:25.220 But I want you to pick one.
01:33:26.260 this will tell me a lot about how you think would you prefer in the next five years
01:33:33.520 that china have more compute power than we do even though the the use and the outcome and
01:33:40.200 jobs and everything else we're talking about is unknown or would you prefer the united states of
01:33:44.780 america and and their allies in north america had more compute power to develop ai which depends
01:33:52.660 Whichever makes the United States happier, freer, and more prosperous, I guess, is the answer to every question.
01:33:57.460 Do you know what the answer to that is? What do you think it is?
01:33:59.680 I'm not sure that we know.
01:34:00.820 I know that reducing this question to a competition between the United States and China is a very quick way to derail people from the core questions, which revolve around whether or not it's a good idea.
01:34:13.380 So that's always kind of a red flag for me.
01:34:16.080 China has more economic power now than the United States, probably has more military power.
01:34:19.880 I didn't want either of those things to happen.
01:34:21.920 I'm a little bitter at the American business community, which allowed them to happen, but
01:34:25.260 they happened.
01:34:25.980 So here we are.
01:34:27.560 There's not much I can do about it.
01:34:29.260 My focus remains, however, on is it good for the United States?
01:34:33.520 Is it good for us?
01:34:34.200 Does it make people happy?
01:34:35.720 And I don't understand what people are going to do for a living if machines are going to
01:34:42.640 be able to do most things that people currently do better than people do them.
01:34:46.100 Like, what is the answer?
01:34:47.120 what are we and a lot of people are honest stuff and say nothing we're just gonna the oligarchs
01:34:53.040 are gonna send out checks to everybody the bread and the bread and circuses and everyone's just
01:34:57.620 gonna kind of stay home and be obedient but i don't know that that is a good idea and i don't
01:35:03.860 know i think i think you're raising a great point it's a wonderful debate and it's certainly going
01:35:07.620 on but it's analogous to the narrative that must have been happening back when the model t ford
01:35:12.540 rolled off the assembly line and everybody that had a buggy looked at it and said, oh, no,
01:35:17.120 this is going to wipe me out and there'll be no work for me, that the American economy is very
01:35:21.500 resilient as it develops new sectors. What happened after the Model T was
01:35:25.480 rolled out? We had two big events in the, say, 40 years after that. One was the First World War
01:35:31.300 and one was the Second World War. The Industrial Revolution, like all technological change,
01:35:36.040 caused dramatic political and social change, which resulted in the biggest bloodletting in
01:35:41.380 human history so i'm like it's worth thinking this stuff through like technological change
01:35:47.120 forces it's fair enough but unfortunately capitalism and the freedom it provides
01:35:54.720 uh is volatile and it has lots of periods including the great depression where it taxes
01:36:02.340 the people that support it but it's still better than the alternative anywhere on earth it seems
01:36:08.160 200 years later. And so, you know, when you start and looking at the American economy and the vol
01:36:13.540 it's had, it's been unbelievable. But the system, with all its faults, and you're raising some good
01:36:19.080 points, I agree with you. I'm just trying to think what's the alternative. And regarding AI and the
01:36:24.740 ultimate issue around your job issue, and I think it's fair, I'm looking at it differently. I'm
01:36:32.240 looking at it as insurance i would prefer even though the future is uncertain and the outcome
01:36:39.780 you know and the end is always near to quote jim morris and i love that line in his song
01:36:44.840 roadhouse blues yeah yeah the yeah the future is uncertain and is always near um live you know
01:36:52.800 words i live by but it's sort of if you had to give me the two alternatives let china do this
01:36:58.440 first or us. I'll take us. And I'm part of that now. And in the controversy of it, we started
01:37:04.660 our conversation. I've never seen a controversy like this. I've never seen a tax like this.
01:37:10.820 I know you have, but I haven't experienced it. And I'm hiring. Well, I mean, if you take money
01:37:16.860 from taxpayers, you should expect them to weigh in, too, and take their complaints seriously and
01:37:22.640 not just dismiss them. And you're right. But but this has been going on incentives state by state
01:37:28.020 in every sector happen, even real estate.
01:37:30.040 Not incentives, it's a forced transfer of wealth
01:37:33.040 from taxpayers to people who are richer than taxpayers.
01:37:35.620 So like, of course they have-
01:37:38.200 You get a tax break to create 2,000 jobs
01:37:40.460 and the taxpayer wouldn't do it
01:37:42.920 if it wasn't the long-term outcome.
01:37:44.880 The taxpayer has no choice.
01:37:46.220 The taxpayer has no choice.
01:37:48.160 They can say no, that's not what happened.
01:37:50.520 Well, they tried to say no to this data center
01:37:52.340 and they got laughed at.
01:37:53.480 And then the governor, Spencer Cox of Utah,
01:37:55.860 who really is a kind of living symbol of of our ruling class um said that the state has an
01:38:02.880 obligation to do this and basically as you were doing waved away concerns as like foreign propaganda
01:38:10.860 when i think they're probably non-crazy people live in utah are like why is this good for me
01:38:15.760 why should i pay for this it's fair it's fair all this is fair how about we forget the data center
01:38:20.860 just talk about the power for a second because there aren't going to be any data centers in
01:38:24.100 america unless we build power generation do you have the same visceral reaction to power generation
01:38:29.660 as you do to data centers uh inversely i think we need more electricity in the united states and i
01:38:36.820 think that's what i'm talking about speaking of agents of china in fact if you're looking for
01:38:40.320 agents of china it's the people who told you that you could live without fossil fuels that babies
01:38:45.360 wouldn't die without fossil fuels babies will die without fossil fuels civilization will collapse
01:38:49.980 without fossil fuel. And that right there is the foreign op telling us that, you know, people are
01:38:56.440 causing global climate change. People are not causing global climate change substantially.
01:39:00.960 We're having global climate change as we always have, but to blame it on natural gas is just a
01:39:07.560 flat out lie. And a lot of rich Americans fell for that. So no, I'm, I'm strongly for energy.
01:39:13.080 I'm just very concerned about replacing people, their purpose for living, which is to create, with machines.
01:39:21.760 That seems like hell to me, literally hell.
01:39:24.760 So I think it's fair to ask, like, how is this not hell?
01:39:28.260 And no one has been able to answer it.
01:39:30.360 Yeah, I think trying to stop advancement of technology with the uncertainty it brings has always been a dilemma in capitalism and in America.
01:39:40.860 And that's a fair comment.
01:39:43.520 How is this capitalism?
01:39:44.940 If taxpayers are paying for some of it, how is that capitalism?
01:39:48.160 I think that's no different than building an apartment building downtown in Austin.
01:39:53.040 You can get incentives for that, too.
01:39:55.000 And that's just part of how it works.
01:39:57.000 You can debate that.
01:39:58.200 How is that capitalism?
01:39:59.440 How is that free market?
01:40:00.540 If politicians are handing you other people's money to do it, how is that capitalism?
01:40:06.300 Well, capitalism is built into the Constitution, which I think was genius.
01:40:10.280 because I couldn't have even thought that far ahead, was the idea of competition between states.
01:40:16.340 Why is there an exodus out of New York City to Miami, where I live? Why are people leaving New
01:40:22.540 Jersey and Massachusetts and moving into my building? Nobody in my building is from Florida.
01:40:28.100 They're all from other states. And all of these people moving, why? Because they're reaching
01:40:33.960 retirement age and the the environment in these other states is not conducive to their lifestyle
01:40:41.180 when they're but i think you're making my point for me you just listed states that use a higher
01:40:46.900 percentage of taxpayer dollars for private industry and people are leaving those states
01:40:51.920 illinois wait a second new york new york is an inefficiently run it may be inefficient but the
01:40:58.400 principle that you can politicians can take the public's money and hand it to their friends in
01:41:03.640 business. That's a longstanding practice in New York, California, and Illinois, and relatively
01:41:09.540 uncommon in Florida. So people are moving to the free market state, but you're now saying that
01:41:15.120 actually the beauty of capitalism is that politicians hand other people's money to their
01:41:20.520 friends? That doesn't... No, I'm not saying that. That is incorrect. I didn't say that. I said
01:41:27.100 what makes capitalism work is competition, and it does work, and it has worked for hundreds of
01:41:33.260 years. And so the concept, you may not like the policy of giving incentives for manufacturers
01:41:38.780 and real estate. You may not like it. And I understand you don't, but you don't have to do
01:41:44.080 it. A state can stop doing it. And then for a while, maybe they'll be less competitive in terms
01:41:49.400 of getting new projects. I didn't make those rules up. I'm a football player on the field
01:41:55.000 of capitalism. I have to play by the rules. I get it. No, I get it. And I'm not attacking you for
01:41:59.860 I'm just wondering, again, occasionally it's worth pausing and asking, are we doing the
01:42:04.240 right thing, even if we've been doing it the same way for a long time?
01:42:07.560 Do you, I'll ask just one final question, I'll stop torturing you, but do you see how
01:42:11.980 other people find it unfair that one of the richest people in America, you, in league
01:42:19.640 with the biggest companies in America, Amazon, Microsoft, etc., the ones you listed, that
01:42:26.100 the richest people in our society would be taking money from people much poorer than
01:42:31.380 them, that seems very unfair.
01:42:36.300 You know, Tucker, I'll speak for myself on this.
01:42:41.720 I'm not motivated by money anymore.
01:42:44.720 That is not why I get up and work 18 hours a day.
01:42:50.040 I want to build something incredible.
01:42:52.280 I want my legacy from my children to be something that they're honored by.
01:42:57.180 I want to compete against the Chinese.
01:42:59.840 I don't like the Chinese government, in case anybody has noticed that over the last five
01:43:04.380 years.
01:43:04.980 I don't like how they treat their people.
01:43:06.900 I don't like how they treat me in business.
01:43:09.100 I simply don't like them.
01:43:11.560 What's motivating me now with all of this barrage, this firestorm, which I understand
01:43:18.460 and you've experienced in the past, is I want to beat them in AI. I don't want them controlling
01:43:25.280 the most advanced models, wherever that's taking us, because you bring up great points
01:43:29.880 about the unknown in jobs and everything else. But I want to be part of the team
01:43:34.860 that makes sure our way of life in North America remains the same for my children.
01:43:41.840 I don't give a shit about money anymore. I don't need any more money. I've already been very
01:43:46.440 successful. I'm very lucky, but I'm not finished. I'm very motivated to win. And if you, if people
01:43:53.620 want to compete with me, I want that. If they don't think my motivation is to beat the Chinese,
01:43:59.080 they're wrong. I am going to beat them. I am going to show them these data centers. They're
01:44:04.500 going to be this shining example of how you do this sustainably because I'm the only guy that
01:44:09.600 is graduated out of environmental studies and builds data centers. No one else on earth has
01:44:14.920 on that. I love the idea. I'm for America above all. Do you think that Microsoft and Amazon and
01:44:24.120 Google will do a good job preserving the American way of life? Are they substantially superior
01:44:32.000 morally to the Chinese government? Do you think that those are your times? Yes, I do. And in fact,
01:44:36.500 I deal with that. Tell me how. Tell me how is Google. I'll tell you why. I was talking with
01:44:41.140 the guy yesterday that does, um, is doing, is doing the build outs all around the world for
01:44:47.240 Google's data centers. They are on incredible sustainable mission. They are looking at every
01:44:54.120 technology to reduce their, um, need for traditional power sources. And, you know,
01:45:02.060 we were talking about the, the, the campus in Utah. They have some very advanced thinking on
01:45:09.900 the use of battery, wind, and solar. And so they want to do that. Why? Because their customers
01:45:16.900 want that. Well, wait a second. China leads the United States in battery, wind, and solar
01:45:21.500 in development of those technologies. Yeah, yeah. It's a great story, but not in data centers,
01:45:29.100 they don't. They're powering. No, but hold on. Jumping around a little bit. I'm just trying to
01:45:33.820 ask, okay, so you say you want to preserve the American way of life. That's why you're building
01:45:37.460 this. My question was, okay, well, your tenants are these big companies that are not really
01:45:42.100 American companies or publicly traded companies that are owned by the sovereign wealth funds of
01:45:46.220 tons of countries around the world. In what sense are they American companies? In what sense do they
01:45:52.620 have American values? Because they operate under the laws and are headquartered in the United
01:45:58.740 States of America under the regulations. The only reason that Google can raise as much capital as
01:46:03.920 it does from the sovereign wealth. Now you're coming into my space. I'm an indexer for sovereign
01:46:08.380 wealth. The only reason 52 cents of every dollar comes to America is the transparency of our laws
01:46:17.040 and the appellate system and our banking system and the regulator. And so that's why you find the
01:46:24.640 largest tech companies on earth housed in America because the regulatory environment and the actual
01:46:30.880 regulatory environment in the context of the globe is the most advanced on earth. So you may
01:46:36.840 hate the SEC. You may not like, you know, regulations, but in fact, it gives an infrastructure
01:46:42.020 where most capital comes here. That's what happens because the outcomes, why does France have no
01:46:47.520 Microsoft? Why does Germany not have one? What, what happened to, you know, the European countries
01:46:53.860 and Britain, you know, they don't have any of this tech because they regulated it out of their
01:46:59.680 society. We don't want to change that. We want the innovation here, not the regulation necessarily.
01:47:07.620 I strongly agree, except the extent the regulation protects the public.
01:47:11.680 It's a balance. That's why we trust the government. A lot of people don't like them.
01:47:15.560 We say, look, at the end of the day, you know, the great example that we could use that you both
01:47:19.360 know, I don't think crypto is going to work until they pass the bill about crypto. They got to
01:47:24.480 finish this infrastructure act so that finally crypto can find its place inside of the 11 sectors
01:47:32.160 of our economy no one's gonna i'm not investing in crypto until i get that
01:47:36.400 friggin thing through the house may i ask okay so just to your second point about the american
01:47:41.520 dream there was opportunity i think you know the ability to be judged on on your actions and your
01:47:48.120 behavior, on your character, to be treated equally. And there was also freedom. And so I think many
01:47:55.460 people have deep concerns about AI's effect on freedom. You said yourself, the purpose of AI is
01:48:01.820 in part surveillance. Yeah. What is the point? Right. So what is that? What does that look like?
01:48:10.540 I asked what jobs does it create? Couldn't get an answer. Now I'm going to ask you more specific
01:48:13.940 question. Well, you can't get an answer from me because nobody knows what jobs it's going to
01:48:18.000 create yet. Thank you for doing that. Well, of course, but I've always seen every technology
01:48:23.100 that has been lulled as it's come in always creates new jobs. And I assume the same for
01:48:28.500 this tool. It's just a tool. So, you know, the surveillance is a great point. We have laws
01:48:32.980 regarding surveillance and we have laws that restrict the government and they're built into
01:48:36.520 the constitution and we have free speech. All of these things remain the same. They don't change
01:48:43.280 with AI. Well, they did until last week when the president led the charge to change the law
01:48:50.420 and allow warrantless spying on Americans. Yeah. Okay. Well, wait a second. In direct
01:48:57.200 violation of the U.S. Constitution. So obviously the Constitution doesn't matter to the people in
01:49:01.060 charge. No, it does. Every administration is kept in check by the Supreme Court, including this one.
01:49:06.780 And so that's why the founding fathers created this structure. And I think it works. And I
01:49:12.380 trust it for one. And I'm willing to, you know, have my kids who are Americans grow up here and
01:49:17.140 I want to protect them. Okay. So what kind of surveillance, I mean, it seems like AI will make
01:49:22.320 it possible to know what every American is doing at all times. Correct me if I'm wrong.
01:49:27.920 I would say that's not AI that's going to do that. I would say now with the amount of high
01:49:34.020 resolution video and surveillance that is built into every street corner, that's how they solve
01:49:40.780 a lot of crimes now if you don't like do they solve a lot of crimes has the closure rate for
01:49:47.120 murders in our big cities like what percentage of murders are in new york city or chicago or
01:49:51.960 detroit are solved i don't i don't i don't track them pretty low actually it's lower uh in this
01:49:56.940 country the point is surveillance is a great issue but the enhancement of surveillance with ai tools
01:50:02.200 is how you now walk through the dubai airport and you also walk through some airports here in the u.s
01:50:09.940 Facial recognition technology has finally got to the resolution using AI's tools.
01:50:15.260 It's going to run through your data center.
01:50:16.960 There'll be pictures of all of us in your data center.
01:50:18.740 Every data center.
01:50:20.480 Right.
01:50:20.840 But how does that make it freer?
01:50:23.560 You don't have to opt into that program, by the way.
01:50:26.500 I do it for convenience.
01:50:28.160 I'm not hiding anything.
01:50:29.220 Well, it's on street corners, as you said.
01:50:31.380 So anytime you're in public, your face is being surveilled by the U.S. government.
01:50:35.440 It is, but I'm not doing nefarious things.
01:50:39.260 I'm not a criminal and I'm not stealing any money from anybody.
01:50:41.960 So the definition of nefarious changes, as you know, I mean, hundreds of thousands of
01:50:46.340 American citizens were sent to concentration camps by Franklin Roosevelt because they were
01:50:49.720 Japanese.
01:50:50.560 Correct.
01:50:51.320 So in 1940, it was not a crime to be Japanese.
01:50:54.380 In 1941, it was.
01:50:55.620 So the definition does change, as you know.
01:50:58.960 It's true.
01:50:59.720 So you could have those concerns.
01:51:01.620 But I ask you again, Tucker.
01:51:03.140 I do.
01:51:04.460 And I know you do.
01:51:05.940 But I ask you again, and you haven't answered my question.
01:51:08.300 Do you want the insurance policy or not? With all of the nefarious concerns about AI,
01:51:18.100 which outcome do you prefer as an American for your family? Would you prefer all of us that are
01:51:26.440 developing these data centers put down our shovels and stop while the Chinese accelerate theirs?
01:51:34.060 Would you like that? Well, I see a kind of a different question.
01:51:37.380 I see the question, I want to be, let me answer your question in the following way.
01:51:42.600 Do I want to become like China in order that we can, quote, beat China?
01:51:47.400 Not at all.
01:51:48.420 The problem with China, from my perspective, is that it surveils its citizens and it limits
01:51:53.500 their ability to say what they think and to oppose existing power.
01:51:56.760 That's why China is bad, I thought, because there are a lot of things about China that
01:52:00.080 are great.
01:52:00.580 But what I don't like about China is the totalitarian approach it has towards its own
01:52:05.020 population.
01:52:05.840 I agree with you.
01:52:06.800 you're making it possible for our government
01:52:09.300 to have the same approach,
01:52:10.160 and our government already does.
01:52:11.660 They're surveilling us at all times.
01:52:12.920 Most Americans don't even know they're being watched.
01:52:14.960 There are license plate readers,
01:52:16.420 there are facial recognition cameras everywhere.
01:52:19.340 And the question is why?
01:52:20.800 So the crime rate can go down
01:52:21.880 and what hasn't gone down.
01:52:23.340 So, of course, the real reason
01:52:24.620 is to make you obedient, obviously.
01:52:26.400 No, I have a different vision of the future.
01:52:28.940 Why do we still have crime?
01:52:30.760 Now, hear me out.
01:52:32.120 Here's what I think is going to happen,
01:52:33.680 because I'm enjoying this conversation.
01:52:34.960 Let me, as I know, we come to a close soon, but here's where I think we're going to be
01:52:39.680 in 20 years.
01:52:40.400 Okay.
01:52:41.420 I believe in 20 years, and this is my vision of what's going to occur.
01:52:45.120 And I think I'm a great guy to actually do this because I have a foot in both economies.
01:52:49.180 I think by the time the 20th year, I'm going to make it in 10 years.
01:52:53.300 It may not be with this administration or the administration in Canada.
01:52:56.320 The two economies are going to emerge for one reason.
01:53:00.280 Which two economies?
01:53:02.220 Canada has all the raw resources, the largest consumer part of it.
01:53:06.580 I agree.
01:53:07.460 And it's crazy that we're fighting on this stuff.
01:53:11.600 It's crazy.
01:53:12.040 I agree.
01:53:12.380 It makes no sense.
01:53:12.940 Totally agree.
01:53:13.920 And so we should merge these economies.
01:53:17.520 You don't have to merge the countries.
01:53:19.580 You merge the economies to get an EU-type situation where you allow freedom of movement across the border,
01:53:25.720 and you allow no tariffs on all the stuff we need, the rare earth Canada has,
01:53:30.680 the water, the paper, all that stuff. And the only reason we're going to do that is so we can
01:53:35.120 take 5% of our GDP, which will be the largest on earth by around 25%. No one be close to us.
01:53:42.600 We merge those two together and the benefits that accrue for one reason to tell the Chinese
01:53:49.040 don't fuck with us. That's basically what's going to happen because they want to fuck with us big
01:53:56.800 time. And that's my belief. And I think I'm right. I think that's what's going to happen.
01:54:02.220 And AI and compute and all this stuff, that's a side story. We have to keep all of our tech
01:54:08.940 ahead of theirs, including data centers and everything else. But ultimately, they are
01:54:14.680 our adversary on our way of life. Then can I ask our way of life? Right. So you,
01:54:20.300 I would think since you disagree with the Chinese way of life, which is pretty civilized in a lot
01:54:24.340 of ways but the one way in which it's barbaric is that it grants its citizens no real rights
01:54:29.160 and so i would think that you would be very worried about aping their system which we are
01:54:34.880 now doing like they have total surveillance it's panopticon in china we should create our own
01:54:39.820 let's take 40 000 acres in utah and make it possible for the government to know where you
01:54:43.260 are at all times and what you're thinking listening to you on your phone like we have that
01:54:46.700 but you have to you have to choose the the less of two evils in your scenario and i'm telling you
01:54:53.380 what you should do is say, I want Kevin O'Leary to succeed. I want him to beat the Chinese in
01:54:59.920 compute power. And then use the laws of the United States to make sure that you keep that compute
01:55:04.860 power in check, whichever way you want. But to not have it available, to put down my shovel,
01:55:10.620 I don't think people want me to do that. Even the people in Box Elder, the majority of them want me
01:55:15.180 to hold my shovel and start digging. And that's basically the debate we're having.
01:55:19.620 How do you know that, that the majority of people—
01:55:22.640 Because they voted for it unanimously before the Chinese guys.
01:55:25.680 There was a referendum?
01:55:27.160 The people in dispatch.
01:55:27.820 Like all this crap that's being spewed out in the last four years.
01:55:29.980 Wait, wait, hold on.
01:55:31.160 Yeah, and I may have fallen for some of it, so you correct me.
01:55:34.160 There was a referendum among citizens, or did some, like, county board vote?
01:55:39.480 We actually went through the whole process that you have to do by their laws and were granted three to zero.
01:55:48.580 The commissioners of the county said, we want to be part of this.
01:55:51.820 Oh, so three people voted.
01:55:53.040 The people of the county voted.
01:55:54.540 They elected officials.
01:55:55.740 That's how you do it.
01:55:56.800 How hard is it for Kevin O'Leary and Amazon and Microsoft and Google to subvert three
01:56:05.060 county commissioners in rural Utah?
01:56:07.200 They asked us to come.
01:56:10.800 They asked us to bring $15 billion.
01:56:13.620 They asked, and they voted, and it was a three to zero vote.
01:56:18.260 That's how it happens.
01:56:19.620 Amazon, Google, and Kevin O'Leary got three county commissioners in rural Utah on their side.
01:56:24.980 Good work.
01:56:26.000 But can I just ask you really quick, like, why don't you have a referendum?
01:56:28.680 Why don't you let all citizens, all taxpayers, the ones who are paying for your project, why don't they get to pay or vote?
01:56:36.640 Well, listen, the whole idea of expediting it was so the project didn't go to Jacksonville, Mississippi.
01:56:43.480 That's the competition.
01:56:44.560 And so the government there said, okay, look, we've done, because we haven't lost, we haven't lost our, we have to go through EPA, we have to go through air and quality, we have to go through the water permitting, we have to go through the land construction permitting.
01:56:59.260 Nothing's been lost.
01:57:01.160 There's no, there's no circumventing the process.
01:57:06.700 And that's what we're doing now.
01:57:08.380 So if you want to block the water permit or whatever, go ahead.
01:57:12.000 I mean, if you think that's a good idea, if there's a path to do that, we're going to do it the way we've been told by the law.
01:57:18.540 By the way, I got to go soon, but I would love to keep going.
01:57:22.220 Kevin O'Leary, I sure appreciate it.
01:57:24.140 And I hope that when my personal data comes through your data center, you will filter it.
01:57:30.980 Tucker, I think I'm going to tell when people ask me what was it like I'm going to say, he told me to get my shovel out and keep digging.
01:57:38.780 That's it.
01:57:39.160 I very much did not say that.
01:57:40.760 Not digging for a data center.
01:57:42.460 I don't know.
01:57:43.120 Build like an antibiotics factory.
01:57:44.760 That would be, I'd be for that.
01:57:46.480 Thank you very much.
01:57:47.640 Take care, my friend.
01:57:48.500 You know I'm a huge fan.
01:57:50.540 I really am.
01:57:51.600 I appreciate the-
01:57:52.340 I love the shit you take and the shit you give.
01:57:55.100 I really do.
01:57:56.120 All right.
01:57:56.520 It doesn't bother me at all.
01:57:57.540 Thanks a million.
01:57:57.960 Thank you.
01:57:58.420 Take care.
01:57:58.860 Ciao.
01:57:59.160 Bye-bye.