The Tucker Carlson Show - January 23, 2025


Chamath Palihapitiya: Zuckerberg, Rogan, Musk, and the Incoming “Golden Age” Under Trump


Episode Stats

Length

2 hours and 9 minutes

Words per Minute

169.25993

Word Count

21,847

Sentence Count

1,755

Misogynist Sentences

23

Hate Speech Sentences

20


Summary

On this episode of The Tucker Carlson Show, host Tucker Carlson interviews Alex Blumberg. Alex is a former Wall Street Journal bestselling author, hedge fund manager, and hedge fund analyst. He s been in the investing business for over 30 years and is a regular contributor to Forbes and the New York Times. Alex and Tucker discuss how he got to where he is today, what he s learned, and what he thinks about the current state of the economy.


Transcript

00:00:00.000 With TD Direct Investing, new and existing clients could get 1% cash back.
00:00:08.360 Great! That's 1% closer to being part of the 1%?
00:00:13.660 Maybe, but definitely 100% closer to getting 1% cash back with TD Direct Investing.
00:00:20.820 Conditions apply. Offer ends January 31st, 2025. Visit td.com slash dioffer to learn more.
00:00:30.000 2020 was an incredibly prolific period for me. I'd wake up out of bed and I was doing deals and it was like I had the world in the palm of my hand, it felt like.
00:00:45.680 I was moving markets every time I communicated publicly. That was incredibly dizzying and it had the exact opposite effect on me that it should have.
00:00:56.000 What it should have done, I should have taken a step back and say, hold on, this has nothing to do with me. What is this moment?
00:01:03.900 And the moment would have been, we're at the tail end of zero rates. We had trillions of dollars that the government had basically given to individuals.
00:01:12.220 We had an enormous M2 money supply. And instead, I thought it was me.
00:01:17.180 And then in, you know, 2022, when the war in Ukraine started, the bottom fell out in financially, in Silicon Valley, in frankly, a lot of things that I was working on.
00:01:29.780 It was such a wake up call and it was the biggest blessing of my life.
00:01:33.580 You know, I never thought I would be in a position to have made that much money.
00:01:37.460 In hindsight, I've never been more blessed than to torch, you know, three or four billion dollars.
00:01:42.520 Welcome to the Tucker Carlson Show. We bring you stories that have not been showcased anywhere else.
00:01:59.620 And they're not censored, of course, because we're not gatekeepers.
00:02:02.640 We are honest brokers here to tell you what we think you need to know and do it honestly.
00:02:07.900 Check out all of our content at TuckerCarlson.com. Here's the episode.
00:02:11.580 Do you think, I mean, there's, you know, we're still, it's still ongoing, the war in Ukraine, but it had, you said, an immediate effect on markets.
00:02:20.900 It was, it was like a pivot point.
00:02:23.120 You could point, you could put it on a map, March of 2022, I'll never forget it.
00:02:27.640 So that war, my opinion, was, you know, welcomed by many in the West.
00:02:35.220 And I wonder if, whatever they said, it was clear they were for it.
00:02:41.020 Do you think that those two things are connected?
00:02:45.980 It's not clear to me how connected they are, but the first part of what you said I do agree with, which is that we have silently allowed this insidious war machine to take over large parts of the government.
00:02:57.120 Yes.
00:02:58.140 And what's so interesting about this is that it actually, I would have said, was more riddled inside the Republican Party.
00:03:06.020 Yes.
00:03:06.400 But it turns out over these last few years, especially since this MAGA takeover, this hostile takeover that Donald Trump affected, which I think is enormously important in historical context.
00:03:16.760 But it's, it's more within the Democratic Party now.
00:03:22.040 Yeah.
00:03:22.220 There are these neocon warmongers that want to connect the dots between people suffering and their own economic opportunity.
00:03:31.780 And I think that that's very scary.
00:03:33.860 And I think that whenever you see that, you have to push against that.
00:03:37.540 Humans should not be at war.
00:03:40.580 We should not be fighting and killing each other.
00:03:42.780 It's just a simple foundational moral principle that I think we all have to live by.
00:03:48.980 You just cannot go there.
00:03:51.560 And you have to push back on every avenue of people that try to take you there.
00:03:57.060 I couldn't agree more.
00:03:58.860 Sorry to derail what you were saying because it was very interesting.
00:04:01.940 So 2020, everything you touch seems to turn a profit.
00:04:08.200 Two years later, everything changes.
00:04:10.160 Why is that good for you?
00:04:11.600 Losing all the money?
00:04:12.780 Or all that money?
00:04:14.820 Because I had to take a step back and actually figure out how much of this was actually me and my preparation and my process or my dark passenger.
00:04:27.480 And here's what I mean by this because I thought about this a lot.
00:04:29.760 I think we all have a dark passenger.
00:04:31.460 So when you are born, you're kind of like this body that has the capability to do anything.
00:04:40.940 I mean, you know, we talk about this, but we don't say it enough.
00:04:43.660 But the genetic diversity of all humanity is minuscules.
00:04:47.660 So I think I interpret that as the capability of all humanity is pretty incredible.
00:04:54.480 But you have this huge distribution of outcomes.
00:04:56.860 And part of that are the things that happen to you as you're growing up, right?
00:05:01.780 Your lived experience, right?
00:05:03.500 It's the nature part, not the nurture necessarily.
00:05:06.040 And nature gives some people a very dark passenger, right?
00:05:12.880 Some of them will then commit crimes.
00:05:14.580 Some of them will become murderers.
00:05:15.740 Some of them will become drug addicts.
00:05:17.040 Some of them, you know, will have this litany of things happen to them.
00:05:21.320 In my case, my lived experience gave me this thing where I have always battled this insecurity
00:05:26.540 that I've just felt I'm basically worthless, you know?
00:05:30.780 You're a kid, you come here, you know, you don't really fit in.
00:05:36.640 You know, you try to kind of make a social life out of everything else that you're supposed to do
00:05:44.420 doesn't really work, right?
00:05:46.540 And so that creates a chip on my shoulder.
00:05:48.500 And it was this sort of thing where I always felt I was on the outside looking in.
00:05:53.240 And then when I, you know, worked at Facebook and then left, and all of a sudden in my early 30s,
00:05:58.720 I had, you know, more success and more money, to be honest, than I ever thought I would have.
00:06:06.500 I spent a lot of time, a decade basically, feeding that insecurity, buying things, accumulating things.
00:06:14.200 And to be honest with you, if I'm being really honest with myself,
00:06:16.220 look, I built a really successful investment business by all numerical accounts.
00:06:20.060 But I would say for me, it was a lost decade.
00:06:22.980 I didn't do anything.
00:06:25.200 And I got to a point in 2022 where all of that stuff, so much of it,
00:06:32.420 had to be stripped away.
00:06:33.720 And I had to look at what was left and rebuild from fundamentals.
00:06:39.000 I had an incredible wife.
00:06:41.460 What a blessing.
00:06:42.840 That's for sure.
00:06:43.900 I had incredible kids.
00:06:45.500 What a blessing.
00:06:46.480 Five.
00:06:47.120 Five incredible kids.
00:06:49.380 That's amazing.
00:06:50.620 It's, no, Tucker, it's amazing.
00:06:53.200 Having kids is amazing.
00:06:55.280 I had incredible friends.
00:06:57.460 I have a Thursday night poker game that I will, frankly, no matter where I am in the world,
00:07:02.100 I will go out of my way, planes, trains, and automobiles to fly back for because of what
00:07:07.460 that game gives me with my friends.
00:07:09.200 So I have these little things.
00:07:10.660 Wait, where's the game?
00:07:11.880 In my house.
00:07:12.780 Really?
00:07:13.320 Yeah.
00:07:13.480 And it's great, too, because, by the way, like, you know, poker.
00:07:15.380 How many players?
00:07:16.140 Well, so poker, so in Silicon Valley, there's like a group of us, like some well-known folks
00:07:20.420 that we all get together.
00:07:22.760 And my wife was the one that did this.
00:07:24.920 When she first looked at the game, you'd see all these people, and it was really interesting.
00:07:30.120 But, you know, a lot of my friends have a touch of the tism, and so what happens is,
00:07:33.900 you know, you just end up, like, looking down at your cards for eight hours straight.
00:07:37.480 Nothing was happening.
00:07:38.220 And she said, guys, this is ridiculous.
00:07:40.300 You can play for a few hours, but at 7 p.m., we're going to break.
00:07:43.480 We're going to sit around a table, and we're all going to talk.
00:07:46.820 And you must look at each other in the eyes.
00:07:48.820 And it was so funny.
00:07:50.640 But it's become a ritual.
00:07:52.940 And for all of us, we all feel seen.
00:07:55.120 And then slowly what happens is people start to talk about things that they would have never
00:07:58.420 talked about.
00:07:59.260 And what you see is this repetitive pattern.
00:08:02.520 People with this dark passenger, lots of insecurity.
00:08:06.500 They achieve a lot.
00:08:08.340 But all of that, a lot, is externally validated.
00:08:12.660 But not internally felt.
00:08:15.640 And so there's just this sense that there's an emptiness.
00:08:18.820 And people start to panic.
00:08:21.000 I thought that XYZ would solve the problem.
00:08:23.660 I thought the watches would solve the problem.
00:08:25.700 The boat, the plane, the clothes, the chains.
00:08:29.320 None of it solves it.
00:08:30.500 And everybody, like clockwork, all of my friends, go through it.
00:08:35.720 And so it's funny.
00:08:36.600 I'm sort of at the tail end of this.
00:08:38.540 But that poker game has been almost like this therapy session where we all get to talk.
00:08:42.720 And then as a result, people feel calmer.
00:08:45.320 They feel a little bit more seen about what's going on.
00:08:47.700 My point in telling you this is not sympathy.
00:08:50.260 It's just a state that everybody is going through this struggle.
00:08:54.160 So back to me, I was able to sort of put a finger on what my thing is.
00:08:58.880 What is that big sack that I've been trying to carry up a hill that is totally worthless
00:09:05.200 and not worth my time?
00:09:06.900 It's this idea that I am worthless and I'm not worth anybody's time.
00:09:11.260 And that just comes from the way that I was raised and the things that happened to me.
00:09:15.940 I don't want any sympathy for that except to say that's my thing.
00:09:18.920 But now that I know it, I can try to do things that are more productive in ways where I feel
00:09:24.380 real value.
00:09:25.980 And I think that's a very useful process because it reintroduces.
00:09:30.460 I think it can fix for so many people the thing that is so broken right now.
00:09:35.240 You know, we are completely de-spiritualized.
00:09:39.320 Nobody believes in a higher order, faith, God.
00:09:44.320 And so I think what happens is everybody has this Carl Jung moment.
00:09:49.940 All of this difficulty sits on top of them.
00:09:52.900 And at some point, they may never say it out loud.
00:09:54.740 They think, I am living in a tale told by an idiot, right?
00:09:58.380 That's that famous quote about why spirituality is important.
00:10:01.500 And when you feel that way and you don't have an answer, you start to feel angry and you
00:10:08.180 start to push back and you start to think, tear it all down.
00:10:10.780 None of this is working.
00:10:11.940 It's all BS.
00:10:12.740 So I want to try to solve it for myself.
00:10:17.460 And then as I live and as I just kind of do the things that I'm doing, start new businesses,
00:10:21.320 make new investments, I'm going to try to point this out.
00:10:25.160 Because I think by pointing it out, you have a chance for other people to start questioning
00:10:29.760 things.
00:10:30.960 Do I have a dark passenger?
00:10:32.580 What is it?
00:10:33.060 Why did you get so self-aware about all of this?
00:10:35.440 Most people, when they feel sad to sort of bumble forward and keep doing what they have
00:10:39.720 always been doing to no effect, what stopped you and made you think about what was happening?
00:10:45.700 You know, I am, it's still very much a source of anger for me.
00:10:52.180 Like this idea that I'm basically worthless makes me mad.
00:10:59.200 And I projected it for many years sort of on the people that I, you know, my parents
00:11:05.140 for the most part and all that dysfunction.
00:11:09.220 Because I think about it a lot.
00:11:11.600 And then when I make mistakes, so in that 2022 period, when I started to really write down
00:11:16.260 here are all the mistakes I made, one or two layers.
00:11:19.640 You actually wrote them down.
00:11:20.580 Wrote them down.
00:11:21.160 It's like, okay, I invested a couple hundred million dollars in this thing.
00:11:24.640 It went to zero.
00:11:25.880 What was I thinking?
00:11:28.980 And I would first, in your mind is very clever.
00:11:32.540 At first you lie to yourself and you lie to yourself incredibly well.
00:11:36.620 Well, I underwrote it this way.
00:11:38.500 I thought the discounted cash flows, it's all BS.
00:11:43.420 One layer after that was, and then I communicated my thinking to hold myself accountable.
00:11:48.640 Also BS.
00:11:49.220 That communication got me attention and I liked the way it made me feel.
00:11:56.460 And then I said, hold on a second.
00:11:59.820 That feels true.
00:12:01.400 I hate that I can even say it, that it may be true, but that feels true.
00:12:06.060 And so then I go back and I start to think like, how many other decisions that I made, that I made in that period were rooted in that?
00:12:13.400 And when I saw four or five of them, I said, this is not what I'm supposed to be doing.
00:12:19.600 And then I went back and I said, how many other decisions in my life have I been making that were rooted in the idea of look at me?
00:12:27.260 And it turned out that there were a lot, the things I bought, the clothes I wore, the things I said, the way that I tried to live a life.
00:12:36.440 And I felt that that was not me.
00:12:39.780 I was ashamed of it.
00:12:42.260 Then I was angry about it.
00:12:43.820 And then I asked my wife, I need you to help me fix it.
00:12:47.080 And this is what I mean by like, you have to have a partner, I think, in crime that can really go through the ups and downs with you.
00:12:54.060 Damn, what did your wife say when you told her this?
00:12:55.920 She's like, I've been telling you this for years.
00:12:58.760 What every wife says.
00:12:59.940 But, you know, I was like, finally, I listened.
00:13:02.660 And, you know, she always makes this joke.
00:13:04.040 She's like, I could tell you the smartest thing in the world.
00:13:05.960 And she's like, you think you need a man to tell you.
00:13:07.600 And I said, well, this time this man was me.
00:13:09.580 And it's a joke that we.
00:13:10.460 No, it's true though.
00:13:11.980 And so, and my father-in-law.
00:13:14.800 And my father-in-law, when I finally like kind of got rid of all the anger that I had.
00:13:21.180 My father-in-law was a father that I wish I had had.
00:13:26.000 And what I mean by this is like, some people may not understand this, Tucker.
00:13:30.340 You may not.
00:13:30.960 You may or may not.
00:13:31.680 But it is extremely discomforting to feel unconditional love from the people around you.
00:13:41.660 If you have not felt it.
00:13:43.720 If you have always felt it, you don't know what it means when somebody says it.
00:13:47.760 Yes.
00:13:48.520 You're like, what does that mean?
00:13:49.560 I don't know.
00:13:51.240 I knew what it did not, what I thought love looked like.
00:13:57.280 And then when you have people that love you in this way, it's tilting.
00:14:01.960 And it made me angry.
00:14:04.880 Because I would go back to, what is this?
00:14:07.120 I would push back on this.
00:14:08.360 And I would go back to that and say, why didn't that look like this?
00:14:11.840 And I was just, I was in an endless loop of just being mad.
00:14:15.000 So when you experienced the way your wife's family loved each other, it made you mad about your childhood?
00:14:20.200 Yeah.
00:14:20.440 And at first I was mad, I was mad at her, you know, meaning subconsciously, like, because she's giving me something that I don't understand.
00:14:28.720 And I thought, there's got to be a catch.
00:14:30.960 Where's the asterisk?
00:14:32.640 There's always a catch.
00:14:34.840 And, you know, for years, she's like, there's no catch.
00:14:38.880 Damn.
00:14:39.280 And then she says to me, this is the best thing that's ever happened to us.
00:14:43.040 She's like, now, this next phase of totally building will be that you and I do it together.
00:14:48.960 And again, I freaked out.
00:14:50.620 And I thought, what do you mean?
00:14:52.040 Where'd you find this woman?
00:14:53.140 Oh, my God.
00:14:53.740 She's, I mean, she's from the heavens.
00:14:56.580 She turns out she's from Milan, but she's from the heavens.
00:15:00.780 And she's like, but we'll do it together.
00:15:03.540 And it's going to be incredible, whatever it is, because we're going to look back and the process will have been the thing.
00:15:09.040 And I, so I just, I needed a guide and I needed to be open to listening.
00:15:16.060 And so I needed that event, right?
00:15:18.460 Because, yeah, like, think about a kid, me.
00:15:21.920 I grew up in Sri Lanka.
00:15:23.520 There's a civil war.
00:15:25.420 We claim refugee status in Canada.
00:15:28.240 Grew up on welfare.
00:15:29.200 I get an engineering degree.
00:15:31.360 And within a year, I'm in the United States.
00:15:33.180 And my career just goes up and to the right.
00:15:36.420 Everything was working.
00:15:37.940 Youngest vice president at AOL, you know, when I was 26 years old, running this big messaging business.
00:15:43.560 Then I get recruited by Zuck.
00:15:45.780 I go to Facebook as one of the early execs.
00:15:48.360 I build that business.
00:15:49.480 All the key things that they look back on now, the network effects, the early monetization, internationalization.
00:15:55.060 That was my team.
00:15:57.000 Most of my team still runs that company.
00:15:59.020 So it was like, then I left and I started an investment business.
00:16:03.480 Those investment returns are really good.
00:16:05.980 And so everything was, quote, unquote, working.
00:16:10.540 So I held myself in really high regard, but for the wrong things, if that makes sense.
00:16:16.660 Of course it does.
00:16:17.220 I could point to a bank account or Twitter followers or all of this stuff.
00:16:21.740 And then when that was undone, right, 2022, 2023, what happened?
00:16:29.000 I thought that this was all supposed to work and it would work forever.
00:16:34.220 It was an opportunity for me to really reset and tell myself the truth, the ugly truth.
00:16:38.100 Hey, man, you are motivated by stupid, inconsequential stuff.
00:16:42.580 Get back to basics.
00:16:44.300 And then when I looked around, it wasn't just me.
00:16:47.360 There's so many of my peers in Silicon Valley that it also just wasted the last decade doing nothing.
00:16:52.640 We were all, quote, unquote, wealthier, but we were more broke, you know?
00:16:56.600 I wonder, I do know, and I've certainly seen that a lot, having spent a lot of time around rich people.
00:17:03.340 But I've rarely seen someone address it as honestly as you are now.
00:17:08.500 And do you know other people who've been willing to look at themselves as clearly?
00:17:13.200 I think there are people that may not talk about it, but I think that they've lived it.
00:17:27.080 I know Elon's lived it, where he's always underwritten things based on, like, this is the, so, like, you know, the great thing about being in Silicon Valley is that there are people that I have the honor and the luck of knowing.
00:17:41.440 And my, like him, but I didn't really try to learn from him until 2022, if that makes sense.
00:17:50.640 Meaning, there were all these decisions that he made that I would reductively reduce to, that's a smart business decision.
00:17:58.300 And it had nothing to do with business.
00:18:00.320 He had, he was always leading from, what are my, what are my core moral beliefs and let me act on those.
00:18:06.020 Like, the money never mattered.
00:18:09.180 You know, he never did anything to, to, to live that experience.
00:18:13.580 He loves his friends.
00:18:15.460 He loves his family.
00:18:17.280 It took me a decade of knowing him until I started to listen to that.
00:18:20.860 And to, like, to really hone in on that.
00:18:22.400 So, he's an example.
00:18:24.080 Then there are folks that are a little bit older that have gone through it.
00:18:26.840 My father-in-law has gone through it, ups and downs and ups and downs.
00:18:29.440 And he's built an incredible business.
00:18:32.840 And he's had to face tremendous hardship where he's had to underwrite, well, what is really important?
00:18:38.780 Am I an honorable person?
00:18:40.280 You know, is my word my bond?
00:18:42.120 When you shake my hand and we do a deal, it is what it is.
00:18:45.520 You know, if I can, you know, make X, but it's way too much.
00:18:49.360 And if I can make half as X, but it's more, makes my, he makes drugs.
00:18:53.300 You know, life signs of drugs.
00:18:54.520 Make them more accessible.
00:18:55.500 Should I do?
00:18:55.980 He makes those decisions.
00:18:57.180 And so, you live, you see his morality play out in his actions.
00:19:01.780 My wife, so I think now I have three or four people in my life, but I had to listen.
00:19:07.860 And before, I chose not to listen because my ego said, hey, man, you're, you know, you're the best.
00:19:14.040 You've heard people say it.
00:19:15.580 If you're not doing anything wrong, why do you care if other people watch what you're doing?
00:19:20.060 Well, because privacy is integral to freedom.
00:19:23.080 No privacy, no freedom.
00:19:25.180 Well, it's true that a VPN could come in handy for those who spend their time browsing the dark web, doing scandalous things.
00:19:33.280 VPNs are not actually for criminals.
00:19:35.360 They're for everybody, particularly for people who are doing nothing wrong.
00:19:39.360 People like you and me who want freedom from the creepy people out there, the big data brokers.
00:19:45.040 You might not know it, but big data brokers make a ton of money by selling your personal data, information about you.
00:19:50.340 And they have access to that because you're not using a VPN.
00:19:54.280 Everything you do online can be seen by them and then sold by them.
00:19:57.200 It's time to change that.
00:19:58.620 And ExpressVPN can help you do it.
00:20:01.120 Our data is protected because we use ExpressVPN.
00:20:04.540 It serves as a lock on our door to keep the internet vultures out.
00:20:08.220 And thanks to a new feature, it can now alert you when someone tries to use your social security number.
00:20:12.780 By the way, if someone's trying to use your social security number, you probably should know about it.
00:20:16.520 ExpressVPN is the total package that's easy to use.
00:20:18.380 Right now, you get an extra four months for free when you use our special link.
00:20:21.800 And here it is.
00:20:22.360 Get a pen.
00:20:23.460 ExpressVPN.com slash Tucker.
00:20:25.320 We use it.
00:20:26.360 We recommend it.
00:20:27.460 Hillsdale College offers many great free online courses, including a recent one on Marxism, socialism, and communism.
00:20:34.560 Today, Marxism goes by different names to make itself seem less dangerous.
00:20:39.000 Names like critical race theory, gender theory, and decolonization.
00:20:42.520 No matter the names, this online course shows it's the same Marxism that works to destroy private property
00:20:48.580 and that will lead to famines, show trials, and gulags.
00:20:52.980 Start learning online for free at Tucker4Hillsdale.com.
00:20:58.460 That's Tucker, F-O-R, Hillsdale.com.
00:21:03.140 As we welcome in a new year, it's time to focus on what matters most.
00:21:19.620 Creating healthier habits, enjoying more moments with family, and spending less money on going out to eat.
00:21:25.240 GoodRanchers.com is here to help you turn those resolutions into solutions.
00:21:29.100 During GoodRanchers New Year New Meat special, you can subscribe to any box of their 100% American meat and wild-caught seafood.
00:21:36.880 And if you use code TUCKER at checkout, you'll get $25 off and your choice of free ground beef, chicken, or salmon in every order for an entire year.
00:21:47.420 By shopping with GoodRanchers, you're supporting local farms across the U.S. and avoiding the chaos of grocery store imports.
00:21:54.140 Most importantly, you'll enjoy stress-free, delicious meals that let you focus on what matters.
00:21:59.640 Quality time with loved ones.
00:22:01.480 Start 2025 with better choices, better meals, and better moments at home.
00:22:06.100 Claim your free meat for a year plus $25 off with code TUCKER at GoodRanchers.com.
00:22:11.820 American meat delivered.
00:22:14.660 This episode is brought to you by Samsung Galaxy.
00:22:17.540 Ever captured a great night video only for it to be ruined by that one noisy talker?
00:22:21.840 With audio erase on the new Samsung Galaxy S25 Ultra, you can reduce or remove unwanted noise and relive your favorite moments without the distractions.
00:22:31.100 And that's not all.
00:22:32.120 New Galaxy AI features like NowBrief will give you personalized insights based on your day schedule so that you're prepared, no matter what.
00:22:39.400 Pre-order the Samsung Galaxy S25 Ultra now at Samsung.com.
00:22:43.440 So you got very rich in your early 30s.
00:22:51.020 Yeah.
00:22:52.000 Yeah.
00:22:52.380 That itself is weird.
00:22:53.800 It's really...
00:22:54.440 I mean, it's not weird where you live.
00:22:55.760 It's not uncommon where you live.
00:22:57.620 But it's...
00:22:58.340 Historically, like, there's not a lot of that.
00:23:00.820 Yeah.
00:23:01.000 What's that...
00:23:01.800 What are the upsides and downsides of that, do you think?
00:23:06.300 The upside, I think, is that you can start to really focus on things that you care about.
00:23:13.040 The downside is if you haven't...
00:23:14.660 If you don't know what those things that you care about really are, you're going to waste a lot of time.
00:23:18.400 Good point.
00:23:19.240 What did you do when...
00:23:21.860 This was after the Facebook IPO.
00:23:23.240 I mean, I bought a piece of the Warriors.
00:23:25.060 That was really cool.
00:23:26.140 You know, I was like a 10% owner of the Warriors.
00:23:28.780 I had a press release from the NBA.
00:23:31.960 And there was like two players on the team at the time that were older than me.
00:23:37.740 That's pretty weird.
00:23:39.100 And I made some amazing friendships with them.
00:23:41.480 When we won the championships, you know, we would go to Vegas.
00:23:43.980 We would kind of like party together.
00:23:46.920 Don't get me wrong.
00:23:47.820 Like an incredible experience, especially for, you know, an ugly nerd that had no social life.
00:23:54.180 Do you know what I mean?
00:23:54.760 So, I took advantage of that because it's like filling a jug of water, right?
00:24:02.060 That jug had always been empty.
00:24:03.760 I was never invited out, right?
00:24:05.560 So, okay.
00:24:06.040 I thought, now I'm invited out.
00:24:08.040 Obviously, it's because, you know, the position.
00:24:10.300 It's not like, you know, we grew up together or whatever.
00:24:12.680 But I would get invited out.
00:24:14.680 Okay.
00:24:15.020 So, every now and then, maybe more often than not, I'd just pay for the dinner.
00:24:17.800 Whatever.
00:24:18.040 It doesn't matter.
00:24:18.600 I'm out.
00:24:19.400 Yeah.
00:24:19.600 I'm in the mix.
00:24:20.400 I'm in the game.
00:24:21.360 But then the jug fills up.
00:24:24.420 And then you're like, wait a minute.
00:24:25.980 Because you think like when the jug is two thirds full.
00:24:27.800 Oh, don't worry.
00:24:28.320 It's that extra third that's just going to fix it all.
00:24:31.500 It's true.
00:24:32.200 So, you just go through these cycles.
00:24:35.200 And so, I mean, that was a cool thing.
00:24:37.220 You know, you do the, you go after material possessions.
00:24:40.860 That doesn't do anything.
00:24:42.660 It really doesn't.
00:24:43.900 It's very, these are all very hollowing things.
00:24:45.860 I think like the thing that, like I said, like, you know, the problem with things like
00:24:50.360 social media, what they do is they glorify these things.
00:24:53.380 We all fall for it.
00:24:54.200 I fell for it.
00:24:54.880 So, if when you have money, you go and you buy these things because you think this is
00:24:58.360 what happiness looks like or, you know, success looks like.
00:25:02.580 And it's not, it's none of that stuff.
00:25:04.520 Now, nobody listening to this will believe it because they'll, everybody wants to live that
00:25:11.660 over and over.
00:25:12.920 Because it's not like a very destructive life lesson.
00:25:16.220 You know what I mean?
00:25:16.800 Like to have like a Laura Piano sweater and not need it.
00:25:20.780 But, but the, the, the bigger message is more important, which is if you have a sense
00:25:26.240 of what's important, you can kind of see the things that are really happening in a much
00:25:30.880 more, they're more in focus.
00:25:34.340 So, like, you know, I would say like now that I'm 48, I'm much more aware of like, what
00:25:41.280 does it mean to be an American?
00:25:42.880 What is my job as an American businessman, as an engineer, as an entrepreneur?
00:25:49.840 It's not all of this other superficial garbage because it adds nothing.
00:25:56.620 It's to actually allow the system that rewarded and benefited me to be just a little bit better
00:26:03.680 in terms of the contributions I give to it before I'm no longer part of the system in
00:26:09.060 50 or 60 years.
00:26:11.280 That's very motivating for me now.
00:26:13.700 And this idea that my kids can go and join Team America and do cool stuff and find happiness,
00:26:20.560 find a great husband, find a great wife, have a bunch of kids and live a good life and know
00:26:26.920 what their dad went through and have a better sense of that.
00:26:30.020 That seems like an additive thing I can add to the system.
00:26:34.280 To be a good example for my friends, when they start to go through their own struggles,
00:26:38.640 that they can kind of course correct a little bit faster than I did.
00:26:42.500 You know, I mean, I went through a divorce, right?
00:26:43.840 So, that's a terrible thing.
00:26:46.380 After you got rich?
00:26:47.400 Yeah, after all that, I went through a divorce.
00:26:49.260 You know, I was very lucky to find my wife.
00:26:51.020 But my point is that my first marriage, when you get a divorce, that's a death in the family.
00:26:56.500 Literally, yeah.
00:26:57.480 Literally a death.
00:26:58.240 And then you are complicit in the commission of that death.
00:27:02.500 You know, it's the husband and the wife.
00:27:04.280 I mean, there are only two people responsible.
00:27:06.040 That's right.
00:27:07.760 And so, I see a bunch of my friends who are, you can see some veering and teetering.
00:27:14.340 And now I can sort of intervene a little bit and just kind of cajole and nudge and, you know, help them.
00:27:20.260 And I'm not saying that these are all really grand, highfaluting things,
00:27:24.440 but they actually address the inner part of what I needed for a very long time.
00:27:29.000 I just didn't realize it.
00:27:31.020 No, those are the most important things.
00:27:32.660 Yeah.
00:27:32.880 If you can help someone save his marriage, I mean, I think that's a lot more important and virtuous than most things that we do.
00:27:40.280 Than, frankly, most everything.
00:27:41.580 Especially when you think of how it compounds to that husband and wife's children.
00:27:46.660 Oh, yeah.
00:27:47.540 I mean.
00:27:47.940 It's the key thing.
00:27:49.140 I completely agree.
00:27:50.700 Yeah.
00:27:50.800 And a happy marriage makes happy children.
00:27:52.620 So.
00:27:52.920 Yeah.
00:27:53.080 Yeah, it redounds through the generations, I would say.
00:27:56.120 Exactly.
00:27:57.160 So, speaking of children, I was having a conversation with one of my children this morning.
00:28:03.240 You know, I know a lot of rich people, obviously.
00:28:05.500 And we're talking about somebody who we know is, you know, a good guy, billionaire,
00:28:10.640 who is totally focused on making more money to the exclusion of.
00:28:14.980 Everything else.
00:28:15.820 Kind of.
00:28:16.220 And one of my children said to me, and not in a judgmental way, but, you know, with affection for this person,
00:28:22.620 but, like, why?
00:28:23.660 Like, what is that exactly?
00:28:24.960 I mean, why?
00:28:25.520 You know, I guess it's just on autopilot to some extent.
00:28:28.160 Like, you know, I make money, I'll make more money.
00:28:30.020 But the drive to make more money that is literally superfluous, like, you will never need that money.
00:28:36.000 What is that?
00:28:37.640 It's an emptiness somewhere else.
00:28:39.540 Yeah.
00:28:39.740 It's a jug that they're trying to fill, and they think that the, you know, the closer they get to filling it,
00:28:46.540 the problem is, like, your mind just switches the jug to an even bigger jug.
00:28:50.920 And then it gets closer to being full, and then it switches again, and it switches again.
00:28:55.940 I think the much better way to think about this problem is what am I doing with my time that actually helps
00:29:03.700 the place that gave me an opportunity be better, and the people that live beside me be better.
00:29:08.720 That is a really morally valuable statement.
00:29:14.580 And then you can kind of, like, look at all the things and all the problems that make everybody mad in the United States
00:29:21.720 as an opportunity to actually do them better.
00:29:24.920 And that is useful.
00:29:26.300 And when you have money, the one thing that you can do is you can accelerate that change much faster
00:29:31.060 than folks that have to take a much more arduous path.
00:29:34.360 For sure.
00:29:34.980 And I think we've lost that.
00:29:36.440 There's not enough people that basically say, okay, you know what?
00:29:39.220 The United States has given me so much.
00:29:42.000 Now, how do I give back, quote-unquote?
00:29:44.660 And you don't necessarily have to give back by going into nonprofit or going into government.
00:29:48.440 You can just acknowledge the problems that are there and go fix them.
00:29:52.040 And you can fix them by starting for-profit companies, which are always the best way.
00:29:55.560 And I wonder, like, why don't more people do it?
00:30:00.080 It's a great question.
00:30:00.420 You can have five decent children, too, which is probably the best thing you could do for any country.
00:30:04.740 But, so thank you for that.
00:30:06.600 But I do notice and have always noticed that some of the people who have benefited most from the United States
00:30:12.600 dislike it the most intensely.
00:30:13.920 And I don't really understand what that is.
00:30:15.640 I think that that is – I think that what's happened – so I'll give you my framework.
00:30:20.780 You can tell me maybe where you agree or disagree.
00:30:23.140 But it is important for all 330 million Americans to take a step back and acknowledge this one truth.
00:30:32.120 And I think that it is completely a canonical statement that is inviolate for being an American.
00:30:37.280 We are the single most important country in existence in the world.
00:30:42.980 We are the most important country today.
00:30:45.240 We must be the most important country tomorrow.
00:30:48.060 Period.
00:30:49.360 If you say that enough times and you believe it, then there are two things that underpin that.
00:30:56.080 And I think only two.
00:30:58.000 We are the single most vibrant economy in the world.
00:31:01.680 And we are the single strongest military in the world.
00:31:07.280 And if you can agree to those two things, which I think should be non-controversial if you say – meaning,
00:31:13.200 like if I said to you, hey, Tucker, we make the best oranges and burritos in America.
00:31:18.640 That does not yield the most important country in the world.
00:31:21.920 If I said we make the best shoes and the best flat panel TVs,
00:31:26.800 that does not equate to the most powerful country in the world.
00:31:28.980 But if I said to you, we have the strongest and most vibrant economy and the strongest and most powerful military,
00:31:37.460 that is the strongest and most important country in the world.
00:31:40.360 And then there is only one thing that gives you both of those two things, which is technological supremacy.
00:31:50.240 So go back to these examples.
00:31:52.080 If I said to you, we write the best books, those books could be incredibly powerful,
00:31:57.460 but it does not give you technical supremacy.
00:31:59.720 If I said to you that we have the most abundant energy, oil fields, nat gas, it's important,
00:32:08.200 but it does not give us technological supremacy.
00:32:12.240 Those that get there will be in a position to create the most vibrant economy.
00:32:19.160 They'll take that money and then create the most powerful military.
00:32:22.080 They'll put those two things together.
00:32:23.580 They'll be the most powerful country.
00:32:24.880 So I think today, sitting here, January of 2025, we are in an existential risk of losing our place in the world.
00:32:34.940 And the reason is that we had people, we have people from the inside trying to sabotage our economy effectively
00:32:43.200 and trying to sabotage our military capability.
00:32:46.060 And they do that not explicitly, but they do that because they are in positions of leadership
00:32:51.760 and they fundamentally don't know what they're doing.
00:32:53.720 And this is what needs to get called out.
00:32:57.380 And I think what we need is this wholesale reform of the people that are at the levers
00:33:02.040 and in the controls of these things.
00:33:04.980 The lack of economic judgment, the lack of military judgment,
00:33:08.840 is ruining America's ability to be the most important country in the world.
00:33:15.020 We are, you know, in Silicon Valley, I think it's fair to say that we have had a lost decade.
00:33:20.500 And when you look underneath why, what are the two most or three most or four most incredible technological achievements
00:33:28.660 that the Silicon Valley has created in the last decade?
00:33:32.120 You're hard-pressed to find it.
00:33:34.300 So in one example, you have Elon.
00:33:37.080 He's created reusable rocketry.
00:33:39.500 He's created an entire global mesh of communications infrastructure.
00:33:44.480 He's created electric cars.
00:33:46.140 That's an incredible thing.
00:33:49.880 And he's done that with one hand tied behind his back, meaning fighting the government, local, state, federal,
00:33:55.720 at every single turn over the last 10 or 15 years.
00:33:58.340 What have the rest of us done?
00:33:59.780 We've created AirPods and Instagram reels.
00:34:03.340 Why?
00:34:03.820 I think a lot of people fell into the same lull that I fell into.
00:34:11.380 We had people pushing back constantly.
00:34:14.380 We got distracted.
00:34:16.120 We wasted time.
00:34:17.740 You know, we took an entire cadre.
00:34:19.740 Like, look, engineering is actually very much like professional sports, Tucker.
00:34:24.100 Like, there are Michael Jordans in engineering, okay?
00:34:27.640 And there are many people that are not very close to Michael Jordan, you know, couldn't even make a JV scrub team.
00:34:34.180 Yeah.
00:34:34.920 There is that crazy distribution of capability.
00:34:40.180 And let's just say in Silicon Valley there's 5 million engineers, if you add up all the companies and all the people.
00:34:46.100 Maybe that's a lot, but I don't know.
00:34:48.340 I can tell you that there's at least 25,000 or 50,000 of them that are like Michael Jordan-esque capable.
00:34:54.980 And instead, what we told these people is, hey, don't win six championships in eight years.
00:35:03.000 Don't be the most prolific player ever.
00:35:05.580 What we told them to do was like, hey, you can dribble down the court, but don't dribble too fast because you'll make these other people feel bad.
00:35:10.960 Hey, you know what?
00:35:11.620 You can do a couple of layups, but don't do too many layups because you should actually pass the ball so that these other people we hired because the team photo looks better.
00:35:18.360 You know, give them a chance to score.
00:35:20.280 You did all of these dumb things.
00:35:21.880 Then team management would come down and say, you know what?
00:35:24.160 I actually think the goal should be to play wiffle ball.
00:35:27.760 And then you take Michael Jordan off the basketball court and you make him play wiffle ball.
00:35:31.000 That's what Silicon Valley did.
00:35:33.060 We took all this incredible talent.
00:35:35.460 We got distracted by the money.
00:35:37.540 Yeah.
00:35:38.080 Right?
00:35:38.260 Because what really did happen in the Valley?
00:35:40.480 What did happen is all the billionaires became decabillionaires and centibillionaires.
00:35:46.960 Right?
00:35:47.440 The wealth went through the roof.
00:35:49.420 The innovation went through the floor.
00:35:51.880 So, we got lulled into this economic complacency.
00:35:59.020 My gosh, I'm so much smarter because I'm so much richer.
00:36:01.760 No, you're not.
00:36:03.480 No, you're not.
00:36:05.220 I'll give you a different example.
00:36:06.600 It sounds like just good old-fashioned decadence, kind of.
00:36:08.720 But I think that there was some sabotage, meaning, or maybe sabotage is not the right word, but there were traps that were laid out and we all fell on them.
00:36:20.840 The DEI trap, the woke trap, all this kind of stuff that were distractions to core innovation.
00:36:29.220 I'll give you two examples that paint the picture.
00:36:31.220 I'll bookend it.
00:36:32.700 I'll give you a Silicon Valley bookend.
00:36:34.760 The beginning of the bookend is in the early 2000s.
00:36:37.900 There's an incredible professor, Jennifer Doudna in Berkeley, and she pioneers CRISPR, which is the ability to edit genes.
00:36:45.120 Let's take that off the table, whether you think it's morally right or wrong for a second, okay?
00:36:50.060 It could be a tool.
00:36:51.160 It could be a weapon.
00:36:52.260 I grant that.
00:36:53.100 But it is undeniably a tool that sits in the toolbox that we call technological supremacy.
00:37:02.080 Over the next decade, what Silicon Valley managed to do was embroil themselves in IP lawsuits about who actually owned it.
00:37:11.560 What China did was take the open source awareness of it and pioneer it.
00:37:16.680 Was that smart?
00:37:17.800 Whether you agree or you disagree, should that toolbox be in our toolbox where we can meet it out, or should it be in China's toolbox where they can decide?
00:37:28.540 Where if all of a sudden there is some disease in the future and it requires this very precise form of gene editing and only they can do it,
00:37:39.000 and now a state-sponsored entity in China is the one that provisions a cure for 8 billion humans around the world,
00:37:45.080 that will give them tremendous economic power.
00:37:52.120 Is that smart for America to have done that?
00:37:55.960 I think not.
00:37:57.380 I'll give you a different example, which is just today as you and I sit here,
00:38:02.480 President Biden issued an EO.
00:38:04.000 And what the EO said is, executive order, AI is going to be critical, and so we want to give the ability for federal lands to be used for AI data centers.
00:38:22.500 Okay, now you're cooking.
00:38:24.460 This sounds smart.
00:38:25.360 Let's go read the fine print.
00:38:26.660 And by the second or third paragraph, what it says is, however, we need to think about the diversity and equity inclusions of said data centers.
00:38:37.740 And, and, no, hold on, and, you have to basically give preference to clean power.
00:38:49.040 Well, is that the same clean power that was essentially made impossible because of permitting issues and environmental impact studies?
00:38:57.540 You know, you can't just build solar farms that you want to.
00:39:00.300 You can't build wind farms in America if you want to.
00:39:02.920 You can't build nuclear reactors because they won't let you.
00:39:07.820 These are not technological limitations.
00:39:09.740 These were regulatory limitations.
00:39:12.220 Those are just two examples that just show you.
00:39:14.400 You cannot do what's in America's best interest right now because people have forgot.
00:39:19.640 They've lost the script.
00:39:20.680 They forgot the priorities.
00:39:22.760 Guys, the priorities are we need to remain the most singularly powerful economic and military entity in the world.
00:39:30.160 The way you do that is through technical supremacy, period.
00:39:33.700 I'll give you another two examples just to, you can tell me if these are boring, but.
00:39:38.000 Not at all.
00:39:39.740 Saudi Arabia is doing an incredible job.
00:39:42.060 They're monetizing their oil.
00:39:44.760 They are doing very strategic things with the capital.
00:39:50.420 What they've decided is they're going to allocate money that they take from selling oil to build a global data center infrastructure for AI.
00:39:59.960 That's really smart.
00:40:01.780 And when you look at Saudi Arabia on a map, you think, oh my gosh, this is very smart.
00:40:07.740 Why?
00:40:08.020 Because they sit right in this artery between Asia, Europe, and Africa.
00:40:11.960 This is critically smart.
00:40:17.580 But what is the one thing that they need that they don't have?
00:40:21.580 It's AI chips.
00:40:23.480 Now, AI, just to break it down for, you know, your viewers, think of AI as two buckets.
00:40:31.020 Okay, bucket number one is where you train the brain.
00:40:34.660 Okay, think of AI as a brain.
00:40:36.400 Bucket number one, you train the brain.
00:40:38.460 Bucket number two, you use the brain to make decisions.
00:40:40.600 Okay, the chips that we make to train the brain are under export control, right?
00:40:50.060 We don't want folks to train their own brains necessarily unless we can govern them.
00:40:55.080 That's a Department of Commerce decision on export licensing.
00:40:59.500 The way that we use the brain is a different kind of chip.
00:41:03.480 And we have some export controls there.
00:41:06.060 What will Saudi eventually be forced to do?
00:41:12.640 They're an ally of America.
00:41:14.260 They want to do the right thing.
00:41:16.200 But they have a responsibility to their people to try to become the most incredible, you know, economic and military power in the world.
00:41:24.740 They're going to go and buy the chips from the people that will actually sell it to them.
00:41:28.100 They'll look east, of course.
00:41:29.080 They will look east and they'll find it in China.
00:41:33.480 Yet a different example.
00:41:36.760 If you look at Meta, Meta has poured tens of billions of dollars to training a brain, okay, an AI brain that's called Lama, right?
00:41:46.260 That's Meta's efforts in AI.
00:41:50.440 And it's open source.
00:41:51.940 It's wonderful, actually.
00:41:53.580 My companies use it.
00:41:54.580 It works.
00:41:56.220 It's high quality.
00:41:59.280 Open AI, which is the private closed-sourced competitor to Meta and Lama, also has an AI brain that they've trained.
00:42:08.560 You know, GPT, chat GPT, you've used it probably.
00:42:11.440 They've also spent tens of billions of dollars, in part coming from Microsoft.
00:42:14.780 Meanwhile, in December, a Chinese company open-sourced a model where they spent tens of millions of dollars.
00:42:28.380 And in many cases, that digital brain is smarter than both Meta and open AIs on many dimensions.
00:42:35.580 So, two orders of magnitude cheaper.
00:42:40.440 Well, what do you think that means for the other 182 countries around the world that wants to do something in AI?
00:42:45.920 Are they going to take the $10 billion version or are they going to take the $10 million version?
00:42:50.900 They're going to take the $10 million one.
00:42:52.580 And when you unpack, well, why did it cost $10 billion?
00:42:57.140 That was my question.
00:42:58.820 It cost $10 billion because of all the roadblocks that we put in front of companies to make the things that we need to maintain our technical supremacy.
00:43:07.880 So, I'll give you some examples.
00:43:10.880 There is a huge – so, for example, one of the things that China chooses to not do is they don't really respect copyright law.
00:43:17.460 Now, I'm not saying we should violate copyright law, but I think it's important to acknowledge that there is a technical overhang that it creates in training these brains to try to filter out content that the New York Times tags or Fox News tags and says, don't learn on this.
00:43:35.100 You're not allowed unless you have a deal with me.
00:43:38.940 That creates an enormous layer of expense.
00:43:42.260 How do we judge that issue?
00:43:44.680 Today, if you ask somebody, it's a pretty simple conversation.
00:43:50.660 It's not nuanced.
00:43:51.500 It's do you believe in copyright or do you not believe in copyright?
00:43:54.840 I think it's a much more nuanced question.
00:43:58.380 For the sake of training these digital brains, if there was an economic relationship that we could create, isn't it better that our digital brain is smarter than these other ones and that we make it as cheap as possible?
00:44:08.900 Well, if you ask that question, a lot of people would say, gosh, that's a nuanced question.
00:44:15.720 It's neither an easy no or an easy yes, but on the margins, I would say yes, knowing that there are these impacts.
00:44:24.120 Give you a different example.
00:44:25.820 It takes all this energy to build the data centers.
00:44:29.640 Why does it cost so much?
00:44:33.300 It's not that the, let's just say you wanted to use solar panels.
00:44:36.380 Is it that the solar panels are expensive?
00:44:38.740 No.
00:44:39.760 Is it that the ability to do the interconnects are expensive?
00:44:43.640 No.
00:44:44.080 It's that building that facility had a multi-year environmental impact study, umpteen lawsuits, all kinds of indirection and misdirection from all of these independent actors who believed that they were pursuing their own priorities.
00:45:02.420 And there was no release valve that said, I appreciate and respect the smelt that you're trying to protect or the land grouse, but this is bigger than that.
00:45:15.960 We need to make sure we maintain our technical superiority.
00:45:18.840 That data center is going to be used by the NSA to protect America.
00:45:22.520 So it needs to go up in nine months.
00:45:25.760 No ifs, ands, or buts.
00:45:27.840 Right now, we don't have the ability to say that.
00:45:29.800 Or if we do, it's not clear who should say it.
00:45:33.400 And so all of this time costs money.
00:45:36.420 All of this complexity costs money.
00:45:39.420 And the output are practical costs of building these things that are just two orders of magnitude bigger than our competitors.
00:45:46.960 We're not going to use the word revival, but it does seem true that millions of Americans simultaneously are coming to the conclusion that buying things online and going on vacation may not be the sum total purpose of life.
00:45:59.800 Maybe there's something more.
00:46:01.400 And if you're one of those people who's beginning to ask questions, what else is there?
00:46:05.780 One of those people who's become suddenly very interested in God and having a relationship with God, then we cannot recommend an app called Hallow enough.
00:46:15.740 Personally recommend it.
00:46:17.040 It's a frequent topic of conversation in my house.
00:46:19.520 Hallow is the number one prayer app in the world.
00:46:22.540 It offers thousands of guided prayers, meditations, music options, Gregorian chants, the Bible in a year.
00:46:29.920 It's amazing what's on there.
00:46:31.540 It's great.
00:46:32.940 Hallow is fantastic.
00:46:33.760 It's very easy to use.
00:46:34.680 You build a routine.
00:46:35.800 You can join groups of worshipers, set goals, tailor the whole experience to your exact spiritual needs.
00:46:41.560 It really works.
00:46:43.640 Most apps people get excited about and then stop using them.
00:46:46.400 Hallow, people keep using it because it's great.
00:46:50.260 We recently interviewed the founder and CEO of Hallow, and that conversation left us more convinced than ever that this will never become a godless society because people won't let that happen.
00:46:59.460 People need God.
00:47:00.720 Hallow can help connect you with God.
00:47:02.940 With the March for Life approaching, it's a perfect time to download the Hallow app to unlock a vast array of prayers for pregnant mothers, the unborn, civil leaders, and many others in need of prayer.
00:47:13.120 Because prayer actually works.
00:47:15.840 Head over to Hallow.com slash Tucker.
00:47:17.700 Get three months free.
00:47:19.400 Hallow.
00:47:20.100 It's awesome.
00:47:21.980 Get groceries delivered across the GTA from Real Canadian Superstore with PC Express.
00:47:27.220 Shop online for super prices and super savings.
00:47:30.080 Try it today and get up to $75 in PC Optimum Points.
00:47:33.740 Visit Superstore.ca to get started.
00:47:35.780 Breaking news happens anywhere, anytime.
00:47:50.500 Police have warned.
00:47:51.540 The protesters repeatedly get back.
00:47:53.840 CBC News brings the story to you live.
00:47:57.520 Hundreds of wildfires are burning.
00:47:59.300 Be the first to know what's going on and what that means for you and for Canada.
00:48:04.060 This situation has changed very quickly.
00:48:07.660 Helping make sense of the world when it matters most.
00:48:10.800 Stay in the know.
00:48:11.920 Download the free CBC News app or visit cbcnews.ca.
00:48:16.520 I wonder, though, if just the basic economics of AI just force change.
00:48:27.720 For example, the power draw for AI, I was hearing about it.
00:48:31.920 That was an AI center in another country three days ago.
00:48:34.260 And they were saying they're not exactly sure, you know, what it's going to take to run one.
00:48:39.920 But it could be 10x a normal data center.
00:48:43.460 Which is, that was amazing to me.
00:48:45.740 So how do you power that exactly?
00:48:47.920 So you have these competing imperatives.
00:48:49.680 You've got the climate agenda versus AI, which is clearly the future of the economy of California, for example.
00:48:56.080 Yeah.
00:48:56.360 Can I just say, okay, I have to go on a small diatribe.
00:48:59.340 Please, please do.
00:49:00.360 I'm trying to evoke one.
00:49:01.500 There is nothing that the Western countries can do that will equate in any way, shape, or form the impact of what China and India decides to do.
00:49:16.680 We do not live in small bubbles.
00:49:19.520 There is no glass bubble that sits on top of the United States.
00:49:24.060 Right.
00:49:24.160 We all share an ecology, that ecology is extremely complicated and nuanced, and it's a global one.
00:49:31.920 And the reality is that India and China collectively together have way more impact than what you and I are going to do by becoming vegans.
00:49:42.500 Impact on the climate, on the environment.
00:49:45.080 Yeah.
00:49:46.200 I've noticed, yes.
00:49:47.500 So I think the reason why, for example, like we have, again, so we have lost so much ground.
00:49:53.100 The California wildfires are an example that lays us bare.
00:49:58.460 We should have 50% of the American population on solar panels and battery walls today.
00:50:06.660 The reason, it's nice to have that you do it for climate change, but the reason is resilience.
00:50:13.200 The reason is so that you can power yourself in moments of calamity.
00:50:17.140 The reason is so that you can make sure that you can take care of your family, cook food, you know, desalinate water, whatever it is that you need to do.
00:50:25.220 The prepper approach to energy.
00:50:26.600 I'm with you.
00:50:27.480 It's just like the practical reality.
00:50:29.220 Totally agree.
00:50:29.780 And instead, we make it this sort of like moral blanket that you have to wrap yourself in.
00:50:36.220 Everybody then takes a view and all it does is retard progress.
00:50:39.980 So I guess what I'm saying is we spend way too much time getting distracted on fringe ideas and we need to recalibrate.
00:50:52.180 We need to be and we need to remain the most singular muscular power in the world, which comes from economic and military supremacy, which comes from only one thing, technical supremacy.
00:51:03.900 So you have to find a way of enabling those 50,000 Michael Jordans that exist in America to cook.
00:51:11.640 Let them cook.
00:51:13.980 That's what you need to do.
00:51:15.600 I wonder if the structure of Silicon Valley or-
00:51:18.380 Can I give you a different example?
00:51:19.320 Yeah, of course.
00:51:19.720 Sorry, this is my last one.
00:51:23.620 Do you believe, you know, one of the big pushes in AI is in robotics.
00:51:26.360 Yeah.
00:51:27.260 And just to double click into a robot, a robot moves through these things called actuators, okay?
00:51:33.900 And one of the things that actuators needs, one of the ways that they generate mechanical motion, 3D planar mechanical motion, is through the use of magnets, permanent magnets.
00:51:45.680 And permanent magnets are different than the ones you and I play with or what our kids play with.
00:51:49.440 They are made from these things called rare earth metals.
00:51:53.280 Yes.
00:51:54.720 And rare earths are a misnomer.
00:51:56.420 They are not rare, but they are abundantly available in the earth.
00:52:01.140 There is an incredible supply of rare earths in California.
00:52:06.300 You could not, for the life of you, get a permit in California, no matter how clean, how green, to mine those materials, to make sure that we could make magnets, so that we could make the robots, so that we maintained our technical supremacy.
00:52:25.140 Right?
00:52:25.620 That's how that decision, I think, should get made.
00:52:28.880 We want technical supremacy in the next 5 to 10 years.
00:52:31.560 There's going to be a huge wave in robotics.
00:52:33.760 America needs to be at the forefront.
00:52:35.820 America needs to make them.
00:52:37.600 We need to understand how to program them.
00:52:40.280 Our robots need to be smarter.
00:52:42.460 So there's an AI track, right?
00:52:44.040 There's a mechanical engineering track.
00:52:46.000 Okay, well, there's a mining track.
00:52:47.620 Let's get the materials.
00:52:48.660 Let's make sure that we are beholden to nobody, so that we can make them.
00:52:53.080 By the way, these are all great paying jobs, if you are able to actually get them permitted.
00:52:57.260 And when push comes to shove, when it's like, yeah, get the rare earths out of the ground and make the permanent magnets, can't do it.
00:53:03.180 Why?
00:53:04.840 You're going to spend 13 years in permitting hell in California to try to get that done?
00:53:08.560 So, and I know this, the reason I know this is that I started a business to make sure that China does not have access to the only supply of rare earths.
00:53:24.080 And I initially tried to do it in California.
00:53:26.880 Impossible.
00:53:28.160 I invested in one business that actually had an old mind that we were able to get back online through a bankruptcy process and blah, blah, blah.
00:53:36.380 But it's not nearly enough.
00:53:37.660 So, I went to India, and I was able to get a deal done with the Indian government.
00:53:44.680 And what they were able to see was the strategic rationale of making sure we had access to not only the rare earths, but also to make sure that there were subsidies so that they would compete on the global stage cheaper than China.
00:53:58.580 And now I can bring them back into the United States to make these magnets.
00:54:01.600 Would I like to do that in America?
00:54:04.540 Yes.
00:54:05.040 Can I?
00:54:05.720 Impossible.
00:54:06.080 I'd be sitting around twiddling my thumbs for a decade.
00:54:09.800 The guys in India did that deal with me in less than 18 months.
00:54:14.100 They understand.
00:54:15.760 There's an escalation point where they'll sit down and say, Chamath, what do you need?
00:54:19.500 What makes sense?
00:54:20.380 And I say, well, sir, here's what we're trying to do.
00:54:22.340 Here's why it's important.
00:54:23.700 Here's why having a global supply chain that's independent of China is just important.
00:54:27.720 It's competitive.
00:54:28.560 It's good for everybody.
00:54:29.360 They're like, okay, great.
00:54:31.200 X, Y, and Z.
00:54:31.860 Do the following three things.
00:54:33.300 Build a plant over there.
00:54:34.340 We'll make sure that we support you.
00:54:37.380 Here, we're just going to a morass and die.
00:54:39.040 I mean, can we generate, do we have the hardware to generate the electricity necessary to remain dominant in the AI?
00:54:49.260 Yes.
00:54:49.540 So, the other crazy thing about that Biden EO is why do you have, like, why couldn't you have just stopped the EO at that first paragraph?
00:55:01.960 Because it wouldn't include the control provisions, which are the whole point of it.
00:55:06.320 We control your behavior.
00:55:07.700 We give you something, but then we're in charge.
00:55:09.240 We're in charge.
00:55:10.520 We have more nat gas and oil in the United States.
00:55:15.200 I don't know if you saw this, Tucker, but there was a piece of data that came out this week, but our reliance on foreign oil is almost entirely gone.
00:55:25.440 Oh, yeah.
00:55:26.380 Meaning, like, almost like to where we don't even need to buy it anymore.
00:55:29.460 Not just that we make more than we import, but where the imports will soon go to zero.
00:55:34.580 Yes.
00:55:34.920 That's an incredible statement about the energy independence of America.
00:55:38.340 It's the single most obvious way, by the way, to guarantee peace.
00:55:42.860 Well, yeah.
00:55:43.440 If you're not fighting over that critical resource, it's going to be very unlikely you're going to go to war, which, if you look past the last four or five wars, the trillions of dollars and the hundreds of thousands of American lives, what were they all about?
00:55:55.980 Oil.
00:55:58.040 And now, you know, we have the ability to power those data centers.
00:56:03.340 So should data centers be built in a fair, predictable way on federal land?
00:56:09.760 And, yeah, because it feeds the technical supremacy we all need.
00:56:13.880 Now, by the way, there is a conversation to be had is, okay, great.
00:56:16.820 When you have all this economic abundance, how do you share it more?
00:56:19.440 I get that.
00:56:20.400 Right.
00:56:20.820 But at least you're in a position to have the conversation.
00:56:24.760 Yeah, that is jumping ahead.
00:56:26.200 So, back to this, though, but, like, you know, if you're able to build these things, I think what Biden should have done is just say, hey, listen, folks, do your best, get the energy, because it exists, use NAC gas, use oil, and we will figure it out after we've won.
00:56:41.600 That feels like leadership.
00:56:44.320 Right.
00:56:44.460 But he can't do that, because he's surrounded by people who have already said no hydrocarbons, period.
00:56:51.980 But it's not rooted in any reality.
00:56:54.680 No, I'm aware.
00:56:56.120 I'm aware.
00:56:57.400 It's just—
00:56:58.460 It's rooted in moral grandstanding.
00:57:01.380 Oh, I've noticed.
00:57:02.300 But I just think that we're—I mean, if you've got mandates for net zero or mandates for no hydrocarbons, mandates for EVs, and this economic imperative around AI, I just don't—and people like air conditioning, I just don't think we're going to have enough electricity unless there's hardware built, like, really soon.
00:57:20.040 It's incredible that you mentioned air conditioning.
00:57:24.840 So back to sort of one of these—back to one of these ways where I was like, you know, after 2022, how can I just get back to basics and do the things that I do well?
00:57:35.340 One of the projects that I started with this incredible mechanical engineer was we set out to rebuild the air conditioner.
00:57:42.420 Yeah, good.
00:57:43.880 And when you look at air conditioners, the heat transfer mechanisms can be made much, much more efficient.
00:57:49.460 And you don't have to use this horrible coolance.
00:57:53.740 And so we have a working version now.
00:57:58.180 It's like our second prototype.
00:57:59.320 We're probably two years away from getting something working.
00:58:01.900 But when that works, by the way, we've started a process to figure out how do we sell it.
00:58:07.280 But this is, again, Tucker, it's like if I tried to sell it to a home, the number of people that want to touch that thing—
00:58:15.300 Oh, I know.
00:58:15.840 It will take me an extra $150 million and an extra, extra $150—it'll take $150 million to build it.
00:58:23.460 It'll take an extra $150 million and an extra seven years.
00:58:27.500 To comply with code?
00:58:28.820 Yeah.
00:58:29.140 What is it that I'm supposed to do?
00:58:31.400 And by the way, not to be, like, pedantic, but those rules were never passed by the Congress.
00:58:37.820 I don't think those rules were passed by people, even to give them the benefit of the doubt, by one degree removed.
00:58:46.460 Folks that even have an understanding of, like, physics.
00:58:49.140 No, well, of course not.
00:58:50.420 Of course not.
00:58:51.020 So there was probably a very smart, capable lawyer representing some smart organization who knew that this was something that they could write in to slow people down.
00:59:01.720 But what people need to understand is, again, it just slows down our ability to remain number one.
00:59:07.300 But where does this—I think you're exactly right.
00:59:09.560 I think you've described it very nicely.
00:59:11.480 But I'm still baffled by the motive.
00:59:15.020 Why would people in a country as great as ours want to wreck the country?
00:59:19.600 Where does that come from?
00:59:21.660 I don't think that they want to wreck the country as much as I think that they have lost the global context.
00:59:30.200 I don't think they understand that we are in an existential risk of no longer being number one.
00:59:37.900 And I think that they also don't understand the implications if we are not number one.
00:59:45.380 You need to just look at the UK.
00:59:47.180 If you want to have a very simple and visible picture of a white Anglo-Saxon Protestant nation completely losing itself,
01:00:00.260 totally losing the economic war, losing the military war, losing the technology war,
01:00:07.900 and fighting a fringe issue war, you need to just look there.
01:00:12.720 And people should ask themselves, is that what we aspire to?
01:00:16.500 I'd rather live in Pakistan than live in the UK.
01:00:19.260 I'm being serious.
01:00:20.220 I think it's the most depressing country on the planet for the reasons that you just described.
01:00:23.960 There's something about decline that's—it's hard to describe.
01:00:27.880 But when you're there, you feel it.
01:00:29.380 It's terrible.
01:00:29.780 Oh, it's soul.
01:00:30.720 It's so much worse.
01:00:31.600 You're much better off, to your point, being in a developing nation on the come up than you are being in a—
01:00:37.840 There are over 200,000 Brits in UAE right now.
01:00:41.340 Right.
01:00:41.620 In UAE, which they formerly controlled until not that long ago.
01:00:45.620 And now there's a massive outflow.
01:00:48.260 There are far more Brits in the Emirates than there are Emiratis in Britain.
01:00:52.080 So the thing that we need to do, I think, as a Western set of nations, is we need to understand and agree on the fact that governance over this last decade has totally and miserably failed by focusing on these fringe issues and by losing sight of these global priorities.
01:01:10.780 I'll give you a different example.
01:01:11.960 At the end of the global financial crisis, Canada emerged as the healthiest G7 country out of all of us.
01:01:21.560 They were doing phenomenally well, low debt to GDP, phenomenal growth.
01:01:27.320 And then over those intervening 17 years, they've focused on all kinds of fringe issues.
01:01:33.980 Rampant open immigration, poor allocation of risk capital inside of Canada.
01:01:39.880 Canada, they've allowed this incredible flight of human capital to the United States.
01:01:45.520 Oh, yeah.
01:01:46.700 And now today, if Canada were to join the United States, it would be poorer than Alabama on a per capita basis.
01:01:54.740 Oh, yeah.
01:01:55.540 And much more depressing.
01:01:57.440 So what has happened in Canada?
01:01:58.800 There's a vibe shift.
01:01:59.860 When countries decline, it's not simply a matter of GDP moving in the wrong direction.
01:02:04.060 It's the spirit of the country is palpably different and sad.
01:02:08.440 I mean, it's why so many Canadians accept their government's invitation to kill them through the maids program.
01:02:14.240 Their suicide rate is insane.
01:02:15.760 Why is that?
01:02:16.280 That's not a measure, a sign of health or vigor or ascendance.
01:02:21.240 That's a sign of, you know, terminal decline.
01:02:24.460 It's like shocking.
01:02:25.540 You're from there.
01:02:26.280 When you go back, what does it feel like?
01:02:28.220 You know, my dad passed away 10 years ago.
01:02:30.500 My mom now comes to see us.
01:02:31.980 So I rarely go back.
01:02:33.060 Actually, I'm going to be back in February for a little bit.
01:02:38.000 I've always felt like a fish out of water in Canada.
01:02:41.160 I've loved it for many reasons.
01:02:43.120 I think that it had had some principles back then that I think are very legitimate and I think should exist in the United States.
01:02:50.120 The most important being a capped cost of higher education.
01:02:53.600 I spent $12,000 a year to get an electrical engineering degree from a place called the University of Waterloo, which globally is as good, frankly, better than MIT.
01:03:05.420 If I had to be in the United States and if I had gotten into MIT, which I probably would not have, but had I been able to, I'd be $100,000 or $200,000 in debt.
01:03:17.840 And so had I not had this lottery ticket for me pay off with an IPO and working at Facebook, I don't know where I would be today.
01:03:25.620 That's crazy.
01:03:26.440 So there are things that Canada, I think, and we should acknowledge that, does really right.
01:03:31.080 That is probably the most important thing that it does right.
01:03:35.420 And I think the idea of a state-sponsored healthcare system, it's implemented horribly poorly, but there are elements of that where I think, especially around sort of capped costs, which I think are important, meaning, you know, in the United States, a healthcare CEO was telling me, when you look inside of an EMR system, the electronic medical care system.
01:03:57.020 Let's say, Tucker, you're a doctor and you did a knee surgery, you would have 100 different prices attached to you, depending on the plan, depending on the insurer, that's dumb.
01:04:09.980 You are one person, those are one set of hands, that's one surgery, it's one quality.
01:04:16.300 The idea that one person is lucky enough to pay $1,000 deductible and the other person has to pay $50,000 is an egregious market failure.
01:04:22.900 It's just egregious.
01:04:26.100 So, Tucker Carlson, the best knee surgeon in America, if he charges $2,000, then he charges $2,000 for everybody.
01:04:34.280 That seems fair.
01:04:36.180 It does seem fair, it seems much more efficient.
01:04:38.620 I just can't...
01:04:40.020 Not everything else in Canada is broken, I think, but those two ideas...
01:04:42.440 Well, their healthcare system is a disaster.
01:04:43.900 Well, the implementation then of the system...
01:04:46.160 So, my question is, you know, it's like one of those things that, I'm not against it in theory, but I can't think of, I mean, national health system doesn't work, the Canadian healthcare system doesn't work.
01:04:55.820 I mean, is there a working national healthcare system?
01:04:58.300 Yes, I think that you need to have competition.
01:05:04.100 I think you need to have...
01:05:06.520 There's a hybrid that the United States could implement that is not NHS or the Canadian system, but it's not just a pure free market free for all.
01:05:18.040 It's a little bit in between, and let me describe what the in between parts are.
01:05:23.280 Medicare is an incredibly important insurance program in the United States.
01:05:27.360 I think it stands to reason that Medicare should have its own PBM.
01:05:33.500 If Medicare was able to negotiate, you know, an extremely aggressive price for drugs, it sets the boundary for what is allowable for everybody else.
01:05:44.940 And for folks that are 65 and older, then now they have a very viable alternative to use that.
01:05:50.800 It also creates transparency around the variation in pricing, number one.
01:05:57.360 Number two, there needs to be a way where a private insurer can build up some credit for doing things today that may only pay off for that employee in the future when he's no longer an employee.
01:06:13.880 So, for example, you worked at Fox.
01:06:17.060 Should the Fox insurance program have put you on a statin?
01:06:20.160 I'm making this up.
01:06:21.180 Have put you on a statin in your early 40s to help you manage your...
01:06:27.200 I'm not saying you have rising cholesterol, but if you had that, because in 10 or 15 years from now, it would help a potential cardiac arrest or heart attack.
01:06:38.000 A lot of the companies that are faced with this decision today say, we're not going to do that because you may not be an employee in 15 years, so why am I paying now for something where I get the benefit then?
01:06:49.320 But that's a simple healthcare economics market solution.
01:06:53.900 We should have those.
01:06:55.380 We should give private insurers incentives to, you know, in some cases, maybe the right thing to do is to put people on Ozempic and Munjaro.
01:07:02.720 Do the work now.
01:07:05.060 I understand that that employee may be retired by the time that, you know, they may need that.
01:07:11.820 But it was the right thing to do for that person for having worked for you for 15 or 20 years.
01:07:16.840 Another thing.
01:07:18.900 With AI today, you can read all of these insurance plans.
01:07:24.220 And you should have a standard way of knowing that a condition is going to get approved or not before it starts.
01:07:33.760 And that needs to be auditable.
01:07:36.180 It cannot be where you need like a PhD and five different people to read these insurance plans.
01:07:44.320 And then all of a sudden, a random person can make an economic decision to say no.
01:07:52.020 That's also where, you know, simple technology can get built to build very strict guardrails.
01:07:58.380 What is approved?
01:07:59.400 What is not approved?
01:08:00.500 Have it be known.
01:08:01.660 Have it be auditable.
01:08:03.260 Be in a log so that, you know, people can just simply escalate.
01:08:07.540 Hey, that person got his approved and mine was dinged.
01:08:09.880 Why?
01:08:11.000 So there's all these little things to like make the system better.
01:08:13.740 It needs to be more open.
01:08:15.580 You need to have, for example, a different, on the case of open.
01:08:20.320 It is highly unlikely that you've ever tried to get all of your healthcare data.
01:08:24.860 Impossible.
01:08:25.380 Now you would say, Chamath, why would I need that?
01:08:27.740 Maybe there's an AI agent that can actually be your doctor in your pocket that just works for Tucker.
01:08:33.580 You know, it's the person that is constantly reading my every interaction with the healthcare system
01:08:39.780 to tell me what they think.
01:08:41.880 It's my own second opinion.
01:08:44.040 That's not a bad idea.
01:08:45.100 That's a very low cost thing to do.
01:08:47.480 Except you can't get the data.
01:08:49.680 Why can't you get the data?
01:08:50.720 Because there's federal regulations and most people don't listen to them.
01:08:54.340 Federal regulations, TEFCA says, you have to be able to download your data.
01:08:59.440 These companies make it very hard because they know the more open it becomes,
01:09:03.640 it induces competition.
01:09:05.160 Competition that will show up will probably be infinitely better.
01:09:10.160 So I'm going to do everything I can to block it.
01:09:13.200 And, you know, the federal and state governments don't do enough.
01:09:15.700 In America, we do things a little differently and we always have.
01:09:19.180 But the British said, hey, we're going to tax your favorite morning beverage.
01:09:23.120 The revolutionary Sons of Liberty said no.
01:09:25.800 And they poured the entire shipment of tea into Boston Harbor and created a new country.
01:09:30.940 A country based on personal choice and freedom.
01:09:34.040 Well, 251 years later, it is time to throw something else overboard.
01:09:39.480 Your overpriced big wireless contract.
01:09:41.900 You don't need it.
01:09:43.280 Do you need to pay $100 a month just to get a free phone?
01:09:47.040 I don't think so.
01:09:49.240 The cell phone company we use, Pure Talk, says no to all of this.
01:09:53.240 Inflated prices, the BS, the contracts you can't understand.
01:09:57.420 And instead gives you service that you need, that you actually need.
01:10:00.500 With Pure Talk, it's super straightforward.
01:10:01.840 You get unlimited talk, text, 25 gigs of data, mobile hotspot at a fraction of the price.
01:10:09.780 And with a qualifying plan of just $45 a month, they'll throw a free Samsung Galaxy your way.
01:10:15.780 So you get everything and you know exactly what it is.
01:10:18.840 It's not designed to deceive you.
01:10:21.300 We strongly recommend it.
01:10:22.940 Pure Talk provides the same coverage as the other guys.
01:10:26.360 It's just a lot more affordable and a lot more straightforward.
01:10:30.140 You can find it for yourself.
01:10:31.480 Visit puretalk.com slash Tucker.
01:10:34.100 Make the switch today.
01:10:35.100 Pure Talk, America's wireless company.
01:10:51.300 So you said AI can be used to make sense of, you know, like the Talmud of health insurance regulations.
01:11:00.380 What, looking down the next five years, you know, name three or four innovations that you're pretty certain we're going to see life enhancers from AI.
01:11:08.440 Okay.
01:11:14.940 I, so I, I start companies.
01:11:18.140 Yeah.
01:11:18.720 But when I don't see something that I think I can start right away and there's somebody that's a little bit in the lead, I'll just invest and I'll just take a large piece so that I can help guide them.
01:11:26.260 One of those businesses where I'm the largest shareholder has been working on breast cancer surgery.
01:11:34.380 And today across America, for every 10 women that go and get diagnosed with breast cancer, three of the surgeries are, leave cancer behind.
01:11:48.260 Ouch.
01:11:49.260 So, and the way that it works is you go in for what's called, so there's two different kinds of breast cancer surgery.
01:11:53.280 There's a lumpectomy, which is take out the lump or a mastectomy, which is take out the entire breast.
01:11:59.620 And in the lumpectomy, you have to have, you visualize with your own eyes, whether you think most of the cancer is gone, you close up the woman and you take that sample, you give it to a pathologist.
01:12:12.540 And typically between seven and 11 days, which is how long it takes because they're clogged up and there's backlog, they'll look under a microscope and visually inspect and say, actually, sorry, 30% of the time they say you left some cancer behind.
01:12:26.260 So, now that woman has to, those three women have to go back, they get another surgery, but again, 30% error rate, they do it again.
01:12:36.160 And then one of those women gets a ding.
01:12:37.980 So, now one woman has had three breast cancer surgeries.
01:12:40.500 Oh.
01:12:41.600 That is happening today.
01:12:42.680 Now, if you go to a really, really good teaching hospital, that error rate will be 5% because the docs are incredible.
01:12:52.920 If you go to an overly zealous doctor, that error rate will also be low, but you'll come in for a lumpectomy, you'll end up with a mastectomy kind of a thing.
01:13:01.800 They'll just take out so much of that and that creates a disfiguration.
01:13:05.920 So, this company basically says, hold on a second, I'll just use AI.
01:13:12.300 I'll look right down to the granular microscopic level.
01:13:16.120 I'll take an extremely high-res picture.
01:13:19.140 My brain will be trained on only this one task.
01:13:22.600 Is there cancer or is there no cancer?
01:13:24.960 So, we built it and it's in a bunch of the leading hospitals in America.
01:13:29.360 But we had to file with the FDA to be allowed to tell the doctor.
01:13:38.240 Right?
01:13:38.460 So, I just want to be clear.
01:13:39.220 Like, we know.
01:13:41.480 We can see it.
01:13:43.800 And so, we ran an 18-month trial, you know, to cost us 10 or 15 million bucks.
01:13:50.300 And we met our endpoint in November.
01:13:54.220 However, we can now absolutely be sure that the cancer was removed or not removed.
01:14:01.760 Now, we have to package all of that up.
01:14:04.160 We've filed it with the FDA.
01:14:06.640 We've been told we'll get approval by June.
01:14:09.780 And then we'll be able to sell.
01:14:12.540 Meaning sell the software upgrade.
01:14:14.940 So, that essentially in the operating room, instead of having to wait for the pathologist 7 to 11 days later,
01:14:20.840 the doctor will do the lumpectomy, put it in the machine, and instantly you'll say,
01:14:26.460 Tucker, you need to take out a little bit more.
01:14:27.920 The margins are not right.
01:14:29.220 Tucker, perfect job.
01:14:30.100 Close her up.
01:14:32.640 That's using AI.
01:14:33.720 While the patient is sedated.
01:14:35.340 Yeah.
01:14:36.080 Wow.
01:14:36.400 That is using AI.
01:14:37.380 And that can get breast cancer surgeries to be so prolifically good that the error rate goes to zero.
01:14:44.700 The impact and the quality of life of those women, and then by extension, their families, their kids that have to deal with that stress as well, can go away.
01:14:58.180 So, that's a profound impact of AI that you're going to see in the next year.
01:15:02.180 We could quibble that this should have been faster, could have cost a lot less money.
01:15:06.420 Meaning, it cost me the same amount of money to get this trial done as it cost that Chinese company to build a digital brain that's as good as OpenAI or Facebook.
01:15:14.080 That's crazy.
01:15:15.300 But whatever.
01:15:16.240 Okay, put a pin in it.
01:15:17.380 We did it.
01:15:18.280 We'll file.
01:15:19.980 And we'll move forward.
01:15:21.540 You know, we play the rules that are on the field, even if the rules make no sense.
01:15:24.940 And it's hard to tell anybody, change the rules.
01:15:26.680 But it is what it is.
01:15:28.660 That's an example.
01:15:29.480 Elon published some data yesterday, which is incredibly profound, which is he has an AI brain inside of the Tesla.
01:15:41.240 And the ability to drive safely on the highway became 7.5x better than his previous version.
01:15:50.000 And his previous version was already 10 times better than a normal car.
01:15:54.020 Which means, when you get on the highway, you engage FSD, it's called, his version of autopilot for the car.
01:16:04.160 And it was already 10 times better.
01:16:06.440 Now it's 7 times better than that 10 times to not get into an accident.
01:16:12.060 So, the idea about using AI now to just eliminate all the unnecessary deaths that happen because of traffic mishaps, there's the potential where that goes to zero.
01:16:25.840 That's today as well.
01:16:27.860 Again, his issue isn't technological.
01:16:30.660 His issue is going to be regulatory.
01:16:32.300 At what speed will people be able to be comfortable letting him take people from point A to point B?
01:16:44.660 In yet a different example, there are all of these small airplane companies.
01:16:50.900 They're called eVTOL.
01:16:52.020 So, it's not a plane, not a helicopter, but it's kind of this hybrid thing in between.
01:16:56.400 And they have this autopilot where it's like taking off in a no-pilot configuration, piloting it in the air, which is much simpler, by the way, than driving on the ground, and then landing.
01:17:11.200 And you have the ability to now just create a level of transportation which increases GDP.
01:17:16.500 And that is an AI brain that's calculating all the system variables around itself, being able to fly safely, getting from point A to point B.
01:17:26.080 You don't have to drive seven hours.
01:17:27.460 It's a 40-minute hop, skip, and a jump now.
01:17:29.820 So, now go and do your job and come back.
01:17:32.760 All of these things are happening right now.
01:17:35.000 We are gated on the regulatory machine, which are unfortunately not filled with enough actual technologists.
01:17:43.700 It's mostly bureaucrats, teaching them, getting them comfortable that, you know, what do we intend with all of this?
01:17:51.280 We intend of reversing this thought, lost decade.
01:17:54.080 We just want to get to work and build a thing so that America kicks ass.
01:17:58.020 David Sachs, your co-host, is now the AI czar, also crypto.
01:18:01.920 Amazing.
01:18:02.780 Well, it is amazing, and I'll just say that.
01:18:04.420 He is an amazing human being.
01:18:06.700 No, he really is.
01:18:07.640 Like, he is a guy.
01:18:08.440 I agree.
01:18:09.760 I had to teach Sachs how to hug properly.
01:18:12.840 How did it, can you show me how you did that?
01:18:15.420 You know, Sachs was a side hugger, which I can't.
01:18:18.720 I can see that.
01:18:19.500 It's like, I'm not a side hugger.
01:18:20.900 Yeah.
01:18:21.240 You're going to give me a hug.
01:18:22.080 And we've been, you know, really, really close friends for 20 years now.
01:18:25.680 So, I was like, bro, learn to hug.
01:18:27.480 And we have all of these videos where, like, over the course of, like, four years, you
01:18:33.520 know, I had to take his arm and teach how to.
01:18:35.380 You did video training.
01:18:36.520 Oh, yeah.
01:18:36.840 Like, I had to teach him how to put, no, with him and me, you know, his wife would take a
01:18:40.100 video.
01:18:41.260 And it's like, Sachs, you got to put your arm this way.
01:18:44.160 And then he's like, okay, now you have to, you know, put your head over here.
01:18:48.120 He is, he's a brilliant guy.
01:18:51.100 So, you taught Sachs how to hug.
01:18:53.520 Yeah.
01:18:53.900 I mean, he taught me everything else, but I taught him how to hug.
01:18:57.280 So, your show, All In.
01:19:00.160 I did it last year.
01:19:01.860 I did it because David asked me, and I love David.
01:19:05.620 I had no sense of its penetration.
01:19:07.940 I had no idea.
01:19:08.680 I'm not in business.
01:19:09.360 I'm not in technology.
01:19:11.060 You know, I drive a stick shift.
01:19:12.180 I'm kind of the opposite of that.
01:19:13.680 And I just didn't realize.
01:19:16.300 Everybody watches.
01:19:17.200 It's unbelievable.
01:19:18.040 It's unbelievable.
01:19:19.160 I heard from people I hadn't seen since early childhood.
01:19:23.040 I heard from everyone.
01:19:24.400 I was stunned by it.
01:19:26.240 So, what's that like for you?
01:19:28.300 You're obviously well-known in Silicon Valley, Facebook, and your investment business.
01:19:35.240 But, I mean, it's a much larger audience in Silicon Valley.
01:19:38.060 So, what has that been like?
01:19:40.220 I mean, I think it's made me much cooler for my older kids, my teenagers.
01:19:47.060 You know, they use these words, which I just can't stand.
01:19:50.540 But they're like, you know, if somebody comes up and says, hi, and we take a picture, they're
01:19:54.020 like, dad, that was some good riz.
01:19:55.840 Or, you know, they'll be like, yeah, that was, okay, good aura, dad, good aura.
01:20:00.460 And so, we make fun of it.
01:20:01.900 Like, you know, I will turn to them, and I'll be like, in the car, I'll be like, guys, shut
01:20:05.660 up.
01:20:06.300 You know, they'll be making a lot of noise.
01:20:07.780 They're like, why?
01:20:09.140 And then I'll turn up, because I'm a very important podcaster.
01:20:12.300 Did you expect that when you, so like most of the people on the podcast, maybe all are,
01:20:18.840 you know, don't need to be doing podcasts.
01:20:20.120 You're not doing it for the money, obviously.
01:20:21.540 So, this starts as like, for fun?
01:20:24.820 Or why did you, how did you end up doing that?
01:20:26.680 It actually started because our poker game, me, Sachs, J. Cal, Freeberg, the four co-hosts,
01:20:34.260 we were four of the kind of like the regular everyday players.
01:20:38.160 When the COVID shelter-in-place happened, we weren't allowed to go out, and so we started
01:20:44.800 to Zoom each other, and then we just started to record, and then we just threw it up on
01:20:50.160 YouTube, because we would, you know, we would ask Freeberg to teach us about the disease
01:20:54.400 and the science of it all.
01:20:56.100 Then, you know, we would all take turns riffing on how frustrated we were at Newsom, and then
01:21:01.280 we would kind of post it, and then it just took on a life of its own.
01:21:04.680 And yeah, fast forward four years, I mean, we've done, you know, an episode a week.
01:21:12.640 I think it had a, we got very lucky because it happened in a moment.
01:21:16.160 I think you and Joe Rogan are two of the more leading examples of this, but
01:21:22.240 the traditional media is totally dead.
01:21:26.320 Yes.
01:21:26.740 There was a stat today, the Washington Post's traffic in the last four years has gone from
01:21:31.540 20 and a half million viewers to three.
01:21:34.980 Isn't that incredible?
01:21:36.260 That's right.
01:21:36.520 20 and a half million monthly users to three million.
01:21:41.840 So the question is, in that vacuum, it's not as if the American population shrank.
01:21:48.000 It's grown.
01:21:49.600 What is the sense-making?
01:21:51.600 Or news stopped happening.
01:21:53.280 Actually, much more is happening.
01:21:54.160 Or did news, yeah, exactly.
01:21:55.420 More news is happening.
01:21:56.340 So I think it's important to rebuild the sense-making.
01:22:00.260 And I think we accidentally found ourselves making sense of things that are happening to
01:22:06.600 people.
01:22:07.100 Like, explain this technology thing, explain this business thing, explain this political
01:22:13.140 thing.
01:22:13.540 Not necessarily so deeply, but connecting the dots so that you could have a slightly better
01:22:19.580 worldview.
01:22:20.840 And I think that that's been helpful for people.
01:22:23.240 It's been really fun for me.
01:22:24.580 Well, apparently, and then Trump comes on.
01:22:28.180 I mean, he is, he is really incredible.
01:22:33.080 You know, I didn't know what to expect.
01:22:37.240 And then David and I threw a fundraiser for him.
01:22:39.380 I mean, I think part of where all of this started was David called me in June.
01:22:43.140 And so, actually, taking a step back, he and I had always been active in politics.
01:22:49.080 He was a, you know, died, he was a tried and true conservative.
01:22:55.020 Yes.
01:22:55.300 From the beginning.
01:22:55.860 I was a little bit more promiscuous.
01:22:58.820 I didn't really understand how to make my political beliefs fit into one of these vessels.
01:23:05.580 But for the most part, in the last few years, like everybody else, I got lulled into buying
01:23:12.120 into a lot of the Democratic lines.
01:23:15.120 And I was a large donor to the Democrats.
01:23:16.900 And it all started to unravel for me in 2020 because I didn't understand COVID.
01:23:27.380 Then I got the, you know, vaccine, only to realize that it was, you know, not a vaccine.
01:23:34.640 And I was really upset.
01:23:36.040 A couple of my kids really struggled with what happened to them by not being able to
01:23:42.140 go to school.
01:23:43.880 And I was like, why hasn't the government intervened and gotten my kids back in school?
01:23:49.440 All of these things just made me super frustrated.
01:23:51.580 So, I started to question, maybe I wasn't seeing things right.
01:23:54.800 And maybe I was, you know, reacting more with my ego.
01:23:58.420 And the answer was, yeah, actually, like, you know, the Democrats are extremely charming.
01:24:03.320 And they can play a very sophisticated game.
01:24:06.180 Where they make you feel exclusive.
01:24:08.720 Oh, for sure.
01:24:09.060 And that's part of the shtick of how they get money from you.
01:24:11.520 I mean, I remember, and I would tell these stories as a point of pride.
01:24:15.060 Now, I tell it as a warning to myself.
01:24:17.620 I was sitting at dinner with Obama the day of Brexit.
01:24:21.800 And you know that very famous kind of moment where, like, you know, Andrew Card passes that
01:24:25.300 piece of paper to George Bush?
01:24:26.580 Yes.
01:24:26.880 There was a version of that moment, obviously much less important.
01:24:30.140 But where somebody.
01:24:31.300 Summer of 16.
01:24:32.600 Yeah, where somebody, like, passes a note to Obama and he goes, oh, my.
01:24:35.500 This was like, we were in San Francisco.
01:24:37.640 It was like at 8 or 9 p.m.
01:24:39.080 And he's like, oh, wow.
01:24:39.920 The UK just voted to leave.
01:24:42.920 So I remember where I was that day.
01:24:45.140 But those kinds of moments made me feel that maybe I was doing the right thing.
01:24:51.120 Right?
01:24:51.400 And then all of my businesses just kept running up against these brick walls over and over
01:24:58.360 and over again.
01:24:59.120 Every time I tried to do something good that I thought was valuable, it would bump up against
01:25:04.300 all of this stupid regulation and slowed back.
01:25:09.040 And every time I looked at who they were, they were all supposedly on the same team that
01:25:13.580 I was in.
01:25:14.060 So I said, forget this.
01:25:16.600 I have to start from first principles.
01:25:18.580 So in 2023, and Sax and I decided we were going to start throwing fundraisers for a whole
01:25:27.380 variety of candidates.
01:25:29.120 So the first one was at his house.
01:25:30.840 We threw a fundraiser for RFK.
01:25:32.660 We got to meet Bobby.
01:25:34.340 It was exhilarating.
01:25:35.380 And, you know, for the first time, you had this person speaking truth to power.
01:25:41.620 Then at my house, we threw a fundraiser for Vivek.
01:25:44.740 David and I did that together.
01:25:46.620 And then David called me and said, let's do a fundraiser for Trump.
01:25:49.580 And I said, absolutely.
01:25:51.160 And I think that that started the...
01:25:52.840 Did you check with your wife first?
01:25:54.300 Yeah.
01:25:54.440 That's quite a statement.
01:25:55.460 She's very supportive.
01:25:57.060 Yeah.
01:25:57.220 You know, look, my wife, she's in the life sciences business.
01:26:02.540 Um, but, you know, she's always been like, you have to make these decisions, not from
01:26:11.540 how other people will perceive you, but what do you feel?
01:26:15.000 And she's like, explain to me, Bobby Kennedy, explain to me, Vivek, explain to me, Donald
01:26:20.500 Trump, in your own words, that isn't tied to, well, here's what other people will think
01:26:25.180 of me if I do this.
01:26:26.360 And then when I did that, she was like, okay, let's do it.
01:26:29.060 Let's do it.
01:26:29.600 And, I mean, Trump was unbelievably true.
01:26:33.240 But at the time, just for, because I sort of understand the cultural context.
01:26:37.480 No, it was very, it was a very tough moment.
01:26:39.260 Kind of ballsy.
01:26:40.220 It was very ballsy.
01:26:41.440 People, I got the amount of like hate text messages.
01:26:44.040 I should have kept some of them.
01:26:45.100 They're probably still on my phone, actually.
01:26:47.060 Um, but people were very mad.
01:26:50.760 And part of, I think, why they were mad is they were afraid.
01:26:53.580 Meaning, there was a lie that Biden was sharp as a tack, and myself and David had not really
01:27:02.780 dismantled the lie beyond just saying it looks like a lie.
01:27:07.340 And I think that they were worried that it would create, I think what Peter Thiel calls
01:27:11.160 this, which I really agree with, is a preference cascade.
01:27:13.600 Yes, that's right.
01:27:14.400 You know, like Peter is a genius, but he is on an island many years ahead of the rest
01:27:18.760 of us.
01:27:20.140 Then there are other people, you know, like me and David to some degree, who are also
01:27:24.440 sort of like, we can kind of see the patterns not nearly as fast as a Peter.
01:27:29.120 But then we do a decent job of translating it for other folks.
01:27:32.120 So, I think people were afraid, if these guys translate their interactions with Donald Trump
01:27:38.280 into the truth, it's going to tip a lot of people.
01:27:41.800 And they were right.
01:27:43.180 Because when Trump came, wow, I mean, like.
01:27:47.960 Had you met him before, by the way?
01:27:50.140 He had called me.
01:27:51.520 So, I'd had a telephone conversation.
01:27:53.280 So, I knew what he was like on the phone.
01:27:55.960 That's when I knew I'd made a mistake before about like, believing what the press was saying.
01:28:00.740 That's where I went back and I looked at the Charlottesville press conference.
01:28:04.720 And I looked at all of them again after my phone call.
01:28:08.800 When the president called me, I was like, hold on a second.
01:28:12.800 This man was incredibly charming, polite, kind.
01:28:20.420 He just like, what I honestly thought, Tucker, I called my wife.
01:28:23.720 I was like, he was raised by really good parents.
01:28:26.400 That's what I said to them.
01:28:27.800 That's what I said to my wife.
01:28:28.760 Because you can tell in your kids.
01:28:30.560 You know, like when you see kids around, and there are some, they are polite, they're kind, they may buy contact.
01:28:38.020 There's these, I don't want to call them simple, but there are these building blocks of being a human being that you need to be taught by your parents.
01:28:48.340 He was taught.
01:28:49.800 And I have tremendous respect for that.
01:28:51.720 So then when I saw him at David's house, I mean, it's like, it's pretty incredible.
01:28:58.880 Like, it's like, it's a larger than life figure.
01:29:02.880 And he's extemporaneous.
01:29:04.440 He talks for an hour and a half.
01:29:05.880 He's going all over the place.
01:29:07.100 He's doing the weave.
01:29:08.900 You know, it's incredible.
01:29:11.860 And so I was-
01:29:12.860 And he's hilarious too.
01:29:13.720 He's very funny.
01:29:14.660 Yes.
01:29:15.040 Which is really hard to be, actually.
01:29:16.840 Yes.
01:29:17.140 And so I walked away thinking, wow, I had got it totally wrong.
01:29:23.480 And I was lied to.
01:29:25.060 And I believed at a very superficial level what the mainstream media was saying.
01:29:31.060 And then I did even more research.
01:29:33.140 You know, I read the lawsuits.
01:29:34.360 And it was just the amount of contortion that people were going through to try to prevent this man from getting into power made me want him to be in power even more.
01:29:46.780 Yes, right.
01:29:47.320 Because I thought, they're afraid of something.
01:29:50.400 That's something they're not going to say out loud.
01:29:52.440 But that is the thing that we need to exercise from the U.S. government.
01:29:55.820 Yes.
01:29:56.280 That thing that they want to protect.
01:29:58.020 It's so nicely put.
01:29:58.540 That's right.
01:29:59.480 It's not even now clear exactly.
01:30:00.800 It's disclosure of some kind.
01:30:02.840 They fear being, you know, revealed.
01:30:06.120 But, yeah, that's as far as I've gotten.
01:30:08.020 And then, you know, different from his first time around, the caliber of the people around him, it's like the 92 Barcelona, you know, dream team.
01:30:18.540 As far as I can tell, I mean, like, my gosh.
01:30:20.980 Like, you get Elon.
01:30:22.780 You get Vivek.
01:30:23.560 You get RFK.
01:30:24.340 You get Tulsi.
01:30:25.480 You get all of these people.
01:30:27.140 You know, Howard Lutnick.
01:30:28.480 David Sachs.
01:30:29.160 David Sachs.
01:30:30.320 Scott Bessent.
01:30:31.080 This is amazing.
01:30:32.860 I want to tell you about an amazing documentary series from our friend Sean Stone called All the President's Men, the Conspiracy Against Trump.
01:30:41.200 It is a series of interviews with people at the very heart of the first Trump term, many of whom are close to the heart of the second Trump term.
01:30:50.460 This is their stories about what Permanent Washington tried to do to them, in many cases send them to prison, for the crime of supporting Donald Trump.
01:30:58.600 Their words have never been more relevant than they are now.
01:31:01.440 Steve Bannon, Kash Patel.
01:31:02.840 I'm in there even.
01:31:03.740 All the President's Men, the Conspiracy Against Trump, and you will find it only on TCNTuckerCarlson.com.
01:31:10.180 Highly recommend it.
01:31:10.920 So what is the view?
01:31:29.080 I remember vividly the day that you all had that fundraiser for Trump and thinking, I never thought I would live to see this.
01:31:37.060 This is like a major change, major, major change.
01:31:39.680 Was there a major change in attitudes in Silicon Valley toward Trump after that?
01:31:43.500 After that, yeah, because it gave permission.
01:31:45.360 David is very complimentary to me in that because I was a little bit more promiscuous and not necessarily just a straight hardline Republican, it was a little bit more valuable, I think, for that cohort of people because they could kind of say, oh, well, if it's, you know, Chamath said it, oh, then yeah, sure.
01:32:09.260 But I think that he probably took a lot of heat as well.
01:32:14.300 And more than that, what he did was he then said, you know, I'm really going to put my foot on the gas here.
01:32:19.900 And he's great at this, which is just like, I'm going to tip the preference cascade.
01:32:24.420 So then he went to the Republican Convention.
01:32:26.320 He spoke.
01:32:27.260 He, you know, he did a lot.
01:32:29.460 And his wife did a lot.
01:32:31.200 That's a real dynamic power couple, those two.
01:32:33.780 And then Marc Andreessen came out.
01:32:35.680 And then Andreessen came out.
01:32:39.260 And it was funny, like the Andreessen thing was so a little bit interesting.
01:32:43.880 I don't think the full detail.
01:32:45.200 So I'm just speculating.
01:32:46.160 But, you know, it did turn out that his partner also gave like 50 or 100 million bucks to Kamala, which I suspect then that Marc probably gave 50 or 100 million to Trump.
01:32:56.020 It looked like they were splitting it.
01:32:57.100 Yeah.
01:32:59.200 But.
01:33:00.260 So what was it like, though, living there, living in, you know, among the people you work with and have worked with?
01:33:07.760 Like, how did people.
01:33:08.980 Oh, there was like, there was an initial part where people do this, you know?
01:33:12.160 Yeah.
01:33:12.940 It's like the.
01:33:14.580 And I felt that because I'm very sensitive to that.
01:33:16.840 Yes.
01:33:17.300 You know, and my feelings get hurt.
01:33:20.520 Yes.
01:33:20.820 And I get super annoyed by it.
01:33:23.160 You know, I think you call it the private equity wives for me.
01:33:25.440 It was like the private equity husbands as well.
01:33:27.380 You know, they all.
01:33:28.280 That's really shameful.
01:33:29.420 They all just like kind of like a scowl.
01:33:32.180 Well, and then after the election, it was more like a, and I'm like, what changed?
01:33:38.180 Well, what that's, I've felt that across the country.
01:33:41.020 What is, how would you describe that?
01:33:42.100 I think a lot of people.
01:33:43.000 People are not necessarily fighting a moral or ideological battle, Tucker.
01:33:52.640 I think people are just trying to get from one day to the next day.
01:33:55.680 For sure.
01:33:56.840 And, and I respect that.
01:34:00.480 At first I was hurt by it because I didn't understand it.
01:34:03.120 And I thought, again, it goes back to, oh, I'm worthless.
01:34:05.840 Like, is there something about me?
01:34:07.000 Is that why you're rejecting me?
01:34:08.100 And then I realized it has nothing to do with me.
01:34:10.080 This person is just, today is a day, tomorrow is a day.
01:34:13.900 And they're not fighting these battles.
01:34:17.240 Doesn't make them better nor worse.
01:34:19.080 It's just a different.
01:34:20.940 For me, I'm caught up in my own head about having to do something a little bit more ideological
01:34:30.140 and morally rooted because I need that so that I feel like there's purpose.
01:34:35.180 Of course.
01:34:35.660 Not everybody's built that way.
01:34:36.700 And I wasn't built that way.
01:34:38.800 Well, people are distracted by their day-to-day concerns also.
01:34:41.940 And I think that that's very fair.
01:34:43.300 So, you know, I kind of leave it to them as what they are.
01:34:45.760 It's, I think it's good that their minds are open and their eyes are open.
01:34:49.040 So, the, the way that this is going to be really tested now is what's happening in California.
01:34:55.080 There are, if you look at the Palisades, which is an incredibly affluent area.
01:34:59.700 Yeah, beautiful place.
01:35:00.720 Beautiful place.
01:35:01.940 And has just been decimated.
01:35:03.720 And several of my very good friends lost their homes there.
01:35:06.700 They were tried and true Democrats.
01:35:11.440 It didn't matter what the candidate's name was beside the box in the ballot.
01:35:17.220 They checked it and they moved on.
01:35:19.480 And now the real test for them is what do they do?
01:35:22.500 Will they look through the label and actually look at the policies and ask, did these help or hurt me?
01:35:30.040 All of these things that I give money to, all these NGOs and nonprofits, did they help or hurt me?
01:35:38.720 California has a $322 billion a year budget.
01:35:43.140 Do I, as a taxpayer who pays some of the highest taxes in the nation, have a right to ask where that money is being spent?
01:35:50.980 And if the answer to all those questions is, it doesn't matter, check the box, then we deserve what we get in California.
01:36:00.120 But it can be much better.
01:36:02.920 Because today the state is totally broken.
01:36:05.720 We have lost the script.
01:36:07.560 We don't prioritize California's economic supremacy, nor its technological supremacy.
01:36:17.400 That is a huge mistake when you look at the number of companies that have left the state.
01:36:22.300 Oracle, Tesla, Chevron.
01:36:25.020 I mean, this is not like just fringe businesses.
01:36:28.140 These are the businesses that matter.
01:36:29.660 And they're voting with their feet.
01:36:33.660 I mean, it does feel like the tipping point is here, though.
01:36:36.980 I mean, you said that we're—
01:36:38.620 I would not underestimate the democratic machine.
01:36:42.140 It's a cartel that runs that state.
01:36:43.760 Oh, I know.
01:36:44.360 And Gavin Newsom is extremely charismatic.
01:36:48.600 And he is able to convince people of things that are just not true.
01:36:54.360 And so, he is fighting for his political life.
01:37:02.060 He definitely wants to be on the national stage.
01:37:06.200 And so, the real question is, is he able to convince enough people that this was the environment and that these winds came out of nowhere?
01:37:14.780 Or will it be laid bare at his feet that it was just a lack of skill, intellectual rigor, and distraction, and negligence, and incompetence that caused this fire?
01:37:30.020 I mean, I think you make a pretty strong case for the latter.
01:37:32.200 You know, I found that there were three bills that started in the legislature that were either approved by the legislature and were then vetoed by Newsom or were then pulled down by the legislature itself.
01:37:46.420 Again, this is like Democrats on Democrat violence.
01:37:50.620 To just give a waiver to all of these local municipalities to go and clear the brush.
01:37:57.720 Just that.
01:37:58.540 Clear the brush.
01:38:00.060 Now, would that have stopped the fire?
01:38:01.660 No, I get that it wouldn't have stopped the fire.
01:38:03.420 But, you know, the intensity of a fire is directly proportional to the energy load that you give it.
01:38:08.720 To the fuel, of course.
01:38:09.820 So, take away some of the fuel and the fire will be less.
01:38:12.940 And if we can't even admit that plain truth, this is what I mean by if we're going to continue to just lie to ourselves because we care so much about this failed ideology, then California is going to just continue to just degrade.
01:38:24.820 Right.
01:38:24.900 But there's a point, I mean, you see the same thing in New York City where if the, you know, the engines of the economy leave because it is, you know, a country with 50 states, you know, there's a point where the math doesn't work and things just decline so quickly that it's hard to recover.
01:38:41.380 The thing, it is true, but the thing that is important to keep in mind about California is sometimes you get lucky.
01:38:49.980 And we have to acknowledge our luck but not fritter it away.
01:38:53.800 Specifically in California, there is a critical mass of these 50,000 Michael Jordan engineers.
01:39:00.380 Yes.
01:39:00.680 You cannot rebuild Silicon Valley by casting it to the four winds.
01:39:06.640 Some people go to Priorya, Illinois.
01:39:08.700 Some people end up in Boston.
01:39:10.020 Some are in Miami.
01:39:11.360 Some are in Las Vegas.
01:39:12.860 Some are in Austin.
01:39:14.240 All fine and good.
01:39:15.740 But that is not the place or the way to create the technical vibrancy we need to generate technical supremacy.
01:39:23.000 You need these people close and around you so that they're cross-pollinating from each other.
01:39:27.980 And this is, again, where why hasn't the state realized that?
01:39:31.660 That is an incredibly critical resource.
01:39:35.240 Well, at this point with, you know, the decline of ag and the entertainment business and aerospace, I don't really know what drives the economy of California other than technology.
01:39:46.540 Well, right now, if you look at California's employment, the employment picture is quite scary because it is government jobs that are convoluting how healthy the actual state is.
01:39:56.920 Right.
01:39:57.320 So if you take away the government jobs, but at the end of the day, what do government bureaucrats do?
01:40:03.040 They want to govern.
01:40:04.680 How do they do that?
01:40:05.740 They're going to legislate or they're going to regulate.
01:40:09.020 Where is that felt?
01:40:10.340 It's felt on private industry and private citizens.
01:40:12.620 But eventually, at the limit, the amount of regulation goes to infinity and they're legislating themselves into oblivion.
01:40:19.960 Nobody will be there to legislate, to be governed by all of this insanity because everybody will leave.
01:40:27.080 And first, the people that leave are the people that can leave.
01:40:31.260 Then the people that leave are the people that have to leave.
01:40:34.500 And we're in the first part.
01:40:36.720 It's clear the people that-
01:40:37.660 Do you know people who've left?
01:40:38.640 Many, many, many, many.
01:40:39.900 Have you thought about it?
01:40:41.160 Yeah.
01:40:41.560 I mean, look, my, again, I'll tell you in terms of my poker game, half of that, half of those people that have been my lifelong friends for 20 years have left.
01:40:49.680 Half.
01:40:50.680 Where'd they go?
01:40:51.400 Austin, Miami.
01:40:55.180 Texas and Florida, yeah.
01:40:56.180 Yeah, Texas and Florida.
01:40:58.720 And it's just so frustrating.
01:41:01.200 And they've left because they can't build things.
01:41:05.060 The taxes were like secondary.
01:41:07.500 One of my friends left because, you know, one of their children went through, you know, a bit of an identity crisis, if you want to call it that.
01:41:16.340 Yes.
01:41:16.580 And the police showed up and wanted to, you know, take the kid away, you know, so that the kid could go through like a transition that the kid ultimately decided they didn't want to go through.
01:41:30.160 The police showed up?
01:41:31.260 Yeah.
01:41:32.340 You know, that's the law in California.
01:41:33.660 So, like, encourage it.
01:41:35.220 I'm not sure if it's to encourage it, but my understanding of the way this works is if you, if you as a child have these issues and you escalate that, there is a requirement for the school to basically call Child Protective Services, who may or may not call the police, who may or may not come to your house and try to take the kid away.
01:41:55.200 You know, so.
01:41:55.680 So, they're not your children in California is basically what they're saying.
01:41:58.940 Well, they're your children to the point where the government believes that they know better.
01:42:02.240 Man, that's just, I mean, any government that presumes to know better than parents is an out-of-control, scary government.
01:42:10.260 I mean, look, I, I try to think of myself as a good parent.
01:42:13.360 I try to do the best I can.
01:42:14.520 Yeah.
01:42:14.880 Do I make mistakes?
01:42:15.760 Yeah, sure.
01:42:16.820 But am I generally more, you know, better for my kids than some other random adult?
01:42:22.920 Also, yeah.
01:42:24.280 Yeah, strongly.
01:42:25.020 Right, and so, like, the idea that just some random person in an office somewhere can read some piece of paper and all of a sudden take your kids away, that's very scary.
01:42:34.820 It's very scary.
01:42:35.780 It makes me feel very insecure, that idea.
01:42:37.540 What's keeping you there?
01:42:41.820 I'm very stubborn.
01:42:42.940 And I feel very grateful.
01:42:47.380 I feel very grateful to America.
01:42:49.420 I feel very grateful to California.
01:42:52.920 It gave me a path that I would not have had anywhere else.
01:42:59.540 And so, I have a responsibility to stay and fix it.
01:43:04.900 Good for you.
01:43:06.480 Yeah.
01:43:06.780 I feel guilty that I left.
01:43:07.880 Because it's the prettiest state.
01:43:11.040 I mean, nothing comes close.
01:43:12.480 So, to get driven out of where you grew up and where your ancestors lived is pretty, it's bitter, but I did it.
01:43:19.540 So, does it change?
01:43:20.860 Does the state change?
01:43:21.640 It's a one-party state.
01:43:22.800 Clearly, it's enormously corrupt, as you know, and wasteful.
01:43:27.280 And now it's fallen down on its most basic obligation, which is to keep your house from burning down.
01:43:32.120 So, does that force change politically?
01:43:37.880 I think people need to force the change now.
01:43:41.700 It has to force change.
01:43:43.920 People have to understand that these labels mean nothing.
01:43:47.020 Meaning, you could have the smartest person in the world and the dumbest person in the world.
01:43:51.860 You cannot vote for the dumbest person in the world just because the label beside them is something you've been told is wrong.
01:43:57.860 Exactly.
01:43:58.580 That is just the height of stupidity.
01:44:00.800 It's not what you're allowed to do as an adult.
01:44:02.880 Right.
01:44:03.960 Adults aren't allowed to do that.
01:44:05.340 Your kids can do that.
01:44:06.640 Right.
01:44:06.960 And you're supposed to teach your kids that that's not how you make decisions when you're an adult.
01:44:11.640 On the basis of labels, brands.
01:44:13.240 It's so stupid.
01:44:14.740 It's childish.
01:44:16.020 There are some really competent people that are in that state that will try now to come out of the woodwork to do the right thing, to deregulate California.
01:44:28.760 California has 60,000 regulations on the books.
01:44:32.180 It was 10,000 less than a decade ago.
01:44:35.300 Go back to 10,000.
01:44:36.700 Go to 5,000.
01:44:38.480 What are we afraid of?
01:44:39.700 Are we afraid of the breast cancer thing that could get to market faster?
01:44:44.440 Are you afraid that Elon's autopilot can now save people's lives more?
01:44:48.700 Are you afraid that we can catch rockets and then send them back to the stars and the heavens and Mars?
01:44:57.580 Why is that bad?
01:44:59.320 Well, the point of regulation is to encourage, guarantee, health and safety.
01:45:06.580 And growing up in California, it was a healthy state.
01:45:09.900 It was the healthiest state, as far as I could tell.
01:45:11.980 And it was a very safe state.
01:45:14.380 And it's neither one of those things.
01:45:15.540 It's one of the least healthy states.
01:45:16.960 Actually, there's more poverty in California than any state.
01:45:19.220 Autism rates are the highest in the country.
01:45:21.660 And it's dangerous in a lot of ways.
01:45:23.820 So, like, it's not working.
01:45:25.120 I guess that's what I would say.
01:45:26.000 The regulations aren't working.
01:45:26.960 It's not working.
01:45:27.580 So, how long do, you know, well-heeled Democratic voters need to see their state run over, need to see their lives ruined, and now their kids' lives ruined?
01:45:37.820 And maybe this is the thing that actually—
01:45:40.160 Well, you tell me you live among them.
01:45:41.500 Like, do you see the change coming?
01:45:42.660 Um, so, not where I live, because the only thing that—the only damage that has ever happened in Silicon Valley is economic damage, but then it's righted itself.
01:45:57.460 Right.
01:45:58.360 And so, people just go and move along blindly in the orthodoxy.
01:46:03.820 It's only when your business is at risk where you'll flip.
01:46:06.720 So, you know, you've seen kind of like Metta have a total, you know, come to Jesus moment.
01:46:11.740 Yeah, what's that?
01:46:13.040 Have you talked to Zuckerberg about it?
01:46:15.400 No, Mark and I have.
01:46:16.200 You know, I gave this speech at Stanford in 2016, which went pretty viral, which kind of laid bare what was going on in social media, and he and I have not spoken since.
01:46:27.280 Wow.
01:46:28.140 Yeah.
01:46:28.400 Well, knowing him as you do and for as long as you have, what do you make of his Joe Rogan appearance?
01:46:34.760 I think it's a calculation to manage the conditions on the field.
01:46:41.000 Meaning, what I said on the pod is all you need to know are two things.
01:46:48.400 One is there's an incredible picture in Donald Trump's book, which is a picture of Zuck sitting in the oval.
01:46:57.200 And, by the way, this is the same book which you have to get.
01:47:01.120 But it's the book where, like, you know, he talks about, you know, Trudeau's mom kind of like casting about with the Rolling Stones and like, you know.
01:47:08.840 Not just casting about, rolling about.
01:47:10.620 Yeah, rolling about.
01:47:11.500 And like, you know, he's like Fidel Castro's love child.
01:47:13.960 Anyways, but in that book is a picture of Zuck, and it says, you know, Zuck was the nicest guy to my face, but then would, you know, work against me to turn over the election.
01:47:21.460 And, you know, I've made it very clear that if he tries to do anything like that again, he'll go to prison for the rest of his life.
01:47:28.380 That's what Trump says in the caption.
01:47:31.880 And then he was asked about that last week, the exact same day that Mehta changed their policies.
01:47:37.920 And he said, do you think it was in response to what you said?
01:47:40.860 And the president said, probably.
01:47:43.200 So, I think it's a very smart but necessary set of calculations, but I think they are calculations.
01:47:49.360 So, what are his highest values?
01:47:51.500 What are his first principles?
01:47:53.500 Zuck?
01:47:53.940 Yep.
01:47:54.920 I mean, I'd only be guessing.
01:47:57.400 But an informed guess since you know him.
01:47:59.620 You know, I think that he's a very big fan of the Roman Empire.
01:48:07.720 If I had to translate and guess how that manifests in his day-to-day decision-making, I think he thinks of things empirically, meaning, like, as an empire.
01:48:19.000 And he has an empire.
01:48:20.860 You know, he has Facebook, WhatsApp, Instagram, you know, Messenger.
01:48:26.680 These are institutional worldwide products.
01:48:31.300 In the digital sphere, he is Rome.
01:48:35.700 And so, I think that he cares about the propagation of that empire probably more than he cares about any philosophical trends, per se.
01:48:48.860 Because, you know, the empire endures if you can, you know, manage the vicissitudes of trends.
01:48:56.280 That's right.
01:48:56.620 So, when the trend was under the Democrats to build a censorship machine, you know, you do that.
01:49:03.560 And then when the trend is to do the opposite, you'll do the opposite.
01:49:07.380 But he said to me a number of times over a number of years, you know, off camera, but has said, you know, basically I'm a kind of 70s liberal and I really believe in free speech.
01:49:20.020 You think that's true?
01:49:22.280 Sure.
01:49:22.560 I think that that's not the question.
01:49:26.580 Apparently not.
01:49:28.400 He has, you know, he controls Facebook with an iron fist.
01:49:32.180 He has complete voting control.
01:49:34.400 He has less economic participation, but, you know, he has absolute power.
01:49:41.860 And that was the justification in some ways of explaining how he made these changes now.
01:49:47.600 It doesn't explain why it veered in the other direction then.
01:49:52.560 And I think it's important to just probably get the answer to that and then you'll know where he stands.
01:49:57.640 You know, meaning the economic power has ebbed and flowed, but the absolute power has never changed.
01:50:02.120 So, you know, the philosophy that he believed in then, if it's the same now, then the question is, well, why did the manifestation of that power change?
01:50:11.220 Well, he's basically said, you know, he's bullied by the national security state.
01:50:14.640 I can believe that too, by the way.
01:50:15.860 I'm sympathetic to that.
01:50:16.680 I mean, that's got to feel like a lot of pressure when you're a young guy on the come up and a bunch of these well-heeled politicos show up at your office and say, you know, bend the knee.
01:50:25.900 Why don't you go fuck yourself?
01:50:27.100 That's what I would say.
01:50:28.380 Maybe that's why I'm not a billionaire.
01:50:30.580 But it's like unbelievable that happens in this country.
01:50:34.420 It's like shocking.
01:50:35.140 There are billionaires that have said, go fuck yourself.
01:50:37.340 Yeah, I know.
01:50:37.760 And they've become, you know, even more successful as a result.
01:50:42.440 I mean, Richard Spann in the world did that.
01:50:44.360 So what is his role in Elon's role in the new administration, do you think?
01:50:48.740 I think he's the heartbeat.
01:50:50.740 Here's what I mean by that.
01:50:52.300 I've thought about this.
01:50:54.140 It is the most important temperature check that we are going to make the changes that Donald Trump wants.
01:51:02.060 I think the president, in some ways, is the most powerful job, but I think this second term, he's more of a vessel in the sense that he's got all of these great field generals who can now run the place.
01:51:18.180 And of all the field generals, the one that has just an incremental more degree of freedom to also communicate openly and continuously with the public is him.
01:51:28.500 So you'll get a sense of whether there's an arrhythmia by just monitoring his X feed.
01:51:33.940 You'll also get a sense of whether there's like a, you know, like if the drumbeat is building, I think you get a sense of that from Elon as well.
01:51:42.820 I think he is the, he's the heartbeat.
01:51:46.000 So do you think that Elon's X feed is a kind of pretty accurate window into what he's thinking?
01:51:50.800 Yeah, and that's what's so powerful.
01:51:52.420 And this is why I think, you know, when you compare and contrast Zuck with Elon, it's, I think Zuck is like every other CEO.
01:52:02.180 And Elon is just a complete singular outlier in the sense that there is no book that would have told you just tell the truth grounded in your morality from day one and just burn the boats.
01:52:18.780 You know, that old Cortez line, right?
01:52:20.280 Of course.
01:52:20.720 And people would have said, what?
01:52:24.440 And so whatever he's felt he's shared in complete truth and candor, no other CEO has ever done that.
01:52:32.420 And probably no one will ever will because you can't do it as a strategy and you can't implement it on day 17 or day 1004.
01:52:41.840 It's you either are or you're not.
01:52:44.060 And so he sets a way of behavior that future CEOs should emulate.
01:52:51.780 But we also have to a little bit give a break and cut some slack to everybody else because they're just never going to do as good a job as him in doing that.
01:53:00.500 How was he seen in Silicon Valley?
01:53:02.040 Oh, he's a star.
01:53:03.700 But he's a kind guy.
01:53:05.180 Like, I mean, you know him as well.
01:53:06.380 But like, I think that people should know this.
01:53:10.920 Like, this is a kind.
01:53:13.180 He's a kind guy.
01:53:14.300 He's like a beautiful person.
01:53:16.540 He's a kind guy.
01:53:18.320 I don't know what to say.
01:53:18.920 Has any American ever had this much power?
01:53:23.100 Gosh.
01:53:31.340 Meaning like the intersection of private and public industry kind of thing?
01:53:35.360 Yeah, I just I can't think of I mean, of course, the president has the power to launch nuclear weapons.
01:53:39.160 So that that trumps all power.
01:53:40.980 But but as a private citizen, non a non president, I can't think of any time in 250 years where an American has had as much power as Elon Musk has in the sense that, you know, he's the most successful businessman by many measures.
01:53:55.100 He runs the most powerful media outlet in the world, which is X.
01:54:00.820 And he has, you know, this mandate from the newly elected president of the United States to to change the government.
01:54:06.240 I mean, I don't just nothing like that has ever happened that I know of.
01:54:09.160 Yeah.
01:54:11.780 It's like, yeah, I mean, I guess so.
01:54:14.800 You know, I guess it hasn't happened.
01:54:16.420 Why did he buy X?
01:54:21.740 Because as a business decision, it's kind of hard to justify.
01:54:24.940 Well, there's like it happened in a moment where I think that there was the intersection of a lot of free speech issues.
01:54:32.660 And I just think like there was like some, you know, stuff I'm not going to get into, but like personal stuff, I think probably in his own life.
01:54:45.280 And it's just like made him question, like, what are these philosophies?
01:54:49.340 What are these ideologies?
01:54:50.480 And how destructive are they if we can't question things and talk out loud?
01:54:57.380 And so we all got very lucky.
01:55:00.860 He paid the personal.
01:55:01.840 This is what I mean.
01:55:02.560 But like, like everybody can be like, wow, it's so easy.
01:55:05.560 It must be amazing.
01:55:06.260 It's not so, that, that's like a, I would crumble under the weight of that.
01:55:10.340 I don't think I'm capable of that.
01:55:11.820 I know that about myself.
01:55:12.480 It doesn't look easy at all.
01:55:13.500 My God.
01:55:14.100 And like the price you pay, the personal price you pay.
01:55:19.340 I couldn't do it.
01:55:20.180 I think it's a, he's a unique person that way.
01:55:24.180 His capacity for pain, his capacity for just, just the sheer grudgery, like just drudgery, like just, it's incredible.
01:55:35.420 I've never seen anything.
01:55:36.460 I've never seen anything like it.
01:55:37.960 I mean, I guess in this context, you know, I've been around two of these incredible figures.
01:55:43.500 Mark, I worked in the trenches with.
01:55:45.080 Elon, I've, you know, seen now for 15 plus years.
01:55:48.120 And I, and when you think about those two, um, yeah, with Elon, there's just an incredible intellectual curiosity, an incredible amount of ability to suffer pain and suffering.
01:56:03.920 And yet he's incredibly kind.
01:56:05.940 I don't know how, how, how.
01:56:08.240 He also seems to, I don't understand how he does everything.
01:56:13.120 How do you tweet that much, run that many companies, also have that many children, also, you know, he's a gamer, I guess.
01:56:23.060 Like what, like, how does that work?
01:56:25.500 Yeah, I agree.
01:56:26.100 Like, I mean.
01:56:26.280 As a scheduling matter.
01:56:27.340 I know, exactly.
01:56:28.260 I, look, I got here last night, ate at the pink elephant, went to bed and I felt guilty.
01:56:35.760 I thought, you know, shouldn't I be doing more yesterday?
01:56:40.660 I really thought that.
01:56:42.380 You know, and I saw my email kind of piling up and I, and I used you as an excuse.
01:56:48.620 Oh, I have to be fresh for Tucker.
01:56:49.780 Good night.
01:56:51.080 That's what I did.
01:56:53.680 So you're right.
01:56:54.460 But, and then I think, okay, I, I'm, I'm, I'm operating across an investment portfolio and, and one company that I run very intensely.
01:57:05.160 How do you do that multiply by seven and all of these other, I don't know how you do it.
01:57:09.360 I really don't know, Tucker.
01:57:10.220 I wish that there was an answer, but I also think it's not the right thing to answer in the sense that we're all going to give some glib answer.
01:57:18.940 Right.
01:57:19.020 And everybody's going to try to run and copy it.
01:57:20.760 And I think what you forget is he is the product of 20 years of preparation.
01:57:30.400 He started with one company.
01:57:32.660 Then he started two companies.
01:57:34.440 Do you know what I mean?
01:57:35.080 I do.
01:57:36.040 So these are reps upon reps upon reps upon reps over decades.
01:57:44.560 And I think it's important to keep that in mind.
01:57:46.420 Like that is a level of skill.
01:57:49.380 Like he is demonstrating human peak level performance.
01:57:53.480 It's easier to observe it in maybe an athlete or something else, but that's what he's demonstrating.
01:57:59.360 He is at the peak of human intellect.
01:58:03.660 That is the brain unencumbered by all the other stuff that maybe a lot of us get caught up in.
01:58:09.600 Well, that's the other thing is he's given up possessions effectively, you know.
01:58:13.280 Great idea.
01:58:14.760 Yeah, it is.
01:58:15.500 I mean, it's not, I've never.
01:58:17.020 He tweeted out like, like there's all these things I've never, I never tell him these things,
01:58:20.560 but like there are things that he's, that's why I follow him because part of it is like,
01:58:24.540 I get things from him in Twitter that really profoundly affect my life.
01:58:30.100 So when I was going through all of that turbulence in 2022, you know, he tweeted out, I think it was in 21.
01:58:36.200 I can't remember when it was, but he's like, I've just sold all my homes and possessions.
01:58:39.540 And I went back, immediately I thought of, that is the one thing that I was taught as a Buddhist when I was raised Buddhist, you know.
01:58:48.540 And despite all the stuff and all the anger that I had, and I never thought that Buddhism was all that effective or useful for me.
01:58:55.460 I took one lesson, detach yourself from the physical world.
01:58:59.460 Yes.
01:58:59.800 And I saw it and I connected that dot to myself and I said, I am the opposite of that.
01:59:06.580 It's totally true.
01:59:08.080 I am totally, totally, totally attached to the physical world.
01:59:11.400 I keep a verse on my phone.
01:59:12.920 I'm not a great Bible scholar, hardly, but I do keep this verse because I think it actually, I mean, it's 1 John 3.
01:59:24.100 Do not love this world or the things it offers you.
01:59:26.400 For when you love the world, you don't have the love of the Father in you.
01:59:29.220 For the world offers only a craving for physical pleasure, a craving for everything we see, and a pride in our achievements and possessions.
01:59:36.860 Right.
01:59:37.060 These are from the world and the world is fading away along with everything that people crave.
01:59:44.300 There you go.
01:59:45.200 Christian, not Buddhist, but same idea and true.
01:59:47.200 Same idea, same idea.
01:59:48.700 And so he tweets that out.
01:59:50.560 He sold all of his houses, you know, he sells all of his possessions.
01:59:56.840 And I'm like, what am I doing?
01:59:59.220 I'm buying more.
02:00:00.560 Yeah.
02:00:01.000 I'm like, oh, I'm going to take delivery of this thing and this other thing.
02:00:04.020 And now I have to hire people to manage the things.
02:00:06.220 And I'm like, what am I, what am I doing?
02:00:08.180 Yes.
02:00:10.680 So that helped me a lot.
02:00:12.380 I've never told, I never told, but yeah, these things.
02:00:14.920 So he is an incredible, I think, guidepost on how to clarify your own intentions.
02:00:23.800 If you're willing to see through it all and clarify that message, that would be a thing that I think he's an incredible exemplar of.
02:00:32.640 Find the thing that you care about and then just double and triple and quadruple and quintuple down on that thing.
02:00:38.400 That is exactly right.
02:00:39.340 That is exactly right.
02:00:40.340 So speaking of bookends, we began this conversation with your description of your, I hate the word journey, but it is a journey to like a much higher level of self-awareness and peace.
02:00:51.200 And you're ending by describing, you know, with admiration, Elon's decision to detach himself from the world, even as he engages in the things that he really loves.
02:01:01.080 Do you think that there is a greater spiritual awareness, a greater hunger in Silicon Valley where it matters because of the richest people in the country than there was?
02:01:11.780 You said, you know, no one believes in God.
02:01:13.120 Do you think that's changing?
02:01:13.820 No, but I think we need to make it more fashionable to be spiritual.
02:01:18.140 Oh, you don't think it's changing?
02:01:19.460 I don't think it's changing yet.
02:01:21.680 I think that a lot of this extreme wealth was created in people that are in their 30s, some in their 20s, many of them are in their 40s.
02:01:31.200 And I think that in over this next 10 or 15 years, there's going to be a crisis of identity when they realized that, you know, they were playing dumb games focused on superficial things beyond the companies themselves.
02:01:44.240 Right.
02:01:44.440 And that there's a bigger purpose, right?
02:01:47.860 There needs to be a loyalty to country, loyalty to the state, loyalty to the people around you that you don't even know.
02:01:54.260 Not to virtue signal, but to actually just do the hard practical work.
02:01:58.040 Part of that is rooted, I think, in a spirituality.
02:02:00.440 That's not there yet.
02:02:03.000 But I think that it will become cool again.
02:02:08.180 Sorry.
02:02:09.080 It can become cool again.
02:02:13.820 I believe in God.
02:02:15.060 And the reason I believe in God, and if I were to tell my friends in Silicon Valley, the way that I would incept this idea would be purely from a scientific lens.
02:02:28.220 Which is, if you believe it is true, which I think most people do.
02:02:35.560 That there is a finite moment in which the universe began.
02:02:43.320 Explain not the process, but explain the moment.
02:02:47.300 And you cannot.
02:02:50.720 And it is an endless rabbit hole that if you spend your time, and again, when I was feeling very empty, that's where I started.
02:02:59.280 And I, you know, I read all about, I read about Islam, I read about Christianity, I read about Judaism, I reread, you know, about Buddhism.
02:03:06.520 And the only explanation is God.
02:03:13.400 And I think that that's a way where people there can be, you know, they'll let their guard down.
02:03:20.200 Because if you start a conversation about physics and cosmology, and people are very open-minded.
02:03:24.940 Yeah.
02:03:25.120 And then you go, well, how did it happen?
02:03:26.940 How did T equals zero happen?
02:03:28.940 And what does T minus one look like?
02:03:30.440 And their brains explode.
02:03:31.840 Why?
02:03:32.180 Their brains truly explode.
02:03:33.180 Because people that are very good technologically in that way are very good at getting to explanations.
02:03:39.960 And they're very good at kind of like breaking things down.
02:03:42.300 I know it's used a lot now, but into first principles.
02:03:44.580 Yes.
02:03:46.280 There is no first principles explanation for how the world, for how the universe began.
02:03:50.160 There is none.
02:03:51.080 Don't tell me about the Big Bang Theory.
02:03:53.400 It doesn't work.
02:03:54.080 Don't tell me about general relativity.
02:03:55.760 Because it all breaks.
02:03:57.200 We have to make these profound assumptions in math and physics to make it all hang together.
02:04:04.140 Because you cannot tell me what T equals zero right at that moment.
02:04:11.040 How?
02:04:12.520 Nobody can answer the how.
02:04:13.800 No.
02:04:13.940 And so then if you look at this entire lived world around you, I don't know, I just, I'm filled with this like immense gratitude.
02:04:26.480 And then I think it must be God.
02:04:29.020 And that gives me like, it gives me something.
02:04:32.680 I didn't have that before.
02:04:34.880 And so I take that.
02:04:36.160 I'm not trying to push that on other people, but that works for me.
02:04:39.080 You know, it makes me a better husband, makes me a better dad, it makes me a better friend.
02:04:44.800 I do things now that I, you know, when the fires were happening, you know, my friends, I just round robin.
02:04:53.380 I just kept calling them every day to check in on them.
02:04:57.440 It's a small thing, right?
02:04:58.540 Um, but these are values that I had lost somewhere along the way where, you know, just like, just calling people, caring about people.
02:05:09.760 Um, one of my friends lost the home.
02:05:12.580 I spent an entire afternoon, not an entire afternoon, sorry.
02:05:16.060 Again, I don't want to overplay it.
02:05:17.620 I spent like an hour or so going through all my Google photos, clicking through and finding all the photos of him, our friends, so that we can make him an album.
02:05:26.040 Um, because he lost everything and yeah, you get the clothes, whatever, but you know, these picture albums, you know, they, they mean a lot.
02:05:34.740 Um, and I was happy and I felt really like a useful, good person at the end of that.
02:05:41.940 Yes.
02:05:42.760 I don't know.
02:05:43.300 That's, that's, I got that from believing in God.
02:05:46.240 And my final question is, are you hopeful since we are at a moment of real change in the country?
02:05:56.460 Are you hopeful for the future?
02:05:58.820 Yeah.
02:06:00.140 I think there is, we were, we had a fever and that fever has broken.
02:06:06.840 And what has to lie in its place are examples of how all of these things that we thought we were not allowed to do when we do them actually work.
02:06:20.720 Meaning we're actually just going to focus on merit and get incredible people.
02:06:25.240 And it'll turn out that you'll get your, you know, diversity wish, but you're not going to get it by mandating it and forcing it down our throats.
02:06:33.900 We're just going to get the best people.
02:06:36.060 And then the best people are just going to go and kick ass together.
02:06:38.600 You know, we're going to get an ecology that we protect and love because we want to be out there hunting, fishing, camping, living it, skiing it, whatever it is.
02:06:50.580 But we're going to get there because we actually manage it and take care of it and clean up all of the, you know, stuff that would otherwise burn it to the ground.
02:07:00.800 So we're going to test all of this stuff.
02:07:07.900 And I think it's going to work.
02:07:10.260 And then the other thing we need to do is we need to cut down all of these things that are these little ropes that are pulling us all back.
02:07:18.600 Meaning this is a stretch, but I'll, I'll use myself in this example.
02:07:24.260 I think I'm one of those 50,000 people that can go and ram and jam for Team USA.
02:07:29.320 I can't.
02:07:30.080 I've been, you know, there are moments where I've been really dunking on people.
02:07:33.340 You know, if this is a basketball analogy, I want to do more of that.
02:07:37.780 I want Team USA to kick ass.
02:07:40.580 I just want it to be a little bit easier.
02:07:43.660 And so I hope that we figure out a way to, instead of having 60,000 little regulations and people that want to lord over us, just give us 10,000 and just trust us that we're trying to do the right thing.
02:07:55.700 And if we don't, fine, trust, but verify, you know, and if we screw up, fine.
02:07:59.760 Hold us accountable.
02:08:01.540 But just give us a chance so that we can just make sure that USA, Team America, that idea is the singular organizing function for America.
02:08:11.940 Not everybody's own little pet project.
02:08:14.580 Right.
02:08:14.680 That's what that's, so I think that that's possible.
02:08:18.220 But I mean, time will tell.
02:08:19.200 These next four years will be super critical.
02:08:21.740 Jamal, thank you.
02:08:22.960 Thanks, Tucker.
02:08:23.340 Thanks for listening to the Tucker Carlson Show.
02:08:27.360 If you enjoyed it, you can go to TuckerCarlson.com to see everything that we have made.
02:08:32.020 The complete library.
02:08:34.120 TuckerCarlson.com.
02:08:35.260 Bye.
02:08:35.460 Bye.
02:08:37.580 Bye.
02:08:38.040 Bye.
02:08:38.300 Bye.
02:08:38.820 Bye.
02:08:39.040 Bye.
02:08:46.040 Bye.
02:08:48.520 Bye.
02:08:49.680 Bye.
02:08:50.620 Bye.
02:08:51.820 Bye.
02:08:54.780 Bye.
02:09:03.380 Bye.