Verdict with Ted Cruz - February 28, 2024


How We Re-Capture Big Tech & Universities: One-on-One with Tech Entrepreneur Joe Lonsdale


Episode Stats

Length

35 minutes

Words per Minute

208.59373

Word Count

7,421

Sentence Count

585

Misogynist Sentences

4

Hate Speech Sentences

4


Summary


Transcript

00:00:00.000 This is an iHeart Podcast.
00:00:02.700 Guaranteed human.
00:00:05.900 Welcome.
00:00:06.460 It is Verdict with Senator Ted Cruz.
00:00:08.440 Ben Ferguson with you.
00:00:09.540 Senator, this is really fun.
00:00:10.820 We're getting to do back-to-back evenings with a live audience,
00:00:13.560 which is really exciting.
00:00:14.940 And we've got a dear friend of yours and a guest with us tonight
00:00:17.880 getting to talk about some really cool things,
00:00:20.240 especially when it comes to education.
00:00:22.800 Imagine starting a university.
00:00:24.600 That's a pretty cool idea.
00:00:25.800 And we have someone that's done that.
00:00:27.280 I'll let you do the intro.
00:00:28.140 Well, we are very proud to be in Austin, Texas tonight
00:00:31.440 with a very good friend of mine
00:00:33.360 and someone who I say without hyperbole
00:00:36.140 is one of the smartest people on planet Earth.
00:00:38.880 We are with Joe Lonsdale.
00:00:40.620 Joe Lonsdale is a big tech entrepreneur.
00:00:44.200 He is a venture capitalist.
00:00:46.140 He runs a major venture capital fund
00:00:48.440 that invests in tech companies across the country.
00:00:51.960 He was the CEO of Palantir.
00:00:53.820 He led the big tech exodus from California to Austin, Texas.
00:01:01.260 And one of the amazing things we're seeing is Austin, Texas is becoming a mecca
00:01:05.940 for people in tech who are not insane socialists.
00:01:11.540 Amen to that.
00:01:12.560 And Joe came as sort of the search party.
00:01:17.300 He was sent first.
00:01:19.340 And I think probably Silicon Valley was curious the reception he would get.
00:01:24.020 And they found the cannibals did not eat him.
00:01:26.920 And so he wired back, come to Texas.
00:01:29.920 There's freedom here.
00:01:31.060 There's sanity here.
00:01:32.020 Joe, it is great to be with you.
00:01:33.160 Welcome to Verdict.
00:01:34.000 Thanks for being here, Ted.
00:01:35.040 And thank you very much for including me.
00:01:36.440 That's very kind of you to say for one of the smartest senators.
00:01:38.320 I appreciate the line there.
00:01:39.340 Let me tell you about Freedom Gold USA.
00:01:42.020 If you're like me and you like to make sure that you protect your hard-earned dollars,
00:01:47.160 if you're close to retirement, if you're in retirement,
00:01:50.520 and you're watching what's happening with inflation,
00:01:53.420 you're watching your savings erode away, your purchasing power erode away,
00:01:58.100 well, then it may be time for you to take a look at gold and silver
00:02:01.700 as part of diversifying your financial portfolio.
00:02:05.420 It is something that I've done, and I use Freedom Gold.
00:02:08.480 In fact, right now in my hand, I have an ounce of silver that is part of my portfolio.
00:02:14.980 And I have this because no matter what happens,
00:02:17.520 we've got a debt that's above $34 trillion with a T.
00:02:22.180 And we've seen the push for central bank digital currencies.
00:02:26.160 Our financial freedom is at risk.
00:02:28.760 That is why I want you to know about Freedom Gold USA.
00:02:32.220 They are ready to help you preserve your wealth
00:02:34.160 and provide you stability in uncertain times.
00:02:37.160 Now, here's the big perk.
00:02:38.960 If you call them right now, you can see if you qualify for up to $10,000 in free silver.
00:02:47.080 That's right.
00:02:47.760 Learn how to add gold or silver to your IRA or have it shipped directly to your home
00:02:52.600 and safeguard your wealth with physical gold and silver.
00:02:55.880 1-800-655-8843.
00:03:00.800 That's 1-800-655-8843.
00:03:05.440 Use the company that I use.
00:03:07.320 Go to freedomgoldusa.com slash verdict.
00:03:11.740 That's freedomgoldusa.com slash verdict.
00:03:15.880 And talk to them to see if you qualify for up to $10,000 in free silver.
00:03:20.580 1-800-655-8843.
00:03:23.240 You've got something that I think is amazing, and it deals with education.
00:03:27.640 I want to start with that first, because not only did you leave California,
00:03:30.220 but you also had this insane idea of, hey, I want to start a university.
00:03:36.380 Where did that come from, and what is the goal and mission?
00:03:39.720 Oh, you know, we've been talking about the universities for a while.
00:03:42.100 I mean, all of us have seen over the years how they've kind of gone the wrong direction.
00:03:45.060 But, you know, when I was in university, I was there in 2004.
00:03:49.840 There were problems.
00:03:50.700 There was, you know, you'd get in trouble for being politically incorrect.
00:03:53.840 You'd be told not to talk about things.
00:03:55.080 I got my first B for trying to defend John Locke in a humanities course,
00:03:58.200 because you're not supposed to do that.
00:04:00.420 You framed that one, by the way.
00:04:01.760 Yeah, I know.
00:04:02.780 But it wasn't like...
00:04:03.700 Joe, that's called a humble brag.
00:04:05.820 First B.
00:04:06.900 That's true.
00:04:07.980 It was freshman year.
00:04:08.860 It was, you know, but, you know, it wasn't totally crazy.
00:04:13.300 And I think a lot of people who have not been at universities for a while
00:04:15.980 don't realize, like, just how crazy these places have become
00:04:19.400 over the last, you know, five, six, seven years.
00:04:21.680 I think it was like a shift in our society.
00:04:23.760 Maybe it was around 2014, 2015.
00:04:26.360 But, I mean, the Stanford Review, which Peter Thiel started,
00:04:28.760 and I was involved with, which is a libertarian and conservative paper,
00:04:31.800 like a lot of the kids there, it's like you can't even admit you write for it anymore.
00:04:35.140 It has to be pseudonymous because it would ruin your social life
00:04:37.380 and you'd be attacked for it.
00:04:38.320 And, like, these departments have just gotten so radicalized and so broken.
00:04:42.980 And so there's, you know, the administrators over the last 20 years,
00:04:45.900 you've tripled the size of the administration.
00:04:47.260 There's more administrators now at Yale than there are students.
00:04:49.600 About as many administrators at Harvard.
00:04:51.300 This DEI thing has come in.
00:04:52.820 It's anti-merit.
00:04:54.080 It's just so broken.
00:04:55.300 So I think people don't realize that, you know,
00:04:57.160 I actually believe universities played a really important role
00:04:59.300 in our society the last 100 years.
00:05:00.980 And, you know, I'm really lucky to have friends such as Neil Ferguson,
00:05:04.280 the great historian, Barry Weiss, you know,
00:05:06.700 who I think runs one of the most important media companies in the U.S. today,
00:05:09.780 as my two co-founders.
00:05:10.960 And all of us realize, you know, American institutions are breaking.
00:05:13.840 They're failing.
00:05:14.380 It's bad for our country.
00:05:16.020 You know what you do in America when these things are broken?
00:05:17.760 You build new ones.
00:05:18.660 And, Joe, what you're doing here is incredibly important.
00:05:21.020 And so you founded the University of Austin.
00:05:24.020 Tell listeners of the podcast what the University of Austin is
00:05:27.420 and what's the vision.
00:05:28.280 What is it trying to accomplish?
00:05:30.340 You know, we're trying to build a new great university in America
00:05:34.360 that competes with Harvard, Yale, Princeton, Stanford, MIT.
00:05:37.160 It competes for the best and brightest.
00:05:39.000 And we want to have one of these places where it actually pursues truth,
00:05:42.460 where it doesn't defer to who's offended,
00:05:44.720 doesn't defer to a crazy ideology.
00:05:46.860 You don't have a bunch of, you know, administrators hounding you down.
00:05:49.660 You actually are focused on teaching young people to be courageous,
00:05:53.980 to basically learn how to speak up,
00:05:56.120 learn how to confront solutions in our society.
00:05:57.760 If you can have even, like, a small number of the people who go on to run our elite
00:06:01.720 to learn how to speak out, learn how to call out nonsense.
00:06:03.940 And where is the university and the journey of its founding
00:06:07.020 and becoming established and growing?
00:06:09.180 So, you know, it turns out, like a lot of other industries,
00:06:11.220 there's a big cartel for starting these things.
00:06:13.120 We had to do 2,000 pages of regulation.
00:06:14.740 We had to, you know, go through all sorts of things.
00:06:17.040 But we were officially an operating university,
00:06:19.340 the first new private university in Texas in over 60 years.
00:06:22.380 We've done all sorts of different events and seminars.
00:06:24.820 And we have our first founding undergraduate class joining right now.
00:06:28.380 There's going to be 100 students to join in the fall.
00:06:30.360 You know, we've had over 5,000 professors apply.
00:06:31.600 Now, are they all freshmen that are starting?
00:06:33.660 Or how does it work in terms of the class coming in?
00:06:36.600 We're defining them as all freshmen,
00:06:38.140 although we may admit a few people who've gone to a couple of the top schools
00:06:41.300 and are fleeing them.
00:06:42.100 And so, you know, we can include them with us instead, of course.
00:06:44.640 Now, are there particular majors that are being offered to start with?
00:06:47.660 What are the students going to be studying?
00:06:49.480 Well, you know, we want all the students to have sort of intellectual foundations
00:06:53.140 of, like, the great debates of Western civilization,
00:06:56.200 kind of like a core, what's called a classical liberal core.
00:06:59.140 But then you have different centers.
00:07:00.120 We have a center for economics and history and politics.
00:07:02.840 And we have people who've given up tenure at places like University of Chicago
00:07:05.520 and Columbia and other places like that to teach there.
00:07:08.560 We have a center for STEM.
00:07:09.620 Our friend Elon Musk built a lot of things here.
00:07:12.000 And people who help run SpaceX and Boring Company are helping us shape some of the STEM
00:07:15.620 to make sure these students are people they want to partner with.
00:07:17.740 So we have multiple different electives as well, just like any other college.
00:07:21.440 Raising money is something that obviously is vitally important.
00:07:24.960 And you have moments where you can see where probably instead of you going to people,
00:07:28.600 telling them the story, they start coming to you.
00:07:32.020 Did that just happen when we saw so much anti-Israel rhetoric on college campuses
00:07:36.780 where then people that you know came to you and said,
00:07:39.360 hey, maybe I do want to get involved in this idea.
00:07:42.020 Maybe you're on to something here.
00:07:44.100 Yeah, you know, a lot of people, like I haven't realized just how broken universities are.
00:07:47.960 They always thought it was something.
00:07:49.460 Guys like Ted and I like to argue with the crazy people on the far left,
00:07:52.760 and we've probably always called these places out.
00:07:54.540 But they've gotten a lot worse since we were arguing with them 20 years ago.
00:07:57.640 Yeah, and so a lot of these people finally woke up after October 7th
00:08:01.740 and after they saw, obviously, the presidents of Harvard and MIT and whatnot,
00:08:06.180 like in Penn, going and making total fools of themselves.
00:08:08.700 And they started to look a little more closely.
00:08:10.480 And then, of course, like the plagiarism scandal comes out.
00:08:12.760 And then if you've been paying attention, it turns out that it's not just Clyde and Gay.
00:08:15.880 It's all the leadership of the DEI and the Title IX office all plagiarize all their stuff.
00:08:20.140 Because guess what?
00:08:20.600 If you have a philosophy that's anti-merit,
00:08:22.480 maybe you yourself aren't doing things that are meritocratic, you know?
00:08:24.920 Look, I don't know about you, Joe, but I, for one, was really inspired
00:08:28.740 when former Harvard President Claudine Gay wrote the immortal words,
00:08:32.540 we have nothing to fear but fear itself.
00:08:37.300 First time she's ever written that. It was amazing.
00:08:39.520 You know, then she said, I have a dream.
00:08:41.680 Yes.
00:08:43.320 And then, e pluribus unum.
00:08:47.380 It's really brilliant, let's just be honest.
00:08:49.580 And it shows you how rotten these places are.
00:08:51.800 They've now done an investigation.
00:08:53.120 The board of Harvard didn't even look into her scholarship before making her president.
00:08:57.100 They didn't even look into it at all.
00:08:58.480 And it's clear, because she was not hired for being a great scholar.
00:09:00.780 She was hired for, obviously, other reasons, for her gender and her sex and for being political.
00:09:04.640 So how are you finding your faculty to assemble a new university?
00:09:08.440 You've got people like Neil Ferguson.
00:09:09.960 You've got Barry Weiss, who I would note.
00:09:11.740 If you have not read Barry Weiss's resignation letter from the editorial board of the New York Times,
00:09:19.120 it is one of the most important things written in the past decade.
00:09:24.900 And it is the most concise and effective indictment of the corruption of corporate journalism that I've read anywhere.
00:09:31.940 So the two of them, how did you team up with them, and how did you find the other members of your faculty?
00:09:37.020 You know, I have to give Mark and Drewson credit for introducing me to Barry.
00:09:39.680 We both were talking to him about the need to rebuild our broken institutions in the U.S.,
00:09:43.160 whether it's media, whether it's universities, whether there's so much else we need to fix.
00:09:46.880 And I'm so inspired by her.
00:09:49.120 Neil's been her friend for a very long time.
00:09:50.660 I think he's the greatest living historian.
00:09:52.720 And a lot of other people are attracted to working with people like that.
00:09:57.040 So, you know, when we announced this, we had, I think, the first few months,
00:10:00.620 5,000 professors send us notes to try to inquire about working with us.
00:10:03.640 So it's not been hard to find them.
00:10:05.060 This is a place people want to be part of.
00:10:06.560 So 5,000 professors.
00:10:08.680 That's worth underscoring.
00:10:10.100 Look, there are, and I think this is true in every one of our institutions that is corrupted and captured by the left,
00:10:15.960 there are people trapped within them who have not lost their minds, but they're scared.
00:10:22.820 They still want to earn a living.
00:10:23.920 They want a job.
00:10:24.800 They know if they open their mouth, they risk being canceled, being fired, being thrown out.
00:10:30.000 But I think that's true at universities.
00:10:31.900 I think that is true in entertainment.
00:10:33.480 I think that is true in journalism.
00:10:35.200 I think that's true in big tech.
00:10:36.500 Let me shift.
00:10:38.300 You know the world of big tech well.
00:10:41.040 You know, I think back to big tech 15 years ago.
00:10:43.940 And I think at the time, big tech was really at a fork in the road.
00:10:48.680 And it could have gone down the road of embracing a libertarian utopia of saying,
00:10:54.340 leave us the hell alone.
00:10:55.680 We're going to be entrepreneurs.
00:10:56.920 We're going to invent a new world.
00:10:59.340 Or it could have gone down the road they chose instead, which is nanny state totalitarianism.
00:11:05.880 We have the power, and we will use the power to silence anyone who dares speak out.
00:11:11.340 Do you agree with that assessment?
00:11:13.100 And if so, why did they choose road number two?
00:11:17.660 I do agree with that.
00:11:19.340 And to tie it back to what we're just talking about, Ted, these cultures come from our universities.
00:11:24.900 Google is hiring thousands of PhDs out of these universities who have just grown up in that culture their entire life.
00:11:31.400 Amazon, Microsoft, Facebook.
00:11:33.380 These tech cultures and university cultures, they're one and the same.
00:11:36.620 And, you know, the university culture is that raw.
00:11:38.960 And that's what these kids have brought up in their whole life.
00:11:41.100 And it's interesting because you kind of learn these places.
00:11:43.620 Like, rather than a university that teaches you to be courageous and to speak up, you learn, shut up, virtue signal, go along, or you're going to get in trouble.
00:11:50.140 And you learn there's going to be a 5% of crazy people on the far left.
00:11:53.600 And when they shout, you obey because that's how you stay out of trouble.
00:11:56.840 And that's the way these companies are run.
00:11:58.580 Now, is there a tipping point?
00:12:00.860 There have been a handful of people who have shown real courage in the tech space.
00:12:05.880 There's you.
00:12:06.700 There's Elon Musk.
00:12:07.980 There's Peter Thiel.
00:12:09.160 There's Palmer Luckey.
00:12:10.300 There's Larry Ellison.
00:12:11.540 There are a few.
00:12:12.440 How many others are there, and do you see a tipping point where others will feel like, wait, I can speak out in support of free enterprise.
00:12:22.480 I can speak out in support of free speech.
00:12:24.700 I can stand up to the Borg, the collective mentality of Silicon Valley.
00:12:31.500 You know, I think a lot of people are scared, and they're scared for good reason.
00:12:35.000 This is where I actually have a little bit of empathy towards a lot of these friends of mine.
00:12:38.260 And I get texts every time that we put something online that you and I like and we're speaking out and we're being strong.
00:12:44.380 I get texts from people who run multi-billion dollar companies.
00:12:46.740 I get texts from people whose companies support hundreds of thousands of other companies.
00:12:50.840 And they're terrified if they're supporting $100,000 companies that if they become, you know, we're drinking vodka with the name of a friend in Austin on it.
00:12:58.000 He does not do politics because he knows that if he becomes controversial, it could hurt him.
00:13:02.200 So there's a lot of fear right now in the community, and the far left's very good at demonizing people who speak out.
00:13:06.580 You moved a company from California to Texas.
00:13:09.860 There's a lot of people that love that, but they also, they're saying, don't California, my Texas.
00:13:15.480 When people move here, I'm assuming you had people that didn't agree with your conservative values that came.
00:13:21.060 Do they see life differently?
00:13:22.380 I mean, it's been a couple years now.
00:13:23.680 Do they come in and go, hey, it's actually a better way of life, and they like freedom, and they're starting to come around to it?
00:13:29.240 Or do they just move here because they say, okay, well, it's more freedom during COVID, and I pay less taxes, but I'm still the same person voting?
00:13:35.340 You know, I want to give you a statistic you probably know, and obviously I'm a huge fan of the senator.
00:13:40.240 The last race you run was closer than it should have been.
00:13:42.100 If it wasn't for people who had moved to Texas, the race would have gone the other way by the numbers.
00:13:46.980 And so it turns out that on the whole, the people who choose to move here tend to be even more on the side of liberty, more on the side of freedom, even than the people who were born here.
00:13:55.960 So I think it's fair to be really worried about these crazy people coming in, but you should know the people who choose to come here, we're fleeing something that's broken, and we're coming here because if America falls, we're screwed.
00:14:08.920 And a lot of my friends, by the way, have given up.
00:14:10.840 I have billionaire friends who are living in Switzerland, who are living in Singapore, who are saying, Joe, the woke guys are in charge.
00:14:15.340 You're done.
00:14:15.840 My wife and I are here in Texas because we are making a stand here for America, and those are our values, and those are a lot of our friends' values.
00:14:20.820 You've got kids.
00:14:23.460 How much of that was your decision as well?
00:14:25.640 I mean, when you've got young kids, you and I are actually the same age, graduated the same year, and I know for me that—
00:14:32.060 By the way, Ben, he's made gazillions of dollars.
00:14:34.240 He's been a major CEO.
00:14:35.380 What the hell have you done with your life, man?
00:14:36.840 I'm a co-host with Senator Ted Cruz.
00:14:38.580 I've got that going, right?
00:14:39.900 You know.
00:14:43.400 But how much was it you're—
00:14:45.020 I'm getting my thought if I come back in a moment.
00:14:48.500 And I will tell you, Joe does have a basketball court, and we are going to shortly play hoops.
00:14:52.940 I'm told Joe has a pretty serious hoops game.
00:14:56.660 Ben, in high school, was a center and, you know, has got some mass.
00:15:00.640 So we're going to see how things play out.
00:15:04.480 He's a little bigger guy than you.
00:15:05.920 I wouldn't talk too much trash right before our game here.
00:15:07.980 Somebody's going to be limping out of here tonight.
00:15:09.540 Joe, this is the only part of my game is trash talking.
00:15:12.840 If I give up on trash talking, I have zero game left.
00:15:16.740 Ben, we're loving raising our four daughters here.
00:15:20.700 I think this is a great place to raise kids.
00:15:22.640 I think it's a lot more tolerant place of a lot of different views.
00:15:24.800 It's funny to say, it used to be San Francisco was quote-unquote tolerant, but it actually is not.
00:15:28.720 Like, if you speak out and on the other side, you're in trouble there.
00:15:31.080 I think here is very accepting where we live in Texas.
00:15:34.040 And, yeah, I mean, listen, you guys have all heard the stories.
00:15:36.420 I had friends' kids who were in first grade at the local private school that we had heard was the best one.
00:15:40.780 And they came to us, and they were distraught, and they said, you know, the teacher lined the kids up today and told them to line up by gender.
00:15:46.360 And then she yelled at them for half an hour about how there's not only two genders, and they're confused.
00:15:49.860 And then it's like, I mean, I'm not obsessed with this stuff myself, but if the teachers are obsessed with it, that's kind of a weird place to raise kids.
00:15:55.240 Canadian women are looking for more, more out of themselves, their businesses, their elected leaders, and the world around them.
00:16:02.200 And that's why we're thrilled to introduce the Honest Talk podcast.
00:16:05.920 I'm Jennifer Stewart.
00:16:07.100 And I'm Catherine Clark.
00:16:08.340 And in this podcast, we interview Canada's most inspiring women.
00:16:12.100 Entrepreneurs, artists, athletes, politicians, and newsmakers, all at different stages of their journey.
00:16:17.820 So if you're looking to connect, then we hope you'll join us.
00:16:20.820 Listen to the Honest Talk podcast on iHeartRadio or wherever you listen to your podcasts.
00:16:25.240 What's the path to take them back?
00:16:29.080 So with University of Austin, you were fighting to try to take back universities.
00:16:32.840 I think that's incredibly important.
00:16:34.880 You know, I got to say, as a parent, you know, you sit here and think, do you spend hundreds of thousands of dollars to send your kid to a school that will try to brainwash them to hate America and hate you?
00:16:47.980 Exactly.
00:16:48.420 And it's hard to know what to do.
00:16:50.780 But at the same time, you want your kids to do well.
00:16:52.480 And education has been the key to success so often in America that I know a lot of parents that are just almost paralyzed.
00:17:00.420 What do I do?
00:17:01.020 You're fighting to take that institution back.
00:17:03.640 Do you have hope we can take universities back?
00:17:07.080 And then I'm going to ask you the same question on big tech.
00:17:08.920 You know, on universities, we've got to build some new ones.
00:17:11.560 They're going to influence the broken ones to be better.
00:17:14.140 I think it's going to take multiple that we build.
00:17:16.720 And, yes, I think we can shift the back.
00:17:18.780 We're not going to reconquer Harvard and Yale.
00:17:21.040 I mean, you have to basically realize these places.
00:17:23.500 It's the administrators.
00:17:24.440 It's the departments who do their own hiring.
00:17:26.300 It's the lawyers who are in charge.
00:17:28.340 It's the board who's in charge.
00:17:29.340 So you're not going to reconquer those schools.
00:17:31.380 But you can influence them to be better and you can build better ones.
00:17:33.700 I actually think we have a better chance in K-12 than we do in universities.
00:17:37.180 And that's thanks to school choice if we can get that done.
00:17:39.840 And this is something I wish, though.
00:17:41.200 I think a lot of Republicans are on our side now in Texas.
00:17:43.520 They get it.
00:17:44.160 They get how bad it is.
00:17:45.820 I think a lot of people in the rural areas, they know the teachers.
00:17:48.780 They love them.
00:17:49.600 And they're confused.
00:17:50.180 They don't realize there's tons of schools, especially in our cities in Texas, even here, that are brainwashing our kids.
00:17:55.020 And we desperately need to give parents the right to get out of those schools.
00:17:57.620 I want to tell you about our friends over at Patriot Mobile.
00:18:00.660 If you are sick and tired of giving your money to woke companies that literally hate your values, hate your family values, hate your faith,
00:18:07.680 it is time for you to vote with your dollars and switch to a company that stands by what you believe in.
00:18:13.320 That is Patriot Mobile.
00:18:14.560 When I look down at my phone, I see the word Patriot in the top left.
00:18:18.100 Why?
00:18:18.760 Because I switched to Patriot Mobile.
00:18:20.360 Now, I get the same great service that I had with Big Mobile.
00:18:23.480 But the biggest difference is every time I make a call,
00:18:25.860 every time I send a text, and every time I pay my bill,
00:18:28.540 I know I'm standing with a company that's actually fighting for my values.
00:18:32.500 Patriot Mobile offers you dependable nationwide coverage,
00:18:35.960 giving you the ability to access all the major network towers,
00:18:39.520 which means you get the same coverage you've been accustomed to without funding that left.
00:18:44.240 And when you switch to Patriot, you're sending a message,
00:18:46.460 because 5% of your bill every month is given back, and no charge to you,
00:18:51.640 to causes that you help choose to support.
00:18:54.220 We're talking about supporting free speech, religious freedom,
00:18:57.860 the sanctity of life, our Second Amendment,
00:19:00.520 as well as supporting our military, our veterans,
00:19:03.640 our first responder heroes, and our wounded warriors.
00:19:07.280 How easy is it to switch?
00:19:08.860 Just go to PatriotMobile.com slash verdict.
00:19:12.020 That's PatriotMobile.com slash verdict.
00:19:14.320 Or call them 972-PATRIOT.
00:19:17.620 Make the switch and make a difference with that bill every month.
00:19:20.720 Free activation when you use the offer code VERDICT.
00:19:23.860 That's 972-PATRIOT.
00:19:26.380 972-PATRIOT.
00:19:27.240 Or PatriotMobile.com slash verdict.
00:19:29.480 You seem to be doing something that is really cool and fun.
00:19:33.720 A lot of people do legacy later in life.
00:19:36.160 They think about their country.
00:19:38.060 They think about their grandkids.
00:19:39.720 And then they kind of shift what they're doing.
00:19:42.060 You seem to be in the fight right now.
00:19:44.140 And it includes you working on legislation through a think tank.
00:19:48.720 You even had an op-ed that just came out this last week.
00:19:51.260 Tell people about that aspect of what you're doing.
00:19:53.280 Because you seem to be all in on the fight, again, at a young age, in your early 40s.
00:19:57.660 Yeah, you know, I'm still building companies and running my funds.
00:19:59.920 So it would be really nice just to be able to wait until I was 60 or 70.
00:20:03.340 I think maybe if this was 20, 30 years ago, I might have done that.
00:20:06.600 It feels like this is a really critical time for our country.
00:20:08.860 And we can't just wait 20 or 30 years.
00:20:11.280 Absolutely.
00:20:12.220 Yeah, so given that we might not have a country left if we don't all fight for it right now,
00:20:16.840 it's my job to do it.
00:20:18.140 And, you know, at Cicero Institute, we have teams in 20 states.
00:20:21.460 We're fighting for liberty and accountability and going after all sorts of nonsense there.
00:20:25.420 Yeah, the op-ed was fun this week.
00:20:26.960 I felt pretty strongly.
00:20:28.760 I'm friends with Elon Musk.
00:20:30.560 Watching what's happening to him and the president with the weaponized courts.
00:20:33.320 And, you know, our country, one of the reasons it's exceptional is we have, you know,
00:20:36.460 equality of justice under the law.
00:20:37.960 And I'd be very against Republicans weaponizing courts to attack the left.
00:20:41.140 And I was also very against the left weaponizing courts to attack them.
00:20:44.420 I think it was great.
00:20:44.940 Jeb Bush spoke up about that as well.
00:20:46.360 Yeah, look, it is absolutely grotesque, the weaponization of our justice system.
00:20:50.780 We're seeing it against Donald Trump with four indictments all over the country,
00:20:54.480 with a ridiculous civil verdict in New York.
00:20:57.900 And we're seeing it against Elon Musk.
00:20:59.560 I will say the Biden administration, watching the Biden administration weaponize every single
00:21:06.000 federal agency against Elon Musk.
00:21:08.860 And as you know, Elon, until like 12 minutes ago, wasn't a Republican.
00:21:14.960 Elon had never voted Republican until just over two years ago.
00:21:19.080 Elon voted for Hillary Clinton and for Joe Biden.
00:21:22.060 And he said publicly the first Republican he ever voted for was Mayra Flores here in Texas
00:21:26.920 just a couple of years ago.
00:21:29.020 And the fact that he dared speak out, and especially the fact that he bought Twitter
00:21:35.700 and has allowed free speech, the left has decided he must be destroyed.
00:21:41.780 And even for someone with vast resources, having the federal government come after you is a
00:21:49.000 daunting proposition.
00:21:51.100 Yeah, it's really disgusting to watch, like how open they are.
00:21:54.820 It feels like a third world country, Ted.
00:21:56.180 And, you know, I think you step back a little bit.
00:21:58.600 There's this battle in our civilization for truth and justice.
00:22:02.120 And it's very clear truth and justice have been losing a lot the last 20, 30 years.
00:22:06.220 I think buying Twitter now acts, I think hopefully what we're trying to do with some of these
00:22:10.000 institutions, we can start to turn it around.
00:22:11.900 I still think we're losing a little bit, but I think we're going to start turning things around.
00:22:14.880 And if more of us can get into the fight, more people like Elon, I think we have a chance to win.
00:22:18.400 So how about big tech?
00:22:19.680 Is there hope for turning big tech around?
00:22:23.520 Are there, you mentioned 5,000 professors wanting to get out.
00:22:27.020 That's a really encouraging stat to me.
00:22:28.740 Do you think there are likewise people in big tech that are quietly wanting some semblance of sanity
00:22:37.120 but are afraid and is there a way that they can come out of hiding?
00:22:41.880 You know, there's definitely a lot of them.
00:22:43.280 And I'll tell you what, the way that this works, thank goodness, is there are market forces.
00:22:48.100 And, you know, Google would have been way ahead of everyone else if they didn't have a completely corrupt.
00:22:53.040 I mean, it's a joke online, but it's like they put out these things they saw this week
00:22:56.140 where they couldn't even do a picture of a white person.
00:22:57.900 You'd ask 1820s, show me a couple from the 1820s America.
00:23:00.900 And it's like a black guy and a Japanese woman.
00:23:04.480 And that's nice, but it's probably not, you know.
00:23:05.840 I don't know if you've seen, it's actually very complex game theory.
00:23:08.900 But if you ask Google to create a chessboard, it has only black pieces.
00:23:13.440 And it's very hard to know how to win or lose at that point.
00:23:17.140 The funny part is they only got in trouble.
00:23:18.740 I want to make the point.
00:23:19.520 But the funny part is they only got in trouble for this because somebody thought to say,
00:23:22.620 show me a German Nazi soldier from the 1930s.
00:23:24.920 And it showed this Han Chinese woman, this Native American guy in Nazi uniform.
00:23:29.120 And that was too far.
00:23:30.800 It was amazing.
00:23:31.500 That was too far for the New York Times.
00:23:32.700 It was like, okay, now we actually know we need to fix this.
00:23:35.040 Listen, the Chinese Nazis, that was a real problem.
00:23:37.500 It was a huge problem.
00:23:38.460 It's not covered in history class.
00:23:40.320 Ms. Gay, she actually covered that in a paper.
00:23:42.380 I don't know if you've read that one yet.
00:23:43.740 Here's the great thing about markets and about innovation is that when you start to focus so much on nonsense
00:23:48.920 that you start to lose and you start to not attract the best people,
00:23:51.820 other people defeat you in the market.
00:23:53.080 And those new things are very often, you know, if you look at fast-growing startups versus these tech monopolies,
00:23:59.140 the fast-growing startups are far less woke because they have to be focused on competence.
00:24:02.400 And a lot of people who are joining them are fleeing these crazy broken places.
00:24:05.340 So I do think it's going in the right direction.
00:24:06.520 All right, let me ask a business question.
00:24:08.560 You know tech better than most people alive.
00:24:12.480 Where are things going in terms of innovation 10 years from now?
00:24:16.860 What should we know now that we don't know?
00:24:20.260 And how will the world be different in a decade?
00:24:22.360 Well, the really positive thing that's happening right now, and I was never a huge crypto guy.
00:24:27.440 I don't love fiat currencies.
00:24:28.620 I think there's a good use against, like, you know, corrupt governments.
00:24:30.920 But I was never that into crypto.
00:24:32.840 AI actually, to me, is actually very real.
00:24:36.380 The way to think about it, we could talk all about sorts of complicated things,
00:24:39.420 but the simple thing to think about is productivity is just really key in our economy.
00:24:43.020 The reason we have more wealth is we do more with less.
00:24:45.040 And there's all these industries in our economy where this AI combined with operations
00:24:49.960 can do things much more affordably, much cheaper.
00:24:52.800 And so if you look at this, like healthcare billing, for example,
00:24:54.780 we spend probably over a quarter trillion dollars a year healthcare billing,
00:24:57.860 and you can probably cut that in a third over the next five or six years.
00:25:01.080 There's tons of areas like that.
00:25:02.140 So there are lots of Cassandras painting stories of impending doom from AI.
00:25:08.020 Is AI going to destroy us all?
00:25:10.140 And do you know what year does Skynet go online?
00:25:13.500 Well, I do work a lot in defense, so I'm working on it, Ted.
00:25:16.020 But there's – now we can control all of you.
00:25:19.760 No, listen.
00:25:22.780 There's two different conversations with AI.
00:25:24.440 Yes, my master.
00:25:25.460 Thank you.
00:25:26.240 No, I'm getting in trouble there.
00:25:28.240 Ted's in charge.
00:25:29.080 There's two different conversations with AI.
00:25:33.040 One of them is productivity and wealth creation,
00:25:35.600 and it's actually extremely positive, and that's really good.
00:25:37.840 The other conversation with AI, it's very funny.
00:25:39.520 A lot of people in the tech world are not religious.
00:25:41.480 They've given up their religion.
00:25:42.820 And so this is kind of like a form of their religion,
00:25:45.480 the singularity, the taking over the world of AI.
00:25:47.940 And it's very funny.
00:25:48.720 It's a very messianic vision.
00:25:49.920 It's very much like revelations in Judaism and Christianity
00:25:52.960 where this thing comes and it changes everything,
00:25:55.520 and it's effectively a new God because once it improves itself,
00:25:57.620 it keeps getting better.
00:25:58.980 And so it's like it's a secular religion in Silicon Valley.
00:26:01.940 People are obsessed with it.
00:26:03.380 They talk about end of times with it all the time.
00:26:05.920 And it's funny because America's had a lot of other religious revival movements
00:26:08.700 over the last 200 years where people were convinced end of times was coming very soon.
00:26:12.460 This is quite a weird one based in Silicon Valley.
00:26:14.640 All right, so we're going to wrap up momentarily,
00:26:16.300 but I want to ask – so you are very engaged in policy.
00:26:20.220 A policy question Washington is wrestling with right now.
00:26:23.360 So as you know, I'm the ranking member on the Senate Commerce Committee,
00:26:27.020 and AI is squarely within our jurisdiction.
00:26:29.520 In fact, back in 2015, I chaired the first ever congressional hearing on AI
00:26:34.040 and have been focused on it for a long time.
00:26:36.320 Now, there are a lot of voices in Washington, most notably Chuck Schumer,
00:26:40.540 but also including some Republicans that are eager for a very heavy hand of government
00:26:45.840 when it comes to AI.
00:26:47.100 And Schumer and Democrats are proposing literally prior government approval before innovations in AI.
00:26:54.920 I've been very vocal in saying that it's catastrophically stupid,
00:26:58.840 and if we put government in the position of prior approval,
00:27:02.420 we will cede leadership of AI to our enemies, to China and other countries,
00:27:07.400 and we will kill American leadership.
00:27:09.800 I'm interested in your views because this policy discussion –
00:27:12.920 and I've got to tell you, a lot of big tech, the Googles and Facebooks of the world,
00:27:17.140 are saying, yes, yes, regulate us,
00:27:19.140 because they believe they can capture the government and use it to shut everyone down.
00:27:23.120 And what's your take on how government should approach AI,
00:27:26.880 because this is as hot as any question in Washington right now?
00:27:31.080 Well, you know, Mr. Senator, I 100% agree with you.
00:27:34.080 I'm really glad you're taking that tactic.
00:27:36.540 As you know, the big companies, a lot of them know they're losing some of their best talent.
00:27:40.080 They know it's going to be hard to compete.
00:27:41.540 But you know what they have?
00:27:42.480 Like if I want to start a competitor, for example, to BlackRock right now in New York,
00:27:46.580 I have to spend $100 million a year on lawyers, even just to do what they do.
00:27:49.300 They love the fact that there's tons of rules and regulations.
00:27:51.700 These big companies would love it to make it impossible to compete against them in AI.
00:27:55.520 So number one, 100%, keep the regulations as small as possible.
00:27:58.800 Now, the thing I will give them, and we have to be very careful,
00:28:02.020 because this is not why they're doing it.
00:28:03.320 The thing I will give them is there probably are ways
00:28:05.460 that some people could figure out how to use AI and bioterror and other areas.
00:28:09.300 And so we have to watch it.
00:28:10.200 We have to be very careful.
00:28:11.400 We have to see as it goes along.
00:28:12.360 But let's not give them the ability to make the whole thing crony and break it.
00:28:15.680 Well, and look, there is no doubt there will need to be regulations applied to AI,
00:28:20.040 like to any other industry.
00:28:21.120 Now, many of our existing laws can apply.
00:28:23.340 So are there risks of fraud?
00:28:24.680 Are there risks of deception?
00:28:26.240 Yes.
00:28:26.600 So do you see things like Taylor Swift had the AI fake porn put on,
00:28:31.740 and because she was Taylor Swift and had such a prominence,
00:28:34.800 she was able to get it pulled down.
00:28:36.920 Well, what happens if that's your kid?
00:28:38.780 Well, if it's you.
00:28:39.860 Yeah.
00:28:40.840 And nobody would watch that.
00:28:42.040 That's all right.
00:28:42.460 The market forces would take care of that all on its own.
00:28:47.020 I was so ready to get in there, so ready.
00:28:50.440 That was my moment, and you knew it, and you jumped in beforehand.
00:28:53.520 Oh, man.
00:28:56.980 Okay, keep going.
00:28:58.400 That was all me, folks.
00:29:00.160 Go ahead.
00:29:00.660 But there's no doubt there are going to be needs to apply laws and rules,
00:29:05.980 whether fraud, whether deception.
00:29:07.380 The legal system will have to be applied.
00:29:11.160 But I think we should move slowly and understand what we're doing,
00:29:15.440 because the productivity benefits potentially are massive.
00:29:20.740 And I will say, you talked a minute ago about how the big tech companies want barriers to entry,
00:29:28.380 and that is the most common.
00:29:29.660 One of the great lies of politics is the idea that conservatives are pro-big business.
00:29:38.400 The reality is big business loves big government.
00:29:42.560 Big business usually gets in bed with big government.
00:29:46.620 And big business loves when government puts barriers to entry to stop the next generation of entrepreneurs.
00:29:53.260 And I'll say this.
00:29:54.540 Look, I have nothing for or against big business,
00:29:57.200 but I am interested in the little guys, the next group of entrepreneurs,
00:30:02.120 what the economist Joseph Schumpeter called creative destruction.
00:30:05.480 And one of my favorite images on the Internet is a picture of the founders of Microsoft in 1978.
00:30:13.000 And you have Paul Allen with long hair and a beard, and he looks like one of the Bee Gees.
00:30:16.640 You've got Bill Gates with glasses the size of hubcaps.
00:30:21.780 And it's just that picture of a bunch of college dropouts,
00:30:26.000 and it just asks, would you invest money with these guys?
00:30:30.120 And that is, and they were taking on IBM Big Blue, the giant behemoth,
00:30:34.260 and they were the creative destruction.
00:30:36.140 Now they're the giant.
00:30:37.460 And I will say, let's do this to wrap up.
00:30:40.400 Talk about the importance of disruptors, of innovation,
00:30:43.920 of the next generation driving techs, driving productivity, driving our economy.
00:30:48.840 I mean, this is 100% how America works, as you say.
00:30:51.840 And by the way, it's our biggest advantage against China as our adversary.
00:30:54.940 In China right now, the CCP, aside from just having killed a bunch of our billionaire Chinese tech friends,
00:30:59.780 everyone's terrified to build more tech if you're already successful in China.
00:31:02.840 The other thing they have going against them...
00:31:04.660 Hold on. Say that again.
00:31:06.340 A lot of our tech friends died and or fled in the last five years out of China.
00:31:11.020 And a lot of them were taken away and disappeared,
00:31:12.840 then came back, and they won't talk about it anymore.
00:31:14.480 So do we know names of people who were killed?
00:31:17.300 Because I don't.
00:31:18.880 I'll give you a friend.
00:31:19.780 Andy Tianren, Asian Innovations Group, 47 years old,
00:31:23.020 about to go public last year after working hard for 11 years.
00:31:25.900 And they told him they wanted to do things differently with the data and going in China.
00:31:29.760 He said, I'm going to go talk to him in Beijing.
00:31:31.980 Next I heard he died in his sleep that night at 47 years old.
00:31:34.600 Wow.
00:31:35.240 And there's a lot of stories like this.
00:31:36.760 There's a lot of guys who built a lot of it, who fled,
00:31:38.960 and who are very scared of Xi Jinping.
00:31:40.320 But I'll tell you the other big advantage we have, though, against them,
00:31:42.900 other than them screwing that up, is basically all this productivity coming from AI,
00:31:48.080 it's going to disrupt health care.
00:31:49.600 It's going to change how health care works.
00:31:50.480 It's going to change how logistics works.
00:31:52.080 It's going to change how all these industries work.
00:31:53.120 In China, the government people and their cronies, they own those industries.
00:31:56.280 They are not going to allow those to be disrupted.
00:31:57.960 The question is, is in America, are we still able to disrupt things?
00:32:01.360 Are we still going to be allowed by our government to go in and change how those things work?
00:32:04.800 And it's going to be about, because we have regulatory agencies
00:32:06.760 that also want to slow it down with the big companies.
00:32:08.800 But I still believe in America, with the right leadership,
00:32:11.320 we actually can disrupt these things and we can grow.
00:32:13.560 Well, look, when AI replaces podcasts,
00:32:15.620 I hope that the computer that takes my place does a really fine job.
00:32:19.080 I want to talk to you about how you start your morning off.
00:32:21.940 If you're like me and you're a coffee drinker, I get up early.
00:32:25.120 I get on the radio at 7 a.m.
00:32:27.120 And I have got to have not just a cup of coffee, a really good cup of coffee.
00:32:31.940 And I have a 2024 New Year's resolution.
00:32:34.860 I am not giving my money to woke coffee companies.
00:32:37.820 That is something I have gotten rid of.
00:32:41.200 Blackout coffee is the coffee that I drink.
00:32:43.760 It is amazing.
00:32:45.460 This is 100% America and 0% woke coffee.
00:32:51.460 Blackout coffee is 100% committed to conservative values as a company.
00:32:56.020 From sourcing their beans to the roasting process.
00:32:59.540 Customer support and shipping, they embody true American values
00:33:04.340 and they accept no compromise on premium taste and premium quality.
00:33:09.660 If you want a great cup of coffee, not good, not kind of good, not pretty good, but amazing,
00:33:17.260 you need to go to blackoutcoffee.com slash verdict.
00:33:21.460 Now, here's the cool part.
00:33:22.920 Use the promo code verdict for 20% off your first order.
00:33:27.620 So try it.
00:33:29.080 You're going to be hooked like I am and you'll never go back to those other woke brands.
00:33:34.640 Blackoutcoffee.com slash verdict.
00:33:36.840 Be awake, not woke.
00:33:38.760 That's blackoutcoffee.com slash verdict.
00:33:41.860 Promo code verdict for 20% off your first order.
00:33:46.560 Final question for you.
00:33:49.100 And I want to go back to the university because there's going to be a lot of kids that listen to this,
00:33:52.500 a lot of parents, grandparents, and maybe even professors that may want to reach out.
00:33:59.000 What does next year's class look like?
00:34:00.700 Is there a cap on that?
00:34:01.740 If someone says, I want more information, if there's a professor that's listening to this and says,
00:34:05.680 hey, I want to leave this great institution that I'm at because of being stifled or silenced,
00:34:10.920 I want to talk to you.
00:34:11.860 How can they do that?
00:34:13.460 So we're bidding our first class right now.
00:34:15.640 This is just as competitive to get into as the other top 10 schools.
00:34:18.100 But if you have a really bright young student who's a founding personality,
00:34:22.380 entrepreneurial personality, it's pretty much one of the coolest places you can go.
00:34:25.020 We have 100 of my top tech friends who put their names on and advising it.
00:34:28.300 We have all these top academics.
00:34:29.620 It's going to be very competitive to get in.
00:34:31.140 But yes, please, please, please apply.
00:34:32.960 You can go to youaustin.org and Church University of Austin online.
00:34:36.640 Professors, they're welcome to email.
00:34:38.400 Obviously, if they're amazing, we'd love to talk to them.
00:34:40.340 We have a pretty big line of people trying to get in as professors right now.
00:34:43.220 But obviously, we're very interested in meeting great people.
00:34:45.680 Thank you for coming on the podcast.
00:34:46.900 Thanks for having us here.
00:34:47.720 Thank you to everybody that's here in the audience as well.
00:34:50.240 Don't forget, we do this show Monday, Wednesday, Friday.
00:34:52.560 Hit that subscribe, auto-download button.
00:34:55.320 And don't forget the Saturday Week in Review.
00:34:57.700 Anything you may have missed during the week in the center.
00:34:59.180 And I will see you back here in a couple of days.
00:35:02.140 This is an iHeart Podcast.
00:35:04.940 Guaranteed Human.
00:35:05.560 Let's get in.
00:35:05.920 I'll see you in the next week.
00:35:06.320 Stay tuned.
00:35:07.040 Bye-bye.
00:35:07.800 Bye-bye.
00:35:09.040 Bye-bye.
00:35:10.300 Bye-bye.
00:35:11.320 и Matter Podcast.
00:35:13.360 Bye-bye.
00:35:14.840 Bye-bye.
00:35:15.440 Bye-bye.
00:35:15.760 Bye-bye.
00:35:16.420 Bye-bye.
00:35:17.080 Bye-bye.
00:35:17.840 Bye-bye.
00:35:18.140 Bye-bye.
00:35:19.400 Bye-bye.
00:35:19.960 Bye-bye.
00:35:24.320 Bye-bye.
00:35:24.680 Bye-bye.
00:35:25.120 Bye-bye.
00:35:25.640 Bye-bye.
00:35:26.240 Bye-bye.
00:35:26.640 Bye-bye.
00:35:27.860 Bye-bye.
00:35:28.780 Bye-bye.
00:35:28.940 Bye-bye.
00:35:31.100 Bye-bye.
00:35:31.480 Bye-bye.
00:35:32.160 Bye-bye.
00:35:32.760 Bye-bye.
00:35:33.680 Bye-bye.