The Tucker Carlson Show - September 25, 2025


Tucker Carlson LIVE: The End of Free Speech


Episode Stats

Length

1 hour and 51 minutes

Words per Minute

177.7457

Word Count

19,901

Sentence Count

1,185

Misogynist Sentences

17

Hate Speech Sentences

42


Summary


Transcript

00:00:00.080 Hey, it's Tucker Carlson. Charlie Kirk was assassinated two weeks ago today in an event that clearly is going to change American history, changed a lot of people inside.
00:00:11.500 And there was a moment in the first week where you thought to yourself, this is going to have effects. A lot of them are going to be bad, but some of them are probably going to be good because Charlie's life was itself so good.
00:00:23.180 Charlie Kirk spent his life above all trying to live the Christian gospel and trying to live the principle of free speech, which is to say he talked and he also listened.
00:00:34.040 He was most famous for traveling from college campus to college campus and asking people who disagreed with him to confront him.
00:00:41.000 Ask me anything, he said, and he sat there patiently as they did, and they often attacked him.
00:00:45.860 They almost always expressed views he found repugnant, and almost always he took those views seriously and answered the questions put to him.
00:00:52.580 as crisply and honestly as he could. That's what he spent his life doing, and in fact, he was assassinated while doing that.
00:01:00.220 So if there's any lesson from Charlie Kirk's life, well, the first lesson would probably be sincere Christians tend to be really decent people.
00:01:08.600 Maybe we should have more of them.
00:01:10.600 But the more secular temporal lesson is that free speech is a virtue.
00:01:15.900 It is, in fact, the foundation of this country, not only its laws, but its culture, and that we should protect it.
00:01:22.220 And maybe if we seek to honor Charlie Kirk, we should emulate it.
00:01:26.000 Maybe we should begin by asking our politicians to do what Charlie Kirk spent his life doing, which is to answer the question.
00:01:32.880 Just calmly answer the question. We'll ask you anything, and then you go ahead and answer it to the best of your ability.
00:01:37.740 Like, for example, who blew up the Nord Stream pipeline?
00:01:40.500 What happened to all the money we sent to Ukraine?
00:01:42.480 Why haven't you released all the JFK files?
00:01:44.400 Et cetera, et cetera, et cetera.
00:01:46.320 All the questions on your mind that slowly drive you crazy because no one will address them.
00:01:50.660 Why don't we just ask them directly to our leaders, and they get to answer.
00:01:54.640 Nothing, nothing would honor Charlie Kirk's memory more than that.
00:01:58.560 That is free speech in action.
00:02:01.560 But nothing like that happened.
00:02:02.940 Instead, the only real conversation we've had about free speech has been about Jimmy Kimmel, who is hardly a champion of free speech.
00:02:11.080 In fact, just the opposite.
00:02:12.220 He's a nasty little censor, talentless, a person who has many times on camera over the years chuckled and applauded as other people, his political enemies, have been silenced.
00:02:22.660 A guy who has so little influence in American society and so little audience, he was on his way out anyway, has the job only as a result of some kind of weird political affirmative action where people who agree with studio heads get to have late night jobs.
00:02:35.700 He is hardly the person who should be taking up the cause of free speech or become a symbol of it because, of course, he's the symbol of censorship and has been for most of his career.
00:02:45.900 And the other thing that we saw, maybe even more distressing than that, was politicians turn not only against free speech, but actively and openly announced efforts to censor the American population and use the memory of Charlie Kirk to do it as their justification.
00:03:06.640 There are many examples we could pick.
00:03:08.420 Here's a particularly raw one.
00:03:10.640 This is from Congressman Moskowitz, just in the House of Representatives, eight days after Charlie Kirk died.
00:03:17.080 Here it is.
00:03:18.180 It's crazy what's going on on the social media platforms.
00:03:23.460 There are so many conspiracy theories on what's going on with Charlie Kirk.
00:03:27.640 Israel assassinated him, right?
00:03:30.300 There are conspiracy theories about your personal social life all day.
00:03:35.100 It is totally rampant.
00:03:37.620 Big names on the right.
00:03:40.640 Candace Owens, right?
00:03:43.720 Talking about how the what's been released as far as the dialogue between the perpetrator and his roommate is manufactured by the FBI, manufactured by the administration.
00:03:55.580 It is totally rampant, allowing foreign governments to just perpetrate these platforms, all of these bots, all of the time to weaponize Americans.
00:04:06.640 And so if we want to do something, then we should talk about Section 230.
00:04:11.060 We should talk about how we're going to make sure that we don't let foreign governments poison our children's mind.
00:04:19.060 And so I will work with you on that, director.
00:04:21.280 I'll work with you on 230 any day.
00:04:23.080 So there is the congressman talking to the FBI director, and there's a lot there, and we'll unpack it.
00:04:29.840 But the most telling line came right in the middle, and he turns to the FBI director, and he says, they're criticizing your personal life.
00:04:35.260 They're airing conspiracy theories about your personal life.
00:04:37.960 Now, speaking for myself, I have literally no idea what the congressman was talking about.
00:04:42.940 I haven't seen that.
00:04:43.960 Doubtless it exists.
00:04:44.820 There are conspiracy theories.
00:04:45.820 Conspiracy theories about everybody and everybody's personal life.
00:04:48.200 If you're in public, people are theorizing about you on the Internet, kind of the nature of the Internet and kind of the nature of having authority.
00:04:53.800 But you'll notice that the congressman thinks this will be a compelling argument for the FBI director.
00:04:59.600 He basically just says, they're criticizing you and me, and they're not allowed to do that.
00:05:03.840 But he's not even pretending that the purpose of censoring speech, and that's what he's saying, we need to censor the speech, that the purpose of that would be protect any vulnerable group.
00:05:14.380 Vulnerable groups, no.
00:05:16.280 They're criticizing us.
00:05:17.360 They can't do that.
00:05:18.960 And then, of course, he goes on to blame unseen foreign actors.
00:05:21.860 And by the way, that's something that I think most Americans would get behind, but that the congressman is not behind at all.
00:05:26.400 If we want to take the influence of foreign nations out of our politics, most Americans would applaud.
00:05:30.960 And that would start with not taking money from their lobbies.
00:05:34.000 That would be a welcome change.
00:05:36.060 But the idea that we need to censor what you say because the people who run everything don't want to be called out or have their personal lives misdescribed to the subject of conspiracy theories, well, that's not really reform as much as it's just kind of classic old-fashioned tyranny, isn't it?
00:05:53.660 Shut up!
00:05:54.680 We have guns you don't, and we're going to make you.
00:05:57.140 And how are we going to make you?
00:05:58.060 That's the question.
00:05:58.720 So you may remember that last week the attorney general came out right after Charlie Kirk's death and said there is a distinction between speech, constitutionally protected, famously in the First Amendment and the Bill of Rights, and something called hate speech, a category that doesn't, strictly speaking, exist under the law but which a lot of people seem to believe exists.
00:06:18.620 And hate speech is never defined, like most of the most powerful words that we use to punish people, terrorism, for example, racism.
00:06:27.660 It's never actually defined.
00:06:29.320 What is that exactly?
00:06:30.080 We don't know.
00:06:30.720 And most people who understand the American story, who understand our government, who understand our culture, who care about continuing all of those things, reacted with outrage when the attorney general said that you can't pass a law that will strip from us our God-given right to say what we think is true.
00:06:58.140 She addressed it in such a ham-handed way that it was obvious to everybody exactly what she was talking about, and they reacted.
00:07:05.180 We did too.
00:07:06.640 But in real life, that will not happen.
00:07:09.780 There will not be, I can say confidently, in my lifetime, a law in the Congress that says, explicitly, any American who says something our leaders don't like, anybody who traffics into conspiracy theory about our personal lives will be shut down, fined, imprisoned.
00:07:26.760 An open, transparent censorship law will not pass through the House of Representatives or the United States Senate.
00:07:34.800 It will not be signed by the president.
00:07:36.200 Why?
00:07:36.520 Because it's just too obvious.
00:07:39.380 So instead, because censorship is coming, if these people can help it, instead, they will invoke something called Section 230.
00:07:48.560 And you're going to hear a lot more about this without question.
00:07:51.300 It's never, again, explained very well.
00:07:53.800 And the reason it's not explained is because they don't want you to know exactly what they're doing.
00:07:56.860 So let me just give you the Cliff Notes version of what Section 230 is.
00:07:59.720 Section 230 is a section, 230, within the 1996 Communications Decency Act, and it is the piece of legislation often credited for creating the Internet.
00:08:10.580 It's the framework that Congress came up with at the dawn of the Internet to put parameters around what this is, to protect companies as they grew, to set laws around this new technology.
00:08:22.940 And one of the laws that they made, Section 230, shields Internet providers' platforms from lawsuits.
00:08:32.300 It gives them legal liability from lawsuits on the basis of slander, obscenity, things that are on their platforms that they didn't create.
00:08:44.580 In other words, it creates a distinction between a publisher, like a newspaper, a magazine, a television network, and a platform, Google, Facebook, X.
00:08:55.620 And the distinction allows the platforms to let other people post whatever they want without getting sued for it.
00:09:04.040 They cannot be held liable.
00:09:05.740 These big companies cannot be held liable for slander, hate speech, anything, really, on their platforms.
00:09:14.860 And as a result of this law, those platforms have come to dominate news and information globally.
00:09:21.640 In fact, when we talk about censorship, nobody's talking about censoring the New York Times, the Washington Post, NBC News, because nobody cares.
00:09:27.460 All meaningful information and all meaningful social movements are influenced by social media.
00:09:34.260 So if you want to get people whipped into a frenzy, if you want to change your government, for example, you're not going to take out an ad in the New York Times.
00:09:40.480 Of course not.
00:09:41.540 You're going to get something going on the social media platforms.
00:09:44.700 So they are huge.
00:09:46.540 They are completely dominant.
00:09:48.280 Information flows almost exclusively on them.
00:09:51.900 And all of this is possible because of Section 230.
00:09:54.900 Now, there's been a pretty vigorous debate for the last 20 years over whether this is a good idea.
00:10:00.440 And there are arguments against it.
00:10:01.920 One of them is, why would Google get a liability exemption when I don't have one?
00:10:06.840 You run a business.
00:10:08.320 You're just an American citizen.
00:10:09.660 You can be sued at any time under our famously loose and destructive tort laws.
00:10:14.820 And you can go to business.
00:10:15.800 You can be bankrupt.
00:10:16.520 You can be destroyed.
00:10:18.060 They can do to you what they did.
00:10:19.100 Alex Jones, for example.
00:10:20.020 The FBI can join up with some activist group and take your business away, wreck your life.
00:10:25.420 So why should these big tech companies be exempt?
00:10:28.900 Now, that's a real argument.
00:10:31.020 It's similar to the argument about the pharma companies.
00:10:33.880 Why should vaccine makers get a shield from lawsuits?
00:10:38.400 If I make playground equipment, I'm vulnerable.
00:10:40.680 If I make the COVID vaccine, I'm not.
00:10:42.780 That's a principled argument.
00:10:44.060 But what's interesting about the 230 debate is that both parties have been on both sides of it at various times.
00:10:52.120 The Republicans, for years, were mad at the big platforms because they were censoring conservatives, which they were.
00:10:58.280 And so they often muttered about revoking 230 shield protection unless they opened their platforms to all points of view.
00:11:05.940 In other words, they wanted to use Section 230 to end censorship.
00:11:10.320 There's no reason you should get a special carve out from the U.S. government, from the Congress, if you don't treat people equally, if there's not fairness and neutrality in the way you allow opinions to be broadcast on your platform.
00:11:21.640 That seemed like a fairly reasonable position.
00:11:24.120 But things have changed.
00:11:25.520 Now you're seeing Republicans invoke Section 230, pick up the cudgel that they hold over these huge tech companies and say, unless you censor, we will revoke Section 230.
00:11:42.380 And by the way, they are following in the footsteps of the leftward edge of the Democratic Party in doing this.
00:11:49.160 In 2020, Beto O'Rourke of Texas ran one of his many doomed campaigns for office.
00:11:54.740 That's one, I think, for president.
00:11:55.820 And he said, unless they get hate speech off the platforms, we're going to revoke Section 230 and put these people out of business.
00:12:03.920 By the way, the threat is enough.
00:12:06.320 That was the hope.
00:12:07.260 If we threaten them that we don't have to do the censoring, we'll make Google, Facebook, Meta, and X do the censoring for us.
00:12:14.820 That was the idea, that no one can accuse us of violating the First Amendment or being for speech codes.
00:12:21.660 We'll make someone else do it.
00:12:23.760 He lost.
00:12:24.900 But then Joe Biden, that same year, 2020, said, actually, yes, we should use this threat to force the big tech platforms to censor in ways that we like.
00:12:36.440 And by the way, they did, they did throughout the Biden presidency, Facebook, then Twitter, Google, all censored opinions the Biden administration didn't like.
00:12:47.440 And they did this ultimately because they feared having their legal protection revoked.
00:12:54.020 That's why they did that.
00:12:55.060 That's what got them to act.
00:12:56.040 And Republicans, the sensible ones, looked at this and said, this is completely wrong.
00:13:00.600 It's totally immoral.
00:13:01.680 It's illegal for the U.S. government to be imposing censorship on its citizens.
00:13:05.440 It's against the Constitution of the United States.
00:13:07.440 And it's against, more important, natural law.
00:13:10.480 These are not rights we were given by the Biden administration.
00:13:13.320 These are rights we were born with.
00:13:14.520 And when you take them away from us, you are the criminal.
00:13:17.080 And they made that point.
00:13:18.160 All of a sudden, you are seeing Republicans take the position that Beto O'Rourke and Joe Biden took just five years ago.
00:13:28.900 Here, for example, and this will come as no surprise to you at all.
00:13:31.780 You will not be shocked to hear this.
00:13:33.580 Here is Senator Lindsey Graham of South Carolina running for reelection, making exactly the same case that Beto O'Rourke made.
00:13:41.980 Watch.
00:13:42.760 Section 230 needs to be repealed.
00:13:44.880 If you're mad at social media companies that radicalize our nation, you should be mad at a bill that will allow you to sue these people.
00:13:51.800 They're immune from lawsuits.
00:13:54.440 Oh, it should be repealed.
00:13:57.440 Everyone thinks, I'm very careful online.
00:14:00.200 And you probably already got your way to avoid sketchy websites and obvious scammers.
00:14:04.060 You're not giving money to Nigerian princes.
00:14:05.940 Good for you.
00:14:07.240 But is that enough?
00:14:08.300 No, it's not.
00:14:09.500 Big tech, invasive advertisers, even politicians can track what you do online.
00:14:14.880 From standard browsing.
00:14:16.100 And they get rich from it.
00:14:17.220 And they control you with the information that they glean.
00:14:19.800 And this whole ugly process begins with data brokers.
00:14:23.080 Should be legal.
00:14:23.600 It's not.
00:14:24.540 These digital predators track everything that you do online.
00:14:26.900 Every click.
00:14:27.720 Every scroll.
00:14:28.560 Every search.
00:14:29.900 Then they search what you're doing.
00:14:32.420 Collect it.
00:14:33.260 Sell it to the highest bidder who exploits it for profit.
00:14:36.600 And control.
00:14:37.800 Again, it should all be illegal.
00:14:39.260 But it's not.
00:14:39.720 So in the meantime, ExpressVPN can help you fight back.
00:14:43.000 ExpressVPN reroutes your online traffic everything through secure encrypted servers.
00:14:47.220 And that makes it impossible for data brokers to track you.
00:14:50.380 It's super easy to use.
00:14:51.680 It's rated number one by CNET and The Verge.
00:14:53.820 It can work on up to eight devices simultaneously.
00:14:56.740 It's protection for everyone in your house.
00:14:58.900 We use ExpressVPN.
00:15:00.000 It's chained to our company.
00:15:01.160 You should use it too.
00:15:03.480 Secure an extra four months for free when you use our link.
00:15:06.200 Scan the QR code on the screen or go to expressvpn.com slash tucker and you get four extra months
00:15:11.040 of ExpressVPN.
00:15:12.680 Expressvpn.com slash tucker.
00:15:15.240 Tulsa is my home now.
00:15:17.220 Academy Award nominee Sylvester Stallone stars in the Paramount Plus original series, Tulsa
00:15:22.340 King.
00:15:22.980 This distillery is a very interesting business.
00:15:26.420 And we got to know the enemy.
00:15:28.320 From Taylor Sheridan, co-creator of Landman.
00:15:31.300 What are you saying?
00:15:32.620 I'm a rat!
00:15:33.160 If you think you're going to take me out, it's going to be really difficult.
00:15:39.640 Tulsa King.
00:15:40.680 New season.
00:15:41.620 Now streaming.
00:15:42.520 Exclusively on Paramount Plus.
00:15:44.880 You may have noticed this is a great country with bad food.
00:15:49.060 Our food supply is rotten.
00:15:51.200 It didn't used to be this way.
00:15:52.800 Take chips, for example.
00:15:54.460 You may recall a time when crushing a bag of chips didn't make you feel hungover.
00:15:59.760 Like you couldn't get out of bed the next day.
00:16:02.300 And the change, of course, is chemicals.
00:16:05.040 There's all kinds of crap they're putting in this food that should not be in your body.
00:16:08.760 Seed oils, for example.
00:16:10.580 Now even one serving of your standard American chip brand can make you feel bloated, fat,
00:16:18.280 totally passive, and out of it.
00:16:20.740 But there is a better way.
00:16:21.820 It's called masa chips.
00:16:23.200 They're delicious.
00:16:24.380 Got a whole garage full of them.
00:16:25.720 They're healthy, they taste great, and they have three simple ingredients.
00:16:30.200 Corn, salt, and 100% grass-fed beef tallow.
00:16:34.880 No garbage, no seed oils.
00:16:37.240 What a relief, and you feel the difference when you eat them, as we often do.
00:16:41.260 Snacking on masa chips is not like eating the garbage that you buy at convenience stores.
00:16:46.120 You feel satisfied, light, energetic, not sluggish.
00:16:50.500 Tens of thousands of happy people eat masa chips.
00:16:54.500 It's endorsed by people who understand health.
00:16:56.760 It's well worth a try.
00:16:58.280 Go to masa, M-A-S-A, chips.com slash Tucker.
00:17:01.160 Use the code Tucker for 25% off your first order.
00:17:03.800 That's masa chips.com, Tucker.
00:17:07.720 Code Tucker for 25% off your first order.
00:17:11.160 Highly recommended.
00:17:12.540 Now, it's not clear from that clip exactly why Lindsey Graham is calling for the repeal of Section 230.
00:17:18.220 Why is he threatening the tech platforms?
00:17:20.320 And by the way, the pretext always changes.
00:17:22.560 They'll tell you, well, we're against child sex trafficking, as if anyone is for it.
00:17:26.680 We're against terrorism, a term, once again, they never define and don't have to.
00:17:31.180 We're against drugs.
00:17:33.000 We're against foreign influence.
00:17:34.620 We're against bigotry.
00:17:35.740 Whatever.
00:17:36.520 They will always give you an excuse, and that excuse will make them sound like the virtuous party, like the good guys.
00:17:43.700 We're here to save the vulnerable.
00:17:45.920 But that's never the real reason.
00:17:49.840 Censorship always and everywhere is imposed with the intent and always has the effect of shielding the powerful.
00:17:58.040 They're the ones who don't want to be exposed.
00:18:01.920 Free speech, by contrast, and this is the reason it's in our Bill of Rights, is the one great power that the powerless have.
00:18:10.960 Especially in a world where your vote may or may not matter, all you have is your voice.
00:18:16.760 All you have is your opinion.
00:18:18.840 And that's infuriating to Lindsey Graham's donors.
00:18:21.740 And make no mistake, when he's calling for invoking Section 230 and taking it away, threatening the big platforms,
00:18:28.760 he's doing that on behalf of his donors who feel criticized by random accounts on the Internet.
00:18:33.780 And they hate it, because the people in charge always hate to be called out.
00:18:41.680 Censorship has one goal, and that's to preserve secrecy.
00:18:45.760 And secrecy has one purpose, and that's to abet wrongdoing.
00:18:50.060 So people who are doing nothing wrong are transparent.
00:18:53.920 People who are committing evil hide.
00:18:56.160 And censorship allows them to hide.
00:18:58.060 It's literally that simple.
00:18:59.240 And those people are the most powerful people in the country.
00:19:01.820 So who's encouraging this?
00:19:05.620 The donors, whoever they are.
00:19:07.680 But there are lots of lobby groups, all of them on the left, pushing the Republican-led Congress to get behind censorship initiatives,
00:19:17.160 using the cover of the Section 230 debate to get it done.
00:19:21.960 To pressure the tech companies into making you shut up.
00:19:24.880 Into taking your opinions off the Internet.
00:19:27.320 Using algorithms designed to censor you without even a human being entering into the equation.
00:19:33.040 No person will decide that your opinion is offensive and pull it off.
00:19:36.320 The computer will decide that.
00:19:37.700 And it will be aided by the massive exponential growth in computing power that is at the very center of tech right now.
00:19:45.860 AI.
00:19:46.120 That is the goal, to make certain that opinions that are disruptive to the people in charge never see the light of day.
00:19:55.180 What's amazing, and what's especially infuriating, is that many in the Republican Party,
00:20:01.280 the party that controls all branches of government right now, are completely for this.
00:20:06.580 Strongly for it.
00:20:08.380 Where did they get this idea?
00:20:09.680 Is this a betrayal?
00:20:11.920 Oh, it's a betrayal.
00:20:13.180 How profound a betrayal?
00:20:15.680 Listen to Congressman Don Bacon of Nebraska, a former Air Force general, describe who he's been talking to about censorship.
00:20:23.780 And I appreciate, John, the agreement, what hit him and his ADL stands for.
00:20:29.060 I know you made us better with your feedback and ideas and recommendations, and it's been a trick to get to know you.
00:20:35.080 We want to be in a country that makes clear that anti-salatism or any kind of racism is repugnant, unacceptable, not allowed in my space, and with a zero tolerance for it.
00:20:48.000 So we need to hold these companies accountable and work with them to take it up the airwaves.
00:20:53.400 It's hard to believe that's a real clip we actually checked.
00:20:55.620 Is that real?
00:20:57.040 Congressman Don Bacon of Nebraska, a great and sensible state with tons of normal people, a former Air Force general.
00:21:03.820 Is he really colluding with Jonathan Greenblatt of the ADL to take away your right to say what you think?
00:21:12.600 Oh, you bet.
00:21:14.320 That's exactly what he's doing.
00:21:16.540 And make no mistake, the ADL is not an anti-defamation organization.
00:21:22.060 The ADL practices defamation and slander and bullying and not in service of protecting a marginalized group, but in accruing power and by forwarding its goals, which are ideological.
00:21:36.560 And if you don't believe that, go on the ADL's website and take a look at what the ADL considers hate speech.
00:21:43.080 Hate speech, another one of those terms never quite defined, but the ADL has actually taken the time to define it.
00:21:48.180 What do they consider hate speech?
00:21:49.980 Well, among other things, complaining about drag queen story hour is hate speech, according to the ADL.
00:21:57.080 Huh.
00:21:57.340 Not being enthusiastic about the COVID vax, that's hate speech, and it's dangerous.
00:22:03.680 Noticing that the American population has changed completely in the past 30 years thanks to immigration, that's dangerous hate speech.
00:22:11.420 You should be punished for that, for noticing in your country that you were born in.
00:22:16.720 No noticing.
00:22:17.680 You can't notice that it looks completely different because of decisions that someone who never consulted you made without your knowledge.
00:22:24.180 Shut up, says the ADL.
00:22:26.800 You not only don't have the right to speak, we're going to scream at you and call you a Nazi and imply that you are the dangerous one.
00:22:32.980 The people who opened up the borders to 50 million foreigners.
00:22:36.800 But you're the dangerous one.
00:22:38.620 Sure.
00:22:38.940 You know what else is hate speech, by the way?
00:22:40.660 Reading the gospels of Matthew or Mark or Luke or John, the gospel itself, Christianity itself is hate speech.
00:22:46.180 I know that because three nights ago, I recounted the Christian story in its essence over like five minutes and was immediately denounced by the ADL as someone who was dangerous and inspiring murder.
00:22:58.940 But I'm not the only one.
00:23:01.780 The ADL has actively attacked the Christian gospel for years, has gotten behind a definition of hate speech that includes the Christian story.
00:23:11.380 That's not an exaggeration.
00:23:13.140 That's not a fervid conspiracy theory.
00:23:15.940 That's a fact and you can look it up.
00:23:18.340 So this is the guy?
00:23:20.280 That's the guy, Jonathan Greenblatt, the most aggressively left-wing, democratic-aligned, but much more important than that, lunatic, anti-human, anti-American group, the ADL, completely corrupt.
00:23:34.380 He's consulting that guy to decide how much speech you should have because there are ugly opinions on the internet?
00:23:44.620 Yeah.
00:23:45.920 That's your Republican Party.
00:23:47.380 Was he denounced by his fellow Republicans in the House?
00:23:50.440 Was he denounced by the Speaker of the House?
00:23:53.340 Speaker Johnson?
00:23:54.260 No, he wasn't.
00:23:55.540 They barely even noticed because they have the same views.
00:23:57.960 Not all of them, but an awful lot of them.
00:24:01.040 It's unbelievable and it's counterproductive.
00:24:04.380 Because, once again, censorship is never enacted to help the powerless.
00:24:09.400 It is always and everywhere an effort to shield the powerful.
00:24:13.200 Always.
00:24:14.160 And, in fact, it has a counterproductive effect on the people it is supposedly designed to help.
00:24:19.700 How would you feel about any person you're not allowed to criticize?
00:24:24.060 Would that make you like the person more?
00:24:25.700 No.
00:24:26.080 It would make you resentful and suspicious.
00:24:29.760 And it would give you the well-deserved opinion that this is not an egalitarian society in which we're all citizens.
00:24:36.980 It's a hierarchical society in which the government has decided some people have more rights than others.
00:24:41.640 So, if you find out you're not allowed to criticize someone else, maybe the first question you might ask is,
00:24:46.600 Well, then, why are people allowed to criticize me?
00:24:49.420 And the answer is because some people have more power in our society.
00:24:52.720 Or are being used to pit different groups against each other.
00:24:55.340 Or who knows what's going on.
00:24:56.540 But none of it is consistent with the core promise of this country, which is we're all citizens under our government.
00:25:03.820 And we're all equal before our God who made us.
00:25:06.480 It says that.
00:25:08.620 But increasingly, that's not the country we live in.
00:25:10.880 We live in a country where some people have more rights than others.
00:25:13.820 And that's exactly the kind of message you would send if you wanted to foment a revolution against your government.
00:25:18.460 Because it enrages people.
00:25:20.000 And it divides them from each other.
00:25:21.800 Oh, we have to protect this group.
00:25:23.080 What does everyone else think of that?
00:25:24.580 They secretly don't like the group.
00:25:27.500 You're not ending bigotry by enacting censorship.
00:25:31.800 You're creating it, dumbo.
00:25:34.820 And this is specifically aimed at Congressman Bacon, who was somehow an Air Force general.
00:25:40.660 He can't be dumb.
00:25:42.160 But he's obviously not very thoughtful because this is very obvious.
00:25:46.400 People don't like other people who get special treatment.
00:25:52.320 Were you never a child?
00:25:53.200 Did you never learn that?
00:25:55.580 Who knows what the purpose is here?
00:25:57.140 It doesn't even matter.
00:25:59.780 It is happening right before us.
00:26:02.580 The people who are elected to protect us, who say they're our friends, are selling us out.
00:26:08.200 And you can theorize as to why.
00:26:10.500 And by the way, all of that theorizing is itself unhealthy.
00:26:13.700 Where do conspiracy theories come from?
00:26:15.500 Where do you think they come from?
00:26:16.500 They come from living in a country where the government will never explain anything and lies constantly.
00:26:21.400 So the next time you see someone in power complain about malicious conspiracy theories, stop him in mid-sentence and say they exist because of you.
00:26:28.040 If you just tell the truth, if you would live like Charlie Kirk and answer the question politely, reasonably, fully, there wouldn't be a vacuum into which lunatics would rush.
00:26:42.060 We would have a plausible answer to basic questions like, what the hell is going on?
00:26:47.420 But because you haven't provided that, what do you think is going to happen?
00:26:52.440 People are going to have some pretty far out explanations.
00:26:55.160 And maybe some of them are true, by the way.
00:26:56.940 We don't know.
00:26:58.240 Your behavior is so suspicious because you can't answer any question straight.
00:27:03.940 Any.
00:27:04.260 And you're spending your time talking to Jonathan Greenblatt, one of the darkest, most corrupt people in our society, truly a divisive figure.
00:27:13.680 Speaking of divisive, how many Americans have made fellow Americans hate each other more consistently over the years than Jonathan Greenblatt?
00:27:21.660 Very, very few.
00:27:22.680 Very few.
00:27:24.220 And you're talking to him?
00:27:25.420 So if this sounds like a paranoid rant, like, oh, that could never happen, well, you should know that it is happening right now in the state of California.
00:27:36.680 The state of California, like, two weeks ago, ten days ago, eight days ago, something like that, has passed a law in the state legislature, both chambers of the state legislature.
00:27:49.580 It awaits a signature from Gavin Newsom that would ban hate speech on the Internet in California.
00:27:58.940 Hate speech.
00:27:59.640 Now, how do they define hate speech?
00:28:00.960 I actually have the definition.
00:28:02.060 I actually wrote it down because I was so shocked by it that this is happening.
00:28:05.720 The state of California, if Gavin Newsom signs this law, and he has until October 13th to do it,
00:28:10.980 people will be fined if the censors determine that speech constitutes, and we're quoting now, violence, intimidation, or coercion.
00:28:21.340 What's intimidation or coercion, right?
00:28:24.080 Or coercion based on race, religion, gender, sexual orientation, immigration status, or other protected characteristics.
00:28:32.040 Obviously, white Christian men are not covered under that, and so the society becomes ever more hierarchical with a Brahmin class and untouchables at the bottom,
00:28:43.960 the opposite of the country all of us over 50 grew up in that had an egalitarian spirit where some were rich, some were poor, some were smart, some were dumb, some had good jobs, others were unemployed,
00:28:52.980 but all of us were considered equal under the law and equal in the eyes of God.
00:28:58.000 And that concept is the basis of a stable society, any stable society, and it was the basis of stability in this country.
00:29:04.980 And laws like this and the attitudes that give birth to them, to laws like this, have made it wildly unstable.
00:29:13.660 Wobbly, it's so unstable.
00:29:16.440 So, as of October 13th, that could become law.
00:29:19.740 Now, that's a censorship law.
00:29:21.260 Now, they'll say, no, no, no, we're just, we're actually getting the platforms to censor.
00:29:24.320 Well, right, you're getting someone else to do the job for you, but if you hire a hitman and he carries out the hit, you're the murderer.
00:29:33.320 He participated in it, but you hired him.
00:29:35.660 And that's exactly what's going on here.
00:29:37.280 The state of California, under Gavin Newsom, is about to, we think, censor the opinions of Americans.
00:29:45.220 Not to protect anybody, but to shield themselves from criticism so they can continue to do what they want to do in secret.
00:29:51.800 Jonathan Greenblatt, the head of the ADL, applauds this.
00:30:04.080 And in case you're not familiar with Jonathan Greenblatt, and in case you want a sense of what he's like and what he considers hate speech,
00:30:10.480 let's just go right to the tape so you know that we're not exaggerating.
00:30:14.940 This is Jonathan Greenblatt of ADL.
00:30:16.260 When you look at the prevalence of anti-vaxxer accounts that have been amplified and spread across Facebook,
00:30:22.800 they don't show up on your network, but they show up every day to billions of people
00:30:27.060 because Facebook profits from amplifying these voices, which are literally killing people.
00:30:33.260 And freedom to express your opinion isn't the freedom to incite violence.
00:30:37.400 But, but for Facebook, it is, and that needs to change.
00:30:41.300 That's all. It's simple.
00:30:43.240 There's nothing wrong with keeping all of us safe from violent white supremacists or hateful people.
00:30:51.460 So criticizing the COVID vax is tantamount to murder.
00:30:55.920 There's never been, I mean, obviously that's prima facie insane.
00:30:59.100 It's untrue.
00:31:00.240 It's a deranged perspective.
00:31:01.820 But more than anything, you're seeing who Jonathan Greenblatt really is.
00:31:07.260 He is a faithful Praetorian guard for the people in charge.
00:31:12.200 This is not someone who's ever challenged actual power, not once in his life.
00:31:16.380 That's who he works for.
00:31:17.960 That's who he takes money from.
00:31:22.220 That's what hate speech looks like.
00:31:23.980 Anybody in charge can make you shut up when you criticize them or stand in the way of their aims.
00:31:34.080 So in case you don't think this can come to the United States,
00:31:38.120 one final clip, and it's a sad one, and it comes from the UK.
00:31:42.200 Now the UK, obviously the country that gave birth two hours,
00:31:45.440 a cousin, a country so similar to ours and so close, six hours by plane overnight,
00:31:50.460 that we don't really think of it as fully foreign.
00:31:52.420 It's not like going to Malaysia or Burundi or even France.
00:31:56.760 It's an English-speaking country whose customs are recognizable,
00:32:01.000 whose government and common law form the basis of our government and our law.
00:32:07.480 Everything about England seems like home but three degrees off.
00:32:13.280 And yet the UK has become a police state.
00:32:17.800 And if you don't believe that,
00:32:19.480 if you think that's just hyperbole designed to whip you into a frenzy,
00:32:23.820 here's a stat that we checked,
00:32:25.200 and it's hard even to believe this is true,
00:32:27.200 but this is true.
00:32:29.660 2023.
00:32:31.420 So like a year and a half ago.
00:32:33.580 How many people do you think were arrested
00:32:35.720 in the United Kingdom
00:32:37.400 for speech violations,
00:32:39.680 arrested by the police,
00:32:40.900 handcuffed and brought to jail in 2023?
00:32:43.080 A couple dozen,
00:32:45.220 you know,
00:32:45.440 the ones you see on X,
00:32:47.240 the ones Fox News talks about.
00:32:48.800 How many people in that year were arrested
00:32:50.700 for saying things the government didn't want them to say?
00:32:53.940 What's your guess?
00:32:54.760 Is it more than 12,000?
00:32:56.920 Because that's the answer.
00:32:58.140 More than 12,000.
00:33:00.180 Wow,
00:33:00.700 that seems like a lot.
00:33:01.680 Is that a lot?
00:33:03.040 I mean,
00:33:03.180 it's kind of hard to know,
00:33:04.020 right?
00:33:04.780 Okay.
00:33:05.500 Well,
00:33:05.860 let's compare it
00:33:07.200 to the number,
00:33:08.980 the widely agreed upon number
00:33:10.260 from the most totalitarian country in the world.
00:33:13.860 A country so lacking in basic freedom.
00:33:16.940 A country run by a madman.
00:33:18.960 A country that's so evil,
00:33:20.520 we're literally at war with that country right now,
00:33:22.660 just on principle.
00:33:23.600 Because we so disapprove of how they treat their people.
00:33:26.160 And that country,
00:33:27.180 of course,
00:33:27.600 is Russia under Vladimir Putin.
00:33:30.220 So if the UK
00:33:31.500 handcuffed 12,000,
00:33:34.280 more than 12,000 people in one year
00:33:36.420 for saying things the government didn't like,
00:33:38.760 how many were arrested in Russia?
00:33:41.540 A country with twice the population of the UK.
00:33:44.840 Oh,
00:33:44.960 we happen to have the number.
00:33:46.340 3,319.
00:33:49.700 So to restate,
00:33:50.840 more than 12,000 people arrested
00:33:53.140 in the United Kingdom,
00:33:55.800 England,
00:33:56.900 in one year for speech code violations,
00:34:00.640 3,300 arrested in Russia,
00:34:03.180 a country with twice the population.
00:34:05.580 So that tells you,
00:34:07.060 you don't think totalitarianism
00:34:08.720 can come to the Anglosphere?
00:34:10.240 Oh,
00:34:10.540 it already has.
00:34:11.720 We haven't even touched on
00:34:12.660 Australia,
00:34:13.400 New Zealand,
00:34:13.900 Canada.
00:34:14.540 In some ways,
00:34:15.140 even worse.
00:34:16.780 But what does it look like?
00:34:17.960 What is the face
00:34:19.280 of
00:34:20.660 hate crime prosecution?
00:34:23.320 What does it actually look like
00:34:24.400 when a citizen is arrested
00:34:25.400 for saying something
00:34:26.060 the government doesn't like?
00:34:27.660 This video is not from China.
00:34:28.820 It's from the United Kingdom.
00:34:30.380 This is a British veteran
00:34:31.720 being arrested
00:34:32.800 for offending the government.
00:34:33.880 Watch.
00:34:35.280 The scripture police
00:34:36.520 would realise
00:34:37.980 how ridiculous this is.
00:34:39.600 It is.
00:34:40.480 It is.
00:34:41.080 It didn't need to come to this.
00:34:42.720 What did it need to come to?
00:34:45.160 Tell us why you
00:34:46.140 escalated it to this level.
00:34:47.340 Because I don't understand.
00:34:48.920 I posted something
00:34:49.640 that he posted.
00:34:50.680 You come to arrest me,
00:34:51.560 you don't arrest him.
00:34:52.520 Why has it come to this?
00:34:53.580 Why am I in cuffs?
00:34:54.480 Because it's something
00:34:55.160 he shared,
00:34:56.420 then I shared.
00:34:57.020 Because someone
00:34:57.940 has been caused
00:34:58.680 obviously anxiety
00:34:59.720 based upon your
00:35:01.880 social media page.
00:35:04.000 That's why you've been arrested.
00:35:07.300 Oh yes.
00:35:08.520 The velvet wrap
00:35:09.620 jackboot
00:35:10.440 of British fascism.
00:35:12.120 You're being arrested
00:35:13.120 because someone
00:35:14.520 has been caused anxiety
00:35:15.640 by your views.
00:35:16.480 Notice that someone
00:35:17.020 is never identified
00:35:17.720 and of course the answer
00:35:18.620 is someone in power.
00:35:20.140 Someone in the government
00:35:20.900 or someone who funds
00:35:21.640 the government,
00:35:22.200 someone close to the government,
00:35:23.140 someone who has a lot more power
00:35:23.980 than you,
00:35:24.300 didn't like what you were saying,
00:35:25.460 felt anxious about
00:35:26.220 what you were saying,
00:35:27.020 and so unfortunately
00:35:28.100 we're going to have
00:35:28.460 to handcuff you
00:35:29.160 and bring you to jail.
00:35:31.740 Ho, ho, ho, ho, ho, ho.
00:35:34.100 That happened this year.
00:35:35.400 That happened in,
00:35:35.860 that's from January
00:35:36.640 and it happens
00:35:37.880 every single day.
00:35:38.700 More than 12,000 people
00:35:39.760 arrested every single year
00:35:40.760 for criticizing their government
00:35:41.780 in the UK.
00:35:42.940 Our closest ally
00:35:44.280 with whom we share
00:35:46.000 intelligence on every level.
00:35:48.040 British intelligence,
00:35:48.940 I know everyone's spun up
00:35:49.860 about Mossad,
00:35:50.520 very close to Mossad,
00:35:51.900 we're closer
00:35:52.360 to British intelligence.
00:35:53.220 That's the country
00:35:54.880 we're partnering with
00:35:56.000 to spy on
00:35:57.400 our respective populations.
00:36:00.280 Yeah.
00:36:01.140 So it's really,
00:36:01.840 really simple.
00:36:02.720 If a government,
00:36:04.440 if your government
00:36:05.740 is willing to arrest you
00:36:07.740 for saying things
00:36:09.240 that they don't like,
00:36:10.440 if your government
00:36:11.440 is arresting you
00:36:12.840 for criticizing them,
00:36:14.300 one way or another,
00:36:17.000 you need a new government.
00:36:18.900 If there is
00:36:19.940 any justification
00:36:21.340 for revolution,
00:36:22.740 it's that.
00:36:23.320 That's unacceptable.
00:36:24.260 That's tyranny.
00:36:25.240 A government that does that
00:36:26.120 is not a legitimate government.
00:36:27.420 It has absolutely
00:36:28.060 no right to do that
00:36:29.160 and it should be stopped
00:36:30.920 from doing that
00:36:31.520 immediately.
00:36:32.960 That's the red line
00:36:33.840 right there.
00:36:34.820 So the First Amendment
00:36:35.860 is the one truly distinctive thing
00:36:37.860 that makes America,
00:36:39.100 America.
00:36:39.660 It makes this country great.
00:36:40.800 You are a citizen.
00:36:42.240 That means you can speak
00:36:43.740 openly and honestly
00:36:45.000 without fear
00:36:45.700 about what you actually believe.
00:36:47.360 The government doesn't own you.
00:36:49.380 You own the government.
00:36:51.340 That's the premise
00:36:52.140 and for 250 years
00:36:53.440 we've lived it.
00:36:54.520 We hope to keep living it.
00:36:56.120 Our sponsor, Pure Talk,
00:36:57.220 understands how important
00:36:58.080 and how central it is.
00:36:59.600 So if you want to support brands
00:37:01.100 that defend freedom
00:37:01.860 and American values,
00:37:02.880 we recommend switching
00:37:03.740 your wireless service
00:37:04.580 to Pure Talk,
00:37:05.080 which is way cheaper
00:37:06.120 and uses the same towers
00:37:07.720 the other guys use.
00:37:09.120 It's the best.
00:37:10.160 We know what you're thinking.
00:37:11.340 Of course,
00:37:11.800 giving business to companies
00:37:13.000 that share your values
00:37:13.720 sounds nice,
00:37:14.360 but at the end of the day,
00:37:15.780 you don't want to spend
00:37:17.240 more for the privilege
00:37:18.720 of buying products
00:37:20.960 from a company
00:37:21.440 that loves America.
00:37:22.560 Well, you don't have to.
00:37:23.960 Pure Talk's plan
00:37:24.660 started at just $25 a month,
00:37:26.440 $25 for the same 5G coverage
00:37:29.920 the other companies provide,
00:37:31.360 literally the same cell towers.
00:37:32.740 And you support a business
00:37:33.920 that believes in this country
00:37:35.040 and creates jobs
00:37:35.840 here in this country.
00:37:37.220 If you're interested,
00:37:38.320 visit puretalk.com
00:37:39.800 slash Tucker
00:37:40.280 to switch to our wireless company
00:37:42.060 than when we use Pure Talk.
00:37:43.380 Right now,
00:37:43.780 you save an additional
00:37:44.440 50% off your first month.
00:37:46.000 Again,
00:37:46.780 puretalk.com
00:37:47.660 slash Tucker.
00:37:48.380 Ontario,
00:37:49.300 the wait is over.
00:37:50.760 The gold standard
00:37:51.500 of online casinos
00:37:52.500 has arrived.
00:37:53.600 Golden Nugget Online Casino
00:37:55.160 is live,
00:37:55.860 bringing Vegas-style excitement
00:37:57.300 and a world-class
00:37:58.360 gaming experience
00:37:59.320 right to your fingertips.
00:38:01.200 Whether you're a seasoned player
00:38:02.400 or just starting,
00:38:03.540 signing up is fast
00:38:04.480 and simple.
00:38:05.660 And in just a few clicks,
00:38:06.940 you can have access
00:38:07.600 to our exclusive library
00:38:08.920 of the best slots
00:38:10.020 and top-tier table games.
00:38:11.900 Make the most
00:38:12.540 of your downtime
00:38:13.180 with unbeatable promotions
00:38:14.720 and jackpots
00:38:15.520 that can turn
00:38:16.020 any mundane moment
00:38:17.300 into a golden opportunity
00:38:18.820 at Golden Nugget Online Casino.
00:38:21.380 Take a spin on the slots,
00:38:22.740 challenge yourself
00:38:23.300 at the tables,
00:38:24.120 or join a live dealer game
00:38:25.540 to feel the thrill
00:38:26.540 of real-time action,
00:38:27.860 all from the comfort
00:38:28.800 of your own devices.
00:38:30.080 Why settle for less
00:38:31.040 when you can go for the gold
00:38:32.540 at Golden Nugget Online Casino?
00:38:35.340 Gambling problem?
00:38:36.260 Call Connex Ontario
00:38:37.360 1-866-531-2600.
00:38:40.580 19 and over.
00:38:41.480 Physically present in Ontario,
00:38:42.860 Eligibility restrictions apply.
00:38:44.480 See GoldenNuggetCasino.com
00:38:46.180 for details.
00:38:46.960 Please play responsibly.
00:38:48.400 So you probably got
00:38:49.160 Verizon, AT&T, T-Mobile.
00:38:51.360 That means you are definitely
00:38:52.440 way overpaying
00:38:53.540 for wireless service.
00:38:54.480 And we're not just saying that.
00:38:55.800 It happens for a reason.
00:38:57.020 When you join
00:38:57.580 a massive cell phone company,
00:38:59.340 you get charter support
00:39:00.940 everything that their
00:39:01.940 operation is doing.
00:39:03.080 And that's a lot.
00:39:04.200 Big corporate programs,
00:39:06.000 huge HR departments,
00:39:08.440 thousands of retail stores
00:39:09.800 you're never going to visit.
00:39:10.680 You think your money
00:39:12.000 is going toward getting
00:39:12.920 better cell service,
00:39:14.060 5G service,
00:39:14.800 but it's not.
00:39:16.220 So the wireless company
00:39:16.840 we use,
00:39:17.560 PureTalk,
00:39:17.940 is very different.
00:39:18.640 They use the exact
00:39:19.400 same cell network
00:39:20.860 as the companies
00:39:21.420 we just mentioned,
00:39:22.380 but they don't do
00:39:23.720 any of the other garbage.
00:39:25.540 And for you,
00:39:25.960 that means $25 a month
00:39:27.960 for your phone's data plans.
00:39:29.600 Actually,
00:39:30.040 $25 a month.
00:39:33.040 You'll be amazed.
00:39:34.260 Switch to PureTalk.
00:39:35.040 It's super easy.
00:39:36.140 Visit puretalk.com
00:39:37.660 slash Tucker
00:39:38.160 and you save an additional
00:39:39.120 50% off your first month.
00:39:41.540 PureTalk.com
00:39:42.540 slash Tucker.
00:39:43.160 It literally takes minutes.
00:39:45.100 It's America's
00:39:45.620 wireless company.
00:39:47.360 Michael Schellenberger
00:39:48.440 is one of the great
00:39:50.080 reporters in the United States,
00:39:51.640 a friend of ours,
00:39:52.660 and someone who,
00:39:54.260 as a former liberal,
00:39:55.420 has probably thought
00:39:56.440 about speech
00:39:57.480 for more years
00:39:59.100 and with more clarity
00:40:00.120 than probably anyone I know.
00:40:02.200 And so we're so grateful
00:40:02.900 to have him on
00:40:03.460 to assess
00:40:04.000 the state of free speech
00:40:05.880 in the United States
00:40:06.560 two weeks after
00:40:07.140 Charlie Kirk's assassination.
00:40:08.620 Mike, thanks very much
00:40:09.520 for coming on.
00:40:10.440 Are you worried?
00:40:14.200 I'm very worried.
00:40:16.060 I mean,
00:40:16.420 I think what's maybe
00:40:17.400 undersaid recently
00:40:19.160 is that,
00:40:20.000 you know,
00:40:20.560 assassination is the ultimate
00:40:22.040 form of censorship.
00:40:23.340 You know,
00:40:23.960 and it comes from
00:40:25.580 the same place.
00:40:26.940 You know,
00:40:27.140 I think that's
00:40:27.820 what everybody
00:40:28.320 senses about it
00:40:29.460 is that
00:40:29.980 there had been efforts,
00:40:31.700 you know,
00:40:31.920 to censor,
00:40:32.900 you know,
00:40:33.180 and they had,
00:40:34.260 I think,
00:40:34.520 censored Charlie Kirk,
00:40:35.300 obviously.
00:40:35.800 I mean,
00:40:35.900 the Twitter files,
00:40:36.760 we discovered
00:40:37.220 that he was on a blacklist.
00:40:39.320 There was obviously
00:40:40.200 huge attempts
00:40:40.900 to keep him out
00:40:41.600 of universities.
00:40:42.640 He had already had,
00:40:43.420 you know,
00:40:43.620 many death threats,
00:40:44.700 which is not a direct
00:40:46.480 form of government censorship,
00:40:47.480 but these are societal demands
00:40:49.080 that he'd be silenced.
00:40:51.360 And then at a global
00:40:53.180 institutional level,
00:40:54.620 yeah,
00:40:54.820 it's a very disturbing trend
00:40:56.240 that we're seeing.
00:40:57.140 I mean,
00:40:57.300 I think there's two things
00:40:58.280 happening.
00:40:58.860 There's both a organic
00:41:00.260 kind of demand
00:41:02.000 from powerful people
00:41:03.580 like the kind
00:41:04.140 that you were describing
00:41:05.100 where a politician
00:41:06.640 just really,
00:41:07.620 you know,
00:41:07.880 I think it was the Moskovits
00:41:09.080 where you just kind of
00:41:10.200 can't stand something
00:41:11.400 and they just want to see
00:41:12.340 something taken down
00:41:13.340 and you'll see that
00:41:13.980 from groups
00:41:14.680 and politicians.
00:41:16.380 And then there's more
00:41:17.200 of an inorganic
00:41:18.580 demand for censorship,
00:41:19.800 which we've labeled
00:41:20.920 the censorship
00:41:21.580 industrial complex.
00:41:22.820 You can call it
00:41:23.600 the censorship industry
00:41:24.760 and that is in place
00:41:27.140 in the European Union,
00:41:29.000 in Britain,
00:41:30.100 in Brazil,
00:41:31.760 California would like
00:41:32.780 to have that.
00:41:34.180 Basically,
00:41:34.780 all the Five Eyes countries
00:41:36.120 are pursuing that
00:41:37.620 and their strategy,
00:41:39.340 I think,
00:41:39.620 is pretty clear
00:41:40.220 at this point
00:41:40.700 is to encircle
00:41:41.560 the United States
00:41:42.580 and to make
00:41:43.240 our tech platform
00:41:44.380 censor along the lines
00:41:46.420 that they would like to
00:41:47.540 so they can achieve
00:41:49.000 censorship
00:41:49.620 through the back door.
00:41:51.040 And this has always
00:41:52.000 been their strategy
00:41:52.880 because they know
00:41:53.700 that the First Amendment
00:41:54.620 is a major obstacle
00:41:56.680 for them
00:41:57.280 since it requires
00:41:58.600 that the people
00:41:59.200 have really radical
00:42:00.420 levels of free speech
00:42:01.680 that no country
00:42:02.460 has ever come close to.
00:42:04.220 And as you said,
00:42:05.200 based on this idea
00:42:06.040 of natural rights
00:42:06.880 that we are granted
00:42:08.240 by our creator,
00:42:09.340 not given to us
00:42:10.440 by the government,
00:42:11.340 the speech comes
00:42:12.140 before the government.
00:42:13.300 The speech is how
00:42:13.860 we constitute our government.
00:42:15.340 Whereas in Europe
00:42:16.260 and everywhere else,
00:42:17.080 the government
00:42:17.500 had gradually
00:42:18.460 let people say
00:42:20.420 certain things.
00:42:21.200 You have to petition
00:42:21.820 the king.
00:42:22.420 Oh, king,
00:42:22.860 can we criticize you
00:42:23.740 for sleeping with Ann Boylan?
00:42:25.080 And the king would decide
00:42:26.100 whether that would be okay
00:42:27.100 or not.
00:42:28.080 And that's how
00:42:29.200 it would occur.
00:42:30.540 And the creators
00:42:31.240 of this amazing country,
00:42:32.740 and as you work
00:42:33.280 on free speech,
00:42:34.260 I've only worked on it now
00:42:35.220 for really two and a half years.
00:42:37.400 I'm a newbie
00:42:38.340 to the issue.
00:42:39.520 But one of the things
00:42:40.300 you just really appreciate
00:42:41.260 is really how radical
00:42:42.520 and powerful
00:42:43.520 and strong
00:42:44.260 that commitment
00:42:44.760 to the First Amendment was.
00:42:46.260 It's not just hype.
00:42:47.480 You might just think,
00:42:48.160 is that just patriotic hype
00:42:49.460 from Americans.
00:42:50.560 It's not when you read
00:42:52.080 the history of free speech
00:42:53.040 over 2,500 years
00:42:54.480 going back to Socrates
00:42:55.620 who was put to death
00:42:56.680 for things that he said,
00:42:58.860 also an act of censorship,
00:43:00.940 then you realize
00:43:02.500 just how radical
00:43:03.320 the First Amendment,
00:43:03.820 and how beautiful
00:43:04.380 the First Amendment is
00:43:05.560 because the Americans
00:43:06.420 that created our country
00:43:07.600 said we don't want
00:43:08.200 to have a country,
00:43:09.580 we don't want to have
00:43:10.180 a government
00:43:10.700 if we can't have
00:43:11.740 full free speech
00:43:13.220 with some very narrow exceptions.
00:43:15.720 And so the exceptions,
00:43:16.780 now they sort of say,
00:43:17.560 well, the Internet
00:43:18.320 changed everything.
00:43:19.240 That's what you hear from.
00:43:20.400 That's what I hear
00:43:20.780 from my progressive friends.
00:43:22.760 Heard it on Martha's Vineyard
00:43:23.880 of all places.
00:43:25.320 It's all changed
00:43:26.100 with the Internet.
00:43:26.800 It's too dangerous
00:43:27.560 to allow this high level
00:43:28.640 of free speech.
00:43:29.260 We have to change things.
00:43:30.720 Well, Tucker,
00:43:31.200 to put in context
00:43:33.320 how crazy that is,
00:43:35.120 our Supreme Court
00:43:36.140 has ruled not once
00:43:37.380 but twice
00:43:38.180 that Nazis can march
00:43:40.080 through neighborhoods
00:43:41.500 not only of Jewish Americans
00:43:43.360 but of Holocaust survivors
00:43:44.840 and that the line
00:43:46.780 of the Holocaust
00:43:47.560 where the line
00:43:48.900 that gets crossed
00:43:49.740 between speech
00:43:51.300 and actual violence
00:43:53.100 is when I say
00:43:54.060 go kill that person there
00:43:55.920 or go light that house
00:43:57.300 on how it's when
00:43:58.080 the speech becomes
00:43:58.960 part of the action
00:44:00.840 or coordinating
00:44:02.540 an assassination
00:44:03.240 or something.
00:44:03.780 Of course,
00:44:04.120 language in that context
00:44:05.400 has to be illegal
00:44:06.840 because it's part of
00:44:07.840 breaking the law.
00:44:09.800 But marching
00:44:11.080 through a neighborhood
00:44:11.600 with the most vile ideology
00:44:13.680 is something
00:44:14.420 that the Supreme Court
00:44:15.220 has twice upheld.
00:44:16.360 Well, now we're supposed
00:44:17.560 to believe
00:44:18.280 that some racist comments
00:44:20.280 on a Facebook post
00:44:22.600 or, as you said,
00:44:24.380 it's really political.
00:44:25.640 It's really about
00:44:26.380 stifling the conversation
00:44:27.880 around migration,
00:44:29.980 gender, climate.
00:44:32.660 I mean,
00:44:32.880 it's actually been less
00:44:33.700 on race,
00:44:34.380 a huge amount
00:44:35.340 on trans issues.
00:44:36.920 I mean,
00:44:37.200 we just saw
00:44:37.800 a British gentleman
00:44:38.680 at Grand Lennam,
00:44:39.860 a famous television
00:44:41.540 comedy writer,
00:44:43.160 get arrested
00:44:43.960 when he landed in Britain
00:44:45.820 for having urged
00:44:47.980 biological women
00:44:49.440 to defend themselves
00:44:50.600 from biological males
00:44:52.220 that come into
00:44:53.000 their bathrooms.
00:44:54.000 So it is a very
00:44:55.220 serious threat,
00:44:56.080 Tucker.
00:44:57.020 I think that the thing
00:44:57.840 to keep our eye on
00:44:58.960 is they've been trying
00:45:00.680 to basically get
00:45:01.780 governments to empower
00:45:03.040 a mostly secret group
00:45:04.840 of so-called NGOs
00:45:06.080 that would be financed
00:45:07.660 by the government
00:45:08.360 and who would be telling
00:45:09.920 the social media companies
00:45:11.080 what to take down.
00:45:12.200 In some places,
00:45:12.980 they're more subtle
00:45:13.600 with it than in others.
00:45:14.900 That's the big threat.
00:45:16.460 The Trump administration
00:45:17.140 has done some great things
00:45:18.260 to defund that.
00:45:19.520 It was all going to work
00:45:20.280 through NSF
00:45:21.240 and then Congress
00:45:21.920 would have to bless it
00:45:22.880 and that was the way
00:45:23.500 that they were going to do it.
00:45:24.440 So the Trump administration
00:45:25.780 has done a great job
00:45:26.820 defunding that
00:45:28.780 and also holding
00:45:29.760 a strong line
00:45:30.680 on both Europe
00:45:31.460 and Brazil,
00:45:32.700 demanding that
00:45:33.960 free speech protections
00:45:34.940 be there.
00:45:36.740 But obviously,
00:45:38.000 we see some backsliding
00:45:40.300 and some behaviors
00:45:41.120 over the last couple of weeks
00:45:42.260 that were lamentable.
00:45:43.660 Certainly the Attorney General's
00:45:45.060 comments,
00:45:45.540 which she then later
00:45:46.440 kind of went back on
00:45:48.000 and said she didn't
00:45:49.000 mean what she had said
00:45:50.520 and then obviously,
00:45:51.840 or maybe not obviously,
00:45:52.840 but a dust up
00:45:53.940 over the FCC chair
00:45:56.020 and his comments
00:45:56.880 around Jimmy Kimmel,
00:45:58.280 which I got to say
00:45:59.180 as the days go by
00:46:00.200 and you look in retrospect,
00:46:01.740 it just seems absurd.
00:46:03.160 I mean,
00:46:03.320 you had Democrats
00:46:03.980 trying to create
00:46:04.840 this elaborate censorship system
00:46:06.880 and then you had
00:46:07.800 some bad mouthing
00:46:08.700 of Jimmy Kimmel.
00:46:10.100 It wasn't really,
00:46:11.100 it wasn't great.
00:46:12.620 I think it was inappropriate.
00:46:16.500 But I think there was
00:46:17.560 a lot of other
00:46:17.900 complicating factors too
00:46:19.240 and there was,
00:46:20.440 you know,
00:46:20.620 economic concerns
00:46:21.460 around Jimmy Kimmel
00:46:22.440 and just sort of
00:46:23.220 this demand that,
00:46:24.660 you know,
00:46:24.940 that he have to be carried
00:46:26.040 on every television station.
00:46:27.680 I think in retrospect,
00:46:29.400 we won't look back on
00:46:31.120 as it's not a great moment.
00:46:32.780 I don't think
00:46:33.040 the Trump administration
00:46:33.600 covered itself in glory
00:46:34.680 over the last couple of weeks
00:46:35.780 on that,
00:46:36.740 but on the European side
00:46:38.580 of holding strong
00:46:39.280 on free speech
00:46:40.220 and also in standing up
00:46:41.660 to Brazil,
00:46:42.980 I do give the Trump
00:46:44.120 administration credit.
00:46:45.840 All it did was save
00:46:46.460 Jimmy Kimmel.
00:46:47.420 I mean,
00:46:47.620 Jimmy Kimmel is on his way out.
00:46:48.740 Nobody watches that.
00:46:49.800 It's crap.
00:46:50.280 I mean,
00:46:50.600 it has no effect
00:46:51.320 on American society.
00:46:52.320 It's just masturbatory.
00:46:54.000 It's really,
00:46:54.380 he's playing for an audience
00:46:55.320 of one himself
00:46:56.480 and this kind of allowed him
00:46:59.300 to pose as a free speech defender.
00:47:01.180 I got to say,
00:47:02.180 the TikTok thing,
00:47:03.540 I think I sort of missed that.
00:47:05.040 I don't know why
00:47:05.660 I wasn't paying attention.
00:47:06.600 I should have been,
00:47:07.720 but TikTok was banned
00:47:09.240 by the Congress,
00:47:11.360 forced to sell.
00:47:12.340 It was a Chinese-owned company
00:47:13.200 by Dan's own TikTok
00:47:14.720 and the argument
00:47:16.200 in the Congress was,
00:47:17.280 well,
00:47:17.400 we can't have foreign ownership
00:47:19.040 of,
00:47:20.480 you know,
00:47:20.760 a critical service
00:47:21.520 like social media
00:47:22.620 in this country
00:47:23.640 and so it has to be owned
00:47:24.980 at least 51%
00:47:27.600 by Americans.
00:47:28.780 Okay,
00:47:29.120 I just,
00:47:29.480 I don't know why
00:47:30.040 I missed the significance
00:47:30.960 of this.
00:47:31.640 Then it turns out,
00:47:32.800 and they said this openly,
00:47:34.060 it had nothing to do
00:47:34.840 with China
00:47:35.300 and members of Congress
00:47:36.800 said this.
00:47:37.460 I'm,
00:47:37.920 a lot of Republicans,
00:47:39.240 I'm voting
00:47:40.160 to shut down TikTok
00:47:41.500 because people
00:47:43.640 are starting to like Hamas
00:47:44.740 when they watch it.
00:47:46.040 Now,
00:47:46.200 I'm not endorsing Hamas,
00:47:47.540 obviously,
00:47:48.100 I'm not pro-Hamas
00:47:48.800 at all,
00:47:50.160 but Americans have a right
00:47:51.620 to like anything they want
00:47:53.380 and to come to their own conclusions
00:47:55.180 about some foreign conflict
00:47:57.080 or even domestic conflict,
00:47:58.180 any conclusion they want
00:47:59.420 because they're not slaves.
00:48:01.740 So,
00:48:02.140 is it okay
00:48:02.820 for the Congress
00:48:03.460 to decide
00:48:03.920 I don't like the effects,
00:48:05.240 the radicalization
00:48:06.260 of this one
00:48:08.260 social media platform
00:48:09.980 so I'm going to shut it down?
00:48:12.000 I mean,
00:48:12.240 is that allowed?
00:48:13.060 Can they do that?
00:48:15.440 Well,
00:48:15.920 this gets to,
00:48:16.760 let me,
00:48:16.960 let me come to it
00:48:17.880 by addressing the,
00:48:19.240 one of the things
00:48:19.680 that you,
00:48:20.260 I thought rightly
00:48:20.820 discussed in your monologue,
00:48:22.480 which is this
00:48:23.180 very wonky
00:48:24.340 but important issue
00:48:25.600 of this law
00:48:26.400 called Section 230
00:48:27.440 and the nature
00:48:29.160 of these platforms
00:48:30.560 that we have.
00:48:31.500 And I think it's helpful
00:48:32.340 to think of these platforms
00:48:33.480 at this point
00:48:34.220 as utilities,
00:48:35.860 they're monopoly utilities.
00:48:37.640 Right.
00:48:37.880 You could say
00:48:38.320 there's some competition
00:48:39.220 between Instagram
00:48:40.220 and TikTok
00:48:40.780 and X
00:48:41.900 and there's truth
00:48:42.540 to that
00:48:42.900 but there's often
00:48:43.960 situations
00:48:44.600 in monopoly environments
00:48:46.100 where there's
00:48:46.920 some competition
00:48:47.660 but they really do
00:48:49.360 operate,
00:48:50.220 I think,
00:48:50.460 functionally
00:48:50.960 as monopolies
00:48:52.260 and they're
00:48:52.720 already regulated
00:48:54.340 monopolies
00:48:54.940 by Section 230.
00:48:56.020 They're already
00:48:56.400 saying to them
00:48:57.180 you're a different
00:48:57.760 category of business,
00:48:59.560 you're not liable.
00:49:00.760 if you take down
00:49:02.020 illegal content,
00:49:03.960 you're not,
00:49:04.520 you can't be sued
00:49:05.460 for having had
00:49:06.500 that content
00:49:07.140 on your website.
00:49:07.880 It still requires
00:49:08.560 them to take it down.
00:49:10.260 I think my view
00:49:11.680 and I've published
00:49:12.600 a couple of white papers
00:49:13.640 on it,
00:49:14.200 I've testified on it,
00:49:15.480 hasn't exactly
00:49:16.580 caught fire,
00:49:17.680 my proposal.
00:49:19.140 I've got a lot of views
00:49:19.900 like that too.
00:49:21.740 Yeah,
00:49:22.160 but I think like,
00:49:23.040 I think that,
00:49:24.320 you know,
00:49:24.500 I mean,
00:49:24.640 look,
00:49:24.840 one thing you have
00:49:25.200 to understand
00:49:25.420 about these big tech
00:49:26.720 companies,
00:49:27.120 they're so powerful.
00:49:28.200 There's,
00:49:28.700 I was shocked
00:49:29.660 when I learned that
00:49:30.180 like Facebook
00:49:30.720 has a different
00:49:31.400 lobbyist for House
00:49:32.660 Republicans
00:49:33.220 than for House
00:49:34.440 Democrats
00:49:35.040 and a separate
00:49:35.460 lobbyist for Senate
00:49:36.740 Republicans,
00:49:37.440 Senate Democrats.
00:49:38.280 I mean,
00:49:38.480 these guys really
00:49:39.360 put a lot of money
00:49:40.400 into having that
00:49:41.200 control over Congress.
00:49:42.100 So that's a little
00:49:43.620 bit like the utilities
00:49:44.560 power at the state
00:49:46.520 level,
00:49:46.860 the electric utilities
00:49:47.800 power at the state
00:49:49.300 level.
00:49:49.680 So it's a regulated
00:49:50.540 environment,
00:49:51.100 but I do think
00:49:51.800 public interest voices
00:49:52.760 like yourself
00:49:53.520 and Joe Rogan
00:49:54.380 and others
00:49:54.960 out there carrying
00:49:56.900 this message
00:49:57.400 is really important
00:49:58.260 because I think
00:49:59.560 what's in the
00:49:59.940 public interest
00:50:00.700 is that we
00:50:01.700 actually do
00:50:02.380 keep 230
00:50:03.420 but make it
00:50:04.880 contingent
00:50:05.700 on allowing
00:50:06.540 all adult
00:50:07.640 users
00:50:08.180 to filter
00:50:10.240 our own
00:50:10.980 content,
00:50:11.940 our own
00:50:12.360 legal content.
00:50:13.640 So in other
00:50:14.020 words,
00:50:14.900 all the child
00:50:15.640 exploitation stuff
00:50:16.740 obviously still
00:50:17.820 being policed
00:50:18.540 as we do now.
00:50:19.980 All of the,
00:50:21.120 you know,
00:50:21.640 copyright violation,
00:50:22.640 all that stuff
00:50:23.240 still being policed
00:50:24.240 as is now.
00:50:25.140 But when you
00:50:25.480 would go into
00:50:25.940 social media platform,
00:50:26.880 you'd have a chance
00:50:27.480 to basically decide
00:50:28.500 what you wanted to see
00:50:29.280 and what you didn't
00:50:29.820 want to see.
00:50:30.440 And there was
00:50:30.780 even some talk
00:50:31.480 among Republicans
00:50:33.040 who I respect,
00:50:34.080 but I disagree
00:50:34.760 on this issue
00:50:35.320 that we're upset
00:50:36.180 that the video
00:50:37.540 of Charlie Kirk
00:50:38.220 being assassinated
00:50:39.180 was on X.
00:50:40.840 I mean,
00:50:41.020 it was quite shocking.
00:50:42.060 I have to agree
00:50:42.780 with that,
00:50:43.420 but I don't think
00:50:43.920 the solution
00:50:44.340 is to necessarily
00:50:45.120 take it down.
00:50:45.880 But you could
00:50:46.160 certainly create
00:50:46.760 your own filter
00:50:47.340 that I want to
00:50:47.920 filter out any scenes
00:50:49.120 of people being
00:50:49.980 physically harmed.
00:50:51.280 You could have
00:50:51.560 a lot of different
00:50:52.140 filters.
00:50:52.560 You could have
00:50:52.820 the Tucker Carlson
00:50:53.620 filter.
00:50:54.460 You could have
00:50:54.720 the Greta Thunberg
00:50:55.460 filter.
00:50:56.620 But the users
00:50:57.660 would be able
00:50:58.300 to do that.
00:50:58.980 And then,
00:50:59.240 of course,
00:50:59.420 the platforms
00:50:59.920 like X already
00:51:00.560 does can feed
00:51:01.480 you their own
00:51:02.280 platform,
00:51:02.800 a separate stream.
00:51:04.320 I think Elon
00:51:05.280 has gotten
00:51:05.960 pretty close
00:51:06.880 to that at X.
00:51:08.160 Not as close
00:51:09.040 as I would love
00:51:09.680 to see it,
00:51:10.680 but we're so
00:51:11.340 grateful because,
00:51:12.640 I mean,
00:51:12.960 the impact
00:51:13.560 that he's had
00:51:14.460 has been so
00:51:14.960 enormous in terms
00:51:16.280 of ending
00:51:17.060 of ending
00:51:17.080 this censorship
00:51:18.820 fact-checking
00:51:20.160 mafia.
00:51:21.620 Mark Zuckerberg
00:51:22.480 at Meta
00:51:23.160 earlier this year
00:51:23.960 decided he was
00:51:24.560 going to copy
00:51:25.100 the Elon Musk
00:51:26.340 model of
00:51:26.920 crowdsourcing,
00:51:28.200 which is what
00:51:28.680 the spirit
00:51:29.260 of the First
00:51:29.680 Amendment is.
00:51:30.200 We crowdsource
00:51:30.800 truth with the
00:51:31.460 First Amendment.
00:51:32.580 And then we just
00:51:33.080 saw Google
00:51:33.760 yesterday in a
00:51:35.480 letter to
00:51:36.200 Chairman Jim
00:51:36.940 Jordan in the
00:51:37.540 House so that
00:51:38.660 they would move
00:51:39.240 to something
00:51:39.740 more like that.
00:51:41.120 So those are
00:51:41.700 good directions,
00:51:42.860 but for me,
00:51:43.900 that should be
00:51:44.480 the only issue
00:51:45.120 of Section 230
00:51:45.840 reform is to
00:51:47.000 actually expand
00:51:49.040 the speech that's
00:51:50.480 allowed, not
00:51:51.440 restricted.
00:51:52.300 What they want
00:51:52.860 to do is they
00:51:53.640 want to basically
00:51:54.360 give the deep
00:51:55.960 state, for lack
00:51:56.900 of a better word,
00:51:57.780 DHS, NSF,
00:51:59.860 DOD, the power
00:52:01.420 to kind of choose
00:52:02.460 the people that
00:52:03.620 will decide what
00:52:04.180 the truth is as
00:52:05.040 these NGOs who
00:52:06.080 would then get NSF
00:52:07.080 money, which is
00:52:08.040 public money,
00:52:08.800 National Science
00:52:09.200 Foundation money,
00:52:10.380 and that they
00:52:10.880 would then get
00:52:11.360 special access to
00:52:12.400 the data.
00:52:12.980 I mean, this was
00:52:13.340 their whole vision
00:52:13.960 and they were
00:52:14.260 close to achieving
00:52:15.100 it.
00:52:15.300 We only really
00:52:15.760 discovered it
00:52:16.360 with the Twitter
00:52:16.820 files.
00:52:18.020 That's their
00:52:18.660 holy grail, to be
00:52:19.460 able to control
00:52:19.980 it that way.
00:52:21.360 They're set back
00:52:22.300 in the United
00:52:22.700 States, but they
00:52:23.660 are moving for
00:52:24.520 sure in that
00:52:25.500 direction in
00:52:26.400 Europe, Britain,
00:52:28.280 Brazil, and
00:52:29.060 certainly California
00:52:29.820 would like to do
00:52:30.420 the same.
00:52:30.940 I hope that if
00:52:31.620 Gavin does sign
00:52:32.420 that atrocious
00:52:33.080 legislation that you
00:52:34.060 were describing,
00:52:34.880 Tucker, I hope
00:52:36.180 that the Supreme
00:52:36.720 Court, or that
00:52:37.480 the courts strike it
00:52:38.520 down.
00:52:39.200 They struck down
00:52:39.960 the last California
00:52:41.180 censorship initiative,
00:52:42.300 which was aiming
00:52:42.860 at banning AI
00:52:45.800 parodies, and that
00:52:48.100 got struck down
00:52:48.700 by a judge in a
00:52:49.480 very eloquent
00:52:50.220 decision.
00:52:51.520 But that's kind
00:52:52.000 of where we're
00:52:52.360 at and why I
00:52:53.240 think you're special
00:52:55.000 on this is so
00:52:55.760 important, because
00:52:56.300 we're on a knife's
00:52:57.000 edge.
00:52:57.220 On the one hand, I
00:52:57.760 think we're making
00:52:58.200 some good progress
00:52:58.980 here in the United
00:52:59.520 States in exposing
00:53:00.700 the censorship and
00:53:02.220 defunding it, but I
00:53:04.260 think worldwide the
00:53:05.160 trends are in the
00:53:05.780 wrong direction, and
00:53:06.800 on college campuses
00:53:07.820 with young people,
00:53:08.920 unfortunately, we've
00:53:10.480 seen an increase of
00:53:11.600 support among
00:53:12.320 censorship, really
00:53:13.200 every generation,
00:53:14.420 from baby boomers
00:53:15.180 to Gen Xers to
00:53:16.040 millennials to
00:53:16.720 Zoomers, that we
00:53:18.240 see support for
00:53:19.000 censorship going up,
00:53:20.060 and it's exactly like
00:53:21.180 what you were
00:53:21.580 saying.
00:53:22.060 It's about protecting
00:53:23.380 feelings.
00:53:24.420 It's this, you know,
00:53:25.600 to use a bit of
00:53:26.160 jargon, it's this
00:53:26.960 expressive individualism,
00:53:28.440 which is that my
00:53:29.040 feelings are like the
00:53:29.980 most important thing
00:53:30.780 in the world, and
00:53:31.400 if my feelings are
00:53:32.220 hurt, then somebody
00:53:32.960 has to pay a price.
00:53:34.680 That, sadly, is the
00:53:36.040 ideology, and that's
00:53:37.040 why I think the
00:53:37.540 censorship is in
00:53:38.880 this broader, you
00:53:40.960 know, cultural
00:53:41.640 decline, you know,
00:53:42.820 where it's like the
00:53:43.820 intolerance and the
00:53:45.700 entitlement that people
00:53:47.100 feel are these big
00:53:48.620 forces that have been,
00:53:49.860 I think, driving those
00:53:50.740 demands for
00:53:51.220 censorship.
00:53:52.260 I think everything
00:53:53.340 you've said is
00:53:53.880 absolutely true.
00:53:55.040 I think it may even
00:53:56.280 be more insidious than
00:53:57.320 that, however.
00:53:57.900 I think that there
00:53:58.560 are decent people
00:54:00.200 who've had their best
00:54:01.640 impulses hijacked by
00:54:03.500 totalitarians and
00:54:05.320 used against them.
00:54:06.120 In other words, I
00:54:06.660 think there are
00:54:07.280 good people,
00:54:08.380 Americans, mostly
00:54:09.080 women, to be totally
00:54:10.340 blunt about it, who
00:54:11.660 are like, oh, we
00:54:12.500 can't be mean to this
00:54:13.560 or that group.
00:54:14.200 Well, I think that's a
00:54:14.900 good impulse by that.
00:54:15.920 We shouldn't be mean
00:54:16.500 to any group, and the
00:54:18.040 weaker they are, the
00:54:19.220 more careful we should
00:54:20.020 be about being mean to
00:54:20.840 them because you don't
00:54:21.300 want to be a bully,
00:54:21.920 right?
00:54:22.440 That's a good impulse.
00:54:24.060 But that impulse is
00:54:25.120 hijacked by the censors
00:54:26.660 who are acting on their
00:54:27.480 own behalf, not on
00:54:29.060 behalf of whatever
00:54:29.720 marginalized group they
00:54:30.840 claim they're acting on
00:54:31.540 behalf of.
00:54:32.160 They don't care about
00:54:32.740 those groups,
00:54:33.560 obviously.
00:54:34.520 The lives of black
00:54:35.540 black people in inner
00:54:36.040 cities did not improve
00:54:36.860 after the George Floyd
00:54:37.680 riots, obviously.
00:54:39.420 But they hijack that
00:54:41.580 and they say, if you
00:54:42.540 care about the
00:54:43.420 weakest among us, you
00:54:45.840 will get on board with
00:54:46.620 censorship.
00:54:47.780 I think that's really
00:54:49.040 clever and insidious and
00:54:51.840 evil, but effective.
00:54:52.880 Yeah, I think that's
00:54:55.880 right.
00:54:56.220 I think it's a
00:54:57.980 manipulation.
00:54:59.620 It just shows how
00:55:00.700 emotional the culture
00:55:02.420 has gotten, that you
00:55:03.900 can appeal to those
00:55:05.500 feelings.
00:55:06.640 I mean, Tucker, it's so
00:55:07.520 easy to show how much
00:55:09.340 of an abuse of power you
00:55:10.420 can get with these hate
00:55:11.500 speech laws.
00:55:12.380 I mean, it's worth
00:55:12.820 considering, for example,
00:55:14.260 you'll get people that
00:55:14.920 will be like, well, are
00:55:16.000 you defending the right
00:55:17.220 of people to call for
00:55:18.900 genocide?
00:55:19.980 And you kind of go, oh,
00:55:20.500 my God, well, no, I mean,
00:55:21.240 that's horrible.
00:55:21.700 I don't want people to
00:55:23.440 kind of go, so we'll
00:55:24.800 carve that out.
00:55:25.640 But then you kind of go,
00:55:26.560 well, wait a second.
00:55:26.980 So the same people that
00:55:28.480 are saying that to you
00:55:29.520 are the ones that point
00:55:31.320 out as soon as you talk
00:55:32.680 about the American
00:55:33.600 experiment of the 17th
00:55:35.680 and 18th and 19th
00:55:36.720 centuries, that the
00:55:38.140 European settlers
00:55:39.400 committed a genocide to
00:55:40.980 create the United
00:55:41.560 States of America.
00:55:42.560 So how hard would it be
00:55:44.240 to criticize somebody
00:55:45.900 that, say, praised
00:55:47.100 America, praised the
00:55:49.200 Western expansion,
00:55:50.320 praised the West,
00:55:51.060 the West opening up
00:55:52.540 and the European
00:55:53.060 settlers, frankly,
00:55:55.100 taking over the United,
00:55:56.020 taking over this land
00:55:57.120 from indigenous people,
00:55:58.640 someone could say
00:55:59.280 that's defending genocide.
00:56:00.540 So you see how easy
00:56:01.720 and quickly it comes.
00:56:02.800 Of course.
00:56:03.040 I'll give you another
00:56:03.440 example.
00:56:04.440 Institute for Strategic
00:56:05.560 Dialogue, which is one
00:56:06.800 of these deeply sinister,
00:56:08.520 I mean, they would not
00:56:09.180 return any phone calls
00:56:10.200 or whatever, and they
00:56:10.860 personally smeared me
00:56:13.760 and a lot of others.
00:56:14.680 Wait, wait, wait, can I
00:56:16.280 ask you to, as Stupsu,
00:56:17.880 the Center for Strategic
00:56:19.060 Dialogue didn't want
00:56:20.280 any dialogue?
00:56:22.260 Yeah, the Institute for
00:56:23.420 Strategic Dialogue
00:56:24.120 refused to have dialogue.
00:56:26.100 I'm not surprised.
00:56:27.160 Creepy, creepy group
00:56:28.760 close to British
00:56:29.660 intelligence.
00:56:30.640 I mean, it's just like,
00:56:32.740 I mean, close to I'm
00:56:33.920 being generous, you know,
00:56:35.040 like clearly an
00:56:35.760 intermediary with deep
00:56:38.020 state British
00:56:38.660 organizations that is
00:56:39.780 very interested in
00:56:40.560 censoring Americans.
00:56:41.240 And we see this dynamic
00:56:42.220 a lot where they, you
00:56:43.520 know, the Brits, you
00:56:44.340 know, pick on Americans,
00:56:45.800 the British groups pick
00:56:46.520 on Americans because,
00:56:48.680 you know, the U.S.
00:56:49.520 intelligence community
00:56:51.240 can't directly go after
00:56:52.860 Americans.
00:56:53.240 So they get their
00:56:53.820 British allies to do
00:56:56.340 it.
00:56:56.580 And this is a group that
00:56:57.540 labeled, along with the
00:56:59.400 Center for Countering
00:56:59.980 Digital Hate, which is
00:57:01.060 equally sinister
00:57:01.920 organization, very
00:57:02.860 connected to the Labour
00:57:03.580 Party of Britain, they
00:57:05.020 label criticism of
00:57:06.460 George Soros as
00:57:08.860 anti-Semitic.
00:57:10.040 And Tucker, I'm not
00:57:10.940 saying criticism of
00:57:12.260 George Soros and
00:57:13.720 even noticing that
00:57:15.120 he's Jewish.
00:57:16.140 It wasn't even that.
00:57:17.020 It wasn't like they
00:57:17.580 said the Jewish
00:57:18.580 philanthropist George
00:57:19.660 Soros.
00:57:19.940 They'd be like, just
00:57:20.840 criticizing George
00:57:22.400 Soros was anti-Semitic.
00:57:24.640 That's how crazy it
00:57:26.720 is and got that.
00:57:29.120 And, you know, you
00:57:29.580 look at the number of
00:57:30.100 institutions that have
00:57:30.900 been putting this.
00:57:31.480 It's the European
00:57:32.000 Union.
00:57:32.740 It's NATO.
00:57:33.720 It's the United
00:57:34.600 Nations.
00:57:35.520 It's Germany, France,
00:57:37.160 Britain.
00:57:38.060 The United States is
00:57:38.680 really powerful.
00:57:39.300 And I think that the
00:57:40.820 president did a good
00:57:41.580 job pushing back
00:57:42.640 against those types of
00:57:43.720 people around the
00:57:44.460 world.
00:57:44.900 But it is important to
00:57:46.040 remember that the
00:57:46.520 European economy is
00:57:47.580 bigger than the United
00:57:48.500 States economy.
00:57:49.240 And certainly when you
00:57:50.180 go and kind of look at a
00:57:51.380 world with this
00:57:52.760 incredible economic
00:57:54.000 power of China and the
00:57:55.720 gravity it exercises and
00:57:57.280 all these other
00:57:57.720 countries in the world,
00:57:58.420 including Europe,
00:57:59.900 including Brazil.
00:58:00.920 I mean, it was notable
00:58:01.800 that when Trump punished
00:58:02.980 Brazil with tariffs for
00:58:05.080 its censorship and
00:58:06.200 banning their opposition
00:58:08.600 party leader and the
00:58:09.860 leading presidential
00:58:10.520 candidate, Bolsonaro,
00:58:12.060 that China made up the
00:58:13.200 difference in the loss of
00:58:14.480 trade.
00:58:15.120 So you sort of start to
00:58:16.100 see the world, you know,
00:58:18.180 and particularly get this
00:58:19.160 kind of organic, you
00:58:20.560 know, decline of real
00:58:21.660 belief and support in
00:58:22.860 free speech with a kind of
00:58:24.440 global move towards this
00:58:26.040 censorship industrial complex
00:58:27.960 system of censorship by
00:58:30.160 proxy, it is disturbing
00:58:31.760 because they can exercise
00:58:32.900 economic power over our
00:58:34.240 platforms.
00:58:34.720 And, you know, I mean, I
00:58:36.340 think Elon has shown good
00:58:37.860 reason that we can, you
00:58:39.020 know, basically trust what
00:58:40.420 he's done.
00:58:40.980 I think he's made great
00:58:41.720 decisions for the most
00:58:43.860 part since he's taken over
00:58:45.160 the platform.
00:58:45.680 But, you know, if anything
00:58:46.660 were to happen to Elon, I
00:58:47.820 mean, these companies, Mark
00:58:49.040 Zuckerberg, Google, Sundar
00:58:51.120 Pichai, they've shown
00:58:52.220 themselves to be quite
00:58:53.140 cowardly.
00:58:54.660 Facebook was worried about
00:58:56.640 the lack of help from the
00:58:58.440 Biden administration.
00:58:59.140 It was enough to get
00:59:00.120 Facebook to censor because
00:59:01.880 they were worried about
00:59:02.840 not having enough help
00:59:04.160 from the Biden
00:59:04.700 administration to retrieve
00:59:06.200 their very valuable user
00:59:07.720 data from Europe, which
00:59:08.920 their laws require.
00:59:10.340 And that's why they
00:59:11.520 agreed to do censorship
00:59:13.140 that even their own
00:59:14.180 social scientists within
00:59:15.440 Facebook said would
00:59:17.060 backfire because they
00:59:18.300 were like, look, if you
00:59:18.940 go censor mothers
00:59:20.620 sharing information about
00:59:22.420 the COVID vaccine side
00:59:24.200 effects, it will make
00:59:25.300 mothers more nervous.
00:59:26.080 That's what the internal
00:59:27.260 people at Facebook said
00:59:29.080 and the Facebook execs
00:59:30.240 were like, we better just
00:59:30.940 give the Biden
00:59:31.520 administration what they
00:59:32.280 want, otherwise they're
00:59:32.980 not going to help us
00:59:33.540 with our data in Europe.
00:59:34.620 So it doesn't, I mean,
00:59:36.840 it's not hard to imagine,
00:59:38.800 you know, the power that
00:59:39.840 these states can exercise
00:59:40.880 on these platforms and I
00:59:42.020 don't think that that
00:59:43.000 threat has gone away.
00:59:45.640 It doesn't seem like a
00:59:46.540 good system if one, you
00:59:49.500 know, South African born
00:59:50.780 naturalized American is the
00:59:52.920 only thing standing
00:59:54.000 between us and
00:59:54.860 tyranny.
00:59:55.780 I mean, I really think
00:59:56.680 that the media wouldn't
00:59:58.560 be, I mean, I work in
00:59:59.920 the media, my whole life
01:00:00.820 I've worked in the media.
01:00:01.800 Elon Musk did this.
01:00:03.240 Elon Musk did all this
01:00:04.980 and he did it because I
01:00:06.860 think he says, because he
01:00:08.300 believes in it, whatever
01:00:09.320 the cause, he did it.
01:00:11.040 He opened up everything.
01:00:12.200 Yeah.
01:00:12.720 So, but that, which I'm
01:00:14.480 will never stop being
01:00:15.440 grateful for, obviously.
01:00:16.900 However, that's a pretty
01:00:18.400 thin thread kind of holding
01:00:20.860 your civilization aloft, no?
01:00:24.580 Yeah.
01:00:25.120 I mean, I, the things I
01:00:26.200 really worry about are
01:00:27.520 those numbers, you know,
01:00:28.960 I mean, you've had, you
01:00:30.140 know, you know, just those
01:00:32.040 numbers of young people
01:00:33.140 that, I mean, the number
01:00:34.460 of young people, the number
01:00:35.860 of college students, the
01:00:37.640 share of college students
01:00:39.080 that said that violence may
01:00:40.580 sometimes be necessary to
01:00:42.660 stop a campus speaker was
01:00:45.100 under 20% in the year 2020.
01:00:47.140 It's 34% right now.
01:00:50.640 That means one third of
01:00:52.000 college students think that
01:00:53.080 violence might be necessary
01:00:54.300 to stop a campus speaker.
01:00:57.020 That is, I mean, that's
01:00:58.720 pathological.
01:00:59.500 I don't know if there's
01:00:59.900 another word, but that's
01:01:00.620 bonkers, crazy, scary
01:01:03.280 behavior.
01:01:04.140 And so, you know, remember
01:01:05.740 George Orwell, he was,
01:01:07.660 you know, a leftist, wrote
01:01:10.860 1984 because he had read
01:01:13.020 James Burnham, who's this
01:01:14.380 very famous, you know,
01:01:15.460 former Trotsky guy who
01:01:16.460 becomes a conservative and
01:01:17.700 writes, you know, this book
01:01:19.800 about the managerial state,
01:01:21.380 which is basically about how
01:01:22.820 this totalitarianism would
01:01:25.000 kind of emerge out of the
01:01:26.860 society and out of the
01:01:28.440 state in these, in these
01:01:30.940 very specific safetyist, you
01:01:33.720 know, harm reduction demands
01:01:35.980 that, that you would sort of
01:01:37.320 get a whole kind of state of
01:01:38.980 busy body, you know, nanny
01:01:41.640 state people that wanted to
01:01:43.120 police the speech.
01:01:44.200 I mean, that was his
01:01:44.880 prediction in 19, whatever
01:01:46.140 that was, 1947, I think, or,
01:01:48.620 you know, when 1984 came out.
01:01:51.160 That is, was so brilliant.
01:01:53.020 I mean, it's terrifying,
01:01:53.880 brilliantly pressing it
01:01:54.940 because that's what I worry
01:01:56.400 about.
01:01:56.800 And I think, you know, I saw
01:01:58.540 the, obviously very moved by
01:02:00.160 all the Charlie Kirk, what's
01:02:02.020 the response to the Charlie
01:02:03.240 Kirk assassin and the desire to
01:02:05.320 go into universities.
01:02:06.200 I think we need to figure out
01:02:08.500 how to move that number down
01:02:10.280 so that people really do be,
01:02:12.120 the young people become, and
01:02:13.320 everybody becomes more
01:02:14.700 comfortable with, yeah, I mean,
01:02:17.080 look, I mean, Charlie really
01:02:18.200 was inspiring in the way that
01:02:21.020 he would go into places.
01:02:22.120 And of course, the sign said,
01:02:23.140 prove me wrong.
01:02:24.300 He was saying, look, I'm open
01:02:25.800 to debate.
01:02:26.520 It's exactly, he, I mean, it's
01:02:28.760 so, I don't know, ironic is not
01:02:30.220 the right word.
01:02:30.860 It's so powerful that the
01:02:34.340 person that was assassinated
01:02:36.760 was the person doing what we
01:02:38.880 need the most of, that the
01:02:40.980 person that was killed was the
01:02:42.440 person who was doing what we
01:02:44.260 need to see much, much more of
01:02:46.220 at the high schools and the
01:02:47.640 colleges, which is getting
01:02:49.040 people very comfortable with
01:02:51.300 having difficult conversations
01:02:52.940 and with having conversations
01:02:54.200 with people whose values you
01:02:55.560 don't share and who believe
01:02:56.860 things that you find
01:02:57.660 reprehensible.
01:02:59.180 And that is at the heart of
01:03:00.320 it.
01:03:00.400 And I don't know.
01:03:01.140 I mean, it's sort of, what's
01:03:02.100 terrifying is that, you know,
01:03:04.700 those numbers of intolerance
01:03:06.580 kept increasing over the last
01:03:08.500 10 years.
01:03:09.260 I hope that we've hit an
01:03:10.320 inflection point.
01:03:11.440 I will say, Tucker, one number
01:03:12.980 that did change that gave that
01:03:14.540 heart that I felt some heart
01:03:15.660 in was that Pew had found that
01:03:17.860 the share of Democrats that
01:03:19.940 thought the government should be
01:03:20.960 involved in censoring
01:03:21.760 misinformation online had
01:03:23.020 risen from 40% in 2018 to
01:03:25.700 70% in 2023.
01:03:28.620 They did the stud, the same
01:03:30.120 question earlier this year,
01:03:31.640 and it's now declined to 58%.
01:03:33.620 So I do feel like there is a
01:03:34.840 chance at which the, I mean,
01:03:36.380 it's still terrible, but like
01:03:37.840 the, the, the, the sense in
01:03:39.320 which that trance has broken,
01:03:40.860 you know, that hypnotic, we have
01:03:42.980 to fight misinformation that
01:03:44.260 just bonkers, anti-American,
01:03:47.020 un-American impulse.
01:03:48.440 It feels like it was broken, but I
01:03:50.640 still think there's a lot of that,
01:03:52.020 you know, cultural work that we
01:03:53.360 need to do to really educating
01:03:54.740 kids because I just don't think
01:03:56.740 free speech is intuitive.
01:03:57.680 I mean, you go to a playground
01:03:58.840 and you see little kids playing
01:04:00.100 and they just are yelling, shut
01:04:01.920 up, shut up at each other all
01:04:03.280 the time.
01:04:04.040 It's our natural instinct.
01:04:05.500 You hear something you don't
01:04:06.440 like, you want to shut them up
01:04:07.840 and the alternative to listen to
01:04:09.900 somebody and actively disagree
01:04:11.660 with them and maybe think about
01:04:13.500 how to respond or just figure out
01:04:14.840 if you agree or disagree.
01:04:16.120 It takes, it's like a muscle.
01:04:17.540 It just takes practice.
01:04:18.420 And I think we have to teach the
01:04:19.780 kids, you know, both the K through
01:04:22.040 12 and the college students how to
01:04:24.180 do that and that doing that is a
01:04:25.960 core value that will be rewarded
01:04:29.060 in life and we should be
01:04:30.880 celebrating rather than the
01:04:32.340 opposite, which is this desire to
01:04:33.940 silence and shut down.
01:04:35.540 Are you concerned that
01:04:36.460 technological advances that
01:04:38.400 we're in the middle of really
01:04:39.820 will be harnessed to affect
01:04:43.020 censorship without people even
01:04:44.800 knowing it?
01:04:45.320 I mean, you did the Twitter files
01:04:46.700 with Matt Taibbi and found what
01:04:49.700 I don't think anyone knew.
01:04:50.620 There was this vast censorship
01:04:52.360 program at Twitter and but most
01:04:55.080 users had the sense that, you
01:04:56.720 know, it's a liberal website,
01:04:57.660 whatever, they're taking the
01:04:58.380 conservatives off, but had no
01:04:59.880 idea about the specifics until
01:05:02.960 you brought them to light.
01:05:04.260 Does AI increase the power of the
01:05:06.860 platforms to take information
01:05:09.520 off the site without anyone even
01:05:11.180 knowing it's been taken off?
01:05:14.180 Yeah, I mean, just to answer that
01:05:15.800 question, I'll preface it by saying,
01:05:17.260 of course, I watched very closely
01:05:18.940 your interview with Sam Altman,
01:05:20.820 where you asked, I think, some
01:05:22.160 very important questions, which
01:05:23.900 is what is the moral framework
01:05:25.520 with which his AI will follow?
01:05:29.060 That is the right question.
01:05:31.560 And of course, it remains an
01:05:33.180 ever-present question.
01:05:34.400 It's not like it will never go
01:05:35.780 away.
01:05:36.900 Ultimately, these decisions about
01:05:38.720 what gets censored and what the AI
01:05:40.580 censors for us are made by people.
01:05:43.500 And so you look at the worst
01:05:44.780 episodes of censorship over the
01:05:46.500 last five to 10 years.
01:05:49.060 You can find the people that were
01:05:50.580 demanding the censorship.
01:05:51.880 You can find the groups they
01:05:53.080 created to demand it.
01:05:55.140 That it was human decisions and
01:05:57.540 that, in fact, at the company
01:05:59.660 level in the Twitter files, there
01:06:02.040 was a huge amount of debate
01:06:03.380 around, I mean, not enough, a
01:06:05.380 huge amount of debate around
01:06:06.460 deplatforming the president of the
01:06:08.520 United States, like removing the
01:06:10.540 account of the president of the
01:06:11.560 United States, which is so
01:06:12.320 insane.
01:06:13.820 It was a big deal, obviously.
01:06:15.220 It was talked about.
01:06:15.940 And then, of course, you could see
01:06:16.800 it.
01:06:17.460 Same thing with the Hunter Biden
01:06:19.180 laptop, where the FBI ran a
01:06:20.900 deception operation against the
01:06:23.140 social media platforms illegally
01:06:25.020 in an illegal conspiracy that
01:06:28.040 included spreading disinformation
01:06:29.460 about the laptop.
01:06:31.260 That was obviously a very elaborate
01:06:33.400 thing that a lot of people could
01:06:34.780 see and were getting kind of
01:06:35.960 glimpses into.
01:06:37.320 There was also just the humdrum or
01:06:39.800 the ordinary kind of de-amplification.
01:06:43.040 So you remember Twitter famously
01:06:44.320 said, oh, we don't shadow ban.
01:06:46.160 That was the language that people
01:06:47.500 had used.
01:06:48.480 Well, of course they did.
01:06:49.900 They called it something
01:06:50.860 different.
01:06:51.460 It was called like, you know, do
01:06:53.280 not amplify lists, for example,
01:06:55.900 it's like a kind of blacklist that
01:06:57.120 they ran or a trends blacklist.
01:06:59.020 Don't let them show up on the
01:06:59.920 trends thing.
01:07:00.940 So there's just all a million
01:07:02.020 dials, of course, as you know,
01:07:04.140 Tucker, to like kind of turn these
01:07:05.280 things up and down.
01:07:06.220 Yeah.
01:07:06.340 The AI can help, but sometimes,
01:07:08.360 you know, like they wanted to go
01:07:09.260 after a QAnon conspiracy.
01:07:10.780 At one point, I reported this in
01:07:12.520 my Twitter files on the decision
01:07:14.040 to de-platform Trump.
01:07:15.660 And there was something around
01:07:16.740 like the Kraken, which I guess is
01:07:18.480 like a giant like squid in the
01:07:20.420 ocean.
01:07:20.780 I think it's, you know, they were
01:07:22.220 like the Kraken was somehow tied
01:07:23.440 into QAnon conspiracy theories.
01:07:25.580 And they wanted to censor that,
01:07:27.120 which is also insane.
01:07:28.320 Like they wanted to literally stop
01:07:29.740 people from talking about Kraken,
01:07:31.460 like on Twitter.
01:07:32.920 And then somebody figured out that
01:07:34.300 the Seattle, I think hockey team
01:07:36.540 is the Kraken and that all these
01:07:38.380 tweets around hockey were getting
01:07:40.220 like swept up in it.
01:07:41.580 So it's like, you know, it's like
01:07:43.080 I worry about it, but ultimately
01:07:44.580 it's not a bunch of censors in
01:07:46.320 like the Philippines or even I
01:07:48.840 think Palo Alto, these like the
01:07:50.220 worst forms of censorship are being
01:07:51.400 decided at the executive level.
01:07:53.080 But as I said, my view is that if
01:07:55.600 you have section 230, which is what
01:07:57.780 gives you the power to be a
01:07:59.660 monopoly, it's like literally like
01:08:01.880 the path, like the permit to
01:08:03.420 operate as a functional, natural
01:08:05.100 monopoly, I think that you should
01:08:07.680 have to give the user, the adult
01:08:11.160 user complete control over all
01:08:13.680 legal content and you can censor
01:08:16.180 the illegal content.
01:08:17.840 And I do think we should, there's
01:08:18.840 a whole separate thing on kids,
01:08:20.780 you know, which I think is
01:08:21.880 complicated because they're using
01:08:23.340 the kids right now in Australia.
01:08:25.020 They're literally using the kids
01:08:26.620 in Australia to create digital
01:08:28.040 identifications as a way to
01:08:29.280 censorship, which I think we should
01:08:30.620 be alarmed about.
01:08:31.420 So, but nonetheless, as a father
01:08:33.000 who's seen the impact of social
01:08:34.100 media on adolescents, I do worry
01:08:36.260 about it.
01:08:37.160 But I do think like if you're
01:08:37.960 going to have section 230, that's
01:08:39.640 that should be the agreement.
01:08:42.800 Yeah.
01:08:43.260 And I thank you for describing it
01:08:44.820 as using the kids because it is
01:08:46.620 the most obviously transparently
01:08:49.160 cynical attempt to censor
01:08:51.940 political speech by using the
01:08:54.060 suffering of children about whom
01:08:55.460 they care nothing.
01:08:56.020 Obviously, um, there's no
01:08:57.960 demonstrated care for kids.
01:08:59.680 Like how are the schools, you
01:09:01.180 know, they just don't care.
01:09:02.700 Oh, um, right.
01:09:03.800 And any pretext will do.
01:09:05.280 I mean, the terrorism thing was
01:09:07.080 huge, as you know, um, under the
01:09:09.580 Bush administration, it's
01:09:10.680 terrorism.
01:09:11.700 What is that exactly?
01:09:12.820 Can you, can you define it for
01:09:13.740 me?
01:09:13.880 No, they can't.
01:09:14.800 Um, I'm wondering though, what's
01:09:16.660 the recourse.
01:09:17.700 So these are decisions you just
01:09:19.000 described that are being made at
01:09:20.320 like the highest level of global
01:09:23.020 global society.
01:09:24.200 I mean, the richest man in the
01:09:25.380 world decided to restore free
01:09:26.980 speech to the United States.
01:09:29.180 The president of the United
01:09:30.460 States helped him, um, judge,
01:09:33.940 federal judges rule on these
01:09:35.300 things, but let's say we have a
01:09:37.080 different president and there's
01:09:38.400 no Elon or his commitment
01:09:39.620 changes and there's a different
01:09:41.360 Supreme court, uh, like where's
01:09:45.500 the power to fight back against
01:09:47.840 this is, can you imagine a kind of
01:09:49.660 civil disobedience that people
01:09:51.920 could use to regain their speech?
01:09:54.520 Uh, I'm trying to think through
01:09:55.700 what that would look like.
01:09:57.480 Well, look at the top of my list
01:09:59.320 is that we are in something called
01:10:01.860 the NATO organization and it has a
01:10:04.580 treaty that requires that we only
01:10:07.580 have as members free democracies.
01:10:11.360 We only are going to defend
01:10:13.460 countries that allow free speech and
01:10:16.680 that allow candidates, uh, people to
01:10:18.900 choose the candidates of their
01:10:19.860 choice.
01:10:20.280 Currently that is absolutely under
01:10:23.140 attack in Europe.
01:10:24.500 Romania has already prevented as you
01:10:27.060 interviewed the Romania has
01:10:28.720 prevented their presidential
01:10:30.160 front runner.
01:10:31.180 Now France is about to ban their
01:10:33.580 presidential front runner Marine
01:10:35.220 Le Pen in a completely trumped up
01:10:37.460 charge that the prime minister, the
01:10:40.020 last prime minister was already
01:10:41.700 guilty of and still, uh, came into
01:10:43.800 office.
01:10:44.940 Germany there's a, I just interviewed a
01:10:47.220 mayoral candidate who was banned for,
01:10:49.380 for like made up reasons because he
01:10:51.280 liked Lord of the Rings.
01:10:52.380 I'm not even kidding.
01:10:53.720 And he went to a book fair where
01:10:55.060 there were some people that the
01:10:56.080 intelligence services, uh, didn't like
01:10:58.560 people.
01:10:58.940 And that was like the basis of the
01:11:00.380 election council preventing him from
01:11:01.460 running.
01:11:02.000 And then they have these elaborate
01:11:03.260 censorship industrial complexes.
01:11:05.020 We're part of NATO.
01:11:06.540 Everybody knows that we're the main
01:11:08.100 event.
01:11:08.600 We subsidize it to the tune of, you
01:11:11.400 know, hundreds of billions of
01:11:12.940 dollars a year.
01:11:14.400 Like you, Tucker, I'm willing to die
01:11:16.620 for free speech and democracy.
01:11:18.440 Like I am with Seneca, uh, like the
01:11:21.080 Spartan slave boy in the great Seneca
01:11:23.020 passage.
01:11:24.220 I would rather die, you know, a free
01:11:26.820 man than live as a slave.
01:11:28.140 And so we, what we are willing to die
01:11:30.120 for, for freedom.
01:11:31.520 And I think we all care a lot about
01:11:32.940 Western civilization in Europe, but not
01:11:35.160 if they're going to, I don't want to
01:11:36.800 defend, I'm not going to, I don't want
01:11:38.220 to put my life on the line to defend
01:11:40.140 authoritarian sensorial autocracy autocracies
01:11:43.620 like France, Germany, and Romania and
01:11:46.420 potentially Britain.
01:11:47.240 So I think the president has, you know,
01:11:49.920 been pretty strong on it.
01:11:51.120 You know, they're still negotiating this
01:11:52.500 right now, but I just think that the
01:11:53.780 public certainly MAGA, but whatever
01:11:56.560 leftists are still in favor of free
01:11:58.180 speech out there, including, as you
01:11:59.480 mentioned, I think a lot of the pro
01:12:00.900 Palestinian folks that felt person, you
01:12:02.860 know, felt censored on Tik TOK and
01:12:04.220 elsewhere and have been censored in
01:12:05.900 other ways that we should all make
01:12:08.600 very clear that we don't want to be a
01:12:09.940 part of a military, uh, treaty that, uh,
01:12:13.960 has us risking our lives for, for
01:12:16.720 illiberal autocracies.
01:12:18.260 Like that's gotta be at the top of the
01:12:19.420 list, same thing with Brazil.
01:12:21.180 It's like, you know, okay.
01:12:23.880 I think people, Americans need to know we
01:12:25.500 should pay more for orange juice if it
01:12:28.280 means protecting our freedom of speech
01:12:30.420 that like our freedom of speech.
01:12:31.620 It's not like a small thing.
01:12:33.080 It's like the main event.
01:12:34.340 It's like the reason why America is the
01:12:36.020 greatest country that's ever existed.
01:12:38.280 And certainly the greatest country in
01:12:39.580 the world still, despite all of our
01:12:41.220 problems and the country that everybody
01:12:42.940 wants to live in, it's because of the
01:12:44.260 first amendment of free speech.
01:12:45.300 So it just has to be an absolute
01:12:46.900 non-negotiable.
01:12:48.300 So I said, this is the number one issue.
01:12:50.220 If you don't have free speech, you don't
01:12:51.560 have anything.
01:12:52.180 You don't have democracy.
01:12:53.220 You don't have your dignity.
01:12:56.140 You don't have prosperity.
01:12:57.140 You don't have, infrastructure can't
01:12:59.060 work.
01:12:59.340 It's just, everything depends on free
01:13:01.520 speech.
01:13:01.920 And so it's just got to be an absolute
01:13:03.660 issue for the administration in these
01:13:07.000 negotiations.
01:13:07.840 And yeah, I mean, I think civil
01:13:09.780 disobedience, if we see, you know, when
01:13:12.560 things get to that level is always, should
01:13:14.460 always be an option, particularly for
01:13:15.900 defending something as essential and
01:13:18.180 sacred as the first amendment.
01:13:20.460 Do you have any guesses or theories as
01:13:23.560 to what happened to Great Britain, which
01:13:25.480 of all countries is the closest to
01:13:27.080 ours, has the deepest historical ties, and
01:13:30.280 is now arresting more than 12,000 people
01:13:32.320 a year for saying things the government
01:13:34.260 doesn't like?
01:13:35.480 That, it really, it's hard for me even to
01:13:37.600 digest that.
01:13:38.960 But I'm also confused by it.
01:13:40.920 How did that happen?
01:13:42.500 Yeah, I mean, look, you had Christopher
01:13:46.600 Caldwell on the other week, and I thought
01:13:49.160 he did an incredible job explaining what's
01:13:52.400 happened to Europe.
01:13:53.180 But I mean, I think we're, I think it's
01:13:55.280 fair to say that we're at the end of an
01:13:57.260 80-year cycle that began in 1945 with the
01:14:00.100 end of World War II.
01:14:01.080 And the United States had the role of being
01:14:03.520 the, you know, really the main, you know,
01:14:06.360 the country that was at the center of this
01:14:08.340 new empire.
01:14:09.240 I mean, you can call it the American
01:14:10.400 empire, whatever you want to call it.
01:14:12.280 And we were trying to prevent another war
01:14:13.880 in Europe, and we pushed out an ideology
01:14:15.920 that you might call, R.R. Reno calls the
01:14:18.600 open society ideology.
01:14:21.340 And at first it made sense when you're
01:14:23.120 denazifying Germany and you're, you know,
01:14:26.880 whatever they did with Japan, moderating
01:14:29.860 Japan, and you're trying to kind of usher
01:14:31.860 in a liberal democratic Western order.
01:14:35.260 Made sense for a few decades.
01:14:36.780 Probably didn't make sense after 1990.
01:14:39.720 And it went too far.
01:14:41.300 And it obviously, we decimated our industries
01:14:43.360 by exporting them to China.
01:14:45.640 And, you know, created, you know, with the
01:14:47.820 help of George Soros, created this elaborate
01:14:49.680 NGO sector that basically pushed two things
01:14:52.280 at the same time.
01:14:52.980 Because I think the only way you can
01:14:54.100 understand the censorship and the demand
01:14:56.620 for totalitarian censorship in the kind of
01:14:59.000 mental space is to just also understand the
01:15:01.920 total disorder that they're creating in the
01:15:04.920 physical world.
01:15:06.080 You know, from like, as you're saying, the
01:15:07.720 unchecked mass migration, the collapse of
01:15:09.940 borders, people in boats, you know, people
01:15:12.780 with 14, you know, criminal prosecutions and
01:15:17.520 still let out on the street, despite their
01:15:19.700 schizophrenia to commit murder against refugees.
01:15:22.200 I mean, that disorder is, I think, I don't think it's a
01:15:25.900 coincidence that those two things are unleashed by the
01:15:29.580 same people at the same time.
01:15:31.060 I mean, Soros Foundation wants censorship.
01:15:33.180 They also want disorder and anarchy and lawlessness, you
01:15:38.440 know, at the street level, at the city level.
01:15:40.540 So I think that, you know, as that, you know, the
01:15:43.980 contradictions of their own, you know, ideology of just
01:15:48.340 sort of, you know, the guilt around the past and the
01:15:50.780 construction of these singularities of evil that were, you
01:15:54.080 know, the Holocaust, slavery, indigenous genocide, those
01:15:57.120 became new religious, the new original sins for this new
01:16:00.640 woke religion.
01:16:02.380 And, you know, it's funny because it was interesting.
01:16:04.720 You look, everyone looks at, I mean, everyone's seen the
01:16:06.400 data, you know, that really the border, the migration and the
01:16:09.900 illegal migration to the United States really wasn't nearly as
01:16:12.620 out of control, you know, before Trump, he campaigned on in
01:16:15.760 2016, but it really gets out of control as a kind of reaction
01:16:18.740 by Biden and the blob elites after 2020.
01:16:23.100 Europe's a slightly different story, but, you know, I think
01:16:26.620 it's just what it looks like.
01:16:27.840 There's just this woke religion has just absolutely displaced
01:16:32.020 the older story that we had of Western civilization, which is
01:16:35.420 that Christianity gave way to the enlightenment, the
01:16:38.560 enlightenment secularized a whole bunch of Christian values,
01:16:41.160 including the idea that we're all born with dignity and rights.
01:16:46.120 And we just, that old story just got replaced by this really
01:16:49.220 ugly story, which is that humans are a cancer on the earth, that
01:16:53.220 Western civilization is just genocidal.
01:16:55.860 And, you know, it's just the opposite of really what it's been
01:16:59.680 historically, which is a massively liberating phenomenon.
01:17:03.040 And we got stuck in this awful story, got taught in the schools,
01:17:06.300 it got taught in the universities, and it's just that beautiful open
01:17:10.100 society vision from 1946 just became its complete totalitarian
01:17:15.060 opposite.
01:17:16.460 And I think Britain really exemplifies that.
01:17:18.760 And it's worth knowing, by the way, too, because I think you've done such
01:17:20.540 a good job here, Tucker, of pointing out the left and right origins of
01:17:23.860 this, certainly like what we call the foreign policy establishment, the
01:17:26.360 blob.
01:17:26.980 They were behind the Online Safety Act in Britain that passed in 2023.
01:17:32.100 It was the conservative government that got it done.
01:17:35.440 But it was the same foreign policy blob that was behind our censorship
01:17:38.760 industrial complex.
01:17:40.400 And that was clearly emerged out of this effort to govern the American
01:17:43.980 empire and then was reacting to this just massive populist unrest to, you
01:17:50.260 know, out of control migration policies, energy policies that were aimed at
01:17:54.020 creating scarcity and high prices.
01:17:55.580 The trans madness were literally, I mean, that is just one where, I mean, if
01:18:00.420 you really want to like, it's like a David Cronenberg movie, you know, it's
01:18:03.720 like, it's these atrocities, physical atrocities that you're then not allowed
01:18:08.700 to talk about.
01:18:09.820 Like that you're then, if you actually deny, if you actually, I mean, they
01:18:13.240 wanted censorship on all of it.
01:18:14.500 We had a Twitter, they censored Megan Murphy for saying, but a man is not a
01:18:18.760 woman.
01:18:19.860 Like, that's what she said.
01:18:21.060 And they like de-platformed her.
01:18:22.320 I mean, you talk about like a terrifying scenario, we're in a scenario where it's
01:18:25.940 like these hideous medical experiments are being conducted on the bodies of
01:18:30.060 adolescents and mentally ill people.
01:18:32.600 And they were then trying to censor people talking about it and demanding that
01:18:36.620 you believe that it's possible to change your sex.
01:18:39.880 I mean, that's just how you kind of go.
01:18:41.080 That's how far gone we've got as a civilization.
01:18:44.720 You know, it's that we convinced ourselves that you could perform, you know,
01:18:48.340 biological alchemy.
01:18:49.660 And then we wanted to silence and suppress anybody who told the truth about it.
01:18:54.400 So, you know, I think there's a black pill moment where one could say that we're
01:18:59.140 pretty far gone, you know, if you're already at this place.
01:19:02.220 But I do think, you know, thanks to, you know, what you're saying to the opening of
01:19:06.880 the platform, to the success of people like you and Joe Rogan and the creation of
01:19:11.060 this alternative media universe.
01:19:12.560 I do think we have a chance to remake that case, not just for free speech, but really
01:19:18.960 for this amazing, you know, tiny moment in history where, like, actually everybody that's
01:19:25.260 a citizen of a country got to be free.
01:19:27.760 And that's a beautiful, wonderful thing.
01:19:30.900 And anybody that's ever traveled outside the United States, I think, can see that and
01:19:34.980 appreciate it.
01:19:35.700 It's the greatest thing that we have.
01:19:38.040 And the reason we have it is because we've reminded ourselves generationally, because
01:19:41.900 we told the story of it, this is the greatest thing that we have.
01:19:44.420 And I can't think of a greater tragedy, a more perverse tragedy, than the assassination
01:19:49.820 of Charlie Kirk being leveraged by people in order to construct a world that he hated and
01:19:55.960 fought against for his entire short life.
01:19:58.080 To use Charlie's assassination as a pretext for censorship, I mean, the mind struggles
01:20:04.640 even to understand that.
01:20:07.020 But that's how brazen people are.
01:20:08.400 So I hate to ask you this, but you've thought so deeply about it.
01:20:11.700 Maybe you have an answer.
01:20:12.520 I don't.
01:20:13.320 What's the motive for all this?
01:20:14.800 Like, why would you want to conduct hideous medical experiments on children?
01:20:19.780 It doesn't benefit you.
01:20:20.560 It doesn't benefit them.
01:20:21.420 Like, what is this, actually?
01:20:22.860 Yeah, I mean, you know, as you probably remember, in one of my last book on the homeless crisis,
01:20:30.160 I put a lot of emphasis on this desire from the left to be compassionate and to think of
01:20:36.060 ourselves as good people.
01:20:38.040 And that really, the idea, and it's really this immediate emotive, you know, like with
01:20:42.560 addiction, people that are crying out, they're saying, I'm fine, and I'm fine, and my living
01:20:47.500 in my waste and being sexually assaulted every night, I'm just fine, let me just smoke some
01:20:51.400 more fentanyl, everybody should know that that person needs an intervention so that
01:20:56.980 they stop harming themselves in public.
01:21:00.560 But the emotionalism and the sentimentality, that immediate appeal to, oh, no, it's somehow
01:21:06.000 cruel, oops, that it's somehow cruel to allow, you know, to enforce laws and mandate care
01:21:13.000 for people.
01:21:14.160 So on one hand, it does seem like this empathy of like, oh, we have to protect people.
01:21:17.500 But I also think there is something, you know, darker and more selfish, and frankly, more
01:21:22.900 hedonistic than that, which is, as you were saying, I mean, the ability to censor somebody
01:21:28.900 is an incredible act of power and domination.
01:21:32.540 It's not something that weak, the weak can't censor people.
01:21:35.800 I mean, you look at any movement for human liberation, like the weak don't have the power
01:21:40.340 to censor.
01:21:40.840 So the censorship comes from these really arrogant, overly empowered, overly powerful, entitled
01:21:47.260 elites displaying traits of, frankly, antisocial disorder with no empathy for the people that
01:21:54.380 they're censoring.
01:21:55.640 And so I think that my views are that there's certainly plenty of people that think that
01:22:00.000 that feel that they're being empathic.
01:22:01.540 But I think a lot of other people, it's will to power and nothing besides, to paraphrase
01:22:07.240 Nietzsche, where it's actually the pleasure of just controlling what people can say online.
01:22:14.480 I mean, the people, these censors have now, we've profiled them.
01:22:18.320 I mean, we've written, you know, we haven't published all of it, but we try to understand
01:22:22.480 the people that are doing this very deeply at a psychological level.
01:22:27.180 And they're just absolutely power hungry.
01:22:30.140 And they're completely arrogant, like they're enclosed minded, and frankly, not very smart.
01:22:35.080 I mean, that's the thing you forget about the totalitarians.
01:22:38.040 You know, it's depressing, because I think there's a lot, there's a story that's a story
01:22:41.460 that's getting told.
01:22:42.400 I won't say everybody knows by who, but somebody, there's somebody on the right that's sort
01:22:45.440 of telling a story about how terrible democracy is, and how if we had an autocracy, it would
01:22:51.320 be run by somebody competent, like Elon Musk and everything would work great.
01:22:54.220 Actually, the history of totalitarianism is that it's the incompetent, awful, idiotic
01:23:00.880 bureaucrats that are running things.
01:23:03.380 It's not like, it's not Mozart and Goethe and Nietzsche that are like, you know, running
01:23:07.780 things.
01:23:08.260 It's like these very crude, dumb people.
01:23:11.640 And so you see someone like Nina Jankiewicz or Rene DiResta.
01:23:15.600 These are really power hungry, very petty, small people.
01:23:19.760 There's so much kind of just a kind of almost like a neediness there.
01:23:24.140 You see it in some of them, a neediness for people to tell them how good they are and how
01:23:28.620 much they care.
01:23:30.040 A lot of like, you know, if you remember the movie Misery, the Kathy Bates character, I
01:23:33.380 sort of see a lot of that Kathy Bates energy of, you know, I'm going to take care of you,
01:23:37.580 but it's actually I'm going to dominate.
01:23:39.900 So I think that when we, I think that people sometimes sort of say it's suicidal empathy
01:23:44.840 or it's pathological altruism, and I know what they mean, I think that what's underneath
01:23:49.420 it is something darker, more nihilistic that is just feeding hedonistically off the power
01:23:56.960 of dominating and censoring and persecuting others that really isn't in service, you know,
01:24:04.500 as, you know, we've, you know, as the foundational spiritualities and, and philosophies of the
01:24:11.840 West have aimed at that power be used in service of, of beautiful values.
01:24:16.800 It's not, it's not in service of that.
01:24:18.500 It's just in service of their own individual expression of power.
01:24:22.000 And like you said, you know, if it's, if it's to censor you on COVID or antisemitism or
01:24:27.000 trans or migration or the Ukraine war, they don't care.
01:24:31.840 Like they're always wanting to find new ways to censor because it's coming from something
01:24:35.660 so deep, something so deep and frankly, pathological inside of them.
01:24:39.960 It's the war impulse is so similar in killing people being the ultimate expression of power.
01:24:45.120 You know, you can't create a life, but you can end it.
01:24:47.700 And there, there are people, and I would say Lindsey Graham is one of them, but there are
01:24:50.940 many who just derive pleasure from the idea of killing people, not just because they're
01:24:57.660 cruel, though, obviously they are, but because it makes them feel alive.
01:25:01.840 And I think there's something you see it in school administrators.
01:25:03.980 So I, I have, I really feel like we're on the cusp of like something great.
01:25:09.360 Charlie Kirk's memorial on Sunday made me feel that way.
01:25:12.300 I feel like it's not all darkness and like, don't take the black pill.
01:25:15.440 You know, there is light there.
01:25:17.680 And, um, so I feel that way, but then you see, I have to say video of Don Bacon, the Republican
01:25:24.940 congressman from Nebraska, the most normal state out of 50 saying, Oh yeah, yeah.
01:25:29.480 I'm talking to Jonathan Greenblatt, who's like a gargoyle from ADL, which is like the
01:25:35.240 most anti-human organization like I've ever dealt with in my life.
01:25:39.460 And you feel like, wow, if Don Bacon is taking orders from the ADL and Jonathan Greenblatt,
01:25:45.620 then like the fix is in, like it's a bipartisan conspiracy to strip people of their most basic
01:25:51.760 freedom.
01:25:54.120 Yeah.
01:25:54.520 I mean, I think that for me also, what comes up Tucker, and I, I know that this is something
01:25:59.020 that you are concerned about too, is that I think you were saying it before, like there's
01:26:03.500 a, there's a censorship and then there's, well, there's actually so many sides to the totalitarianism.
01:26:08.220 There's the censorship, there's the disinformation and dehumanization that the state or these
01:26:14.540 parastatal censorship, you know, proxy entities play.
01:26:18.420 And then there's the secrecy.
01:26:20.320 And so I think what we now know, and again, I've, I've said very clearly and praise the
01:26:25.960 Trump administration.
01:26:26.460 They've actually been helpful in my own case in Brazil, where I'm under criminal investigation
01:26:31.240 for the Twitter files, Brazil.
01:26:32.780 And so I'm very grateful to the Trump administration.
01:26:35.100 I hope that's clear.
01:26:35.920 But nonetheless, I think we can see that there are clearly some things that we're not, that
01:26:41.880 they really don't want us to know about.
01:26:43.880 And the Epstein one, the Jeffrey Epstein situation is easily, I think the most explosive and most
01:26:50.380 famous one where everybody knows there's these files and everybody knows that there are the
01:26:54.900 FBI and DOJ and everybody knows that there's no legal barriers to releasing them and that
01:27:00.740 there's all these excuses or everybody knows that it's not just Jeffrey Epstein's own personal
01:27:05.640 pornography collection and that the story has kept shifting.
01:27:09.560 But, you know, it looks like there may now be enough votes in the House pretty soon to
01:27:14.220 force a vote on it.
01:27:15.500 I think Speaker Johnson could still try to stop that.
01:27:17.980 But I think I'm heartened that the MAGA movement actually remained true to following that issue
01:27:26.000 through to the end.
01:27:26.940 But I think we've seen that there's, you know, frankly, a secret government that, I mean, we
01:27:32.560 can, people will say that sounds conspiratorial.
01:27:35.440 But I think if you realize what the Epstein files are and that it was covering up almost
01:27:41.880 certainly, very likely a sex blackmail operation.
01:27:45.700 And by the way, we didn't even have proof of the hidden cameras until a couple of weeks
01:27:49.520 ago when the New York Times published two photos of the hidden cameras, one of them pointing
01:27:54.480 right at a bed in Epstein's New York apartment.
01:27:57.740 And I think we know that, and, you know, we had, and Massey was in, you know, Congressman
01:28:01.340 Massey was in Congress last week and he said there was 20 names that he knows who they are
01:28:05.880 that are in the files.
01:28:07.060 He gave us one of them, CEO of Barclays Bank, and then he kind of listed who the other ones
01:28:11.560 were.
01:28:11.900 One of them was like a, you know, Hollywood producer, rock star, magician.
01:28:15.440 Um, so we know all these things.
01:28:18.460 It, I think it's a really important test.
01:28:20.660 I think it's really important that all of us that are sympathetic to things the Trump
01:28:24.620 administration has done, that we continue to not let the Epstein issue go.
01:28:29.460 And then I think the other issue, Tucker, that I know you care a lot about is the UAP
01:28:33.720 issue.
01:28:34.720 The president said after the drones over the drones over New Jersey, the unidentified, mostly
01:28:41.120 unidentified drones over New Jersey, that we were going to be able to find what that
01:28:44.660 is.
01:28:45.360 I have a list of all of the doc, of the key documents provided to me by John Greenwald
01:28:50.080 of the document, the UAP documents that exist that have many of which have been released
01:28:55.400 and have just been so heavily redacted.
01:28:57.280 They need to release these things.
01:28:59.320 They need to, um, stop hiding this.
01:29:02.100 And I'll just end by saying, uh, on this to culminate it all, look, the elephant in the
01:29:06.980 room here is the CIA.
01:29:08.360 You know, you've got this wonderful reform leader in Tulsi Gabbard, who is a unifying leader.
01:29:14.240 She has so much trust from people that were on the left.
01:29:16.940 She has so much trust from the MAGA community.
01:29:19.440 She's obviously a good person.
01:29:21.420 Like anybody that has ever met her or seen her.
01:29:24.800 That's correct.
01:29:25.260 And, and she, she, by law, Congress after 9-11 made this law that she is the boss of the
01:29:32.680 intelligence community.
01:29:33.540 That is what the law requires.
01:29:35.260 But we have this recalcitrant CIA where, I mean, come on guys.
01:29:41.040 Like we have not seen significant change to personnel.
01:29:44.500 Apparently only two of the people that worked on the bogus intelligence community assessment
01:29:49.240 about Russia interference in the 2016 elections, only two of those people are gone.
01:29:53.840 Um, it's the response from CIA to us.
01:29:56.580 Uh, there, I frankly found their, what they told us to be just, uh, uh, patronizing to
01:30:03.700 the point of offensive in insisting, you know, is basically trust us, bro.
01:30:08.480 Um, it's all good now.
01:30:10.580 The CIA is fine.
01:30:11.860 The CIA is not fine.
01:30:13.620 The CIA is hiding information that the American people paid for and have a right to know on
01:30:19.860 a lot of issues.
01:30:21.360 A lot.
01:30:22.280 UAP, Epstein, uh, Congressman Massey revealed that there is a CIA file on Epstein that needs
01:30:28.900 to come out and look, maybe CIA shouldn't exist.
01:30:32.880 I mean, Senator Moynihan before he died and, uh, uh, Kennedy's historian, why am I blanking
01:30:40.680 out his name?
01:30:41.020 Dr. Schlesinger.
01:30:41.440 Schlesinger.
01:30:42.180 Yep.
01:30:42.420 Schlesinger.
01:30:43.580 There's been various proposals to break up the CIA.
01:30:47.140 You know, frankly, it's a paramilitary organization ever since 9-11.
01:30:50.860 It was supposed to be an, Truman wanted an intel organization.
01:30:54.420 We need good intel.
01:30:55.840 By the way, congratulations on your brilliant documentary.
01:30:58.360 I saw the first part of it last night.
01:30:59.700 So now it appears, if I'm understanding correctly, that the CIA was probably behind, uh, the 9-11
01:31:06.380 attacks.
01:31:07.000 It was a botched, uh, CIA operation.
01:31:09.220 It sounds like I haven't finished your series, but here you have this.
01:31:12.620 So, I mean, you kind of go, so here you have an organization that's responsible for just
01:31:16.360 the worst, like regime change coups followed by dictators who tortured people, CIA that,
01:31:23.320 you know, infiltrated American student groups that used labor unions to, you know, engage
01:31:29.860 in regime change.
01:31:31.400 Um, you know, that spawn off people that were involved in the censorship industrial complex
01:31:36.240 and lawfare may have been, sounds like what you're saying, you know, that, uh, was behind,
01:31:41.280 or at least, uh, didn't stop or contributed to the 9-11 attacks.
01:31:45.620 And then they did the torture after 9-11, which not only doesn't work, uh, like creates
01:31:51.420 bad information and is a stain on the moral character of the United States.
01:31:57.540 At a certain point, you're like, what is this dog of an organization doing being just unreformed
01:32:03.460 and trampling on all of our basic, you know, freedoms?
01:32:06.420 So, I mean, I, you know, I kind of go, I think we just need to tell people that we don't really
01:32:12.000 govern ourselves as long as you have this, you know, uh, mess of an institution called
01:32:17.200 the CIA, where a bunch of analysts, uh, kind of appear to run the world.
01:32:21.780 As long as that organization remains unreformed and we don't really get true disclosure about
01:32:26.640 all the things that we know are going on, then I think we should be pretty unhappy and
01:32:31.260 pretty demanding of much more significant reforms than it appears, uh, the Trump
01:32:36.120 administration is going to pursue.
01:32:37.880 I'd settle for real oversight rather than, you know, Tom Cotton, who runs the Senate
01:32:42.740 Intelligence Committee, is basically just an apologist for the CIA.
01:32:46.500 There's no oversight at all.
01:32:47.800 He carries water for the agency in ways that hurt this country.
01:32:52.660 And it's, I'm not exactly sure why, like, what is that?
01:32:55.800 And I don't know the answer.
01:32:57.380 Um, people can speculate all they want.
01:32:59.200 I do want to just go back and thank you for what you said.
01:33:02.300 It's all true.
01:33:03.680 It's true.
01:33:04.560 Okay.
01:33:04.800 Um, good.
01:33:05.680 I haven't seen the end of it, but I saw the first part.
01:33:07.520 It's amazing, by the way.
01:33:08.580 No, no, I'm not talking about our documentary series.
01:33:11.620 I was just saying your analysis of CIA.
01:33:14.220 I mean, how many people do you think in the White House right now know what the actual
01:33:18.240 CIA budget is?
01:33:20.320 You know, I'd be surprised if you could find someone.
01:33:21.920 I don't, I've never met anyone who can actually, who can even, they can't tell you
01:33:25.720 because it's classified.
01:33:26.460 But, but, and I assume supposedly the House and Senate Intel committee chairman know what
01:33:32.640 the full budget is, but I would be shocked if they actually did.
01:33:36.480 I mean, it's its own country.
01:33:37.480 It's autonomous.
01:33:38.120 It doesn't have, um, oversight.
01:33:40.820 It doesn't have committee control structures.
01:33:42.860 It just kind of does what it wants.
01:33:44.480 It lies about what it does.
01:33:45.660 There's no way to know for a fact.
01:33:47.520 I mean, you, it's, it's a, it's a separate government within our borders, um, just like
01:33:53.720 they had in Portland, Oregon at the, you know, the height of, uh, George Floyd, but I just
01:33:58.860 want to ask you about something, um, that you said about the, the drone sighting or the
01:34:03.480 lights in the sky over New Jersey last year.
01:34:07.180 And, you know, so many sightings that it's really no dispute that it happened.
01:34:10.820 The question is, what is it?
01:34:12.140 And the president said that he would tell us we've never heard.
01:34:16.360 What was that?
01:34:17.440 Do you think?
01:34:19.860 I don't know.
01:34:20.580 And, you know, they're having also very similar drone sightings now, um, over in Denmark that
01:34:25.460 actually shut down both Danish airports on Sunday.
01:34:29.920 I mean, it's, uh, uh, yeah.
01:34:33.360 I mean, and why can't we know about it?
01:34:35.120 You know, and, uh, but what's your sense?
01:34:36.980 I know that you, you've done a lot on this.
01:34:39.120 I, uh, I've talked to you off, you know, off camera, just entry, cause I, you're one of
01:34:43.860 the few people whose judgment on this, I trust there's so much deception on this question.
01:34:48.680 I think parts of it, what's the term they use?
01:34:51.280 It's an op.
01:34:52.400 Um, I think part of it is part of the explanation is, but at its core are physical phenomena
01:34:58.120 that have been recorded in such volume at such scale that like something real is happening.
01:35:03.680 And I, and I know you don't really have the final answer on that, but what is your sense?
01:35:09.120 Um, yeah, I mean, I think, look, first of all, the government is engaged in extensive
01:35:14.220 disinformation on this topic and that's not, uh, that's not, that's just all confirmed.
01:35:19.300 Like it's been well-documented what they've done.
01:35:22.500 I mean, there's a, you know, there's a, there's an alien crash retrieval manual that, uh, is,
01:35:29.780 you know, that is officially according to the official story, a total fake, a total fabrication.
01:35:33.900 But I mean, when you look at it, it is extraordinary in its quality of like, if it is a fake, I mean,
01:35:41.020 complete with like the names of the people who checked it out and those people having
01:35:44.480 been checked out, who, who would do that?
01:35:47.580 Like, why would you do that?
01:35:48.520 Well, one story is that it was used as passage material to identify counterintelligence
01:35:53.400 spies in the U S intelligence community.
01:35:56.140 But nonetheless, there has been so much government misinformation.
01:36:00.760 Um, there's also been, you know, efforts to, there's also secret, you know, uh, technology
01:36:06.220 projects.
01:36:06.820 I mean, one of the guys that testified at the UAP hearing last week just said that, uh, that
01:36:12.200 he has seen successful reverse engineering of technologies.
01:36:15.580 Um, you know, there's a whole kind of Pentagon technological side of this that many other
01:36:23.000 people have done so much better work on and reporting on than me, Jesse Michaels being
01:36:27.440 one of the leaders of kind of unearthing it.
01:36:30.500 I will say, I don't think any of it could all be reduced to, uh, hard military hardware,
01:36:36.060 either ours or somebody else's.
01:36:37.540 I'm very confident that there's just way too many cases that don't fit that.
01:36:41.580 Um, I also, uh, think that Jacques VallƩe has done really some of the most important
01:36:47.500 scholarship on this.
01:36:48.840 I find myself, and he just gave a presentation on it.
01:36:51.720 He's like, he's the French character played by Francois Truffaut in Close Encounters of
01:36:56.280 the Third Kind by Steven Spielberg, uh, a French researcher who's just sort of a international
01:37:01.920 treasure of UFO, you know, cases.
01:37:05.580 And, you know, he's actually gone a very similar direction that you've gone.
01:37:08.740 Um, and I find myself going there a little bit too, which is that there is a spiritual
01:37:13.160 element to this that I don't think is just purely attributable to technology because the
01:37:19.580 issue is such a gestalt, uh, issue because, you know, if you look at like in a classic
01:37:25.440 gestalt, is it an old woman?
01:37:26.640 Is it a young woman?
01:37:27.280 Is this a spiritual issue sort of manifesting as sort of some high-tech hardware, or is it
01:37:35.200 some high-tech civilization manifesting, um, as something spiritual?
01:37:40.640 I find myself really gravitating towards these, uh, cases, which is also where a valet encouraged
01:37:47.120 a lot of new research.
01:37:48.980 Um, one of which is my favorite is this English woman in the countryside who had a UFO sighting
01:37:53.900 in the fifties.
01:37:54.700 And I would, um, dare anyone to watch that.
01:37:58.240 And she describes seeing, you know, she's like a very working class, uh, English woman.
01:38:02.720 It's a beautiful interview with her, uh, done, it's not by, it's like BBC or somebody.
01:38:07.420 And they just said, describe what you saw.
01:38:08.900 You know, she said, she hears this noise.
01:38:10.440 Her two boys are in the front yard.
01:38:12.080 She sees a huge UFO over her house.
01:38:14.520 He asked her to describe it.
01:38:15.680 And she said, what can I say?
01:38:16.980 It was like a Mexican hat, you know, like a typical, you know, uh, flying saucer with
01:38:22.220 a dome.
01:38:23.000 Her kids were there seeing it.
01:38:25.140 She's where she saw it.
01:38:26.420 She says there was two people inside and they were beautiful people with long blonde hair
01:38:31.500 and sort of slightly bigger foreheads and sort of looking at her.
01:38:36.200 And, um, it ends so interesting.
01:38:38.440 She says, you know, we, we told people about this and then we were ridiculed.
01:38:41.900 And then she said, but it's okay because I know it happened.
01:38:44.420 It's true.
01:38:45.020 And I, I think I dare people to watch that and come away thinking that she was lying.
01:38:50.140 I don't think she was lying.
01:38:52.020 Yes.
01:38:52.540 I also, um, as you know, I have interviewed a fair number of psychotic people living tragically
01:38:57.900 on the street, um, and people in psychotic states, that's not the kind of story they tell.
01:39:04.020 In fact, I have, I even have psych, I have, I have homeless people who I've been interviewing
01:39:07.700 that are meth induced psychotic, you know, and meth induced psychosis talking about aliens.
01:39:11.500 And it's just a lot of word salad and, and garbled.
01:39:14.860 It's like talking to somebody trying to explain a dream they had.
01:39:17.500 It doesn't make sense.
01:39:18.620 Yes.
01:39:18.940 So I don't think she lied.
01:39:21.300 I don't think she's capable.
01:39:22.380 I think that most actors are bad actors.
01:39:24.140 I don't think she's capable of having invented that and then persuading her children to lie
01:39:28.960 with her.
01:39:29.400 I think that that she had that experience.
01:39:31.720 I don't think she's psychotic.
01:39:33.640 Um, I don't really know if anybody knows if that, if, if that, if those beings come from
01:39:40.680 a different planet or they're interdimensional or they're spiritual, or if they have some
01:39:45.060 other form and they're just manifesting and, and holograming like that, I don't know.
01:39:49.220 Um, but I think that, um, the conversation, you know, thanks to, again, people like you
01:39:56.620 and, and Joe and others has just widened so that we can see just what a, what a big lie
01:40:02.600 it's been that science has really properly accounted for reality.
01:40:07.400 Um, you know, this, you know, science magazine did a survey of scientists, including natural
01:40:13.620 scientists.
01:40:14.060 And I think it was somewhere around 60 to 80% of natural scientists.
01:40:18.060 I'm talking physics and biology and chemistry were not able to replicate famous studies
01:40:24.740 in their field.
01:40:25.980 They admitted this in a survey and then they would ask them, do you still trust your field
01:40:29.200 of science?
01:40:29.640 And they all said, yes, but they can't replicate basic scientific experiments.
01:40:33.900 They keep changing their mind on the creation stories at the big bang.
01:40:37.820 I think there's sufficient doubts about, you know, about human origins.
01:40:43.180 And so, but like that became, that was so taboo.
01:40:45.940 That was so, you couldn't talk about that in polite society.
01:40:49.140 But I do think now we are able to have those conversations.
01:40:52.380 And I do think it's really notable that at this political shift, there is, I think, a
01:40:57.120 spiritual, a spiritual movement.
01:40:59.360 I mean, I'm obviously really into it.
01:41:01.300 Other people in my life are not as excited about it.
01:41:06.240 But for me, I think these experiences, you know, the evidence, you know, the spiritual
01:41:14.700 side of it, the government cover up, you know, are just huge areas that we should be doing
01:41:20.080 so much more research and investigations and journalism on.
01:41:23.180 And I get a little frustrated because I think sometimes, I think the conversation right
01:41:29.300 now in the podcast world and in the conversation is just a lot of people are repeating and speculating
01:41:35.020 about stories that we've kind of heard before or sort of know about.
01:41:39.060 But we haven't put nearly enough pressure on the government to release or unredact the
01:41:44.700 documents that we know exist to come clean about what they appear to know and are unwilling
01:41:50.100 to tell there should be a real movement around this.
01:41:53.020 And there should be consequences for members of Congress because that's information that
01:41:58.440 that belongs to all of us.
01:42:00.480 And if there's some evidence of non-human intelligence or a lot of evidence, my understanding, I'm very
01:42:07.260 confident that there are thousands of high quality videos, photos, sensor data, radar data,
01:42:14.020 a lot, a lot that the military is keeping from us and the CIA is keeping from us.
01:42:20.680 And we should be really upset about that.
01:42:22.680 And I think that for me, I'm I think that we can there's just been a lot of conversations
01:42:28.260 where people go around and around about with the data that we know.
01:42:31.680 But what we're missing is the fact that the government, the government is sitting on so
01:42:35.420 much more of it.
01:42:36.240 And I find myself wanting to do more to force it out and I'm getting frustrated, but I'm
01:42:42.800 a little bit, you know, I think, as you've seen, I'm on the one hand, very grateful to
01:42:46.840 this administration and the strong things it's done, you know, on free speech and the
01:42:51.100 disclosure it's done, certainly disclosing so much more than the last administration.
01:42:55.040 But we still need a lot more.
01:42:56.720 There's still so much that needs to be released on Epstein, on COVID origins, the whole COVID
01:43:03.240 pandemic response, on the weaponization of FBI, the continuing rot.
01:43:07.940 I mean, we were someone at the CIA told us pathological rot at the CIA.
01:43:13.100 And we need we need to know what's going on with the UAPs.
01:43:16.020 It's just the speculate like you were saying before, it would be irresponsible not to engage
01:43:22.060 in conspiracy theorizing and speculation, given how little information they give to us.
01:43:26.680 And if they were really so concerned about conspiracy theories and speculation and misinformation,
01:43:31.320 then they should be releasing those documents.
01:43:33.980 Well, of course, they foment conspiracy theories and race hatred because it's a distraction from
01:43:40.280 what they're doing.
01:43:40.820 I mean, when I was younger, living in Washington, and I began to understand that the government
01:43:44.840 was systematically lying across agencies about a couple of things, probably a lot of things,
01:43:49.340 but UAPs were definitely one of them.
01:43:51.680 That became obvious a while ago.
01:43:53.540 And I remember asking, you know, like, what is this?
01:43:57.520 And never getting a straight answer, except people would say, look, it's not it's destabilizing.
01:44:03.320 It would be destabilizing if the public knew and like, who wants an unstable country?
01:44:08.360 You know, there's some things that people just aren't ready for whatever the euphemism they
01:44:11.080 use.
01:44:11.240 But that was the explanation.
01:44:11.880 As I got older, I began to, you know, talk to other people and have other thoughts.
01:44:17.020 And one of them was totally possible that the government has some really does have something
01:44:21.000 to hide is participating in things that people would not approve of or be shocked to learn.
01:44:25.360 And all of that gets to a question that's never occurred to me till right now.
01:44:29.100 But like, who named America's military headquarters after a pentagram?
01:44:35.740 Like, who thought that was a good idea?
01:44:38.060 And I know you've done a lot of research on that period, the war period.
01:44:41.240 Like, what was that?
01:44:44.260 Well, yeah, I mean, this is, you know, I haven't seen it yet.
01:44:46.940 So I can't evaluate it yet.
01:44:48.180 And I don't know a lot about it.
01:44:49.360 But yeah, I mean, there's some, there's a real darkness to the whole area.
01:44:54.180 Yeah, let's call it, let's call the building that controls nuclear weapons, the Pentagon.
01:44:59.440 Huh?
01:44:59.960 I mean, it's sort of like right in your face, right?
01:45:02.120 Or no?
01:45:04.080 Yeah, I mean, I just haven't looked that much on it.
01:45:06.740 I do know that like, a lot of the UFO stuff is very tied in with the occult.
01:45:11.620 Yeah.
01:45:11.700 And apparently, you know, Jesse Michaels again did a, apparently did a new documentary on occult.
01:45:20.440 I mean, I'm not vouching if I don't know about it, but I like Jesse, occult behaviors within NASA.
01:45:26.020 Uh, so, uh, very concerning.
01:45:29.260 I don't know what it means.
01:45:30.700 Um, you know, I, I, I'm, I'm shocked by how little curiosity there is at a society wide level.
01:45:38.240 I think that, you know, the intellectual life of this country, uh, by which I mean, not just the universities, but also the newspapers and the big media companies, that is how censorship was done over the last 80 years.
01:45:51.180 The internet is almost a return to a pre radio pre broadcast period, um, when people were really free to just print whatever they wanted, uh, the internet is not there, but it's, it's a lot closer to it.
01:46:03.200 We finally get to kind of learn that actually there's all these anomalies around human evolution, around human history, around, you know, archeological sites where things don't seem to add up.
01:46:17.580 Right.
01:46:17.860 And you start to get, um, people that were called, you know, pseudo archeologists starting to kind of win some arguments publicly.
01:46:26.660 I mean, there's one happening right now around, uh, Gobekli Tepe with this guy, Jimmy Corsetti, where he's just shown that the people that are supposed to be excavating the site are destroying it, planting trees whose roots will destroy these ancient sites.
01:46:41.920 And also building these really grotesque roofing structures in ways that destroy the site.
01:46:47.480 They're very weird and suspicious.
01:46:49.780 Um, there's just a lot of, you know, we know that a lot of the Tesla information was missing, uh, that, you know, should have shown some very interesting things.
01:46:58.820 And then, yeah, I mean, I think that the relationship with nuclear is one of the most interesting parts of this because these UAPs, they show up around nuclear sites.
01:47:09.240 I used to work on nuclear a lot and there would be nuclear, there would be these drones, um, these unidentified, uh, uh, they've seemed like objects, but unidentified phenomenon around nuclear power sites.
01:47:23.320 Uh, the nuclear, the people working at them, um, were often very concerned around public perception of danger.
01:47:29.100 And so they often didn't talk about them, but they, they'd certainly been over Diablo Canyon nuclear plant in California.
01:47:36.440 Um, but when the drones happened in New Jersey, well, we caught them in an open lie.
01:47:40.880 I mean, they just said, uh, John Kirby at one point said something like they had evaluated like 3000 cases of drone sightings in like 48 hours, which is like absurd.
01:47:49.460 There's no possibility they did it.
01:47:51.320 And then we started looking, a set of people started looking and you discover that in fact, there's been these drone swarms around U S military bases.
01:48:00.680 I mean, not a couple either.
01:48:02.160 I mean, I think it was somewhere around two or three dozen military bases.
01:48:05.900 And there's a lot of evidence that those, those drones, um, are circling around those moments when there's nuclear weapons, uh, in the area.
01:48:15.040 So, um, you know, if it's, I'm skeptical that it's Chinese and Russian because the drones are engaged in behaviors that I think are very difficult, uh, for anybody to do.
01:48:26.000 But I mean, if these objects are behaving in ways that, you know, do appear to be using a different kind of propulsion or anti-gravity, I mean, I'm very skeptical that that that's ours in the sense that it takes a lot.
01:48:38.860 It took a huge effort to create the Manhattan project and to create nuclear weapons.
01:48:42.920 It was a massive, massive endeavor.
01:48:44.720 And so to somehow easily get, or to be able to easily hide reverse engineering, I don't know how you do that.
01:48:52.080 Um, I'm really skeptical that we have it, but it's absurd that we have to just sit around and speculate about it.
01:48:58.580 Like, like, like we need, there's basically no transparency instead we have a DOD organization called Arrow, which as far as I can tell is part of a deception operation consistent with the CIA's recommendations through the Robertson panel in the 1950s.
01:49:15.740 That the main thing the US government should do is supposedly debunk the UFOs and to D and to ridicule the people that see them and, and research them.
01:49:26.900 And worse, um, there's a lot of threats made to people in this field.
01:49:31.400 I personally find it one of the scariest issues, um, trans and UAPs are the two, uh, paradoxically the, the scariest issues.
01:49:42.400 Um, and because it just seems like a lot of people really don't want us to know what's going on with it.
01:49:48.960 And, uh, president Trump's sound made noises like he was going to reveal something and Tulsi Gabbard just made some noises that she wanted to get to the bottom of it.
01:49:57.720 But otherwise, Tucker, they're just, they just seem like they really, they, it seems like they want to, they want to Jeffrey Epstein, the UAP files.
01:50:07.620 Yeah.
01:50:07.980 I don't think there's any, and if you're wondering if there's a spiritual component to the whole thing, if it's, if it's, if it's about technology and, you know, other, you know, I don't know, Martians.
01:50:17.120 Right.
01:50:17.820 Uh, probably not going to be, uh, this kind of response to it.
01:50:23.180 I mean, this just glows with intensity.
01:50:27.140 Again, it's the Pentagon.
01:50:29.460 So yeah, there's a spiritual component to it, I would say.
01:50:32.300 I've been, I've been scared off too.
01:50:33.640 It's like, I don't even want to deal with it, but I'm, I'm grateful for you.
01:50:36.280 Mike Schellenberger, really, I, I'm so grateful you went into journalism.
01:50:40.940 There are a few people, uh, with, you know, you could be doing a lot of other things.
01:50:44.260 There are not that many super smart people in journalism with, you know, true principles, and you're definitely one of the very few.
01:50:51.180 And so I'm always grateful to talk to you, and I'm grateful you're doing what you're doing.
01:50:54.600 So thanks, Tucker.
01:50:55.520 Thank you, and back at you, and congratulations on, uh, coming back to your famous monologues.
01:51:01.840 And I was really delighted that you did it on free speech, and I hope you keep doing a weekly monologue.
01:51:07.160 I think it's, uh.
01:51:08.340 Getting all spun up.
01:51:09.300 Yeah, I enjoy it.
01:51:10.020 Thank you.
01:51:12.300 Great.
01:51:12.500 I hope we can have dinner soon.
01:51:13.240 Great to see you.
01:51:14.640 The great Mike Schellenberger.
01:51:16.660 We'll be back next week.
01:51:18.760 Good night.
01:51:23.520 Alp is a pretty new company, about a year old, but we have a surprisingly deep, and I mean subterranean, flavor vault.
01:51:31.140 We have a massive index, a library, if you will, of archived flavors, all of which have been approved by the federal government.
01:51:37.140 It's all totally legal.
01:51:38.020 And so our archivists went down to the flavor vault last week and came up with a kind of sexy flavor, something I never would have thought of myself.
01:51:45.220 They call it Spearmint.
01:51:47.480 Introducing Alp Spearmint from the flavor vault.
01:51:50.600 It's incredible.
01:51:51.660 It's like a spear, right to the heart of the flavor zone, wherever that is.
01:51:55.960 Available now, alppouch.com.