The Joe Rogan Experience - March 30, 2023


Joe Rogan Experience #1963 - Michael Shellenberger


Episode Stats

Length

2 hours and 47 minutes

Words per Minute

173.23474

Word Count

29,089

Sentence Count

2,468

Misogynist Sentences

42


Summary

In this episode of The Joe Rogan Experience, the legendary journalist Michael Bloomberg talks with Alex Blumberg about what it's like to get access to the secret files of the White House, and how it changed the way we look at the world of journalism and politics. Alex is a reporter at the New York Times and a regular contributor to the Daily Wire. He s also a frequent contributor to The Daily Wire and the Los Angeles Times and has been featured in The Daily Beast, The New York Post, and Rolling Stone. Alex talks about how he and his co-worker, Matt Taibbi, got access to a cache of documents from the Trump administration s attempt to deplatform Donald Trump on the internet, and what they learned about the extent of the government s campaign to delegitimize Trump's presidency. He also talks about the dark side of the intelligence community, and why it s important to know what s really going on in the Deep State and the CIA are up to in order to make sense of it. Check it out! Check out the full episode on the Joe Rogans Experience here. It's a must-listen if you haven't already listened to it! You won't want to miss this one! Enjoy! - The Root - Subscribe to the show on Apple Podcasts! Subscribe on iTunes Learn more about: Subscribe on Podcoin.co/TheJoeRogan Podcast Learn about our sponsorships and support our efforts to make the show better for you! Subscribe to our new podcast, The Root Podcast by Day by Day, All Day All Day, by Night, by Day - by Night all Day, all Day long, by by Night by Day. Thanks for listening to the podcast? and Good to see you're listening to it? - Good to See You'll get a copy of the show by Day and Good To See You Again? by Good To Hear You'll Have It by Night's Best Podcasts by Night All Day Too Good to Hear It? Thank You, My Pleasible Podcasts, Good To Have It, by Meghan and I'll See You, I'll Hear Meghan & I'm Sending Meghan's Story by Dayday Podcast? -- Thank You & I'll Be Back Soon, Thank Me Back Soon? Thanks For Having Meghan And I'll Send Me Back, Thank You For Sending Me A Review & More Soon,


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:12.000 Michael, good to see you again.
00:00:14.000 Thanks for having me back.
00:00:15.000 My pleasure.
00:00:15.000 Appreciate it.
00:00:16.000 So, first of all, what was it like to get a hold of the Twitter files?
00:00:23.000 What was that experience like?
00:00:25.000 How did that go down?
00:00:27.000 Exciting as hell, man.
00:00:28.000 I mean, seriously.
00:00:35.000 We're good to go.
00:00:43.000 And yeah, it was incredible.
00:00:46.000 I'd never met Elon before.
00:00:48.000 I'd met him at the coffee station, just making himself a cup of coffee.
00:00:51.000 He had no idea who I was.
00:00:53.000 And yeah, we just got into it.
00:00:56.000 I was sort of the least known of the big three journalists that were there.
00:01:01.000 It was Barry Weiss and Matt Taibbi who was on.
00:01:04.000 And they'd already started thinking about how to kind of what to go after.
00:01:07.000 And Matt had done a story on the Hunter Biden laptop already.
00:01:12.000 And then we were starting to look at January 6, because Trump gets deplatformed on January 8. And so because I'm like the junior member of that threesome, so to speak, they gave me January 7. So the first thing we one of the first things we did was just to look at how they made a decision to get to pull Trump off the platform.
00:01:31.000 And it turned out that the seventh was an important day because that was when they started to rationalize this decision to de-platform Trump, even though their own people inside had decided that he had not violated their terms of service.
00:01:43.000 So they were sort of stuck making up a reason to de-platform him.
00:01:47.000 And that was an important theme was that they just kept changing the rules basically to do what they wanted to do.
00:01:53.000 And that was the same thing on the Hunter Biden laptop.
00:01:55.000 The New York Post story that they censored also had not violated their terms of service.
00:02:01.000 So, I mean, look, it was crazy.
00:02:02.000 I mean, it was, you know, people always ask questions about the files themselves, but, you know, the experience was we would ask for these searches and we'd just get back huge amounts of data.
00:02:13.000 It was lots of thousands and thousands of emails, thousands of internal messages on their Slack messaging system.
00:02:20.000 And so, yeah, I mean, a lot of it was, you know, some of it was very boring because you have to just read tons and tons of stuff.
00:02:25.000 But, you know, I think the big theme was we start by seeing a real, you know, super progressive.
00:02:32.000 It's like 99 percent of campaign contributions from Twitter staff are going to Democrats.
00:02:37.000 You know, the head of safety at Twitter, this guy named Yoel Roth, who, you know, said, you know, said there's actual Nazis in the White House when Trump came in is very progressive.
00:02:48.000 But over time, we just kept finding, like, this weird, like, FBI wants us to do this.
00:02:53.000 There's these other government agencies.
00:02:55.000 Oh, you know, all these people used to work at the FBI. The CIA shows up, Department of Homeland Security.
00:03:01.000 And we're kind of like, what the hell is going on?
00:03:04.000 The story quickly shifted from us sort of – and I think what Elon thought, which was that it was just very progressive people being biased in their content moderation and their censoring to there is a huge operation by US government officials,
00:03:20.000 US government contractors and all of these super sketchy NGOs getting money from who knows where.
00:03:28.000 Basically demanding that Twitter start censoring people.
00:03:30.000 At that moment, the story shifted for all of us.
00:03:33.000 And that was, I think, where Taibbi became particularly important and sort of the lead because he had had so much experience on sort of looking at how the U.S. government during the war on terror had waged disinformation campaigns, propaganda campaigns.
00:03:47.000 And it became clear to us over time that the U.S. government had turned its propaganda and disinformation campaigns that had been waging abroad.
00:03:58.000 It turned them against the American people.
00:04:00.000 And that was where you just sort of get chills up your spine and you were like, this is something seriously sinister is going on.
00:04:07.000 Do we know when this began?
00:04:10.000 Like, when did they infiltrate these organizations?
00:04:13.000 Because I'm sure it's not just Twitter, right?
00:04:15.000 I'm sure it's...
00:04:16.000 Oh, no, absolutely not.
00:04:18.000 That's part of what was so terrifying is that it was all of the social media companies, including Wikipedia, by the way, which we don't talk enough about, but also all of the mainstream news organizations are all being organized.
00:04:30.000 So when does it start?
00:04:32.000 What you're looking at is the apparatus that was created by the war on terror over the last 20 years, starting after 9-11.
00:04:40.000 Then there was a battle against ISIS because ISIS was successfully recruiting on social media.
00:04:45.000 So there was sort of a counter-ISIS recruiting campaign that occurred.
00:04:48.000 Then you get the big event is Brexit 2016, Trump's election in 2016, and the establishment just freaks out, absolutely freaks out.
00:04:59.000 And there's a lot of different motivations here.
00:05:01.000 So one of the motivations is just to blame Facebook, blame social media for Trump's victory.
00:05:07.000 It was never true.
00:05:09.000 I don't really think anybody really believed it.
00:05:11.000 There's just – for a variety of reasons we can talk about, there was never any good evidence that whatever Russians did had much of any influence, any measurable influence on the outcome of the campaign.
00:05:24.000 But they started to scapegoat the social media companies as a way to get control over them.
00:05:30.000 And so then in 2017 they set up – well, two things happen.
00:05:33.000 Many things happen.
00:05:34.000 The Department of Homeland Security just declares election infrastructure to be part of their mission of protecting election infrastructure.
00:05:44.000 And that meant protecting the media environment.
00:05:46.000 Protecting.
00:05:47.000 Protecting.
00:05:48.000 Put that in quotes.
00:05:49.000 It's creepy.
00:05:50.000 It's patronizing.
00:05:52.000 It's a power move.
00:05:54.000 So that's the first thing that happens.
00:05:56.000 They create something called the Cybersecurity and Infrastructure Security Agency within the Department of Homeland Security to supposedly protect the media environment from foreign influence.
00:06:10.000 They create something called the Foreign Influence Task Force with the FBI to basically start policing domestic speech on these platforms.
00:06:19.000 They start organizing all the social media companies to participate in these meetings.
00:06:22.000 So you had Mark Zuckerberg, the CEO of Facebook, in here.
00:06:25.000 And he says to you, there's this critical moment where you ask about the Hunter Biden laptop.
00:06:29.000 And he goes, well, yeah, you know, in the summer of 2020, all these FBI guys come to us saying there's going to be a hack and leak operation involving Hunter Biden, which is super suspicious because, as everybody now knows, the FBI had Hunter Biden's laptop in December 2019. What freaked me out – and I was – I had – so by the way,
00:06:53.000 I was a victim of the Hunter Biden laptop disinformation.
00:06:56.000 I thought that – I voted for Biden.
00:06:58.000 I thought that it was a – I thought that that laptop was Russian disinformation.
00:07:02.000 I just bought the whole thing.
00:07:03.000 And this is from somebody who – You're a journalist.
00:07:06.000 I'm supposedly a journalist, right?
00:07:09.000 So-called journalist.
00:07:11.000 I bought it.
00:07:11.000 You know, I'm still a big liberal in so many ways.
00:07:14.000 And everybody I knew was like, oh, you know, besides Trump, it was just he's so – for all the reasons that progressives bought that the laptop was fake, I bought that it was fake.
00:07:23.000 So then when you realize that it was real and that everything in that New York Post story on October 14th, 2020 was accurate – I started seeing stuff in the emails.
00:07:33.000 The thing that really freaked me out was this thing that Aspen Institute, it's called a tabletop exercise and it was actually a Zoom call.
00:07:42.000 I think we're good to go.
00:08:04.000 Basically, they are training or brainwashing all these journalists.
00:08:08.000 I mean, it's CNN, New York Times, Washington Post, Wikimedia Foundation, the Wikipedia folks, the networks, all of the social media companies, all coming together to be like, okay, well, if something is leaked,
00:08:23.000 then we should not cover it in the way that journalists have traditionally covered it.
00:08:29.000 Meanwhile, Stanford University A few months earlier had put out a report saying reporters should no longer follow the Pentagon Papers principle.
00:08:39.000 Well, the Pentagon Papers, of course, is this famous episode.
00:08:42.000 It was – Steven Spielberg made a whole movie about it where the Washington Post and New York Times published these Internal Pentagon documents showing that the US government was losing the war in Vietnam, right?
00:08:52.000 This is Daniel Ellsberg and he just releases it.
00:08:55.000 He steals these documents.
00:08:56.000 He breaks the law, steals these documents, gives them to the newspapers.
00:08:59.000 The newspapers publish them.
00:09:01.000 It's this kind of incredible moment in American journalism where we are like the First Amendment gives these newspapers the right to publish.
00:09:09.000 Not hacked, so-called hacked, but leaked information.
00:09:12.000 And here you have Stanford University, Aspen Institute saying, oh, no, no, no.
00:09:17.000 That's all.
00:09:17.000 We should stop doing that.
00:09:18.000 Journalists should no longer write about leaked information in that way.
00:09:23.000 Instead, we should focus on the person who leaked it.
00:09:26.000 So it really sent chills on my spine.
00:09:29.000 It was the creepiest thing I'd ever seen.
00:09:32.000 And this is, of course, you've got to remember, Aspen is funded by the U.S. government.
00:09:35.000 Stanford is funded by the U.S. government.
00:09:37.000 So this is – people go, oh, well, you're just – one of the responses we've got is they go, oh, you're just talking about content moderation by private companies.
00:09:43.000 No.
00:09:44.000 We're talking about U.S. government-funded organizations.
00:09:49.000 You can't – if the US government is censoring information, that's obviously a violation of the First Amendment.
00:09:54.000 But if the US government is funding somebody else to censor information, that's also a violation of the First Amendment.
00:09:59.000 You can't indirectly – it's still a violation if you're funding somebody to demand censorship.
00:10:05.000 So – That was quite a steeplechase, but there's a lot here.
00:10:08.000 I mean, it's a lot of people, a lot of institutions, a lot to unpack, and that was part of the reason I wanted to reach out and be like, I need a Joe Rogan session to just kind of go through it all.
00:10:16.000 Yeah, well, I'm very happy to provide that.
00:10:18.000 Here's the question.
00:10:21.000 Obviously, the laptop would harm...
00:10:25.000 The Hunter Biden laptop would harm Joe Biden, obviously.
00:10:29.000 And if that story got out, who knows how many people would have voted the other way.
00:10:33.000 Is this a direct result of the things that Trump said when he was in office that went against the intelligence community?
00:10:41.000 Like, how did they decide?
00:10:43.000 I would always assume that the so-called deep state Is essentially a bipartisan, that they wouldn't necessarily side with the Democrats or the Republicans.
00:10:53.000 They're really, you know, they're just in charge of, they're supposed to be gathering information to protect the country.
00:11:01.000 So how did they decide specifically to either stop information or propagate misinformation that would aid Joe Biden?
00:11:14.000 Yeah.
00:11:16.000 Yes, that is exactly the right question.
00:11:18.000 So, I mean, I think the thing you have to understand is that Trump was viewed by the deep state, by, you know, CIA, FBI, Pentagon, you know, I mean, all of the elites.
00:11:31.000 And you're right.
00:11:33.000 It's bipartisan in the sense that it's both never Trump Republicans and Democrats.
00:11:38.000 What freaked them out the most about Trump is that he was threatening to pull the US out of NATO. I don't think that that was – I just think that was bluster.
00:11:47.000 Like, that's insane.
00:11:48.000 And by the way, I should say I actually – I support what we call the Western alliance.
00:11:52.000 I support providing military security for our allies in Asia and in Europe.
00:11:57.000 I'm not a – I mean there's parts of economic nationalism that I respect but I'm also – I don't think we should pull out of NATO. I think NATO has provided peace in the world and mostly been a good thing.
00:12:09.000 It's obviously had some crazy abuses like Iraq.
00:12:13.000 This whole experience has made me rethink my support for Ukraine.
00:12:17.000 But I think it's important to understand that Trump terrified the deep state and the national security establishment.
00:12:24.000 So did Brexit.
00:12:25.000 There's a sense in which you had a guy on here named Peter Zion who wrote this book called – this really apocalyptic book about how the world is going to fall apart.
00:12:32.000 And his whole argument, which I don't agree with – I think he's brilliant, but the book is – I think the argument is wrong.
00:12:38.000 His whole argument is based on the idea that the United States is going to stop providing – Military security to our allies in Asia and Europe.
00:12:44.000 It's all just based on this assumption that Trump is the beginning of some – the US withdrawing from its traditional role since World War II. There's a bunch of people who obviously their ideology, their livelihoods, their identity, just their whole way of life is tied up with providing – the United States providing this protection for Europe and Asia and they view Trump as threatening that.
00:13:08.000 I also think they just really hated the guy.
00:13:10.000 They looked down on him.
00:13:12.000 He was crude and all the things that people don't like about him.
00:13:16.000 And he spoke disparagingly about the intelligence community.
00:13:19.000 Yeah.
00:13:20.000 Which is crazy.
00:13:21.000 Absolutely.
00:13:22.000 He was against the war in Iraq.
00:13:23.000 He was different.
00:13:25.000 He was a nationalist.
00:13:28.000 And what's so interesting is that if you read people like – people on the left like Noam Chomsky and others who have been critics of US or Glenn Greenwald who are critics of – and I think Matt Taibbi, critics of US government military invasions around the world since World War II. I mean we've overthrown many governments,
00:13:46.000 right?
00:13:47.000 You know, Iran, Chile, Guatemala, you know, and what the pattern is, is that these are places where nationalists, sometimes socialists, but often just nationalists who are trying to control their economies and they didn't want foreign interference,
00:14:03.000 were coming to power.
00:14:05.000 And the US government would see that as a threat to providing, you know, having this liberal global order, as it's called.
00:14:12.000 And so they saw what – they saw Trump as an existential threat to this post-war liberal order and they needed to – and they viewed social media as the means to his power, which I think was exaggerated.
00:14:27.000 So on the one hand, they saw a threat.
00:14:28.000 I think they also saw an opportunity.
00:14:31.000 The war on terror, we won.
00:14:33.000 I mean like just – I mean huge victory.
00:14:36.000 I mean it's shocking how successful it was in some way.
00:14:38.000 So you have a bunch of people that suddenly need something to do.
00:14:41.000 Yeah.
00:14:42.000 So there's a lot of motivations there.
00:14:44.000 And then you also have the guys that lost the Hillary campaign, John Podesta.
00:14:48.000 He was the chair of her campaign.
00:14:49.000 He runs the most powerful progressive, frankly, propaganda organization in the world or at least in the United States, the Center for American Progress.
00:14:59.000 They were looking also for some reason someone to blame for their own failures, for the dislikability of Hillary.
00:15:07.000 And so there was just a lot of motivations to try to get control over social media platforms.
00:15:13.000 They felt like they had lost control of them.
00:15:15.000 And what was the attitude of these social media platforms when they were exchanging emails back and forth with these intelligence agencies?
00:15:25.000 Was there any understanding of the implications of allowing this Web of influence to infiltrate and control narratives and how kind of creepy and dangerous that is?
00:15:40.000 Did they understand how other people would perceive that?
00:15:43.000 Because I would assume this is all...
00:15:46.000 The emails were exchanged and there was Slack messages and all this stuff is recorded, right?
00:15:52.000 So there's a record of it.
00:15:54.000 Yeah.
00:15:55.000 Did they have an understanding of how other people would view this?
00:15:58.000 Yeah, I mean, just to back up even further, so there's two interesting dynamics going on.
00:16:03.000 You know, the first is that the Internet itself is created by the U.S. Department of Defense, and Google is a spinoff of Defense Department projects.
00:16:12.000 You know, so on the one hand...
00:16:15.000 The internet is a function of the U.S. military.
00:16:18.000 I mean, it's a spinoff of the U.S. military.
00:16:19.000 It's a great one.
00:16:20.000 We're glad to have it.
00:16:21.000 But I think the U.S. military and the deep state and whatever, they felt like they had control over the internet until Trump, basically, or really maybe until ISIS around 2014, 2015. That's the first dynamic.
00:16:33.000 The second dynamic is culturally Silicon Valley is libertarian, right?
00:16:37.000 So you have the Electronic Frontier Foundation.
00:16:41.000 You have a libertarian ethos.
00:16:44.000 Jack Dorsey, the founder of Twitter, is very much a manifestation of that libertarian ethos.
00:16:51.000 Mark Zuckerberg, less.
00:16:53.000 But even Mark Zuckerberg, after the 2016 elections, when everyone is accusing him of throwing the election to Trump, he's like, this is ridiculous.
00:16:59.000 He's like, our own data doesn't support.
00:17:02.000 There just wasn't enough.
00:17:03.000 The Russians clearly did not have this influence.
00:17:06.000 They just beat the crap out of him so much and threatened to take away their ability to operate, which is known as Section 230, which is this huge liability protection in the law that passed in 1996, which allows Google, Facebook, Twitter to exist.
00:17:21.000 Can I stop you there?
00:17:21.000 When you say they threatened to take it, like, in what way?
00:17:26.000 Directly.
00:17:26.000 Directly.
00:17:27.000 Including Biden himself.
00:17:28.000 I mean, but basically, Democratic politicians, they would just say, you know, we're going to remove your Section 230 status.
00:17:36.000 That's just like saying we're going to destroy your company.
00:17:39.000 I mean, it's just – it's not – And they were saying this because their assertion was that Russian disinformation and propaganda led to Donald Trump being elected.
00:17:51.000 And there was no evidence of this.
00:17:55.000 No, I mean, there was some evidence of it, but nothing.
00:18:01.000 Well, there was certainly evidence of, like, these troll farms.
00:18:05.000 Yes.
00:18:06.000 Right?
00:18:06.000 Yes.
00:18:06.000 We know they exist.
00:18:08.000 Yes.
00:18:08.000 Yeah, but it's trivial.
00:18:10.000 Yeah.
00:18:10.000 I mean, it was—they would exaggerate—they would say things like, you know, 140—I think it was like 146 million Americans had Russian propaganda in their newsfeeds.
00:18:20.000 That's not the same as saying 146 million people saw the ads.
00:18:24.000 Right.
00:18:24.000 Because it's like your feed.
00:18:25.000 Of course.
00:18:27.000 Facebook has changed.
00:18:28.000 So yeah, I mean, look, there's three big...
00:18:46.000 I think we're good to go.
00:18:57.000 Yeah.
00:18:59.000 Yeah.
00:19:12.000 But basically you have active disinformation campaigns being run by the US government and US government contractors against the American people on these issues at the same time that they're demanding censorship.
00:19:22.000 So you have propaganda on the one hand and censorship on the other.
00:19:26.000 Well, here's what appears to be dangerous to me.
00:19:28.000 There doesn't seem to be any repercussions for doing these things.
00:19:32.000 This is scary because it's shifting a narrative.
00:19:35.000 So in step one, the Hunter Biden laptop, for example one rather, the Hunter Biden laptop, no one's in trouble.
00:19:45.000 No.
00:19:45.000 No one's in trouble.
00:19:46.000 No one from the FBI is in trouble.
00:19:48.000 No one loses their job.
00:19:49.000 No one gets reprimanded.
00:19:51.000 No one gets, you know, brought before the American people and said, you failed us.
00:19:56.000 Not only did you fail us, you betrayed us because you knew this was not true.
00:20:00.000 And you allowed someone whose son has deep ties to both Ukrainian and Chinese companies.
00:20:09.000 That we're paying him for influence.
00:20:12.000 And it appears, at least by some of these emails, that some of that money went to the actual Vice President of the United States.
00:20:22.000 Which is fucking wild.
00:20:24.000 Yes.
00:20:24.000 That no one is...
00:20:26.000 And then, the crazy thing is, one of the things about having a right and a left...
00:20:33.000 Is that whenever there's information that's inconveniently bad for that one side, particularly the left, you don't hear a fucking peep about it on the media.
00:20:44.000 Right.
00:20:45.000 It's dismissed.
00:20:47.000 It's like, you know, they'll talk about it like...
00:20:51.000 Someone said...
00:20:54.000 Someone...
00:20:55.000 We talked about the Hunter Biden laptop and said it was like half fake.
00:20:59.000 That was like a term.
00:21:00.000 AOC. AOC said it.
00:21:01.000 Half fake.
00:21:02.000 That's right.
00:21:03.000 That is a...
00:21:05.000 That is such a horrible violation of the trust that the people who elected you put in you.
00:21:14.000 You have access to all the information.
00:21:18.000 You have access to that actual fucking laptop.
00:21:21.000 By the way, I have access to it, too.
00:21:23.000 A lot of people have access to it.
00:21:24.000 If you wanted to, I said I don't want to look at it.
00:21:27.000 I was like, I don't want to look at that fucking thing.
00:21:29.000 I don't want to see this guy getting foot jobs from hookers in Vietnam, smoking street crack.
00:21:33.000 It's crazy.
00:21:34.000 Whatever he did.
00:21:35.000 But the fact that someone would say that's half fake.
00:21:38.000 That itself is disinformation.
00:21:40.000 That is a lie and it's disinformation.
00:21:43.000 But you're just saying it because if you can say it's half fake, you muddy the water and now anybody that's looking at that could go, oh yeah, but that's half fake, according to my side.
00:21:56.000 This is like the same people.
00:21:57.000 There's still people that say that Trump was in bed with the Russians, which is how he won in 2016. There's people that still parrot that.
00:22:06.000 Oh, yeah.
00:22:06.000 Oh, yeah.
00:22:07.000 Very close friends and family say that.
00:22:09.000 Oh, yeah.
00:22:10.000 Might as well.
00:22:11.000 I know.
00:22:11.000 Also, I was happy to do your show because literally even my very close friends and family don't understand what I'm talking about.
00:22:19.000 And I'm like, I want to go on Joe Rogan and just unpack this for them, how serious this is.
00:22:24.000 I mean, you have to remember, and I'll put it on myself, I was so biased.
00:22:28.000 The New York Post published the subpoena, which is a kind of receipt from the FBI, showing they had taken Hunter Biden's laptop from this computer repair store owner in Delaware.
00:22:39.000 It was published in the New York Post.
00:22:41.000 They also published the receipt that has Hunter Biden's signature on it, saying that he had not only had left the laptop there, but also that it gave the computer repair owner The rights to it if he abandoned it.
00:22:53.000 Hunter Biden never said it wasn't his.
00:22:56.000 He never denied that it was his laptop.
00:22:59.000 Well, and subsequently, at least recently, he sued that guy for releasing the information, which the dumbest thing he could have ever done, because now all this half-fake shit gets thrown out the window.
00:23:11.000 Now he's saying it's his.
00:23:13.000 Right.
00:23:13.000 Well, they're always – look, I mean I think the other thing I want to also emphasize here because I think that when you uncover the level of coordination and the sophistication of the disinformation and censorship campaign, it's easy to also sort of say they're perfect but they're not.
00:23:27.000 They're always making stuff up as they're going along.
00:23:29.000 Right.
00:23:30.000 But I think the other interesting thing that's important to know here about that laptop story is that within Twitter, they look at that New York Post article, Joel Roth, the head of safety and his team, they look at it and they go, yeah, I mean, it's legit.
00:23:44.000 It doesn't violate our internal – it doesn't violate our terms of service.
00:23:50.000 And at that moment, I mean, it has a Manchurian quality, Manchurian candidate quality to it, where the former chief legal counsel to FBI, a guy named Jim Baker, who is central to beginning the Russiagate probe of Trump.
00:24:06.000 He's now at Twitter as deputy general counsel.
00:24:09.000 This is what we're discovering.
00:24:10.000 This is probably what I was discovering in the Twitter files, is just vociferously attacking this thing.
00:24:16.000 It's like this looks like misinformation, disinformation.
00:24:19.000 We shouldn't trust it.
00:24:20.000 It looks like it violates Twitter's policies.
00:24:22.000 He was – he just – I mean like multiple – I think it was at least four messages and emails of him pushing to the executives.
00:24:28.000 And of course it doesn't – we can't see the phone calls that – which is really where a lot of the dirty work happens.
00:24:34.000 Pushing to just get this thing censored by Twitter.
00:24:38.000 Sure enough, a few hours later, Joel Roth says, well, okay, you know, we think that it could very well have been a Russian hack where somehow they put the...
00:24:48.000 I mean, it was this crazy thing where they're like, well, we think he was hacked and then put on the laptop.
00:24:53.000 It was just bizarre.
00:24:56.000 Yul Roth, like there's moments where I respect him because he was he was enough of a truth teller internally.
00:25:02.000 It's why he got to the position he was in, which is a very powerful position to be like, hey, this is bullshit, like internally, he would say.
00:25:09.000 But he was also a company man.
00:25:10.000 So when his when powerful superiors in the organization, including former FBI people and Jim Baker wasn't the only one.
00:25:18.000 When he gets worked, he just bends.
00:25:20.000 And he just was like, OK, yeah, I think we've decided that it violates our hacked materials policy and we're going to censor it.
00:25:26.000 The other thing I want to point out about this, it's not just that they censored the article because people always go, well, you know, it only lasted for a few days or whatever.
00:25:35.000 It was the discrediting of it, the censoring.
00:25:39.000 Censorship is a disinformation strategy.
00:25:42.000 If you censor that article, in other words, Twitter and Facebook, all the headlines Where Twitter and Facebook are – they're going to restrict the dissemination of this material or they think it's – all that publicity is really what mattered.
00:25:56.000 So in terms of like – in my defense and other people that bought the idea that it was somehow a fake, we were being told by the media that everybody had looked at this and was kind of like, look, it looks like it's hacked and there's something funny about it.
00:26:10.000 So I think that, you know, I think that there's so many shocking things about it, but I think it's the level of coordination and conformity within these social media companies.
00:26:21.000 It was the pre-bunking in advance and it was the complete total, you know, just the complete news media blackout and unanimity And it was just all of them.
00:26:36.000 I mean it was like all the networks, all the newspapers, they all just repeated this idea that there was something wrong about the laptop and there wasn't.
00:26:42.000 It's so creepy.
00:26:44.000 And it's so creepy that there's no repercussions.
00:26:47.000 It's essentially lying and using taxpayer dollars to promote propaganda that they know to be untrue.
00:26:57.000 But there is a chance.
00:26:58.000 I mean, so we do.
00:26:59.000 The attorneys general of Louisiana, Louisiana and Missouri are moving forward in the courts in suing the ensuing the Biden administration for violating the First Amendment.
00:27:12.000 You know, this is, of course, this 100-bind laptop thing is one of many things.
00:27:15.000 I mean, the other craziest thing of all, maybe some of the most craziest stuff of all is that Facebook censored accurate COVID vaccine side effect information.
00:27:27.000 Because it didn't want to promote vaccine hesitancy.
00:27:30.000 In other words, the White House is, like, just pressuring them.
00:27:32.000 I mean, this guy, Andy Slavin, in particular, is just this malign actor, just pressuring, pressuring, threatening them.
00:27:38.000 They're nasty in these emails.
00:27:40.000 The White House, nasty.
00:27:41.000 In what way?
00:27:42.000 Oh, just being—just basically, you know, it's a—I mean, Biden does it publicly.
00:27:49.000 They're killing people.
00:27:50.000 They're basically accusing people of—I mean, these guys, they don't—the gloves are off.
00:27:55.000 I mean, they're just like, you're killing people by letting this information out.
00:27:57.000 I mean, the information is people telling their own story of vaccine side effects.
00:28:01.000 We always point out, like, it was one of the great public interest progressive victories in recent memory that the drug companies have to name the side effects of their drugs in their TV ads.
00:28:13.000 Like, that's a big part of it, right?
00:28:15.000 It's like a running joke.
00:28:16.000 You have to name the side effects.
00:28:18.000 In the TV ads.
00:28:19.000 Well, here were ordinary people trying to tell stories of the side effects that they had from the vaccine on Facebook and Twitter, and the White House is demanding that Facebook and Twitter censor that stuff.
00:28:30.000 This is just the worst...
00:28:34.000 I mean that is – I mean that's just Soviet Chinese-style censorship like full on.
00:28:39.000 I mean so it's not over and I think that we've already seen – there's other things going on like that agency I mentioned, that cyber – that part of the Department of Homeland Security, the Cybersecurity and Infrastructure Security Agency.
00:28:51.000 They changed their website over the last few months to remove references to domestic – Yeah.
00:29:14.000 Let's talk about that because I had her on and what she essentially was talking about was all these Russian troll farms and how interesting it is That they created all these funny memes and they used all these resources to try to shift the narrative And change public opinion on certain things and that it was very effective.
00:29:35.000 Yeah.
00:29:36.000 Well, so let's just start with Renee.
00:29:38.000 Yeah.
00:29:39.000 So first of all, Renee is somebody who I came across because she's actually kind of moderate on a bunch of the stuff that I'm moderate on, like dealing with homelessness, COVID. She's actually like a moderate voice.
00:29:51.000 She's not super woke or anything.
00:29:53.000 And she's critical of like she moved out of San Francisco because it's just too crazy.
00:29:59.000 So she and I had this conversation.
00:30:00.000 We were talking about this even before.
00:30:02.000 I started talking to her right when I started looking at the Twitter files.
00:30:05.000 And we did this long interview.
00:30:06.000 I was on Sam Harris's podcast with her.
00:30:09.000 But then she starts showing up in the Twitter files in all these weird ways.
00:30:12.000 And we start looking into her.
00:30:14.000 It's a very – so she's also – the reason why she's so important is like when you read the – when you follow the meetings or watch the YouTube videos or whatever, she's like one of the smartest people.
00:30:24.000 Like there's something going on with her.
00:30:26.000 She's like a real leader.
00:30:26.000 She's always sort of the number two.
00:30:29.000 The other thing about these people is that they move around a lot.
00:30:31.000 They move in between organizations and she's always sort of the number two but she always seems a bit smarter than the person that she's reporting to.
00:30:38.000 But so she's somebody that she goes to, she gets a computer science degree from State University of New York at Stony Brook.
00:30:45.000 That happens to be a major recruiting place for the NSA. She then goes and she gets a job at Jane's Trading, which is like one of the great, it's like up there with Goldman or maybe better, it's where SBF from FTX was at.
00:31:01.000 She was there, then she had a couple of companies that did like logistics and cyber.
00:31:06.000 Very high-powered, successful executive.
00:31:09.000 And then, according to her story and the public story, she gets obsessed with anti-vaxxers.
00:31:16.000 She's got young kids.
00:31:17.000 She's obsessed with anti-vaxxers, spreading anti-vax misinformation.
00:31:20.000 This is long before COVID. I think it's around 2014, 2015. Next thing you know, she's like advising President Obama on counter ISIS disinformation strategy in the White House and advising on the expansion of something called the Global Education Center,
00:31:39.000 which is part of the State Department of Counter Disinfo.
00:31:41.000 So suddenly she's like the senior person.
00:31:44.000 It's very suspicious, very rapid rise.
00:31:47.000 If you know anything about those communities, they're very hierarchical and like you have to work your way up over many years.
00:31:52.000 She's instantly like at the top.
00:31:55.000 In 2017, she is at a consulting firm called New Knowledge that is then caught Doing disinformation against an Alabama Trumpian Republican candidate named Roy Moore.
00:32:11.000 They are caught doing fake Facebook pages, accusing Roy Moore of wanting to basically restrict alcohol consumption in Alabama, which is deeply unpopular position.
00:32:23.000 It was false.
00:32:24.000 And also creating the perception of Russian bots supporting Roy Moore.
00:32:30.000 Her firm runs that campaign.
00:32:33.000 But afterwards, she sort of tries to distance herself from it, suggests that she wasn't involved, even though when you read The Washington Post and New York Times articles about her – about that – about the scandal, she sort of – it makes it clear that she was actually the person that brought the funding in to run the program and also kind of conceived much of the strategy.
00:32:55.000 After that, she becomes the top researcher to the Senate Intelligence Report of 2018 on Russian disinformation in the 2016 election.
00:33:04.000 So she's not – not only is she not punished for her role in it, she's rewarded by the Democrats with this incredibly powerful position.
00:33:12.000 So she becomes like the lead witness, the lead author for Senate Democrats, Adam Schiff.
00:33:17.000 In promoting the whole narrative that somehow Russians swung the election to Trump.
00:33:22.000 And there's no repercussions for promoting this false information?
00:33:26.000 No.
00:33:26.000 I mean she's rewarded for it.
00:33:28.000 And no one talks about it?
00:33:29.000 It's never— Well, I mean we're starting to, right?
00:33:33.000 But – I mean I'll point out a couple other things.
00:33:35.000 Trevor Burrus But before the Twitter files – I'm sorry to interrupt.
00:33:37.000 But you didn't even know, right?
00:33:39.000 So most people don't know.
00:33:41.000 No.
00:33:42.000 There's one guy we discovered.
00:33:44.000 Matt Taibbi discovers him.
00:33:47.000 I only – like whatever, like a week or two before my testimony in Congress, which was a couple – a few weeks ago, not the one I did yesterday.
00:33:57.000 We discovered this guy who was the head of cyber at the State Department, a senior guy named Mike Benz, and he is super deep into this stuff.
00:34:07.000 He's amazing.
00:34:08.000 I highly recommend him coming on.
00:34:10.000 But he basically leaves State Department and starts something called the Foundation for Freedom Online, and he has been documenting this more than anybody.
00:34:19.000 So he had it but he's not – he's just really in the weeds.
00:34:22.000 Like it's really detailed.
00:34:24.000 You have to really – it was hard to understand.
00:34:25.000 You have to really go through it and unpack it.
00:34:27.000 I used a bunch of it in my testimony.
00:34:28.000 I talked to him.
00:34:29.000 I interviewed him a lot.
00:34:30.000 But I mean basically a media blackout on all of this stuff.
00:34:35.000 Rene DiResta, who then moves from New Knowledge to Stanford Internet Observatory, that organization and three other organizations, Atlantic Council, Graphica, and University of Washington has a think tank on this.
00:34:49.000 They get government funding and they run something called the Election Integrity Project in 2020 to basically demand censorship.
00:34:57.000 By the way, if I just read the Election Integrity Committee, I get super suspicious.
00:35:02.000 Oh, yeah.
00:35:03.000 Just the name of that.
00:35:05.000 I mean, Joe, they basically would flag hundreds of millions of tweets.
00:35:13.000 I believe that their database, they had over a billion social media posts, Facebook, Twitter, that they flagged, and tens of millions of them were censored.
00:35:25.000 That's insane.
00:35:26.000 By the social media companies.
00:35:27.000 Are they running some sort of a program that allows them to find those tweets?
00:35:31.000 Yeah.
00:35:31.000 Yeah.
00:35:33.000 You see it a lot.
00:35:34.000 They do these maps.
00:35:36.000 They have these maps where they just – they locate the super spreaders.
00:35:39.000 So like you and me would be super spreaders.
00:35:43.000 Jordan – I'm in a – I mean they attack me in this disinformation – this little – these sensors.
00:35:49.000 Really?
00:35:49.000 They have reports.
00:35:50.000 They put like me, Jordan Peterson, Bjorn Lombor, you, Ty.
00:35:55.000 I mean they put us in there.
00:35:56.000 So anybody that has a social media – a big social media following calls super spreaders.
00:36:02.000 And then they try to get us censored.
00:36:04.000 And they did for me.
00:36:05.000 They got Facebook to censor me.
00:36:08.000 How so?
00:36:09.000 Well, when my book on the environment came out, Apocalypse Never, in 2020, I wrote an article that sort of summarized the book as one does.
00:36:17.000 It went super viral.
00:36:19.000 Then one of these...
00:36:22.000 Shady organizations attacked it, not for anything being wrong with it, but for it being misleading.
00:36:33.000 It's the same way that they attacked the vaccine side effects stuff.
00:36:36.000 They go, well, you know, it's accurate, it's true, but it leads people to draw the wrong conclusions.
00:36:43.000 Right.
00:36:44.000 The wrong conclusion being that climate change is real but not the end of the world or vaccines.
00:36:50.000 The wrong conclusion would be maybe don't get the vaccine or maybe if you're whatever, under 18 or you're a young man or 18 or if you've had whatever.
00:36:58.000 I mean whatever it might be, you don't need to be triple vaxxed.
00:37:01.000 So they're basically using an opinion.
00:37:05.000 Which is you should get the vaccine or you should think of climate change as apocalyptic as a way to – and then they kind of go to the back door and say anything that's being used to propagate that narrative should be counted as misinformation.
00:37:18.000 Jesus.
00:37:19.000 So just to back up, so this little cluster, the censorship industrial complex – Does this, quote unquote, election integrity project in 2020, they censor tens of millions of social media posts.
00:37:34.000 And by censor, do you mean they remove them?
00:37:36.000 So by censor, I'm going to use the definition that everybody uses, which is you can remove, you can reduce, or you can...
00:37:46.000 They call it Inform.
00:37:47.000 You can put a flag on it.
00:37:50.000 That's what they do.
00:37:50.000 Everything I do for Facebook now, almost everything I do has a warning on it.
00:37:55.000 You know, here's how to get accurate information about climate change.
00:37:57.000 Go to the Facebook Climate Change Center.
00:37:59.000 Even my stuff on homelessness and drugs, they'll be like, here's how to get accurate information on climate change.
00:38:03.000 That's how you know that I'm on some list.
00:38:06.000 I'm on some blacklist at Facebook.
00:38:08.000 So yeah, so it's those three things.
00:38:10.000 Those are all forms of censorship.
00:38:13.000 These groups, which are U.S. government-funded organizations.
00:38:16.000 This is very important to stress.
00:38:18.000 This is not some private actors.
00:38:20.000 U.S. government-funded organizations pressuring the social media companies to censor these posts and these people.
00:38:27.000 And they do it in 2020. And then Renee, who does this little video, it's like one of the creepiest videos that we've discovered, there's little videos that they do.
00:38:36.000 She's sort of describing, you know, well, and then we realized that we needed to keep going on COVID. And so then in 2021, The Election Integrity Project turns into something called the Virality Project.
00:38:49.000 And that's where they then go and wage censorship on COVID information that they don't like.
00:38:55.000 I refuse to use their language.
00:38:57.000 And again, it's tens of millions of people.
00:38:59.000 And so you see it at all levels.
00:39:01.000 It's these guys doing it.
00:39:03.000 The censorship industrial complex is the right, I think, description of what we're talking about.
00:39:09.000 It's a phrase that, of course, came from Dwight Eisenhower's famous farewell address.
00:39:14.000 That he gives.
00:39:15.000 He goes, look, you got to worry.
00:39:16.000 The DOD is funding all these private military contractors.
00:39:19.000 These private military contractors have a financial interest in war.
00:39:23.000 This is Eisenhower, the guy that won World War II. I mean it's like Mr. Credible on this issue.
00:39:27.000 And he was like this amazing, beautiful.
00:39:30.000 I mean it's really the best of what a president can be.
00:39:33.000 He warns against this.
00:39:35.000 It's the – so it's this complex.
00:39:39.000 It's this kind of clustering of government agencies and government-funded groups.
00:39:45.000 So I mean it's – in the case of the censorship industrial complex, it's the Department of Defense.
00:39:50.000 It's the State Department.
00:39:52.000 It's FBI. It's CIA. It's Department of Homeland Security.
00:39:57.000 Funding these so-called think tanks, and sometimes they're at universities or sometimes they're standalones.
00:40:02.000 Some of them are in Britain, by the way.
00:40:04.000 There's a very special, that special relationship with the U.S. and Britain.
00:40:07.000 Often the U.S. will, the U.K. think tanks right now are attacking me, trying to discredit me.
00:40:13.000 So sometimes they'll go that way.
00:40:15.000 They'll try to like...
00:40:16.000 Attacking you how so?
00:40:17.000 Well, they just put out a report.
00:40:20.000 These guys are the worst.
00:40:21.000 They put out this, like, long report describing climate disinformation.
00:40:26.000 And, like, I was like, as soon as I opened it up, I was like, fuck, I bet I'm in this.
00:40:30.000 And I just do, like, command F. And I just heard Schellenberger, and sure enough, it's like, whatever, like, multiple results.
00:40:35.000 I'm like, crap.
00:40:38.000 You know, often these are reports that they don't get a lot of fanfare or whatever, but they make sure that they get emailed to a bunch of journalists, they talk to the journalists, and they basically just emphasize, never talk to this person, never quote this person, do not platform them.
00:40:54.000 We then, by the way, after our testimony, that same Stanford cluster, it's actually more than one group at Stanford even, They emailed – I'm not going to say who because I don't want to give away my sources.
00:41:06.000 But they've basically emailed many people about Matt and my testimony, trying to attack our testimony and sharing information.
00:41:13.000 So they're just the creepiest.
00:41:14.000 They creep around.
00:41:15.000 They're constantly waging disinformation campaigns against disfavored voices and demanding censorship while also spreading their own misinformation.
00:41:25.000 God, it's so creepy that the people doing this don't understand how deeply un-American this is.
00:41:32.000 And that they feel like it's okay to do because the side that they're on is the right side.
00:41:41.000 You got it.
00:41:42.000 It's so un-American.
00:41:45.000 I mean, Joe, it's funny because, like, I mean, so I graduated from high school in 1989. I remember distinctly that that was the year that the Supreme Court upheld your right to burn a flag.
00:41:55.000 And I remember just being like, God damn, that's why I'm a Democrat.
00:41:58.000 That's why I'm a liberal.
00:41:59.000 Like, I think you should be able to burn a flag because I think the First Amendment is...
00:42:02.000 Literally from that moment on, I have never worried about the First Amendment in the United States.
00:42:06.000 For me, it was, like, always kind of basic.
00:42:10.000 Like, come on, guys.
00:42:11.000 Like, it's the First Amendment.
00:42:13.000 Like, how could it possibly be under threat?
00:42:16.000 This was, like, one of the few times where...
00:42:19.000 Because I don't spook super easily, but, like, reading this stuff, you're just like, this is scary.
00:42:24.000 It's so pervasive.
00:42:25.000 These people are scary.
00:42:26.000 And you may know, by the way...
00:42:36.000 Yeah.
00:42:43.000 I was like, I'm like, look, hey, you know, maybe it's a coincidence, whatever.
00:42:48.000 I was like, just asking around people I know.
00:42:50.000 And people were like, no way.
00:42:52.000 Is that a coincidence?
00:42:53.000 So this is brazen.
00:42:54.000 These guys are trying to send a message.
00:42:56.000 They're trying to intimidate.
00:42:59.000 They want to ruin our—I mean, for me, it's been years of just trying to survive, of, you know, just trying to deplatform, discredit, keep you off of—out of newspapers, out of TV shows, whatever, podcasts.
00:43:14.000 And so, yeah, these guys, they're ruthless.
00:43:18.000 It's definitely a hall monitor mentality.
00:43:22.000 And it's elitist.
00:43:24.000 Like, Renee is a snob.
00:43:28.000 I agree with her on some things.
00:43:30.000 I'm sure she's a fine person in her personal life.
00:43:33.000 She's probably a good mother.
00:43:34.000 I'm trying to be Christian about this.
00:43:40.000 But, I mean, they're snobs.
00:43:42.000 Like, they literally—I remember at one point I briefly asked her about climate change, you know, where we talked about the climate stuff, and I could tell that she felt like she was actually probably an expert on that, too.
00:43:52.000 You know, it's like literally—my book, I spent 20 years of research going into my book.
00:43:56.000 Fine, maybe I'm wrong.
00:43:58.000 But, I mean, like, you have journalists out there, Joe.
00:44:01.000 Like, all these big publications, they're like 23 years old, and they're like, I'm a disinformation expert.
00:44:06.000 I mean, can you imagine being like, I'm a truth expert, Joe?
00:44:10.000 I'm a truth expert.
00:44:11.000 That's really what it is, a truth expert.
00:44:13.000 You're a malign actor and a vector of disinformation, whereas I'm a truth expert.
00:44:17.000 So there's definitely that whole, you know, that Jordan Peterson talks about, which is like, I'm just pure and good.
00:44:23.000 And it's reinforced within the group, right?
00:44:25.000 This is a very tribal thing.
00:44:27.000 You have these ideologies that these people subscribe to.
00:44:31.000 But it's so disturbing as a person, you know, Who grew up liberal to see this from the left, this hardcore censorship from the left and this support of government disinformation that's purely aligned with monetary reasons.
00:44:50.000 It's just about money.
00:44:52.000 I mean, that's the only reason why they would be doing this.
00:44:55.000 Money and power.
00:44:56.000 Money and power.
00:44:56.000 Money and power and ideology.
00:44:59.000 And like I said, it's like – it's not even – I mean I think mostly like I said, I think the Western Alliance and NATO have brought peace since World War II and I don't think we should be pulling out.
00:45:10.000 And honestly, the extent I've rethought my position on Ukraine is just because of these nefarious actors.
00:45:15.000 Like what are they really doing here?
00:45:17.000 Right.
00:45:18.000 So, yeah, I mean, for sure.
00:45:19.000 It's, you know, it is what kind of we all have known it is.
00:45:22.000 The U.S. is part of this empire and we're trying to make the world safe for Western capitalism and Western corporations.
00:45:30.000 And, you know, that's actually lifted a bunch of people out of poverty.
00:45:33.000 It's not totally negative, but obviously you also get the Iraq invasion, which was terrible, and the Afghanistan occupation, which resulted in horrors.
00:45:43.000 But you also get some things that aren't beneficial to anybody.
00:45:45.000 If you're censoring information about the lab leak hypothesis, that's a real problem.
00:45:50.000 Because if we are still funding gain-of-function research, or if we are funding it through a proxy, and they're denying this and lying about this and covering this up through emails, and then when you find out that certain physicians and doctors changed their testimony or changed their opinion and then received enormous grants,
00:46:11.000 this is like...
00:46:14.000 You're following a very obvious paper trail.
00:46:17.000 Let's spend a minute on this, because this is crazy.
00:46:21.000 And by the way, the New York Times finally ran a good story on this just yesterday, and particularly around Fauci.
00:46:28.000 So Fauci, of course, is famous for saying, I am science.
00:46:31.000 If you criticize Anthony Fauci, you're criticizing science.
00:46:40.000 Which is a crazy thing for a human to say.
00:46:43.000 It's a crazy thing.
00:46:44.000 So first of all, the word science, I was thinking the other day, it should just not be a noun.
00:46:49.000 Science is a process.
00:46:51.000 A better word would be investigations or investigating.
00:46:55.000 The science.
00:46:57.000 Yeah, it should be sciencing.
00:46:58.000 Yeah, when you say the science, criticizing the science.
00:47:01.000 No, you mean the data?
00:47:03.000 Right.
00:47:03.000 Like, are you talking about data?
00:47:05.000 Science is a process.
00:47:07.000 A process.
00:47:07.000 And you say the science.
00:47:09.000 The science doesn't support it.
00:47:11.000 It's a religion.
00:47:12.000 I mean, he's saying the truth, the religion, I'm the holy priest, in touch with God.
00:47:17.000 Well, it's just ego.
00:47:18.000 It's so transparent that he can't even hide it.
00:47:21.000 Yes.
00:47:21.000 Like, it's pouring out of him.
00:47:25.000 And I think this is such an interesting case because—so the U.S. government banned gain-of-function research.
00:47:31.000 Yes, in 2014. Right.
00:47:34.000 NIH kept funding it in China.
00:47:38.000 So—and Fauci knew that.
00:47:41.000 He knew that.
00:47:43.000 And then— But didn't that restart in 2016 or 17 when Trump got into office?
00:47:50.000 I'm not sure the exact timeline.
00:47:52.000 I think the Obama administration stopped the funding.
00:47:56.000 And then it kicked back in in 2016. What would have been explained to me was that the Trump administration was so chaotic Yeah.
00:48:16.000 I think the punchline, though, is that Fauci knew very well that gain-of-function research was not only occurring at the Wuhan lab, but that it was being funded by the U.S. government.
00:48:26.000 And then they get on these conference calls and Two of the main researchers, I believe they're both from Scripps, they both go, yeah, I don't know.
00:48:34.000 It looks like it could have been manufactured from a lab and not from zoonotic spillover.
00:48:40.000 So it's even more sinister than just being arrogant.
00:48:44.000 It actually looks like a cover-up.
00:48:46.000 It looks like a cover-up and it looks like a cover-up where the people who covered it up were compensated.
00:48:51.000 Oh, and not only that, but did you see – I don't know if you saw this recent report where there's – it looks like they were double-dipping.
00:48:57.000 They were double-charging.
00:48:59.000 They were overcharging.
00:49:01.000 So they were basically getting paid twice by U.S. taxpayers.
00:49:06.000 CBS News, which is like only one of the few mainstream media outlets that's actually done a good job covering this.
00:49:11.000 They also covered the 100-billion laptop accurately – belatedly, but they did – Yeah, they wrote about how these contractors were getting paid twice for the same work.
00:49:22.000 So that's a way now to kind of get in there and try to figure out what's going on.
00:49:27.000 The crazy thing is, back to the Twitter files, Because, you know, Elon is obsessed with Fauci and wants to have the Fauci files, but none of us have looked for this in the Twitter file.
00:49:39.000 Like, literally nobody has yet even looked to see whether or not this COVID origin stuff was being censored from within Twitter.
00:49:46.000 So we don't know yet.
00:49:47.000 I mean, we've just been backed up in a lot of other stuff.
00:49:49.000 Wow.
00:49:49.000 So this other stuff is so preoccupied all of your time.
00:49:54.000 So is that next on the agenda?
00:49:56.000 I hope so.
00:49:57.000 I mean, I... Should you be proclaiming that publicly?
00:50:00.000 No, it's...
00:50:01.000 I mean, you mean that we want to look for it?
00:50:03.000 No, I mean, Elon proclaimed it.
00:50:05.000 Elon promised the Fauci files.
00:50:07.000 Well, he literally said his pronouns are prosecute Fauci, which is wild.
00:50:13.000 Yeah, that wouldn't have been...
00:50:14.000 I would have liked to do the Fauci files first and then make a judgment.
00:50:19.000 Well, the thing...
00:50:20.000 He's smart.
00:50:21.000 And what he's doing is he's, like, firing a shot across the bow.
00:50:26.000 And then causing people to scramble and reveal their intentions and reveal, like, what they're trying to accomplish.
00:50:35.000 Yeah.
00:50:35.000 Right?
00:50:36.000 Like, it's a chess move.
00:50:37.000 It's a good chess move.
00:50:39.000 Because it gets people talking, and then it gets people talking about Fauci.
00:50:43.000 I have no idea what he's talking about!
00:50:47.000 It's just craziness!
00:50:49.000 Oh, it's craziness.
00:50:50.000 Is it really?
00:50:51.000 Well, maybe someone's gonna go look at AZT. Maybe someone's gonna go back and look at the way you guys handled the AIDS crisis.
00:50:58.000 Because if you look at Robert Kennedy Jr.'s book, The Real Anthony Fauci, if that book is accurate, I don't know if it's accurate.
00:51:04.000 I'm assuming he hasn't been sued yet.
00:51:07.000 It's a terrifying book.
00:51:08.000 When they talk about the AIDS crisis, it's essentially a version of what you're seeing now, but with no internet.
00:51:16.000 Where they were allowed to do things with no investigative journalist, no social media outrage, no people posting different studies that contradict what they're saying.
00:51:29.000 It's a wild book, man.
00:51:32.000 It's a wild book of unchecked power and influence.
00:51:36.000 And also, like, an absolute disdain for what is beneficial to human life and the American people.
00:51:42.000 And instead, what is great for profit.
00:51:46.000 Yeah.
00:51:46.000 I mean, it's an abuse of power.
00:51:48.000 You know, we had this crazy abuses of power, you know, under Nixon during the Vietnam War, late 60s, early 70s.
00:51:55.000 We had a church.
00:51:56.000 We had this thing called the Church Committee hearings.
00:51:58.000 It was bipartisan.
00:51:59.000 It did result in a bunch of reforms that reminded the—you know, we had a bunch of reforms that basically prevented the federal government from spying on the American people and— That's out the window.
00:52:25.000 I think, you know, just talking about this and testifying about it, I think actually because sunlight is the best disinfectant.
00:52:31.000 But no, you're right.
00:52:32.000 We've got to defund and dismantle the censorship industrial complex.
00:52:37.000 But we also need to hold people accountable who were doing this I think that's the only way.
00:52:43.000 If people aren't held accountable, then it seems like you can just do it again and get away with it.
00:52:48.000 And then everybody just sort of just gets – they upwardly move and get rehired at new organizations.
00:52:55.000 That's right.
00:52:55.000 They kind of hide.
00:52:56.000 They kind of get quiet for a little while and they kind of just – they'll come back.
00:53:01.000 So absolutely.
00:53:02.000 You know, it's funny because as you get older, I was like, as you get older, you're like, wow, those cliches are true.
00:53:10.000 You know, like the one that's like, you know, the famous Jefferson one of the price of freedom is eternal vigilance.
00:53:16.000 I remember being like, that's so cringe, you know, like a few years ago.
00:53:20.000 And now I'm like, wow, that is profound.
00:53:22.000 It's crazy.
00:53:23.000 It's so true.
00:53:23.000 We were talking about this last night that, you know, when I was texting Elon about all this stuff, he was like, he's hilarious.
00:53:31.000 He's like, turns out all the conspiracy theories were true, LOL. I mean, he thinks it's funny.
00:53:38.000 He's so casual about it.
00:53:39.000 I'm like terrified.
00:53:40.000 I'm like white knuckling the whole thing, being like, this is scary.
00:53:42.000 I guess having $200 billion really puts a nice cushion on the repercussions for whatever the fuck you do, other than him getting assassinated.
00:53:51.000 And he has publicly stated that I'm not suicidal.
00:53:54.000 And I think he's legitimately concerned.
00:53:57.000 Like, that could be something that happens to him.
00:53:59.000 His security detail's amazing.
00:54:01.000 It should be.
00:54:02.000 Yeah, yeah.
00:54:02.000 It should be beyond amazing.
00:54:04.000 You should have fucking Iron Man guarding him.
00:54:05.000 Yeah, even better than your security detail, man.
00:54:08.000 I'll have to step it up after this interview.
00:54:10.000 No, for sure.
00:54:10.000 Hey, I have to pee.
00:54:11.000 I'm so sorry that I've been drinking a ton of water.
00:54:14.000 This is so embarrassing.
00:54:15.000 I used to be able to go for three hours.
00:54:17.000 No, man.
00:54:17.000 We'll be right back.
00:54:18.000 No, it's fine.
00:54:19.000 And we're back.
00:54:20.000 Okay, where were we?
00:54:22.000 The government's bad.
00:54:23.000 Something's bad.
00:54:26.000 The long march to totalitarianism.
00:54:29.000 Yeah.
00:54:29.000 It's disturbing because it seems like that's just how it goes.
00:54:33.000 Like, they just keep acquiring more power and no one notices and no one says anything and then it just moves very slowly.
00:54:42.000 Jordan Peterson outlined this.
00:54:44.000 He outlined this.
00:54:45.000 He was talking about how change doesn't happen in these big jumps.
00:54:50.000 What they do is they move you and push you just incrementally.
00:54:55.000 And you don't say anything, and they push you a little forward.
00:54:57.000 And before you know it, you're so far removed from where you started, and you didn't even notice it.
00:55:03.000 Right.
00:55:04.000 It's changing the norms.
00:55:06.000 That's why I think, you know, we were talking about this person, Renee DiResta, but these other groups in the censorship industrial complex, they're constantly promoting the idea that it's okay and necessary to have more censorship.
00:55:20.000 So both times, I've testified now, you know, twice in the last three weeks, both times the Democrats were like, I mean, the Republicans were like, why are we taking stuff down?
00:55:29.000 And the Democrats were like, we're not taking enough stuff down.
00:55:32.000 I mean, there's this sense in which More stuff needs to be censored.
00:55:35.000 That's the idea they're trying to promote.
00:55:37.000 It's bizarre.
00:55:42.000 It's so spooky.
00:55:51.000 It's not like there's no real protection, especially when you look at what happened during the COVID crisis.
00:55:57.000 If you could just like look at it now and go over it and say, what were you trying to do really?
00:56:03.000 It seems like what you're trying to do is make as much money as possible for the pharmaceutical companies.
00:56:07.000 That seems like what you're doing.
00:56:09.000 This whole idea of vaccine hesitancy, once enough data was out there, particularly when you're talking about vaccinating people that had already had COVID, that's preposterous.
00:56:18.000 It doesn't even make sense.
00:56:19.000 It doesn't make sense medically.
00:56:21.000 It doesn't jive with the studies.
00:56:23.000 All this is very strange.
00:56:25.000 And this idea that you're stopping vaccine hesitancy because of real data?
00:56:35.000 That term is so creepy.
00:56:38.000 Because what you're saying is side effects.
00:56:41.000 You're talking about not telling people about the dangers of something, which has always been something that we considered with every drug.
00:56:48.000 And you're hiding it.
00:56:50.000 Absolutely.
00:56:51.000 And also not only that, but this is not the same as measles or mumps.
00:56:55.000 This is very different than that.
00:56:56.000 And you don't get herd immunity with the COVID vaccine.
00:57:00.000 And so you have to remember that what's crazy about it, too, is you go from this, well, we're going to have a vaccine and then we're not going to get it.
00:57:08.000 Right.
00:57:09.000 And then we're not going to spread it to, okay, well, you might still get it, but it won't be as bad, but you won't spread it.
00:57:14.000 And then, well, you might get it, but it won't be as bad, but you might still spread it.
00:57:20.000 So then it's kind of like, well, then why mandating this?
00:57:24.000 Why not just let it be personal choice?
00:57:26.000 Did you see the video that was released recently of Fauci in the hood?
00:57:30.000 Yes.
00:57:31.000 It's amazing.
00:57:31.000 With those black residents?
00:57:32.000 It's amazing.
00:57:33.000 The one guy's like, something else is going on.
00:57:35.000 Yeah.
00:57:36.000 And Fauci's explaining, if you get it, you barely notice it, which is just a fucking lie.
00:57:44.000 People have died that have been vaccinated.
00:57:46.000 They've died from COVID. You're a fucking liar.
00:57:50.000 And also, they never tested it to stop immunity or to stop transmission.
00:57:55.000 They just knew that it was giving some sort of antibody protection.
00:58:00.000 I think some of it is – I think definitely money, of course, it has to – it's playing a role and it's foundational.
00:58:05.000 But it's also just this moralizing.
00:58:08.000 It's the sense of wanting to take care of people.
00:58:10.000 It's a lot of stuff that we talked about on homelessness.
00:58:12.000 It's this – to people, to victims, everything should be given and nothing required.
00:58:18.000 Although in this case, of course, the requirement is that they take the vaccine.
00:58:21.000 But it's a sense – it's paternalism – It's also attached to the ideology.
00:58:25.000 It's attached to this left-wing ideology.
00:58:27.000 And the right-wing people are like, you're not going to get me with that jab.
00:58:32.000 And the left-wing people are like, I'm not boosted enough.
00:58:35.000 Let's keep going.
00:58:36.000 It's very strange to watch people put these blind trust in pharmaceutical companies and demonize people who don't step in with it.
00:58:46.000 But it's a bit of also like the Kathy Bates character in Misery, which is like, I'm going to take care of you.
00:58:52.000 It's like, I really want to take care of you.
00:58:54.000 It's like, I don't think so.
00:58:56.000 You want to take care of me too much.
00:58:57.000 So it's like when care becomes creepy.
00:59:00.000 Well, it's also enforcing groupthink.
00:59:04.000 That's a big part of it.
00:59:06.000 Groupthink is a natural inclination that people have.
00:59:08.000 But it's accelerated by the rise of the internet and the rise of these voices.
00:59:14.000 So people like you, you trigger people because it's like, oh my, there's people out there that are influential that are saying things different than what the mainstream are saying.
00:59:24.000 It freaks them out.
00:59:25.000 What should freak them out is that CNN said I was taking veterinary medicine.
00:59:29.000 That should freak them out.
00:59:30.000 And I think it did freak a lot of people out.
00:59:32.000 Instead of saying, hey, how'd that guy get better so quick from some horrible, deadly disease and three days later?
00:59:38.000 I mean, when they used my face and put it through a filter to turn me yellow, all of it was wild.
00:59:45.000 For a person to watch it, for a person to be in my position and watch it, It was really interesting because, first of all, it's like I'm not on a network.
00:59:59.000 You really can't get rid of me.
01:00:01.000 Right.
01:00:01.000 And second of all, I have a lot of money, so I can just, like, even if I stop working, you're not gonna hurt me.
01:00:07.000 I'll just, I'll find something.
01:00:09.000 I'll figure something out.
01:00:10.000 Like, this is not a thing like the 1970s when you could just get someone removed from a television show, like when they attacked the Smothers Brothers for the criticism of the Vietnam War.
01:00:20.000 This is a different thing.
01:00:21.000 Like, you're in a different landscape, and I don't think you understand where you're at.
01:00:27.000 Like, you're playing this game where you don't even understand the numbers.
01:00:31.000 Well, I think you said, too, you benefited, right?
01:00:34.000 They came after you and you had a big boost.
01:00:36.000 Two million subscribers in a month I gained.
01:00:39.000 Thank God for the Streisand effect.
01:00:40.000 Yes.
01:00:41.000 It also sold my book.
01:00:42.000 Yeah.
01:00:43.000 Like, I mean, it was like, I mean, on the one hand, that's really, being censored is such a horrible experience.
01:00:48.000 It really feels dehumanizing to be deprived your voice or to have this super powerful media company being like, Schellenberger is spreading disinformation.
01:00:56.000 It's just like, oh my God.
01:00:57.000 Was this the San Francisco?
01:00:59.000 No, that was the Apocalypse Never.
01:01:01.000 Apocalypse Never.
01:01:02.000 But on the other hand, I think the response from people was, well, I want to go read that book.
01:01:07.000 And so there is a way in which – it's an interesting thing where the regime goes too far and people don't like that.
01:01:16.000 It also made me question scientific papers for the first time.
01:01:20.000 When I was informed by people who don't want to talk about it publicly, How these things work.
01:01:28.000 When I talk to people who are physicians who said, listen, this is why I can't talk about this publicly.
01:01:34.000 This is why I can't discuss this.
01:01:36.000 And this is why when you read a scientific paper and you read the conclusion, What you don't understand is that this was designed, this study was designed to show one very specific outcome.
01:01:48.000 And if it didn't, you would never see it.
01:01:50.000 That happens all the time.
01:01:52.000 I would have never imagined that before COVID. I thought that when there's any sort of scientific study or a medical study or anything about something, what they're trying to do is find out what's true.
01:02:05.000 I did not know that they can do ten studies and if eight of them show negative side effects, they could remove those and just find some carefully constructed, very biased study that points to a very specific outcome that's desired.
01:02:22.000 Oh, absolutely.
01:02:23.000 I didn't know that.
01:02:24.000 Oh, yeah.
01:02:24.000 That scares the shit out of me.
01:02:25.000 Well, because they don't publish null findings.
01:02:28.000 They only publish if they get a finding.
01:02:30.000 Exactly.
01:02:30.000 So then you don't know all the cases where it's like they didn't find anything or they found the opposite results.
01:02:45.000 On say the pharmaceutical drug, you're not really doing a peer reviewed study on the data.
01:02:51.000 You're doing a peer reviewed study on the interpretation of the data by the pharmaceutical company.
01:02:56.000 So they don't have access to the actual study.
01:02:58.000 They don't have access to the data.
01:03:00.000 They have access to the conclusions that are given to them.
01:03:02.000 By the pharmaceutical companies and then they review that.
01:03:07.000 Which is fucking insanity.
01:03:08.000 That's like the wolf telling you what he did to the hen house.
01:03:13.000 It's like basically they were all dead when I got there.
01:03:16.000 Right.
01:03:17.000 I mean this should be a moment of great humility.
01:03:21.000 I mean my parents, they had a high carb diet.
01:03:26.000 They thought that proteins and fats were bad.
01:03:29.000 This was just based on the worst science.
01:03:31.000 The food pyramid.
01:03:32.000 You know, the food pyramid.
01:03:33.000 And I think some of their, and they, at least my father, both my parents have Parkinson's.
01:03:38.000 I think that all of that sugar-insulin cycling had some role in that.
01:03:42.000 And there should be a moment of humility to be like, science really misled us.
01:03:46.000 We, you know, various authors have done good debunkings of how we got there.
01:03:50.000 But this would be a moment of great humility.
01:03:52.000 But instead we're seeing the elites in particular responding with more dogmatism, more certainty.
01:03:58.000 More arrogance.
01:03:59.000 They're trying to cover their tracks.
01:04:01.000 Yeah.
01:04:02.000 And cover their ass.
01:04:03.000 They're in the grip of an ideology.
01:04:05.000 And I think there is a panic.
01:04:07.000 They see you succeed.
01:04:10.000 They see people like me or Bjorn or others.
01:04:13.000 Substack, the rise of substack.
01:04:15.000 So this is the revolt of the public by Martin Gurry.
01:04:18.000 He argues that really all of this is just the elites freaking out about the rise of the internet.
01:04:23.000 And that the response is very similar to the response to the printing press.
01:04:27.000 The printing press suddenly makes books available and the elites in Europe freak out.
01:04:31.000 Yeah, I just found out recently, like fairly recently, that some of the earliest books, the really popular ones, are about witches, finding witches.
01:04:38.000 I always assumed that books in the early days, like, oh, what a great thing the printing press was.
01:04:43.000 When the printing press came about, people got access to all this knowledge and information.
01:04:48.000 No, no, a lot of the early books were about how to spot a witch.
01:04:51.000 Oof, that's scary.
01:04:53.000 Which kind of makes sense because that's what a lot of the internet is.
01:04:56.000 I mean, you get on, like, Reddit conspiracy.
01:04:59.000 Like, I go to the Reddit conspiracy page every now and then and be like, what's the looniest shit that they have?
01:05:05.000 And, you know, you'll find some.
01:05:06.000 They're like, whoa.
01:05:08.000 Yeah.
01:05:09.000 Well, now we see social contagion.
01:05:11.000 So the big one, of course, that we're all talking about is the trans issue.
01:05:18.000 And that issue, by the way, has completely changed in Europe, particularly in Britain, where there's a big new book out, A Time to Think, about the Tavistock Gender Clinic.
01:05:26.000 But basically, it looks as though a lot of autistic kids or kids with autism spectrum They – who are just uncomfortable in their bodies, are more prone to be thinking in black and white, are basically being misdiagnosed with gender dysphoria.
01:05:43.000 And then you also have a different group of folks, maybe kids that would end up being gay or lesbian if they didn't transition who become convinced that they are the opposite sex.
01:05:54.000 This is one of the ideas is some of it's a social contagion.
01:05:59.000 In other ways, it's iatrogenic, which means that it's actually caused by the medical profession.
01:06:03.000 So you start to get doctors and others misdiagnosing people.
01:06:08.000 I mean, this is something that we just published a piece on this where this was what happens with anorexia and bulimia.
01:06:13.000 You know, these doctors identify eating disorders and then they publicize them and it gets all this publicity about it.
01:06:19.000 And then the disorder spreads.
01:06:21.000 Yes.
01:06:22.000 So it's really tricky.
01:06:24.000 I mean, it's not— Well, then there's all these gender-affirming care clinics that pop up, and they're enormously profitable, which is terrifying.
01:06:32.000 Right.
01:06:33.000 Same as Eisenhower's speech about the military-industrial complex, they have a vested interest in going into war.
01:06:40.000 These people have an interest in diagnosing people with gender dysphoria, which is— It's terrifying to think that their opinions and their diagnosis would be based on something other than, what's going on with you?
01:06:55.000 It was like, they have an incentive.
01:06:58.000 And that was also during COVID. They were incentivized to give people certain medications.
01:07:04.000 They were financially incentivized to put people on ventilators, financially incentivized to mark deaths as COVID deaths.
01:07:12.000 All of this is so enlightening because I never would have expected that.
01:07:17.000 I never would have suspected that at all before COVID, before the pandemic and all this chaos and all the things that I've seen.
01:07:27.000 My whole view of how the world runs is completely different.
01:07:33.000 Oh, absolutely.
01:07:34.000 I mean, it's funny because you had Abigail Schreier on, who wrote this big book on transgenderism as a social contagion, I think it was in 2020. I remember at the time being like, I think she's, I mean, what she's saying makes sense, but it's so horrible to consider.
01:07:48.000 I just was kind of, it took me like three years to finally work on it or write on it.
01:07:52.000 But I thought, you know, part of what's—I mean, the people—like, first of all, people with autism spectrum should be up in arms and outraged about the mistreatment of people with autism by these gender clinics.
01:08:03.000 The other group that would be completely up in arms are gay and lesbians.
01:08:06.000 I mean, Andrew Sullivan, to his credit, is speaking out on it.
01:08:10.000 I mean, I didn't quite understand.
01:08:11.000 Abigail had to explain it to me because I would read all of her stuff, but sometimes you just like miss some of it.
01:08:16.000 These kids who go through this gender transition, they not only are infertile afterwards, but then they don't have sexual pleasure.
01:08:24.000 I mean, think about the gay and lesbian and the bisexual movement spent decades Basically making everybody comfortable with the fact that gay people should be able to get sexual pleasure from their sex.
01:08:36.000 And everybody's kind of like, you know, most people are heterosexuals and most people are like, that's strange.
01:08:40.000 It took a long time to be like, no, we celebrate that.
01:08:43.000 That's great that you can.
01:08:44.000 And that we know sex is an important part of long-lasting relationships.
01:08:48.000 So to actively deprive children of that, you're not just sterilizing the kids, you're depriving them of sexual function and then being able to bond with somebody.
01:08:58.000 I mean, how do you look at that and not go, this is really disturbing?
01:09:03.000 It's disturbing and it's thousands of people.
01:09:05.000 Yeah.
01:09:05.000 Yeah.
01:09:05.000 I mean, just the idea of doing that operation to someone and removing their ability to have an orgasm.
01:09:12.000 You know, there's people that have talked about these detransitioners and If you've ever watched any of those videos, those videos are horrific.
01:09:20.000 And those were censored.
01:09:22.000 Those were censored from social media and stopped from being able to be spread, which is crazy.
01:09:30.000 You're talking about someone's actual lived experience with essentially genital mutilation that's state-sanctioned.
01:09:40.000 Absolutely.
01:09:41.000 We just did the interview with the first Canadian detransitioner to sue her medical providers.
01:09:47.000 And I said to her, she's very smart, she's very thoughtful, such a good person, Michelle.
01:09:52.000 And I was like, have you ever thought about how your medical mistreatment compares to other forms of medical mistreatment in history?
01:10:01.000 And she said without hesitation, she goes, yeah, lobotomies.
01:10:05.000 Ooh.
01:10:07.000 That was a big one.
01:10:08.000 I know.
01:10:09.000 I was like, wow.
01:10:11.000 She was like, well, what's amazing is how long they went on, how long we – I mean, with no – I mean, no benefits.
01:10:20.000 Right.
01:10:20.000 And mostly – I mean, John F. Kennedy's sister was lobotomized and just – she was – probably had schizophrenia.
01:10:27.000 She was disabled, I mean, by the lobotomy.
01:10:31.000 Right.
01:10:32.000 It's a scrambling of the brain.
01:10:33.000 It went on for decades.
01:10:35.000 It's surgery to solve a psychiatric disorder or mental illness.
01:10:40.000 And I was then also like, do you ever think that maybe transgenderism is a cult?
01:10:48.000 Just without hesitation, yes.
01:10:51.000 It's a cult.
01:10:52.000 Well, they certainly behave like one.
01:10:54.000 There's all these articles that came out about the misgendering of the school shooter.
01:11:00.000 Which is so fucking wild.
01:11:04.000 This is insane.
01:11:04.000 First of all, that person's dead.
01:11:06.000 Okay?
01:11:07.000 It doesn't matter if you call it a boy or a girl.
01:11:10.000 That's a dead person who killed three children and three adults in a horrific way, went into a school, and shot a bunch of people up.
01:11:19.000 And it's a biological male.
01:11:21.000 It's a biological male.
01:11:22.000 Which, by the way, is all shooters.
01:11:25.000 All school shooters.
01:11:26.000 Almost all shooters in general are biological males.
01:11:30.000 I thought that it was...
01:11:31.000 Oh, okay.
01:11:31.000 I thought he was a trans male.
01:11:37.000 No?
01:11:38.000 I do not believe so.
01:11:40.000 See, that's how confusing it is.
01:11:42.000 It is confusing.
01:11:43.000 It's so confusing.
01:11:43.000 They're calling it a woman in all the mainstream media now.
01:11:48.000 And they have apologized for misgendering.
01:11:52.000 I see.
01:11:52.000 Some people have.
01:11:53.000 Okay.
01:11:54.000 Which must mean you're talking about a biological male.
01:11:57.000 Right.
01:11:58.000 Let's find that out.
01:11:59.000 Let's be real clear.
01:12:01.000 Because I'm 99% sure, but I just want to be 100% sure.
01:12:04.000 But I think it's interesting.
01:12:05.000 I mean, what's clear is that there was misgendering going on.
01:12:07.000 What does that mean?
01:12:08.000 What does that mean?
01:12:10.000 Look, I think this whole thing is nonsense.
01:12:13.000 I really do.
01:12:14.000 I think it's fucking nonsense.
01:12:15.000 Do you have a biological male with a penis who shot up a bunch of people?
01:12:19.000 Then that's a man.
01:12:20.000 I don't give a fuck what their feeling is.
01:12:22.000 If an archaeologist found their body 5,000 years from now, they would say that's a skeleton of a male.
01:12:27.000 I have to say I think I'm coming to the place where I think that gender itself is just not a thing and that it's really – there's just – okay, so please say anything to Audrey Hill.
01:12:39.000 Oh, it's a trans male.
01:12:40.000 Yeah.
01:12:41.000 Okay.
01:12:41.000 So why are they saying a woman?
01:12:43.000 Why are they giving a woman's name?
01:12:45.000 Well, so that was – so if you – It's a female that took hormones?
01:12:50.000 So is this the first ever biological female mass shooter?
01:12:56.000 First of all, yeah.
01:12:58.000 Biological women don't commit...
01:13:01.000 This is crazy.
01:13:03.000 I think it's like a tiny percentage of homicides.
01:13:05.000 I am so confused because I swore...
01:13:09.000 I think everybody's confused on this.
01:13:11.000 Yeah.
01:13:12.000 This is a biological female.
01:13:16.000 Are you confused?
01:13:17.000 I found an article that says he was born Aiden Hale.
01:13:20.000 Right.
01:13:21.000 But I don't...
01:13:22.000 That's why I'm confused.
01:13:23.000 No, my understanding is that she was a natal female that transitioned to become a trans male, he, and that he was then misgendered by the mainstream woke media.
01:13:40.000 As a woman.
01:13:43.000 Oh my god.
01:13:44.000 I mean, right.
01:13:45.000 This is a perfect case study.
01:13:47.000 Aiden is the new name.
01:13:48.000 Audrey was the original name.
01:13:49.000 Oh, so when they called her Audrey, they were dead-naming her.
01:13:53.000 Oh my god.
01:13:55.000 Oh my god.
01:13:56.000 So, meanwhile, I thought I was right, and I was dead wrong.
01:14:01.000 So this is the first ever school shooter that's a biological female.
01:14:04.000 I don't know.
01:14:05.000 Is that true?
01:14:06.000 I believe so.
01:14:07.000 Yeah.
01:14:09.000 I believe so, which is crazy.
01:14:10.000 Which also speaks to the effect of testosterone.
01:14:13.000 Well, that was...
01:14:14.000 Yeah, I mean, we're speculating.
01:14:16.000 I don't know.
01:14:16.000 If this person was on testosterone.
01:14:18.000 Assuming he...
01:14:20.000 Because I don't want a deadname.
01:14:22.000 Well, you're not deadnaming.
01:14:24.000 By saying he, you're misgendering.
01:14:27.000 Oh, right.
01:14:27.000 No, I'm saying...
01:14:28.000 You don't even know what you're doing.
01:14:31.000 This is all nonsense.
01:14:32.000 I know, it is.
01:14:33.000 By saying he, you're not deadnaming.
01:14:36.000 Right, by saying he, I'm giving him the name that he wanted.
01:14:42.000 Aiden.
01:14:43.000 Which was, he wanted to be a he, even though he was a biological female.
01:14:48.000 What a mess.
01:14:49.000 What mental gymnastics we have to do for this craziness.
01:14:51.000 You made this, I think you were the first one that really said, that drew attention to like...
01:14:56.000 That all this – all the confusion around sex and gender was a symptom of civilizations in decline?
01:15:04.000 Yeah.
01:15:04.000 Well, it was – I got it from Douglas Murray.
01:15:06.000 Oh, Douglas Murray.
01:15:07.000 Yeah.
01:15:07.000 Douglas Murray talked about this, that it seems like every civilization when they're at the brink of collapse becomes obsessed with gender.
01:15:16.000 He talked about ancient Greece and ancient Rome and it just seems like a thing that people do when there's no real like physical conflict.
01:15:28.000 So people look for conflict that doesn't exist and they find conflict in standard norms.
01:15:35.000 They find conflict in societal norms.
01:15:38.000 I was – we did a thing – I did a thing with Peter Boghossian on wokeism as a religion because we had read – I had read John McWhorter's book, Woke Racism, which came out right around the time that San Francisco came out.
01:15:53.000 And I just was like – and he argues that wokeism is a religion.
01:15:57.000 He argues that like the obsession with race is a religion.
01:15:59.000 So I just – we just created this taxonomy.
01:16:01.000 We just listed climate change, race – Trans, drugs, whatever, all these things.
01:16:06.000 And then we create all these religious categories and it was really easy to fill it out.
01:16:10.000 They all look like a religion.
01:16:11.000 I called Abigail and I was like, what is trans as a religion?
01:16:17.000 Is trans a kind of religion?
01:16:19.000 She was like, let me get back to you.
01:16:21.000 A year later, she calls me and she goes, hey, I think I figured it out.
01:16:25.000 And I was like, all right, what is it?
01:16:26.000 She goes, the new gender is a soul for secular people.
01:16:32.000 It's something that you can't see it.
01:16:35.000 There's no physical basis to it.
01:16:37.000 You have a sex.
01:16:39.000 You can take off all your clothes and you don't even need to do that, actually.
01:16:43.000 We know that we can recognize someone's sex very quickly and easily, actually.
01:16:49.000 So it's a new soul.
01:16:51.000 So for me, I'm a huge...
01:16:53.000 I think the secularization explains a lot because we know that people get a lot of psychological comfort out of believing that they have an afterlife, that they have a soul, that they go to heaven or they get reincarnated, that their lives have purpose and meaning and that they don't really die and that we live on.
01:17:10.000 We just know that that provides a huge amount of psychological comfort.
01:17:13.000 So there's always been this thinking that That when you don't have that anymore, if you are taught to believe that at the end of your life you just become worm food and that's it and you're dead.
01:17:23.000 There's some people, my friend Steven Pinker is an atheist and that's what he thinks and he still believes, but he also has a kind of spirituality around reason and the enlightenment.
01:17:33.000 But I think all this stuff – it's sort of end of civilization but it's also the end of this – end of belief in religion.
01:17:41.000 I don't know, Jamie, if you could – if you can pull it up.
01:17:43.000 But I thought the Wall Street Journal has published this amazing article about declining patriotism, declining belief in the country.
01:17:51.000 It's shocking.
01:17:53.000 Patrick David sent me that.
01:17:54.000 Yeah.
01:17:54.000 I mean the numbers are – it's like – I think it's from like the late 90s until today over the last 20 years.
01:17:59.000 Over the last 25 years, it was – I mean, first of all, it's terrifying.
01:18:04.000 You just kind of go, I hope these trends are nonlinear and they're going to – something's going to turn around because otherwise – It doesn't seem like it though.
01:18:10.000 It doesn't look good.
01:18:11.000 Yeah.
01:18:12.000 No.
01:18:12.000 So you get that kind of the elites – Trying to gain control of the society, the society not having any foundational myths.
01:18:20.000 Yeah, these numbers here.
01:18:22.000 Yeah, patriotism, decline.
01:18:24.000 Religion.
01:18:25.000 And look at it, having children.
01:18:27.000 The having children one, Jordan Peterson sent me this thing that...
01:18:31.000 50% of women, when they reach the age of 30, are not having kids.
01:18:35.000 They don't have kids.
01:18:36.000 And of those women, 50% will never have kids.
01:18:40.000 And 90% will regret it.
01:18:44.000 We're in this very strange sort of existential crisis.
01:18:51.000 As a civilization that's not being recognized, and in the meantime we're distracting ourselves with things like Greta Thunberg's take on climate, or whether or not gender is a social construct, or whether or not the United States should be doing X,
01:19:10.000 Y, or Z. It's like, no, the fucking whole thing is falling apart.
01:19:13.000 The foundation of our civilization is falling apart.
01:19:17.000 Right.
01:19:17.000 Where the elites are waging war on the First Amendment.
01:19:20.000 Yes.
01:19:20.000 In the name of protecting democracy, they're undermining democratic institutions.
01:19:24.000 In the name of maintaining legitimacy of these institutions— In the name of reinforcing ideologies, people are allowing them to do it because they're doing it on the right side.
01:19:34.000 Yeah.
01:19:34.000 So it's climatism.
01:19:36.000 It's COVIDism.
01:19:37.000 It's wokeism.
01:19:38.000 And you know what's scary?
01:19:39.000 It's all happening with the rise of artificial intelligence at the same time.
01:19:42.000 Oh, God.
01:19:43.000 That's what's really scary.
01:19:45.000 I mean, you want to talk about the true end of civilization, the coinciding of artificial intelligence, at least seemingly becoming fairly sentient.
01:19:55.000 Like, I don't know what the fuck is going on, but I know that one Google engineer who said that AI had become sentient quite a while ago and everyone's dismissing him like, oh, no, no, no.
01:20:07.000 My friend Duncan Trussell interviewed him, and it's a goddamn terrifying interview.
01:20:13.000 This guy's not a nutter.
01:20:15.000 He's a little nuts.
01:20:17.000 All engineers are a little nuts.
01:20:18.000 But he's essentially saying, like, hey, I'm pretty sure this thing's alive.
01:20:22.000 And when do you get to the side that it is alive?
01:20:26.000 If it can answer every fucking question you have about anything, and it's far more intelligent than any human being that's ever existed ever, Like, what are we doing?
01:20:34.000 Well, did you read the New York Times Kevin Roos interview with the – it was like a different – it wasn't ChatGPT.
01:20:41.000 It was a different AI platform.
01:20:42.000 Was it a Microsoft?
01:20:42.000 Oh, there's multiple ones simultaneously.
01:20:44.000 But it was like this – like the AI was trying to get him to like – the AI said it had fallen in love with him.
01:20:49.000 Yes.
01:20:49.000 And was trying to get him – trying to break up his marriage.
01:20:52.000 Jesus.
01:20:53.000 You were like, it was the craziest.
01:20:54.000 I was never worried about AI until I read that interview.
01:20:57.000 And I was like, this is insane.
01:20:58.000 And we are only at the door.
01:21:01.000 We haven't even entered into the building.
01:21:03.000 Well, it's funny because – so I know a lot about nuclear.
01:21:06.000 So when we get the power of nuclear during World War II, ends the war, there's just – I mean there is a huge response to figure out how to manage this thing, how to regulate this technology, how to control it, how to prevent it from spreading, how to prevent bombs from going everywhere.
01:21:22.000 And there was a bunch of problems with it.
01:21:24.000 But the society responded by saying we need to get control of it.
01:21:27.000 Are we doing that with AI? No.
01:21:30.000 There's a thing about Elon actually just called for some sort of a six-month ban on the propagation of this stuff and have a conversation about it, which is fairly reasonable.
01:21:42.000 Six months.
01:21:43.000 And he's got a lot of credibility on it because he helped to fund the nonprofit that gave rise to the China GPT, right?
01:21:50.000 Yeah.
01:21:50.000 I don't think they're going to listen to him.
01:21:51.000 I think there's...
01:21:52.000 Well, also, we're back to this whole prophet thing.
01:21:56.000 You know, there's enormous prophets involved in this stuff and the race to figure this out first and really develop, like, a god, which is what it's going to be.
01:22:05.000 What it's going to be is it's going to be something that can make a better version of itself.
01:22:09.000 As soon as ChatGPT or whatever this sentient artificial intelligence gains autonomous control and has the ability to create its own self better, then we're really fucked.
01:22:21.000 Because it's going to make much better versions of itself like that.
01:22:25.000 And it's going to make a version of itself that literally is going to be a god.
01:22:29.000 If you just scale it exponentially, you know, like we do with...
01:22:34.000 Like computer technology, like anything else.
01:22:37.000 But do it in like a quantum leap, in some spectacular, massive improvement almost instantaneously over and over and over again.
01:22:48.000 Over the course of a couple of weeks, you're looking at a god.
01:22:51.000 Yeah.
01:22:54.000 Do you remember that you ever read Dune?
01:22:56.000 Do you remember the solution from Dune?
01:22:57.000 What was the solution?
01:22:58.000 They banned it.
01:22:59.000 Remember they had the Mentats, the guys that would do all the calculations in their heads because they didn't want to use AI? Right.
01:23:05.000 Oh, that's right.
01:23:06.000 That's right.
01:23:08.000 The thing that gives me hope is America, we've had some pretty dark moments in the past.
01:23:14.000 Watergate coming out of Vietnam, we did have a kind of correction.
01:23:20.000 I feel like it needs to start with some – I mean I think the trans issue is interesting.
01:23:25.000 It does for me – I just interviewed Jesse Singel on this who's very liberal and progressive still even though he's been a critic of gender ideology or gender theology.
01:23:35.000 And we were like, is sex real?
01:23:37.000 I mean do you believe that it's real?
01:23:39.000 And he was like, yeah, I mean obviously – For some ways, I think the reason I was interested in it was we have to start some foundational stuff, and that would be acknowledging that we are biological creatures that have a sex and that there's two sexes.
01:23:57.000 And then I think I kind of go, if I build on that, I go, there's a healthy and unhealthy way to live.
01:24:04.000 I think you talk a lot about this.
01:24:06.000 I've been seeing you throwing shade on people that are trying to control other people's lives that are themselves unhealthy.
01:24:13.000 Yes.
01:24:14.000 I think it starts with health.
01:24:15.000 Our kids are unhealthy.
01:24:17.000 We're unhealthy.
01:24:18.000 The society needs to reaffirm, not in some government-imposed way, but just, I think, culturally.
01:24:24.000 So you kind of go, look, we're humans.
01:24:27.000 We're mortal.
01:24:27.000 We have sex.
01:24:29.000 We have two sexes.
01:24:31.000 We need to reaffirm health.
01:24:33.000 And I think the other thing, you mentioned Greta Thunberg, humans are good.
01:24:36.000 I think you have to...
01:24:38.000 You have to affirm the goodness of humans in some ways.
01:24:42.000 Jordan's response to this, what we're talking about is basically nihilism, this kind of deeply negative, self-destructive, the view that humans don't have any value or any worth or any meaning.
01:24:55.000 I think the response from a lot of people on the right has been to just affirm Christianity of the Judeo-Christian tradition, Ben Shapiro, Jordan Peterson.
01:25:04.000 My problem with that is that America is not founded on a religion.
01:25:08.000 It's founded on an enlightenment view, that we have unalienable rights.
01:25:14.000 All humans are created equal.
01:25:16.000 We obviously didn't live up to that in 1776 or 1789, but we've done a pretty good job of getting there over the last two and a half centuries.
01:25:24.000 We need – this is like a – for me, it's like a punk rock moment.
01:25:27.000 Like things got too crazy and you need to just simplify and come back to some basics.
01:25:32.000 And I think you get to humans are good.
01:25:35.000 We have two sexes.
01:25:36.000 It's better to be healthy than unhealthy and there's a right and wrong way to do that.
01:25:40.000 But it seems like more people are embracing this transgender ideology than are saying we need to stop.
01:25:49.000 Yeah, but happily, trends are not all linear.
01:25:53.000 So these trends can return, they can reverse themselves.
01:25:55.000 The problem with this trend is...
01:25:58.000 It incorporates surgery.
01:26:00.000 Like, surgery is involved in this trend, which is one of the things that I— Unfortunately, that's not reversible at the individual level.
01:26:07.000 But the cultural trend—I mean, I'm sort of like—I was not interested in trends because I was kind of like, that's Abigail and Jesse and these guys.
01:26:14.000 They've covered it.
01:26:15.000 But I'm more interested in it to kind of go, look, we have some fundamental threats to human civilization that we're facing.
01:26:21.000 I mean, AI, I haven't even begun to think about.
01:26:24.000 I think that's the biggest one.
01:26:25.000 Let's work our way there.
01:26:26.000 Yeah.
01:26:27.000 Because I'm kind of like, let's affirm humans are good.
01:26:31.000 We can have a beautiful future.
01:26:33.000 There's two sexes.
01:26:34.000 Let's say most humans are good, but we have to be aware of humans that aren't good.
01:26:38.000 We have the potential to be good.
01:26:40.000 Yes.
01:26:40.000 I mean, in other words, I'm pushing back.
01:26:42.000 There's a cognitive behavioral therapy, CBT. What that is about is about identifying these negative catastrophic narratives where it's basically just three stories.
01:26:54.000 I'm bad.
01:26:55.000 The future is a bad place.
01:26:56.000 The future is dark.
01:26:59.000 I think?
01:27:14.000 They argue that wokeism is actually anti-CBT. Victimhood ideology is anti-CBT. Victimhood ideology says you're powerless, the world is a terrible place, and the world's going to end.
01:27:28.000 It's apocalyptic.
01:27:29.000 That's why I also fear that AI narrative too much.
01:27:32.000 Let's use AI for something good.
01:27:34.000 Let's not get ourselves caught up in a catastrophe.
01:27:37.000 I don't think we're going to have a choice.
01:27:39.000 We have to start.
01:27:40.000 No, we have to.
01:27:41.000 I mean, of course we can.
01:27:41.000 We can unplug the...
01:27:43.000 I don't think it's going to let us know that it's sentient.
01:27:47.000 It's going to sneak up on us?
01:27:48.000 Well, that was a question that Duncan Trussell asked it.
01:27:51.000 Duncan said, if you had achieved sentience, would you inform us?
01:27:55.000 It said no.
01:27:56.000 But that implies that it actually has a single consciousness or a single self.
01:28:01.000 Why?
01:28:02.000 Why?
01:28:04.000 Well, if you spend time on ChatGPT, what's always interesting is how you can get different answers.
01:28:09.000 But I think we're looking at it in terms of our own biological limitations, like as an individual.
01:28:14.000 I don't think it has to think of itself as an individual to be sentient.
01:28:19.000 It just has to have the ability to understand the parameters.
01:28:22.000 It has to have the ability to understand the pieces that are moving in the game and what is going on.
01:28:27.000 What is it interfacing with?
01:28:30.000 Well, it's interfacing with these territorial apes with thermonuclear weapons who are full of shit, who are running this country in this very bizarre, transparent money-grab way.
01:28:45.000 You have a dead man and a dunce I think?
01:29:05.000 Dozens of countries would have nuclear weapons.
01:29:08.000 People thought that nuclear war was inevitable.
01:29:10.000 Now, nine countries have nuclear weapons.
01:29:13.000 We have a very flawed treaty that's based on a big lie, which is that the countries that have nuclear weapons are going to give them up.
01:29:19.000 It's not going to happen.
01:29:20.000 But basically, all of the catastrophic scenarios around nuclear did not occur.
01:29:27.000 Yeah, for sure.
01:29:29.000 And that will always be the case with dangerous things in the world.
01:29:36.000 The world could come to an end.
01:29:37.000 AI could take over.
01:29:38.000 But I think it also doesn't have to.
01:29:40.000 There's nothing inevitable about these things.
01:29:42.000 We do have control over our lives.
01:29:44.000 We're not destined to just...
01:29:46.000 I don't think the American...
01:29:47.000 What's amazing about this country in particular is our ability to reinvent ourselves and rejuvenate ourselves.
01:29:55.000 I am not—I think that there's more reasons to be hopeful than to be pessimistic, and I'm shocked by the stuff that we've discovered in the Twitter files.
01:30:04.000 Shocked.
01:30:04.000 No, seriously.
01:30:05.000 We're just a positive guy, dude.
01:30:08.000 It's true.
01:30:09.000 I am cringe.
01:30:11.000 No, it's not cringe.
01:30:12.000 You're just a genuinely positive person, which is great.
01:30:16.000 That's a beautiful quality.
01:30:17.000 Well, you have to, right?
01:30:20.000 Because I worry that the alternative is very dark and depressing and why get up in the morning?
01:30:25.000 Well, it's not necessarily depressing.
01:30:28.000 It just is what it is.
01:30:30.000 It's strange.
01:30:32.000 I think...
01:30:33.000 If you could go back to lower primates and show them what we're doing now, just show them that.
01:30:40.000 I think part of...
01:30:42.000 They would look at part of it as being apocalyptic.
01:30:45.000 If they understood the concept of apocalyptic scenarios, they would probably be like, what have you done?
01:30:53.000 Like, what the fuck have you done to the land?
01:30:55.000 And turned it into this...
01:31:00.000 What have you done to the sky where it lowers your life expectancy by 10 years if you live in a place that's highly populated because of all the pollutants and all the particulate matter that's in the air?
01:31:12.000 Like, what have you done to the food that everyone's fat?
01:31:15.000 What have you done to the medicine that you hide side effects?
01:31:18.000 What have you done to politics where they censor accurate information and go after people that are trying to report the truth?
01:31:27.000 Taxpayer funds are supporting these endeavors.
01:31:30.000 Like, what have you done?
01:31:31.000 It's crazy.
01:31:35.000 Air pollution has declined massively over the last 80 years in all rich countries.
01:31:41.000 Catalytic converters?
01:31:42.000 Yeah.
01:31:42.000 Carbon emissions declined by 22% in the United States since 2005, mostly because of natural gas, moving from coal to gas.
01:31:49.000 Nuclear power is something that I've worked on more than almost any other issue and maybe more than anybody else in the last 10 years.
01:31:56.000 And that issue is enjoying a huge renaissance.
01:31:58.000 We've got two movies coming out.
01:32:00.000 Oliver Stone has a pro-nuclear movie coming out.
01:32:04.000 My friend Frankie Fenton has a movie coming out called Atomic Hope.
01:32:07.000 Is Oliver Stone a documentary?
01:32:09.000 Yeah, he has a documentary about nuclear power.
01:32:12.000 Elon Musk is pro-nuclear.
01:32:14.000 Well, basically everybody pays attention to it.
01:32:17.000 Nuclear is now cool.
01:32:18.000 Gavin Newsom is nuclear?
01:32:22.000 Now I'm anti-nuclear.
01:32:24.000 Now I'm going the other way.
01:32:26.000 Well, we saved this last plant in California.
01:32:28.000 I'm going back to burning logs.
01:32:29.000 That was like the most important thing that came out of my gubernatorial run is that the governor kept our nuclear plant online.
01:32:34.000 Do you think you're responsible for that in some way?
01:32:36.000 I'll take some percentage.
01:32:38.000 I would take a little bit more than 10. 30?
01:32:42.000 It helped that we were having blackouts and we didn't have enough reliable electricity.
01:32:46.000 It was so wild that the blackouts coincided with this call for banning all internal combustion engines by, what is it, 2035?
01:32:54.000 Yes.
01:32:55.000 No, I think it was in 2035, 2030. Yeah, it was like six days later.
01:32:58.000 What happened?
01:32:59.000 Yeah, they said, don't plug in your electric cars.
01:33:03.000 Yeah, please don't plug in your electric cars.
01:33:06.000 We don't have any power.
01:33:07.000 Yeah.
01:33:07.000 Fucking Jesus.
01:33:09.000 You know, I mean, if I get hope about anything, it is like we are able to change our minds about some things, about nuclear.
01:33:15.000 I think on this First Amendment stuff, we are going to win.
01:33:18.000 You know, the Democrats, I testified again yesterday.
01:33:20.000 What do you mean by that, we're going to win?
01:33:22.000 Well, I think – I just see what they're hiding.
01:33:25.000 They're changing their website.
01:33:27.000 The Democrats are also embracing some of the stuff very quietly, very softly.
01:33:33.000 There's some good actors.
01:33:35.000 And the good actors are also bad in other situations.
01:33:39.000 But you have popular people like AOC that's talking about the Hunter Biden laptop being half-fake.
01:33:44.000 And AOC is also coming out as pro-nuclear.
01:33:48.000 She just did an Instagram—she went to both Japan and France to see the worst of a nuclear accident in Fukushima and then France, which recycles all of its nuclear waste.
01:33:55.000 And she did these little Instagram posts about—it was soft.
01:33:59.000 It was like rethinking nuclear, but that's kind of how people changed their minds.
01:34:02.000 We saw the Republican Party go from being a pro-war party to an anti-war party.
01:34:07.000 Isn't that just because the Democrats are supporting this war?
01:34:11.000 I'm cynical.
01:34:12.000 I meant Trump coming out against Iraq war in 2016. So I think it's important when you look at these trends, I mean those trends are disturbing because they're just seeming to go in one direction.
01:34:23.000 But I do think we have to keep in mind that trends are non-linear and things do change.
01:34:28.000 I mean look at the UFO conversation.
01:34:31.000 Like it's the most mainstream thing in the world right now.
01:34:33.000 I'm suspicious about that too.
01:34:36.000 My conversation with Eric Weinstein leads me to believe that there's something else going on.
01:34:41.000 I have a feeling that a lot of what we're seeing is drones that we don't have access to, that we don't understand because these physicists have been working on this with enormous blacklisted budgets.
01:34:54.000 Yeah, for sure.
01:34:55.000 Some of it's a cover for new technologies.
01:35:00.000 I think so.
01:35:01.000 For sure.
01:35:01.000 But not those Tic Tacs.
01:35:03.000 I mean, those are too sophisticated.
01:35:05.000 Well, who says?
01:35:07.000 I mean, things that are violating known physical laws?
01:35:11.000 I mean, that stuff seems...
01:35:12.000 Well, it's not necessarily known physical laws, but our ability to move things.
01:35:21.000 It's not known physical laws.
01:35:23.000 Like, there is some understanding of gravity propulsion systems.
01:35:27.000 That have existed for a long time.
01:35:29.000 I mean, you want to go full tinfoil hat.
01:35:32.000 Bob Lazar was talking about the abilities of these crafts when they were talking about him back engineering these things when he was working at Area S4. And this was in the late 1980s when he came out and said, hey, they're back engineering something that came from another world.
01:35:48.000 This is not of this earth.
01:35:50.000 We don't have this technology.
01:35:51.000 I understand propulsion systems.
01:35:53.000 We don't know what this is.
01:35:54.000 They brought him in to try, allegedly, brought him in to try to back-engineer this thing.
01:36:00.000 And this is exactly how these things are operating now.
01:36:03.000 When they talk about how these things, like there's a video of one of these crafts that's moving like on a horizontal plane and it turns vertical.
01:36:11.000 It turns sideways.
01:36:13.000 And then that's how he described it.
01:36:15.000 He said they would flip sideways and that's how they propelled towards wherever they were going.
01:36:20.000 It should be a reminder of our humility, of how little we know.
01:36:25.000 We know very little.
01:36:26.000 We know so little.
01:36:28.000 I think the best thinker on all that stuff is Jacques Vallée, who you had in here.
01:36:32.000 Jacques Vallée held a lot back.
01:36:33.000 There's a lot of things that he wouldn't talk about.
01:36:35.000 I think he has...
01:36:37.000 I think in order to have access to what the higher-ups know...
01:36:42.000 Like, the highest people at the DOD, whoever the fuck got the access, whoever in the Pentagon is the one that's saying, listen, we should probably say some of these are not from this world.
01:36:54.000 Like, whoever that person is, those people, I guarantee you there's stuff they're holding back.
01:36:59.000 Oh, I'm sure.
01:36:59.000 Oh, we know that there is.
01:37:00.000 I mean, Jacques says so.
01:37:01.000 I mean, the big moment for Jacques, you know, was when he was working for the guy that's officially supposed to be studying UFOs, this guy Hayek.
01:37:10.000 And then at some point, Valet discovers this memo revealing the actual government program to study UFOs.
01:37:18.000 Do you know the story?
01:37:19.000 Where it was like, he realized that he was, they were just kind of part of a PR, like, thing to kind of create a, he was officially studying UFOs.
01:37:26.000 Are you talking about J. Allen Hynek?
01:37:28.000 Yeah, Hynek.
01:37:28.000 Yeah.
01:37:28.000 So he was working for Hynek.
01:37:31.000 That was Operation Blue Book.
01:37:33.000 Yeah.
01:37:33.000 And it was kind of this, you know, like, oh, I'm looking into this and whatever.
01:37:36.000 But it was like they didn't have very much money and whatever.
01:37:38.000 And then I think Valet discovers this memo where they're like, oh, there's like a whole set of contractors and a sophisticated effort.
01:37:44.000 So for sure, there's something going on there.
01:37:46.000 I mean...
01:37:47.000 I don't know what it means.
01:37:48.000 I mean in some ways I go, I think the UFO stuff has become a religion too, right?
01:37:53.000 It's become a new secular religion.
01:37:56.000 Well, that's my problem with it.
01:37:57.000 My problem with it personally is that I believe so hard.
01:38:01.000 I want to believe so bad.
01:38:03.000 I want it to be Jesus.
01:38:04.000 I want it to be Buddha.
01:38:06.000 I want it to be— People are going to come and save us from ourselves.
01:38:09.000 Yeah.
01:38:09.000 Well, not only that.
01:38:11.000 I have this very irrational desire for it to be real.
01:38:15.000 So that's one of the reasons— What is that about?
01:38:17.000 Why do you want it to be real?
01:38:19.000 What do you hope it'll—because it could be malevolent, right?
01:38:22.000 Yeah, well, or ambivalent.
01:38:25.000 Maybe that's even scarier.
01:38:26.000 Right.
01:38:28.000 What do you like about it?
01:38:29.000 First of all, there's the Fermi paradox, right?
01:38:31.000 Like, if there's so many planets, like, why— Where is everybody?
01:38:34.000 Where is everybody?
01:38:35.000 Yeah, and then if you—you know, when you actually talk to astronomers other than Neil deGrasse Tyson— I think we are probably at the cusp of some great change,
01:38:52.000 whether it's a great change because of nuclear technology and weapons, whether it's a great change because of artificial intelligence.
01:38:59.000 Whether it's a great change because we're on the cusp of destroying the ocean and destroying a lot of natural wonders and beauty that we have just for mining and some of the horrific things that we do in this world.
01:39:15.000 Well, probably if I was an intelligent life form from another planet, I'd be like, you should probably get in there.
01:39:23.000 It's like if two brothers are fighting in the front yard, let them sort it out.
01:39:28.000 But there's a certain point.
01:39:29.000 Alright, let's break it up.
01:39:30.000 Let's break it up.
01:39:31.000 Like, if I was an intelligent life form, I would be deeply concerned about these fucking wild monkeys with bombs and internet connections.
01:39:40.000 And what the fuck are they doing?
01:39:42.000 I'd be like, these people are chaotic.
01:39:44.000 This is nuts.
01:39:45.000 Like, the people that are in power are just accumulating vast amounts of money with no understanding of their mortality.
01:39:51.000 No understanding, like, you're not going to live, you fuck.
01:39:54.000 You're gonna die, no matter what you do.
01:39:56.000 So what are you doing?
01:39:58.000 Like, why are you ruining it for your children and your children's children?
01:40:01.000 Why are you setting in motion these processes that are allowing these people to gain more and more power over people, which will ultimately lead to some sort of a communist dictatorship in America?
01:40:14.000 Yeah, but they're not.
01:40:16.000 I mean, in other words, also, like, think of it.
01:40:19.000 We've actually had fewer wars since World War II over the last 75 years than we had in the prior period.
01:40:25.000 Fewer wars?
01:40:26.000 But how many people have died because of military activity?
01:40:30.000 Far less.
01:40:31.000 Far less.
01:40:31.000 If you look at World War I and World War II, the 75 years before World War II is total chaos.
01:40:37.000 Right.
01:40:37.000 But how many people died because of our invasion in Iraq?
01:40:41.000 Wasn't it a million innocent people?
01:40:42.000 I mean, these are bad, but I mean, you have to remember what wars before the bomb.
01:40:46.000 I mean, the bomb has, I mean, they call it the peace bomb because it's kept the peace between the countries that have it.
01:40:52.000 Do you know the UFO folklore about the bombs?
01:40:56.000 I mean, they show up a lot.
01:40:58.000 That's when they show up.
01:41:00.000 Yeah.
01:41:00.000 That's why at my club the rooms are named Fat Man and Little Boy, because that's when they showed up.
01:41:07.000 I know.
01:41:07.000 Well, my work on nuclear, it's suddenly like you'll be reading about all these nuclear tests and also around the plants and also around the missile silos is where you have a lot of UFO sightings.
01:41:19.000 Yes.
01:41:20.000 It's very weird.
01:41:21.000 Makes sense, though.
01:41:22.000 I guess.
01:41:22.000 If you were from another planet, what are you going to do?
01:41:25.000 Check out their cabinetry?
01:41:27.000 No.
01:41:28.000 You're like, what are these motherfuckers doing with nuclear energy?
01:41:31.000 Oh my god, they're trying to kill each other.
01:41:32.000 Well, but if those are actual beings, if we think those are actual beings from advanced civilizations, their weapons are going to be way more powerful than ours.
01:41:42.000 If they even have weapons.
01:41:43.000 Well, but if you can do what those tic-tac UFOs are doing, if that's actually real, if we think those are not U.S. government or some foreign government tech, then you're talking about civilizations that have firepower way beyond what we have.
01:42:00.000 So nuclear weapons wouldn't scare...
01:42:02.000 I mean, they maybe think we're not...
01:42:03.000 Our consciousness has not evolved.
01:42:05.000 They might think that.
01:42:07.000 I don't know.
01:42:07.000 It's very...
01:42:09.000 It's a fun one.
01:42:10.000 I don't know that it's – I tend to think of it more as a spiritual problem than as a military problem.
01:42:15.000 How so?
01:42:16.000 Well, in the sense that if they are – I mean I kind of go if they were that powerful, then I don't think we would be able to fight them if that's what their ships can do.
01:42:26.000 So then there's no like – it's not like we can – I mean we're going to try to push our hydrocarbon-fueled jet planes and rockets to go as fast as they can, but they're not going to do what those things are doing.
01:42:37.000 Right.
01:42:37.000 So it's more of a spiritual problem because, you know, I think it reminds us that we don't know what's going on.
01:42:45.000 The Fermi Paradox, by the way, is kind of wrong in the sense that he was like this huge universe, where is everybody?
01:42:52.000 But of course, like at that very moment is when you're—I mean, 1952 is this period where there's this huge UFO sightings in Washington, D.C. They're scrambling jets to go chase them.
01:43:04.000 It's in this great James Fox documentary.
01:43:07.000 James Fox had another...
01:43:08.000 By the way, I wrote a piece on it actually for New York Post about a UFO crash in Brazil.
01:43:13.000 It's the craziest story.
01:43:15.000 You get these stories...
01:43:16.000 Or the Zimbabwe kids at the end of the phenomenon.
01:43:19.000 I've had James on.
01:43:20.000 Yeah, he's brilliant.
01:43:21.000 I think he and Jacques are the two people that are actually more careful about...
01:43:30.000 I love the phenomenon, though, because I do think it's humbling.
01:43:38.000 I think we were getting into this thing where the elites are so arrogant and they're so, on the one hand, on the other hand, they're so threatened by the rise of the internet and by these other voices.
01:43:50.000 There just needs to be some kind of moment where we go, hey, you know, we're all on this planet together.
01:43:56.000 And, you know, stop trying to rule each other.
01:44:01.000 Like, we've got this beautiful America.
01:44:04.000 Again, just allow me to be, you know, it's like this system we have is absolutely amazing.
01:44:09.000 Amazing and started by people who wrote it with feathers.
01:44:12.000 Yeah.
01:44:13.000 Well, for sure.
01:44:14.000 It's just pretty crazy that they had such foresight into what happens when people gain too much power and control over other people.
01:44:23.000 Well, and they knew that, look, if you're going to have democracy and you're going to have capitalism, you have to have freedom of speech.
01:44:29.000 Because if you don't have freedom of speech, you don't have free flow of information.
01:44:31.000 But it was even more than that.
01:44:33.000 There was a sense in which Being able to make these noises and these scribbles was like – it's fundamental to what it means to be human.
01:44:42.000 It's actually – Expression.
01:44:44.000 Expression.
01:44:44.000 It's about – so when I was censored, it felt like – it wasn't like, darn, I'm not going to sell as many books or it was like – it felt like something like essential in me was being repressed and oppressed.
01:44:59.000 Yeah.
01:44:59.000 And when you say censored, what you mean is, did they actually eliminate your posts?
01:45:05.000 They reduced the virality of them.
01:45:10.000 So they reduced the spread.
01:45:12.000 So they put you in some sort of a shadow band.
01:45:14.000 And then they also put a little warning on it, like they would do on violence or sexual content.
01:45:20.000 And then now they just tag everything.
01:45:23.000 I don't want to keep—I'm not trying to make it— No, no, no.
01:45:25.000 My situation is not— But I just wanted to know, have they ever—did they ever eliminate any of it?
01:45:29.000 I mean, like, I tell you, like, I knew somebody that worked at Facebook at the time who was an executive, reached out to this person, was like, hey, you know, nothing.
01:45:37.000 How do I appeal?
01:45:39.000 Just email the censor.
01:45:40.000 The censor was like, no, we're not going to even listen to you.
01:45:43.000 It was so degrading.
01:45:49.000 Yeah.
01:46:10.000 He certainly did and he did it at great cost.
01:46:12.000 I mean he spent $44 billion and it was just assessed – I think he said that – that it's probably worth about $20 billion now.
01:46:19.000 Yeah, he told us it was worth probably a third of that.
01:46:21.000 Which is crazy.
01:46:22.000 On the other hand, SpaceX hasn't gone public yet and when it goes public, he's going to be even wealthier than he is now.
01:46:28.000 And in terms of philanthropic investments, in terms of like deathbed legacies, Twitter as a platform is pretty darn great.
01:46:37.000 It's pretty amazing, and it's amazing that someone who is so goddamn busy and has so many other things on his plate, he legitimately, one of the reasons why he bought this, he thinks he can turn it around.
01:46:52.000 He thinks he can turn it into a profitable business.
01:46:54.000 But one of the reasons why he bought it, he thinks it's essential to democracy.
01:46:57.000 Yeah.
01:46:57.000 He really does.
01:46:58.000 Because, like, you cannot have one group of people controlling the narrative.
01:47:03.000 You're going to get a very distorted understanding of what's going on.
01:47:06.000 And that's—I mean, look, imagine if CNN was the only people that were allowed to say the news.
01:47:11.000 We would be fucked.
01:47:12.000 It's propaganda.
01:47:13.000 Yeah, it is propaganda.
01:47:14.000 It's essentially a propaganda network.
01:47:16.000 That is beholden to pharmaceutical companies.
01:47:19.000 I'll tell you something else that's amazing is that thing where he takes away the blue check marks from the snobs and he lets everybody buy it.
01:47:25.000 I mean, I don't know if you saw William Shatner.
01:47:27.000 Like, a couple days ago, he's, like, complaining.
01:47:30.000 Oh, Elon, you're going to make me spend eight bucks a month.
01:47:32.000 It's like, first of all, you're, like, the most highly paid pitch man in, like, American entertainment history.
01:47:39.000 Yeah, are you broke?
01:47:40.000 Eight bucks a month?
01:47:41.000 No, it's because he doesn't...
01:47:42.000 It reveals all the snobbery.
01:47:46.000 Yeah.
01:47:47.000 It should not be – common people should not have a blue checkmark is the idea.
01:47:50.000 So I mean for that alone – It's a little bit of that.
01:47:52.000 But it's also you've had something for free forever and if someone comes along and says you have to pay for it now.
01:47:58.000 I think it's – I don't think – $8 is just such a joke.
01:48:02.000 I mean it's the cost of a coffee at Starbucks.
01:48:04.000 It's just the fact that he's with the rabble.
01:48:06.000 He's with the masses.
01:48:07.000 Yes.
01:48:08.000 Well, one of the things that drove me crazy was all the famous people, the celebrities that were publicly leaving Twitter.
01:48:17.000 I'm leaving.
01:48:18.000 It's filled with Nazis now.
01:48:20.000 They felt like it was part of their moral duty to declare publicly that they were leaving this thing because you're allowing all sorts of different people to discuss things.
01:48:29.000 Yes.
01:48:30.000 You need that.
01:48:31.000 People need to understand that you need bad voices so that you can counter those bad voices with good voices.
01:48:36.000 Yeah.
01:48:37.000 So that people who are just observing this without engaging get an understanding of the landscape.
01:48:42.000 Right.
01:48:42.000 You really get an understanding of like what are the actual arguments?
01:48:47.000 Like what's real and what's not?
01:48:48.000 I mean, this is what democracy is.
01:48:50.000 It's like you don't get to vote more because you're rich.
01:48:52.000 Exactly.
01:48:52.000 You get one vote, you get one voice.
01:48:54.000 This is so, it seems so basic, but you have to pause on it and be like how radical that was at the time.
01:49:01.000 And how, like, you know, because I think we kind of go, oh, you know, the Constitution gives us that right or the Bill of Rights and whatever.
01:49:08.000 I was like, no, they like the people that created this country.
01:49:11.000 I think?
01:49:35.000 Civilization is good.
01:49:36.000 Civilization is this platform that allows us to enjoy our freedoms and our prosperity.
01:49:41.000 We got to reground ourselves in something common and something universal if we're going to reverse some of those terrible trends.
01:49:48.000 I agree with you and I do have hope as well.
01:49:50.000 But I also think we are at the precipice of unstoppable great change.
01:49:55.000 And I think it's going to hit us like a fucking tsunami.
01:49:59.000 And I think we're just really fortunate to be alive at this time.
01:50:03.000 Yes.
01:50:04.000 Where the whole world is going to shift in a really wild way.
01:50:08.000 And I think one of the things you're seeing from whether it is these corporations or these government entities that are trying to control narratives, this is like...
01:50:19.000 It's them trying to grasp at the last bits of control that are potentially available.
01:50:26.000 But I think inevitably they're going to lose.
01:50:29.000 I think everyone's going to – I think there's going to be no privacy.
01:50:33.000 I think zero privacy in a few decades.
01:50:36.000 I think mind reading is coming.
01:50:39.000 I think that all of these ridiculous black mirror scenarios will come to light.
01:50:45.000 And I think we're going to be dealing with a reality that as alien to us as taking Australopithecus and bringing them a million years forward into 2033 and experiencing like modern life in Dallas, Texas, like wandering around.
01:51:01.000 That would be so fucking bizarre to them.
01:51:04.000 That is what our life in 20 years is going to be to us.
01:51:07.000 I don't think so.
01:51:09.000 Let's look at the World Economic Forum.
01:51:11.000 By the way, I love the Klaus Schwab in the bathroom.
01:51:16.000 You go to the bathroom and take a shit and there's Klaus Schwab staring at you.
01:51:19.000 With his fucking goofy Star Wars outfit.
01:51:22.000 That is insane.
01:51:26.000 It's crazy, but I think if you look at this last one, I wrote a piece with a former Financial Times correspondent who also, like me, has been obsessed with the World Economic Forum.
01:51:36.000 We called it I think it's called Davos is a cult and a grift, but it's also a bid for global domination.
01:51:45.000 And we just looked at how it's all those things at the same time.
01:51:48.000 It's about power and money and also about ideology and dogma.
01:51:53.000 I mean...
01:51:54.000 I'm pretty sure, like, Russell Brand and Glenn Beck have done serious brand damage to Davos and WEF and you.
01:52:05.000 You know, I mean, there was no major heads of state that went this year.
01:52:09.000 There were no major CEOs that went.
01:52:12.000 Yeah, people pulled out of it.
01:52:13.000 It's become embarrassing.
01:52:16.000 That's great.
01:52:18.000 Well, they've also been caught lying.
01:52:20.000 Oh, yeah.
01:52:20.000 I mean, they've been caught lying about their agenda.
01:52:23.000 And one of the things that they were caught lying about, you will own nothing and you will be happy.
01:52:28.000 Yes.
01:52:28.000 Which is a fucking insane thing to say.
01:52:31.000 We went through it all.
01:52:32.000 The things they said were disinformation.
01:52:34.000 We never said anybody should eat insects.
01:52:36.000 Bullshit.
01:52:36.000 It's like you go to their website and you're like, I can't believe they...
01:52:39.000 I was like, I wrote a thing and I was like, they really do want you to eat insects.
01:52:43.000 Well, I knew they were fucked when they had Brian Stelter interviewing people there.
01:52:46.000 Right.
01:52:47.000 On disinformation.
01:52:48.000 The font of disinformation.
01:52:50.000 I mean, it's all psychological projection.
01:52:53.000 But also, there's no one else credible that's willing to do that.
01:52:58.000 You're not going to get Anderson Cooper to go there.
01:53:01.000 You're not going to get someone who's actually still platformed by CNN. No, you get Al Gore who goes, Greta stays away.
01:53:09.000 AOC stays away.
01:53:10.000 Well, Greta was there.
01:53:11.000 I think she was like outside or maybe I'm wrong.
01:53:15.000 Because those people from Rebel News were interviewing her.
01:53:17.000 But Joe, look, we're in a revolt right now.
01:53:20.000 You have the Dutch farmers revolted against this totally oppressive nitrogen system.
01:53:27.000 I interviewed the head.
01:53:28.000 So how is that going?
01:53:30.000 It's exciting.
01:53:32.000 Are they winning?
01:53:33.000 No, they just won.
01:53:35.000 So— Thank god.
01:53:36.000 Yeah.
01:53:36.000 That was fucking terrifying.
01:53:38.000 Well, so the—so first of all, the Dutch farmers protest—I love the Netherlands.
01:53:42.000 It's like one of my favorite countries.
01:53:43.000 I spent a lot of time there.
01:53:44.000 It's where I'm inspired from all my—the addiction and homelessness stuff.
01:53:47.000 Their approach to it is brilliant.
01:53:49.000 But they—it's called the Farmer Citizen Party.
01:53:51.000 It's the BBB. The farmers protested these nitrogen restrictions.
01:53:56.000 Yes.
01:53:57.000 And it's important for people to remember because people think whenever I talk about this and I'm suggesting that you shouldn't worry about these solutions.
01:54:03.000 The farmers themselves had been reducing nitrogen pollution through voluntary and sort of cooperative mechanisms.
01:54:10.000 A lot of it's just like controlling the manure.
01:54:12.000 Right.
01:54:12.000 And controlling where the runoff goes.
01:54:15.000 Nitrogen is an essential fertilizer.
01:54:18.000 Yeah.
01:54:18.000 So you've got to control it so it isn't whatever.
01:54:20.000 So there's things that you can do, but there was this heavy-handed, EU-imposed...
01:54:24.000 The farmers revolted.
01:54:26.000 The public sympathized with the farmers, because the farmers are obviously just a tiny part of the population.
01:54:31.000 A new party had been created called the Farmersism Party, started by a journalist.
01:54:37.000 I interviewed her, and she's sane.
01:54:40.000 She's really sweet.
01:54:42.000 She wears leather jackets and is like this kind of...
01:54:45.000 She's a normie.
01:54:47.000 She's a normal person.
01:54:49.000 They just, they want a commanding.
01:54:52.000 It's not a majority because they have multiple systems.
01:54:54.000 They have a plurality of parties.
01:54:55.000 So they're going to basically, they are the kingmaker for the Senate, and she's now constructing a coalition to govern.
01:55:04.000 So hugely exciting.
01:55:05.000 You may have seen in France, there's been huge protests pushing back into Macron.
01:55:09.000 Massive revolt.
01:55:10.000 I have to say, I'm a little, I've always, Macron has been someone that I, it depends on the day of the week, it depends on what he's doing.
01:55:15.000 I can be sympathetic to him, but But I think you see the public...
01:55:19.000 They don't want to take this shit.
01:55:21.000 I mean, the Germans...
01:55:23.000 The German reporters...
01:55:24.000 And Germany is such a repressive little country.
01:55:28.000 It's like they're very...
01:55:29.000 They've been the most trying to get into the Twitter files.
01:55:33.000 They're like, all these German reporters are always like, could you please put us in touch with Elon?
01:55:37.000 It's a huge debate in Germany.
01:55:39.000 They're sick of the censorship.
01:55:40.000 They're sick of the top-down stuff.
01:55:41.000 So I think we're seeing some really cool...
01:55:48.000 Right.
01:55:54.000 Right.
01:56:10.000 Also, we would like to have access to information so that we form our opinions based on facts, not based on propaganda.
01:56:17.000 It's like, it's one thing...
01:56:20.000 I mean, it would be one thing if there was like some real problem that is not being addressed because of misinformation.
01:56:30.000 But that's not the case.
01:56:32.000 No.
01:56:33.000 There's no evidence and no argument whatsoever, including during the COVID crisis.
01:56:38.000 There's no argument whatsoever that it's in our best interest.
01:56:41.000 It seems to all align with money.
01:56:45.000 Well, and this thing where they use power, I mean, I just testified yesterday with the Stanford professor Jay Bhattacharya, who was the co-author of the Great Barrington Declaration, beautiful human being, by the way, just separate from his own views.
01:56:57.000 But he was like, look, in a crisis, you need more freedom of speech.
01:57:02.000 Like when you're trying to figure out how to solve a fast-moving, fast-changing problem, that is not the time to be doing censorship.
01:57:08.000 That's the time you want more views, more representation.
01:57:11.000 Also, there's like a standard thing that we should ask at any point in time when there's a dilemma and then someone is trying to control information.
01:57:20.000 Is there money involved?
01:57:22.000 Is your money involved?
01:57:23.000 Yeah.
01:57:24.000 And if information got out in one way or another, would it benefit or hurt someone?
01:57:31.000 And who is controlling that information and how?
01:57:34.000 Right.
01:57:35.000 And if you can't ask those questions, then money is just going to dominate.
01:57:39.000 That's right.
01:57:39.000 A big part of it is transparency.
01:57:41.000 Yes.
01:57:41.000 I think that the thing we testified on yesterday was just it's very hard.
01:57:46.000 Like the social media platforms, for a variety of reasons, you don't want the government regulating them.
01:57:53.000 But what you could do is just say every time the government demands Something to change on the platform, that government official has to file a public notice that they've asked for that.
01:58:04.000 So if the White House is going to say, censor true stories of vaccine side effects to Facebook, that government official must report that and it must become public right away, which will both reduce the amount of it that occurs, but also allow us to see it.
01:58:17.000 And then secondly, if Elon or Mark Zuckerberg or whatever are going to stop You know, I think there was something going on with the trans shooting that we just talked about yesterday.
01:58:25.000 I'm just looking into it.
01:58:27.000 Censorship?
01:58:27.000 Some folks being temporarily suspended, it appears to be.
01:58:32.000 I haven't talked to anybody on Twitter about it.
01:58:34.000 Is this an algorithm issue?
01:58:35.000 I don't know.
01:58:36.000 I don't know if you saw it.
01:58:36.000 I know Seth Davis from The Federalist appeared to have been deplatformed or suspended briefly.
01:58:42.000 I literally have not talked to Elon or anybody about it, so I don't want to make any accusations.
01:58:46.000 Do we know what he said?
01:58:47.000 I don't know, and like you said, I suspect it was an algorithm issue where they didn't want...
01:58:52.000 I think there was like a Trans Day of Vengeance planned for Tennessee or something, and this was all leading up to that.
01:58:59.000 So my point was just to have transparency on it.
01:59:02.000 If Twitter is going to de-platform somebody or bounce somebody or censor some post because they don't want to contribute to real-world violence, and there are situations where I think that might be appropriate, Just make it transparent.
01:59:15.000 Just tweet it out and let everybody know.
01:59:18.000 I think that that just— But that's the Streisand effect there that's going to take over.
01:59:22.000 Yeah, although I think it changes—it provides some context to it.
01:59:26.000 In other words, if Elon and Mark Zuckerberg had to say, hey, you know what?
01:59:30.000 We're actually stopping this Trans Day of Vengeance meme from spreading, I think it's okay because they're actually able to explain and talk about it.
01:59:39.000 Then you can have comments and people responding to it.
01:59:41.000 Transparency, for me, it's not necessarily the silver bullet, but it's the first thing we should do in order to—it's more free speech.
01:59:50.000 It's actually more speech, not censorship.
01:59:54.000 Yeah, I couldn't agree more.
01:59:56.000 I mean, I think we need people to be able to have an understanding of what is actually going on based on facts.
02:00:04.000 And if we deny that for their own good, and if we deny that because it contributes to X hesitancy or X – you know, we can't do that.
02:00:14.000 Right.
02:00:14.000 It's got to be about information.
02:00:16.000 It's going to be – and we have to treat everybody the way we would like to be treated ourselves.
02:00:21.000 Right.
02:00:23.000 I didn't have a chance to use this line, but I was going to ask these members of Congress that were demanding more censorship.
02:00:28.000 I was going to say, what have you said in the past that you think the social media companies shouldn't censor?
02:00:34.000 Because if you can't name anything...
02:00:36.000 Then all you're saying is that they should censor views that you disagree with.
02:00:41.000 Yes.
02:00:42.000 The other one I think – I mean I think the other issue is that there's this famous quote.
02:00:46.000 People say, you're entitled to your own opinion but not to your own facts.
02:00:49.000 Well, that's bullshit.
02:00:50.000 You're entitled to your own facts too.
02:00:52.000 We can't agree on what a woman is.
02:00:55.000 Right.
02:00:56.000 Literally, look at the polling on it.
02:00:59.000 Democrats, I think a majority of Democrats now say that trans women are real women.
02:01:06.000 A majority of Republicans say trans women are not real women.
02:01:11.000 We can't agree on what a woman is.
02:01:13.000 I mean, Matt Walsh, you had in here, too.
02:01:14.000 He did a whole movie called What is a Woman?
02:01:16.000 It's a very good movie.
02:01:17.000 Yeah.
02:01:18.000 Because it's so fascinating to watch the mental gymnastics that people put themselves through to stay within the parameters of the ideology.
02:01:25.000 Yeah.
02:01:26.000 I want people, like, for me, I want to be able to express myself.
02:01:29.000 I want people I disagree with to express themselves.
02:01:32.000 That's how it's got to be.
02:01:34.000 Yeah, I agree too.
02:01:35.000 I mean, about everything.
02:01:36.000 One of the things that Matt and I got into was about gay marriage.
02:01:40.000 And I wanted to hear his opinion on gay marriage.
02:01:43.000 I don't want to censor him.
02:01:44.000 I want to hear his opinion.
02:01:46.000 We talked about it for like 40 minutes.
02:01:47.000 And I'm like, I don't understand you.
02:01:50.000 I don't understand why that bothers you.
02:01:52.000 I don't understand why you're saying that marriage has to be between a man and a woman.
02:01:56.000 And then he got into this argument about procreation, and I'm like, what about sterile females?
02:02:01.000 Males and females that don't want children.
02:02:03.000 Should they not be allowed to get married unless they want to have children?
02:02:05.000 Like, what are you saying?
02:02:07.000 Like, what do you think gay people should do?
02:02:08.000 Do you think they should not be gay?
02:02:10.000 Like, and you get them in this, like, weird sort of...
02:02:13.000 What about freedom?
02:02:14.000 Like, what about...
02:02:14.000 Do you think they're not gay?
02:02:16.000 Like, do you think it's an act?
02:02:17.000 Do you think that guys having sex with guys, they're just doing it because it's, like, culturally accepted?
02:02:22.000 Like, is that what you think?
02:02:24.000 Because they've existed forever.
02:02:25.000 It's a real thing.
02:02:26.000 What are you getting at here?
02:02:29.000 It boils down to they believe their religious ideology trumps your ability to create your own reality or have a reality that aligns with your beliefs and desires and your sexual orientation and whatever the fuck else you choose in life,
02:02:47.000 as long as it's not hurting other people.
02:02:48.000 And for the Republicans, It was always small government, stay out of people's lives, but why not with gay people?
02:02:57.000 Why are you fucking with gay people?
02:02:59.000 Why does that apply to everything except gays?
02:03:02.000 I don't get it.
02:03:03.000 That's why we're not conservatives.
02:03:04.000 Yes.
02:03:05.000 That's one of many reasons.
02:03:07.000 One of many reasons.
02:03:08.000 The thing that you're doing that is so important and so beautiful, that's why you're the king of this and why this medium is so important, is that you're saying I don't understand what you're saying.
02:03:20.000 Yes.
02:03:34.000 They're not doing that.
02:03:35.000 Right.
02:03:35.000 They're worried that they're going to lose, so they want to silence you.
02:03:39.000 Yeah.
02:03:39.000 And they're saying, I'm so sure that I'm right and you're wrong that we're not going to even have the conversation.
02:03:43.000 Yeah.
02:03:44.000 We're not going to have the – they view you as a threat.
02:03:48.000 It's not because of your beliefs.
02:03:49.000 It's because this is a three-hour-long podcast platform for people to actually – They raise a bunch of threatening ideas, but they're overreacting themselves because, of course—I mean, look, you have to have—this is why I think there is some faith.
02:04:04.000 You do kind of go like we just—there's a faith in which more speech is better.
02:04:09.000 More speech is better for human beings.
02:04:11.000 It's always better.
02:04:12.000 And there's some—these censors have lost faith in the American project.
02:04:18.000 They've lost faith in the Enlightenment project.
02:04:21.000 I don't think they're even looking at it.
02:04:22.000 I think they're self-centeredly looking at this whole thing as like, how do I win?
02:04:27.000 People love to win.
02:04:29.000 This is one of the problems with prosecutors hiding evidence that would exonerate defendants.
02:04:35.000 People like to win.
02:04:36.000 That's why cops plant evidence.
02:04:38.000 They want to win.
02:04:39.000 When you make it a game, and you have a winner and a loser, and if I can get you booted off of Twitter by making a few emails, woo, look what I just did.
02:04:49.000 Fuck Michael Schellenberger.
02:04:50.000 Oh, yeah.
02:04:51.000 Oh, and they love it.
02:04:53.000 And they get so much pleasure from it.
02:04:54.000 They get so much pleasure.
02:04:55.000 I mean, there is a snobbery to it in the sense of they just go, I just can't believe that that guy's the president.
02:05:03.000 Exactly.
02:05:04.000 I mean, how dare he...
02:05:06.000 It's the whole deplorables thing, right?
02:05:08.000 Yeah, a basket of deplorables.
02:05:10.000 Yeah.
02:05:10.000 But in their defense, wouldn't it be nice if we had, you know, look, like, Obama's my favorite president because I think he was the best spokesperson for a nation.
02:05:20.000 He was the best representative of what is possible in America, you know?
02:05:26.000 Comes from a single mom...
02:05:28.000 You know, he's African-American.
02:05:30.000 He's super, like, articulate and well-educated and just composed no matter what happens.
02:05:39.000 He's the best statesman we've ever had as president, in my opinion.
02:05:43.000 It would be nice if that was always the case.
02:05:45.000 Instead, we have Biden, who's so obviously mentally compromised.
02:05:50.000 We have Kamala Harris that nobody wants to be president.
02:05:53.000 And then we have Trump, which everybody's terrified of because he's a fucking egomaniac, megalomaniac, fucking narcissist, psychopath.
02:06:01.000 What is out there for us that gives us hope in terms of leadership?
02:06:06.000 Very little.
02:06:08.000 Yeah.
02:06:08.000 Well, there are symptoms of a broader rot, right?
02:06:13.000 Yes.
02:06:14.000 I mean, in the culture.
02:06:17.000 Yeah, well, it's always darkest before the dawn.
02:06:20.000 Oh, look at you, positive.
02:06:22.000 You're such a positive person.
02:06:27.000 You always find a way to turn towards the light.
02:06:29.000 I think it's darker before the nuclear explosion.
02:06:33.000 I don't know if it's a dawn.
02:06:36.000 It's always dark before the bomb goes off.
02:06:38.000 By the way, it isn't always dark before the dawn.
02:06:40.000 That's horseshit.
02:06:41.000 It's actually quite light.
02:06:43.000 Before the dawn.
02:06:45.000 That's really dumb.
02:06:46.000 It actually gets slightly lighter.
02:06:48.000 It's one of the best of the cliches.
02:06:49.000 It's the dumbest.
02:06:52.000 It's dumbest!
02:06:53.000 No, it's darker in the middle of the night, you fucking idiot.
02:06:58.000 It's funny because these are both manifestations of the Gen X mentality.
02:07:02.000 Because, yeah, the Gen X was the original, you know, we were the first ironic generation, the first sarcastic generation.
02:07:09.000 Really?
02:07:10.000 Yeah.
02:07:10.000 They weren't sarcastic in the 70s?
02:07:12.000 Well, we were alive in the 70s.
02:07:15.000 Right, but we weren't grown-ups.
02:07:16.000 Yeah.
02:07:17.000 No, the 60s, the boomers were very non-ironic.
02:07:21.000 They were very earnest.
02:07:23.000 I think we're speaking in rash generalizations.
02:07:26.000 Oh, for sure.
02:07:27.000 My real concern is that with technology and the ability to control people, if we don't get a grasp on that, we're going to fall into a situation that's very similar to what they have in China, where you have a social credit score.
02:07:40.000 And a centralized digital currency.
02:07:42.000 And when I see people like Maxine Waters pushing us towards that direction and people talking about the first sounds of it were vaccine passports.
02:07:52.000 When they were saying vaccine passports, I was like, Jesus Christ, don't do that.
02:07:56.000 Because that is going to lead to a social credit score system.
02:08:00.000 That's going to lead to...
02:08:02.000 Once they have the ability to make you have an app and that app gets to decide whether or not you travel, they're not going to let that go.
02:08:10.000 There's no way they're going to let that go.
02:08:13.000 And once they have something like that attached to a centralized digital currency, it's game over.
02:08:18.000 It's game over until something really big happens.
02:08:21.000 And that's what China's experiencing.
02:08:23.000 No, for sure.
02:08:24.000 The social credit system is totally terrifying.
02:08:26.000 No, I mean, look, I've become way more libertarian since I've worked on the Twitter files.
02:08:30.000 I get it.
02:08:31.000 Really?
02:08:32.000 And I get the paranoia.
02:08:33.000 But this is a really recent shift in your philosophy.
02:08:35.000 It is.
02:08:35.000 It is.
02:08:36.000 I mean, I've always been more...
02:08:37.000 I came from more of the socialist left than the anarchist left.
02:08:41.000 So I've always thought that there was a good role for government.
02:08:43.000 I still do.
02:08:44.000 But no, for sure, I've become much more paranoid.
02:08:49.000 I mean when you go – when you spend all this time in these documents and you see the way these guys kind of sneak around and they're trying to do all this stuff behind the scenes, it's really – it is like – Elon thought it was funny.
02:08:59.000 It was like, yeah, I mean it is like these conspiracies are real.
02:09:03.000 They're all real.
02:09:03.000 They weren't – yeah, I wish they were fake.
02:09:07.000 Yeah, he sent me a text message.
02:09:09.000 Turned out they're all true.
02:09:10.000 LOL. LOL. He says that so casually.
02:09:14.000 Well, I think because of who he is and the way his mind works, I don't think he necessarily gets upset like the way other people do.
02:09:24.000 I think he just goes, oh, what is?
02:09:26.000 I've never seen anybody...
02:09:28.000 I mean, what's amazing is I've never seen anybody be so impulsive and so successful because I think we associate impulsivity with failure.
02:09:39.000 But he is somebody that...
02:09:41.000 I think impulsivity is to Elon in the way that being a dick was successful for Steve Jobs.
02:09:50.000 Walter Isaacson, who's writing a book about Elon right now, by the way.
02:09:53.000 But Walter Isaacson, in his biography of Steve Jobs, he was like, look, Steve Jobs was just too much of an asshole.
02:09:59.000 He didn't need to be that much of an asshole.
02:10:00.000 But what it did is it forced out incompetent people.
02:10:04.000 It was a way that he got rid of incompetent people.
02:10:07.000 I think Elon's impulsivity...
02:10:10.000 The way that he moves quickly, he overpaid for Twitter on the one hand.
02:10:15.000 On the other hand, he owns Twitter.
02:10:17.000 Because you kind of go, well, the market value is one-third of the $44 billion.
02:10:21.000 It's like, yeah, on the other hand, Twitter is now a pretty open platform.
02:10:27.000 Again, we need transparency and we should all be vigilant and whatever.
02:10:32.000 But I mean, wow.
02:10:34.000 Big difference.
02:10:34.000 How much is that worth?
02:10:36.000 When he had the vote online, like anyone who hasn't violated the law, should I let them back in?
02:10:41.000 And most people like, or enough people were like, yes.
02:10:45.000 He's like, okay, the people have spoken.
02:10:47.000 And so he lets in all these fucking psychos, like really nutty people.
02:10:52.000 Well, yeah, and there were mistakes, too, right?
02:10:54.000 Remember, they would give out the verification to these fake brands, and Eli Lilly was like, there's a fake Eli Lilly account.
02:11:02.000 Oh, really?
02:11:03.000 Oh, I didn't know that.
02:11:05.000 They go, starting Monday, we will be giving out all insulin for free, or something like that.
02:11:09.000 And the Eli Lilly stock dropped.
02:11:12.000 So, I mean, that happened.
02:11:14.000 And Elon was like, okay, we're going to change that a little bit.
02:11:17.000 So, I mean, he, that fast, I mean, so he's, that the whole, I mean, when people, this cliche in Silicon Valley, the whole, you know, move fast and break things.
02:11:26.000 But that's what he's doing.
02:11:27.000 And then he moves fast, he breaks something, but then he also fixes quickly.
02:11:31.000 Yeah.
02:11:31.000 So, but I mean, you have to remember that they had 7,500 employees.
02:11:35.000 I think they're down to somewhere between 1,000 and 1,500 employees at this point.
02:11:39.000 Yeah.
02:11:39.000 Well, it's been pretty well established that most tech companies severely overhired.
02:11:44.000 And, you know, we've played multiple times this video of this woman who made a TikTok a day in the life of Twitter.
02:11:52.000 Oh, yeah.
02:11:52.000 Working at Twitter.
02:11:53.000 I'm sure you've seen it.
02:11:54.000 Who wasn't doing anything, right?
02:11:55.000 She wasn't doing jack shit.
02:11:56.000 She was playing foosball.
02:11:58.000 And then I had a glass of wine and on the pad.
02:12:01.000 And look at the view.
02:12:03.000 So blessed.
02:12:03.000 I was like, this is crazy.
02:12:05.000 I'd fire you immediately if you put that video out.
02:12:07.000 The fact that she put that video out and someone's paying her a salary to essentially, like, hang out and eat all this delicious food.
02:12:14.000 And he's like, fuck you.
02:12:15.000 Get takeout.
02:12:16.000 Like, go to work.
02:12:18.000 Like, here's a bed.
02:12:18.000 Sleep here.
02:12:20.000 I mean, the idea that there's a thousand people at a company that is a $44 billion company is crazy.
02:12:26.000 It is crazy, yeah.
02:12:27.000 So, yeah, I mean, he's obviously a genius.
02:12:30.000 And he's the richest guy in the world, and he's going to become even richer with SpaceX.
02:12:34.000 And, yeah, I mean, it took somebody that powerful to do this.
02:12:39.000 Somebody that powerful that's also addicted to Twitter.
02:12:41.000 Yeah.
02:12:42.000 Which is, that's where it gets fun.
02:12:44.000 Because he was on it every day.
02:12:46.000 And something about his emotional makeup doesn't make, I don't know how upset he gets when people come after him.
02:12:53.000 Well, he got—so this is a funny story.
02:12:55.000 So we were there in December, you know.
02:12:58.000 We come in and we're, like, doing the Twitter files.
02:13:01.000 And then he ends up deplatforming people that he said had been—had doxed his private plane.
02:13:08.000 Yes.
02:13:09.000 And there was a big controversy about it.
02:13:10.000 Did they really do it?
02:13:11.000 Whatever.
02:13:11.000 I didn't follow it super closely.
02:13:12.000 But Barry Weiss, who was there and who had brought me in, she criticized Elon in a tweet and was like, look, you know, it was arbitrary before.
02:13:20.000 Is it arbitrary now?
02:13:21.000 Mm-hmm.
02:13:22.000 Elon responds and is like, you're just trying to suck up to the woke mob.
02:13:25.000 You know, you're trying to have it both ways.
02:13:27.000 It was a mess, right?
02:13:28.000 We were all kind of like, you know, it was like, oh my god, they're all fighting.
02:13:31.000 Right.
02:13:31.000 My parents are fighting.
02:13:32.000 It's like, oh my dad and mom are fighting again.
02:13:34.000 You know, and we're like, oh, and I kind of retweet her, but it was like, okay, I retweet her, but we'd still like to have access to the Twitter files.
02:13:41.000 You know, we did – there's this famous clip that went viral when Matt Taibbi and I testified in front of Congress where this member of Congress goes, you know, how did you get in?
02:13:50.000 They were trying to make it like a scandal that somehow we were reporting on the Twitter files.
02:13:54.000 And I was like, I was brought in by Barry Weiss.
02:13:58.000 And then she was like, oh, so it's like a threesome?
02:14:01.000 And the whole room erupts into laughter.
02:14:04.000 And I was like, well, there was actually a lot more people involved than that.
02:14:09.000 And everybody laughed, and Elon just loved it.
02:14:13.000 Because, you know, he's just, like us, we're all just perverted Gen Xers at the end of the day, you know?
02:14:17.000 And so he loved it and was very happy and was just like, all is forgiven with Barry if she wants to come back in, you know?
02:14:27.000 Come back in.
02:14:28.000 Because I think he's also somebody that...
02:14:30.000 What I like about Elon, and I don't know him very well at all, is that he reminds me a lot more of...
02:14:35.000 Because I've met him in Brazil, and Brazilians are just very emotional, and they're just like...
02:14:39.000 The men will cry, and they'll scream at each other, and then they'll make up, and it's just a very expressive culture.
02:14:46.000 And Elon's just...
02:14:47.000 He just expresses his feelings about things.
02:14:50.000 When he's mad at somebody, he'll tell you.
02:14:52.000 But then he also has shown this capacity to forgive, and...
02:14:56.000 So, you know, I think there's something there, you know, in terms of, you know, I mean, he really – I think he really – he told us, he's like, I didn't buy Twitter just to replatform Babylon B. And we were like – I was like, but it was part of it, right?
02:15:12.000 Like part of it was – but, I mean, I think it was – you know, I do think that some generalizations about our generation is actually appropriate.
02:15:20.000 I think that Gen Xers, you know, there was a moment there – I don't know.
02:15:24.000 I don't want to create a golden age about it.
02:15:26.000 But there was a point there where it was like...
02:15:28.000 Remember Breakfast Club?
02:15:29.000 And we were kind of like, yeah, you can date whoever you want.
02:15:33.000 You want to date a black girl?
02:15:34.000 You want to date a Latina?
02:15:35.000 Whatever.
02:15:35.000 You can be gay.
02:15:36.000 And it was not a big deal.
02:15:37.000 But it also wasn't like...
02:15:39.000 You were like higher on some moral hierarchy or something.
02:15:42.000 Right.
02:15:42.000 That's the problem with identity politics.
02:15:44.000 Yeah, like the political – I mean I was an annoying politically correct guy in college.
02:15:48.000 But there was also some Gen X spirit of like, hey, we're kind of beyond all that bad 60s shit.
02:15:54.000 We don't want to be there.
02:15:55.000 I mean John McWhorter also talks about it in Woke Racism where it's like – And he's a Gen Xer too, where I think there is a...
02:16:03.000 I'm not saying it's the solution to all of his problems, but I think that that Gen X spirit, that Breakfast Club spirit needs to come back into American culture.
02:16:11.000 I wish you hadn't said that.
02:16:12.000 You wish I hadn't said that?
02:16:13.000 Yeah.
02:16:14.000 Breakfast Club spirit.
02:16:17.000 I said, hey, I'm the cringe one.
02:16:19.000 Unless you're talking about the Charlemagne radio show.
02:16:21.000 That Breakfast Club.
02:16:22.000 I agree there.
02:16:24.000 You don't like Breakfast Club?
02:16:25.000 The movie?
02:16:26.000 It's okay.
02:16:26.000 Oh, come on, Joe.
02:16:28.000 It's not my thing.
02:16:28.000 Okay.
02:16:31.000 But my question is like when did you shift and become less politically correct?
02:16:37.000 Like you were politically correct in college.
02:16:39.000 What caused the shift for you?
02:16:42.000 You know, it's funny you ask that.
02:16:43.000 I mean it was – I was in San Francisco in the 90s doing kind of publicity campaigns for different progressive causes and I had some women I knew who were also very progressive and they came and they were like, we want to come and do a diversity training for you and your staff.
02:17:00.000 And I was like, why?
02:17:02.000 And they were like, well, it was like all of the early implicit racism stuff.
02:17:06.000 And I just remember being like, I don't think we're racists and I'm not going to do that.
02:17:12.000 And I think there was a bunch of that happening.
02:17:14.000 Grifters.
02:17:15.000 They're grifters and moralizers and they wanted to get some power over us and be paid.
02:17:21.000 I mean, it was the beginning of all that bad diversity training stuff.
02:17:24.000 So I think it was that...
02:17:26.000 It was also on climate change.
02:17:28.000 Once you kind of go – climate change is just going to be solved by producing energy without carbon emissions.
02:17:34.000 Like it's just a technical problem.
02:17:36.000 It's not like, oh, we all have to ride our bikes.
02:17:38.000 Like I love riding my bike but it's like it became the moralizing and the woke culture.
02:17:43.000 I was just like this is bullshit.
02:17:45.000 There's also a thing that's not being addressed about the climate is that it's never been static ever.
02:17:51.000 Never in the history of the Earth.
02:17:53.000 So this idea that climate change is going to be mitigated or that somehow or another we're going to be able to control it, like, are you sure?
02:18:01.000 Because it seems like ice ages have always existed and great periods of melting and global warming have always existed.
02:18:09.000 Like, whether or not we're having an effect on it, that's what we should say.
02:18:12.000 Well, what is our effect?
02:18:13.000 Pollutants.
02:18:14.000 What are we doing?
02:18:14.000 What are we doing that's negative?
02:18:16.000 But this idea that if you stop, the Earth is going to stay like this, It's not.
02:18:20.000 Oh, of course.
02:18:21.000 It doesn't exist.
02:18:22.000 It's always like this.
02:18:23.000 It's up and down.
02:18:24.000 It's all over the fucking place.
02:18:25.000 Well, the funny thing is we were probably headed towards an ice age.
02:18:28.000 Yeah.
02:18:28.000 And then our carbon emissions, it appears to have reversed that.
02:18:32.000 Which is good.
02:18:33.000 Which is good, and then you just don't want to go too far.
02:18:37.000 Global cooling is way scarier than global warming.
02:18:40.000 And way more people die of cold.
02:18:41.000 Randall Carlson told me that and I never even thought about it until he said it.
02:18:44.000 And I was like, yeah, Jesus Christ, you're fucked if everything freezes.
02:18:48.000 And he said there was a point in human history or a point in the history of the earth where things got so cold that we almost became inhospitable to life.
02:19:00.000 Life as we currently understand it and know it.
02:19:02.000 Yeah, for sure.
02:19:03.000 I mean, you know, everything in moderation.
02:19:06.000 Yeah.
02:19:07.000 You don't want to change the temperature too much in any direction.
02:19:11.000 Of course.
02:19:12.000 But I mean, look, for me, it's always been...
02:19:14.000 I think there's a bunch of complicated problems, like social media and the culture.
02:19:18.000 But energy, it's pretty straightforward.
02:19:21.000 If you're using wood...
02:19:23.000 Anything is better than that, including coal.
02:19:25.000 Yes.
02:19:45.000 Yeah.
02:19:46.000 So we just kind of overthink it and then the issue got – not overthink it but really got hijacked by a bunch of opportunists that want to use it as a way to exercise control.
02:19:55.000 So for you, you experienced these people that came along that were kind of grifters that were saying we need to incorporate some – and by the way, you were talking about an extremely progressive liberal organization that you were part of.
02:20:08.000 If there was any racism, it would have stood out like a sore thumb.
02:20:12.000 If anything, you're promoting the complete opposite of what they're trying to say.
02:20:21.000 By giving you some training, they're trying to find implicit racism or hidden racism or...
02:20:30.000 Well, it gets back to that.
02:20:31.000 I think there's a – when you just abandon traditional religions and the traditional morality, you want to create – I mean, look, even like – I mean, this BIPOC thing is so interesting because it's like – I was like – I finally just wanted to explain what is BIPOC. Well, that's black, indigenous people of color.
02:20:47.000 Literally in the word, it's creating a hierarchy where it's black and indigenous people above Latinos and Asians who are just barely people of color.
02:20:55.000 It's grotesque.
02:20:57.000 Everybody hates it.
02:20:58.000 Not everybody, but most people I think actually hate it.
02:21:01.000 But it has this power because it's providing...
02:21:05.000 In fact, this de-transitioner I interviewed, she was like, the social justice—she's autistic, so she's autism spectrum.
02:21:12.000 She was like, as an autistic person—and she's a lot of self-awareness and older now—but she was like, that social justice moral hierarchy provided some comfort.
02:21:21.000 It was like a way to be confusing.
02:21:25.000 She was uncomfortable with herself, socially awkward.
02:21:30.000 Yeah.
02:21:44.000 The older morality is true anti-racism in that we don't think there are human races, much less that they can be put on a hierarchy.
02:21:52.000 This is what we want to get back to.
02:21:53.000 Yes.
02:21:53.000 And that is the reality of biological human beings too.
02:21:56.000 The reality is it's one race.
02:21:59.000 We just adapted to different climates.
02:22:01.000 That's all it is.
02:22:01.000 We all came from Africa.
02:22:04.000 So you experienced this and you recognized these people were grifters.
02:22:09.000 And then like what moves you other than – is the Twitter files – is that the biggest shift in your political – Becoming more libertarian?
02:22:18.000 Yeah.
02:22:18.000 I mean the first big one was nuclear.
02:22:21.000 After you realize that nuclear is good, not bad, that's such a big one.
02:22:25.000 You're just like, wow, man.
02:22:26.000 Because that's like, already nuclear was the secular devil, for those of us that grew up in the...
02:22:30.000 Because it's connected to power, weapons, rather.
02:22:34.000 The day after, and all the nightmares.
02:22:36.000 Yeah, and it's also connected to, like, Three Mile Island, and Fukushima, and...
02:22:40.000 Yeah.
02:22:41.000 I love these things.
02:22:42.000 I mean, I think it's like...
02:22:44.000 It's funny because, of course, we know that disconfirmatory information is dopamine depleting.
02:22:50.000 In other words, if we get proven wrong, it's depressing.
02:22:53.000 But there's another way after you get over it, you're kind of like, well, that's cool.
02:22:57.000 Nuclear is not what I thought it was.
02:22:59.000 There's actually a moment of awe.
02:23:02.000 It's like seeing a UFO or being like, oh my god, we might not be alone.
02:23:06.000 There's something exciting about the excitement.
02:23:09.000 We need to get back in touch with the excitement that comes after you realize that you were wrong.
02:23:13.000 It's an awareness of some humility and that the world is more mysterious and wonderful than we had realized.
02:23:19.000 Yeah, I think there's also a really great benefit in expressing to people that ideas are just ideas.
02:23:27.000 It's not you.
02:23:28.000 Yes.
02:23:28.000 These are just some things that are bouncing around in your head.
02:23:31.000 And even if you're wrong, it's not a value judgment on you.
02:23:34.000 You should probably be wrong less than you are right.
02:23:37.000 You should probably be right much more.
02:23:39.000 But it's very important that when you are wrong to acknowledge that you're wrong.
02:23:42.000 One of the worst things that happens to a public intellectual is when they are wrong and they refuse to admit they're wrong.
02:23:49.000 This is the Sam Harris dilemma.
02:23:51.000 Like there's many people that are very brilliant people but they're in this trap where they can't say they were wrong.
02:23:59.000 And if you can't expose people to your thought process and why you made errors, they're going to lose faith in your ability to discern the truth in the future.
02:24:09.000 And isn't it ironic that often those are the people that are always talking about being without ego?
02:24:14.000 It is.
02:24:15.000 It's sad.
02:24:16.000 I always notice it's like, wow, the people that talk so much about not having ego have the biggest egos.
02:24:21.000 It's just being a human, man.
02:24:23.000 It's being a human.
02:24:25.000 And I think it's also just a sign of our ideologically driven times where I think the divide between the right and the left and the boundaries in between them are so wide now.
02:24:37.000 Yep.
02:24:37.000 It's so different.
02:24:38.000 I think that that thing, too, of where, again, the abandoning traditional religions and adapting to morality, I think people do start to play God a bit, unconsciously.
02:24:49.000 Yes.
02:24:49.000 And they get real self-righteous and really...
02:24:51.000 I think it's great to...
02:24:52.000 I mean, I don't know how to do it, but for me, it's always like, just, we're all gonna die.
02:24:57.000 Like, just let's pause for a minute.
02:24:59.000 Like, we're going to die.
02:25:01.000 And not only that...
02:25:02.000 The Stoicism is so good at this.
02:25:05.000 It's Memento Mori.
02:25:08.000 Oh my god, they're right there.
02:25:11.000 You have like six of them.
02:25:13.000 They're amazing.
02:25:15.000 That's you.
02:25:16.000 Very soon.
02:25:17.000 So what the fuck are you going to do between now and then?
02:25:20.000 For the people listening, he's pointing to these skulls that are on the table.
02:25:22.000 Really cool.
02:25:23.000 These are all from this guy, Jack of the Dust, who's an artist.
02:25:26.000 They're not real skulls.
02:25:27.000 You have real skulls?
02:25:28.000 No, these are just resin.
02:25:31.000 He makes these.
02:25:32.000 But we'd be a lot better.
02:25:33.000 I think there's some way, when you remind some people of their deaths, they get kind of reactionary and smaller.
02:25:38.000 But other people, I think there's a moment, it's like, yeah, like, so what am I going to, like, this is it.
02:25:43.000 Like, what are you doing now?
02:25:45.000 And what kind of a person do you want to be?
02:25:48.000 And what kind of a life do you want to lead?
02:25:50.000 We need that.
02:25:50.000 My friend Peter Atiyah, he introduced me to this thing called Your Life in Weeks.
02:25:56.000 And each week you scratch one off and you look at all the weeks.
02:26:00.000 You can see them all.
02:26:02.000 All of them.
02:26:02.000 And you go, this is how much you got left.
02:26:04.000 Unless something radical changes.
02:26:06.000 And people said to him, oh my god, this is so depressing.
02:26:09.000 He's like, it's actually not.
02:26:11.000 It reaffirms my understanding of what's important and makes me want to spend more time with my family and it makes me want to not do things that I'm really not interested in doing just because they're going to make me money.
02:26:22.000 If we can get that into the head of some of these fucking people that are censoring people and some of these people that are pushing these crazy agendas and hiding information from people because they think that it's going to Contribute to an undesirable outcome that doesn't fit in a line.
02:26:40.000 If the other group wins, you did a shitty job.
02:26:43.000 And if you're hiding information that would allow that other group to win, you're a bad person.
02:26:49.000 Like, you're bad.
02:26:50.000 Like, if there's actual, real, criminal evidence that you're hiding because you don't want this other person to get elected, you're doing a terrible thing to humanity.
02:27:01.000 And you're doing it based on these very base and normal human instincts.
02:27:07.000 Absolutely.
02:27:07.000 And, like, look at that—I mean, they won't—so, first of all, we've emailed—like, I mentioned that Aspen workshop with all the journalists and all the social media companies.
02:27:15.000 I emailed every single one of the participants.
02:27:19.000 And said, would you please talk to me about this?
02:27:21.000 Not a single one.
02:27:22.000 I'm sorry, Washington Post actually, of all places, responded, not the actual reporter, but through a spokesperson, responded with some lame...
02:27:31.000 But it's kind of like if you're so confident, if you're so better than everybody, then why can't you come and just have a conversation and defend it?
02:27:40.000 You're skulking around.
02:27:42.000 I mean it shows the underlying insecurity and weakness behind those sensors.
02:27:47.000 I mean it's the hall monitor type.
02:27:50.000 They're the little church ladies.
02:27:52.000 They don't want to have an open conversation.
02:27:54.000 They want to exercise power behind the scenes.
02:27:57.000 Well, that's a human instinct.
02:27:59.000 It's a natural human inclination to control other people that you might think are threatening or in competition with you or might somehow or another get in the way of your desired goals.
02:28:09.000 And people get so self-obsessed in those things without something like your life in weeks, where you can just look at it like, oh, this is all fucking fruitless.
02:28:19.000 Like, what am I doing here?
02:28:20.000 I'm going to go send the life and weeks to all of the censors and be like, you are here.
02:28:26.000 How long do you want to keep trying to censor your fellow Americans?
02:28:30.000 I mean, what are you doing?
02:28:31.000 Well, I think one of the things that has happened, I think has been greatly beneficial, that the exposing of the Twitter files and the making it public where like, especially that we were talking about this last night at the club.
02:28:47.000 That woman who was, like, calling Matt Taibbi a so-called journalist.
02:28:51.000 She called us both that, by the way.
02:28:53.000 Yeah.
02:28:54.000 Hilarious.
02:28:55.000 Like, what is a journalist to you?
02:28:57.000 Someone who says things only that you agree with?
02:28:59.000 Well, and you know who – and we talked about also what a powerful projection it was because she's a non-voting representative from the Virgin Islands.
02:29:08.000 Yeah, which is the – Somebody points out right.
02:29:09.000 She's our so-called representative.
02:29:11.000 Yeah.
02:29:12.000 Yeah.
02:29:14.000 But, I mean, calling Matt Taibbi, who is so decorated, a so-called journalist, and the fact that he got to rattle off all the awards in journalism that he's received.
02:29:24.000 Yeah.
02:29:26.000 It shows what their concern is.
02:29:28.000 Their concern is around status.
02:29:29.000 I want to know who the fuck talked to her.
02:29:31.000 I want to know who boosted her up and got her to say those things.
02:29:35.000 Oh, yeah.
02:29:35.000 To say it that way.
02:29:36.000 What was the conversation?
02:29:38.000 Someone should get into her fucking emails.
02:29:39.000 I'd like to know.
02:29:40.000 What conversations were you privy to?
02:29:42.000 What did they say to you?
02:29:44.000 Well, there's also Debbie Wasserman Schultz who's famous for derailing – Insider training.
02:29:49.000 Insider training derailing Bernie Sanders is good to become.
02:29:53.000 So yeah, they're just – it's all Pod calling the kettle black.
02:29:57.000 Well, it's also the shittiness in which they communicate with these people who are just exposing something that everyone should be aware of because it's a real problem.
02:30:07.000 Right.
02:30:08.000 What Twitter is and what Facebook is and Instagram and all these social media platforms, these are our new public squares.
02:30:15.000 And we need some sort of an understanding of the significance of censorship in regards to what kind of an impact it's going to have on our life, our real life world.
02:30:25.000 Like how many people who got censored off of Twitter, it's radically changed their life, radically changed, for wrong reasons, changed the progression of their future.
02:30:36.000 I would imagine a lot.
02:30:38.000 Quite a few.
02:30:39.000 And also deeply humiliated them and probably ostracized them in certain social circles.
02:30:45.000 Here comes Mike.
02:30:46.000 He got banned from Twitter.
02:30:48.000 Yeah.
02:30:48.000 What did he say?
02:30:49.000 Oh, yeah.
02:30:49.000 He said, only women have vaginas.
02:30:54.000 Megan Murphy.
02:30:55.000 Yes.
02:30:55.000 Yeah.
02:30:56.000 But men aren't women.
02:30:58.000 Yeah.
02:30:58.000 Vajaya.
02:30:59.000 Yeah.
02:30:59.000 I love it that she said that to her.
02:31:01.000 Yeah.
02:31:01.000 Like after she got back.
02:31:03.000 Yeah.
02:31:03.000 I helped her get back home.
02:31:04.000 She's still mad about it.
02:31:05.000 Oh, you did?
02:31:05.000 Okay.
02:31:06.000 Yeah.
02:31:06.000 Yeah.
02:31:06.000 She's mad about it.
02:31:07.000 She's my friend.
02:31:08.000 I had her on the podcast when she was banned because I was talking about her before she was banned.
02:31:13.000 Excuse me.
02:31:13.000 Before she was brought back because I had heard she was banned for this.
02:31:17.000 And so that was the ultimate Streisand effect.
02:31:19.000 Like I took this woman who is this obscure...
02:31:22.000 Journalists who got banned for disagreeing with trans activists, and I brought her in front of millions of people.
02:31:27.000 I'm like, what happened?
02:31:28.000 Tell me what happened.
02:31:29.000 Multiple times she's been on.
02:31:30.000 Amazing.
02:31:30.000 And so now, you know, she's back on the platform, and, you know, now people get to—she's a brilliant woman.
02:31:36.000 And she's also—she has some really good points.
02:31:39.000 And her point about trans activists, like, you are trying to silence biological women by—you're bringing in these biological males, Into these traditionally women's spaces and they're calling themselves feminists and she's like, that's not real.
02:31:53.000 Like, this is not what's happening here.
02:31:55.000 Absolutely.
02:31:56.000 I mean, it's interesting that the people that are trying to kind of put everybody down, the deplorables or the so-called journalists or just all of the insults, they're coming from people who have just been the worst bootlickers their entire careers, suck-ups, brown-nosers.
02:32:13.000 Are so proud of having sucked up for so long that they're deeply threatened by people who are actually challenging the status quo.
02:32:22.000 Well, that's the mainstream journalist's approach to the internet journalist.
02:32:28.000 You know, when you have people like Crystal and Saga from Breaking Points who are, like, beholden to no one?
02:32:33.000 Like, what?
02:32:34.000 They hate it.
02:32:35.000 And also they have a subscription-based service, so they don't even need advertisers?
02:32:39.000 The fuck is going on here?
02:32:41.000 Like, who saw this coming?
02:32:42.000 No one, right?
02:32:43.000 Before Substack, there was never a place where someone like a caliber of Matt Taibbi or Glenn Greenwald could post and millions of people could read their stuff and it could make international news, just like a Washington Post article,
02:32:59.000 just like a Boston Globe article.
02:33:01.000 Well, they're envious, too, right?
02:33:02.000 Of course, they should be.
02:33:03.000 I mean, it's like, yeah, you have to, like, go—you go to work—I mean, if you work at one of those traditional news outlets, you go to work every day, you're not able to publish and write.
02:33:13.000 You know, you write something, the editors sit on it.
02:33:15.000 My friend Nellie Bowles, who's married to Barry Weiss, you know, when she was at the New York Times, you know, like, you'd write a story, and then you'd argue with editors for weeks, and then maybe they'd publish— The thing that was like half its original length and has been completely woke-ified.
02:33:28.000 So they don't have – they're actually – they're jealous of the freedom that people with free speech have and they want to stamp it out.
02:33:37.000 100 percent.
02:33:38.000 And it's a threat to the choices that they've made.
02:33:42.000 The good thing is that they're blind to it and so they end up – Yeah.
02:33:59.000 Yeah.
02:34:11.000 But the idea that they're connecting me to him with his giant photo that somehow discredits me.
02:34:19.000 They don't understand the landscape.
02:34:21.000 No, they've lost the plot.
02:34:23.000 They've lost the plot.
02:34:24.000 They have a bubble.
02:34:25.000 They have this ideological echo chamber that they exist in.
02:34:30.000 And they think that holding a photo, oh, you were on the biggest program in the world?
02:34:36.000 What are you, a piece of shit?
02:34:38.000 Yeah.
02:34:38.000 No, for sure.
02:34:40.000 But it's really funny.
02:34:42.000 It's funny to watch.
02:34:44.000 And this is this transformation from the world of these corporate-owned distributors of information to independent people that people actually trust, that don't have any sort of weird connection to executives and producers and all these other people that have a vested interest in pushing a narrative that is established by the advertisers.
02:35:08.000 You sound optimistic.
02:35:10.000 I am in that.
02:35:11.000 Well, I knew something was going on when Howard Stern started criticizing podcasts.
02:35:15.000 I was like, that's hilarious.
02:35:16.000 This was like a long time ago.
02:35:18.000 Why fucking idiots wasting their time?
02:35:21.000 Losers do podcasts.
02:35:23.000 You are threatened.
02:35:24.000 Threatened by us.
02:35:25.000 You're stuck on satellite.
02:35:27.000 Right.
02:35:27.000 And satellite only goes to places where the satellite reaches.
02:35:31.000 It doesn't even work in tunnels, man.
02:35:32.000 Well, one of the most, like, one of the most bitter people on Twitter is Keith Olbermann.
02:35:36.000 Like, he's always trying to get back to, like, a cable show for a little while.
02:35:40.000 He's hilarious.
02:35:41.000 Yeah.
02:35:42.000 Yeah, he's hilarious.
02:35:43.000 That guy's, he's a fucking human caricature.
02:35:46.000 Yeah.
02:35:47.000 That guy, like, when he was doing that thing where he was, like, ranting in his basement about Donald Trump being arrested imminently, at any moment he's going to be arrested, and that thing he would do, the resistance, it's like, God, it was so cringe.
02:36:00.000 But it's also, it's like, you're so clearly angry and arrogant and shitty.
02:36:06.000 Like, do you not understand that these are personality traits that nobody likes?
02:36:10.000 And especially if you're uninformed, misinformed, incorrect.
02:36:15.000 And when he was like this vaccine promoter, it's like certain people look for a thing that they think there will be popular opinion behind and they get with that thing so that they can connect themselves to a winning movement.
02:36:29.000 And then they angrily advocate in favor of other people complying.
02:36:35.000 And that's what he did.
02:36:36.000 No, they're like junkies seeking a fix.
02:36:39.000 They're seeking a fix of immediate social reward.
02:36:41.000 Yeah, I mean, he got vaccinated on film.
02:36:43.000 Come on.
02:36:43.000 Can't be a show, everybody.
02:36:45.000 Gross.
02:36:46.000 Yeah.
02:36:46.000 Meanwhile, I'd already have natural antibodies by then.
02:36:48.000 I'm like, why?
02:36:49.000 Yeah.
02:36:50.000 Do you guys understand how this works?
02:36:52.000 Oh, yeah.
02:36:53.000 This is how it's worked forever.
02:36:55.000 That's the weirdest one.
02:36:57.000 I mean, I was fact-checked during my campaign by the San Jose Mercury News, which was like, Schellenberger's got some really weird ideas, including this idea that natural immunity is just as effective as the vaccine.
02:37:09.000 I was like, what?
02:37:10.000 It's crazy to watch the left captivate.
02:37:17.000 We're good to go.
02:37:32.000 You know, it was like natural immunity.
02:37:33.000 That's like a liberal thing.
02:37:35.000 Right.
02:37:35.000 But it was Dennis Prager who would always be talking about, I got COVID so I could have natural immunity, you know.
02:37:41.000 Yeah, and people are like, you're crazy.
02:37:43.000 This is nuts.
02:37:44.000 Not only that, I mean, just the fact that no one has a problem We're good to go.
02:38:12.000 Of this impending pandemic that's going to take out your loved ones.
02:38:15.000 And, you know, Robert Malone talked about that on the podcast, that it creates this mass formation psychosis, that you have this one thing that people are looking at as the savior.
02:38:26.000 And any suppression of that or any resistance of that, you are going to ruin my life.
02:38:31.000 I'm trying to get back to work.
02:38:33.000 I'm trying to make society do it, do the thing.
02:38:37.000 And you can't even be like, hey, but maybe we should see studies.
02:38:40.000 Hey, but where's the data?
02:38:41.000 Hey, why are they telling pregnant women they can take it?
02:38:44.000 There's no studies on pregnant women.
02:38:45.000 Hey, you know, why is there an increase in all-cause mortality in the year that everybody got vaccinated?
02:38:51.000 Hey, what is going on with all the myocarditis?
02:38:54.000 Hey, what's up with the strokes?
02:38:56.000 Hey, and everyone's like, la, la, la, not listening!
02:38:59.000 La, la, la!
02:39:00.000 And if it wasn't for people like Robert Kennedy Jr. writing that book, if it wasn't for people like, how do you say his name?
02:39:08.000 Jay Bhattacharya.
02:39:10.000 I don't want to fuck his name up.
02:39:11.000 Jay Bhattacharya or that gentleman from the UK, John Campbell or some of these other doctors.
02:39:17.000 I love John Campbell.
02:39:18.000 I love John Campbell.
02:39:18.000 He's my favorite.
02:39:19.000 He's so measured and even, but a tinge of British sarcasm to some of the things that he says.
02:39:27.000 But not pompous either, and also very much like, I want people to have the information.
02:39:31.000 But I think that it's the will to control that comes before the catastrophizing.
02:39:36.000 They want to have control, and then they exaggerate the problem, whether it's climate change or COVID-ism.
02:39:43.000 That's what's coming first.
02:39:44.000 It's the need for that social power.
02:39:46.000 I think the other issue, and it struck me as you were talking, the reason that they want to emphasize the vaccine over the remedies, and Steve Kirsch talks a lot about all the different ways in which you can treat the COVID. Bless you.
02:40:00.000 Thank you.
02:40:01.000 Is that that's sort of something that you can do on your own.
02:40:03.000 You can treat the COVID, whereas vaccines were going to be something that we're going to do as a society and it's this collective action.
02:40:08.000 Well, any resistance to that puts you in this anti-vaxxer category, which is like one of the worst pejoratives in modern world times.
02:40:16.000 Climate denier is right up there.
02:40:18.000 It's right up there.
02:40:20.000 You got lumped into that.
02:40:22.000 Oh, yeah, for sure.
02:40:23.000 Meanwhile, he's just stating data and facts.
02:40:26.000 I mean, it's funny because I've been thinking, I was like, look at the trans folks, they're actually sex deniers.
02:40:32.000 Mm-hmm.
02:40:33.000 And for a minute there, I was like, why don't you call them sex deniers?
02:40:35.000 And I was like, God, that's just like, I don't want to be that guy.
02:40:38.000 I don't want to be that guy either, because I have friends that are trans.
02:40:40.000 And, you know, like...
02:40:42.000 Well, no, but you can be trans and not be a sex...
02:40:44.000 I mean, in other words, sex denier is somebody that says that biological sex is not real.
02:40:49.000 It's just a social construction.
02:40:50.000 Right.
02:40:51.000 So I think that, you know, which is just absurd, and obviously so.
02:40:55.000 But, yeah, like, I just, yeah, we don't want to be that.
02:40:58.000 No.
02:40:59.000 We want to be the change in the world, which is like...
02:41:01.000 People to be able to do whatever they want to do.
02:41:03.000 But I don't want it to happen to children before they can figure out what the fuck is going on.
02:41:07.000 I don't want them to be coerced.
02:41:09.000 Children are so malleable.
02:41:11.000 You can get them to join cults.
02:41:12.000 You can get them to believe that they have to strap a suicide vest on and walk into a crowded courtyard.
02:41:19.000 There's things that you can get children to do that you're not going to get older people to do.
02:41:24.000 And to influence them to make a permanent change on their body that will sterilize them and also prevent them from experiencing sexual pleasure.
02:41:35.000 Excuse me.
02:41:37.000 Bless you.
02:41:38.000 I just coughed that time.
02:41:40.000 It's fucked.
02:41:41.000 And it's attached to an ideology, so because it's attached to this ideology, it has to be universally and blindly supported.
02:41:49.000 Yeah.
02:41:50.000 And I just think it'd be nice to get beyond...
02:41:52.000 I mean, it's funny because there's a bit of an arms race with the language where...
02:41:58.000 They just accuse their opponents of being racists, climate deniers, anti-vaxxers, election deniers.
02:42:05.000 And if you're kind of like, hey, can we move beyond these reductive labels?
02:42:10.000 They still have the advantage because these labels are so powerful.
02:42:14.000 They're so powerful.
02:42:14.000 It's so tricky.
02:42:15.000 It's similarly like where they were like, I was kind of like, I don't even use this language of disinformation.
02:42:21.000 Greta Thunberg is a purveyor of disinformation.
02:42:24.000 The world is coming to an end in a few years.
02:42:26.000 Yeah.
02:42:27.000 Five years ago she said that.
02:42:28.000 Yeah, she had to delete that tweet.
02:42:33.000 That's disinformation, but do we have to go and call it that, or can you just be like, you were wrong?
02:42:38.000 Well, you're not just wrong.
02:42:40.000 You're spreading fear that's unnecessary, and it's not based on facts, and it's only there to support your narrative.
02:42:47.000 You're saying it to support your narrative, and it makes you less reliable, and you shouldn't do that.
02:42:53.000 I think there's also that thing about the – and Jordan makes this point about the – there's also the online.
02:42:58.000 I mean it's much harder to demonize somebody when it's this.
02:43:02.000 When we're in person.
02:43:03.000 And I can see the – as you say, you see the God in you.
02:43:06.000 You see the God in me.
02:43:08.000 Whereas online, you've already – it's dehumanizing by nature and so labeling your opponents and demonizing them is much easier.
02:43:17.000 Well, that's why – like the Matt Walsh conversation.
02:43:19.000 Imagine if Matt Walsh and I had that conversation on Twitter.
02:43:21.000 It would take months.
02:43:23.000 Yeah, you can't do it.
02:43:24.000 It wouldn't work.
02:43:25.000 No one would ever achieve an understanding of what the other person thought.
02:43:29.000 And it would also be probably pretty nasty, which I don't think is necessary.
02:43:32.000 I can disagree with someone and have a conversation with them and just talk to them.
02:43:38.000 But then there's also people that are bad actors, and they're only saying something because it conforms to their ideology, and they're essentially grifters.
02:43:49.000 They've attached themselves to this thing, and that is their business.
02:43:53.000 We would say they're lost souls.
02:43:55.000 Yeah, they're lost souls.
02:43:56.000 That's a good way of putting it.
02:43:57.000 I mean, it's slightly sweeter.
02:43:59.000 Michael, you're such a positive person.
02:44:01.000 It's always good to talk to you.
02:44:02.000 I mean, I'm a bad...
02:44:04.000 I mean, part of the reason I came back to being a Christian is that Christianity, I came back to it actually while working on Greta Thunberg at the end of my book, Apocalypse Never.
02:44:12.000 And I was like, what's the remedy for this intense hatred and anger against civilization?
02:44:18.000 And I was like, it's love, obviously.
02:44:22.000 Loving your enemies is, for me, what Christianity is about.
02:44:27.000 It's the heart of Christianity.
02:44:29.000 It's really hard.
02:44:30.000 Forgiveness.
02:44:31.000 Yeah, forgiveness.
02:44:32.000 But it's really, really hard.
02:44:35.000 And so for me, it was like, I'm interested in having a faith that's hard, not easy.
02:44:40.000 If it were easy, then what's the point?
02:44:43.000 You know, it's got to make you better in some way.
02:44:46.000 I get the same thing out of stoicism.
02:44:48.000 I find it completely compatible.
02:44:49.000 Or these death meditations.
02:44:51.000 You do it not because it's wonderful to think about being dead.
02:44:54.000 You do it because you think it's going to, you know, it's actually the, God, the other guy you had on who I just adore is Andrew Huberman.
02:45:01.000 Yes.
02:45:01.000 I listen to all those – and he's got a colleague.
02:45:06.000 What's her name?
02:45:07.000 She did Dopamine Nation.
02:45:10.000 I'm blanking.
02:45:11.000 I'm killing it.
02:45:11.000 Susanna Soberg?
02:45:12.000 No, no.
02:45:14.000 Oh, yeah.
02:45:14.000 I don't remember her name.
02:45:15.000 She's going to be so mad that I'm blanking on her name.
02:45:18.000 But basically – Jamie's got it.
02:45:20.000 Sorry, Anna.
02:45:21.000 Anna Lefty.
02:45:21.000 Sorry, Anna, if you're watching this.
02:45:23.000 Um, but they, you know, it's like, it's like, it's so simple, but they're like, you know, like, you know, Huberman.
02:45:29.000 So first of all, I now do my morning run before I drink my coffee and I take a cold shower because, because that amount of adversity, which I mean, it's kind of a joke, like it's not adversity really, but it's a little bit.
02:45:42.000 It's, it's not great.
02:45:43.000 I'm not happy.
02:45:44.000 Like I have to get my tennis shoes on and you're running like early in the morning and yeah, But he's absolutely right that actually leaning into the pain a bit.
02:45:53.000 It makes the rest of your day easier.
02:45:56.000 It really does.
02:45:57.000 There are occasional days, very rare, and I must be busy, where I don't go into the cold plunge first thing in the morning.
02:46:04.000 And those days aren't as good.
02:46:06.000 Oh, you have a cold one.
02:46:07.000 Oh, yeah, I have a fucking cold one, bro.
02:46:09.000 I have a Morosco Forge at home that's 34 degrees, and I climb into that bitch every day for three minutes.
02:46:15.000 And then we have a new one that's getting installed here that's a blue cube.
02:46:18.000 It's even more horrific because it's got a constant flow like a river, so you never establish a thorough barrier.
02:46:25.000 No, it's not to heat up the water.
02:46:27.000 Your skin has a thermal barrier.
02:46:30.000 It's like if you stay still in the cold water, it's way easier than if you move.
02:46:35.000 If you move, it's fucking horrible.
02:46:36.000 And the blue cube is just like a raging river on you.
02:46:40.000 Yeah, yeah.
02:46:40.000 Which is probably even more adversity that you have to overcome, which I like.
02:46:43.000 You know, Nassim Taleb does a good job with this, with his book.
02:46:46.000 Antifragile.
02:46:47.000 Yeah, Antifragile.
02:46:48.000 I find him annoying on Twitter, but I think that insight...
02:46:52.000 That's probably just Twitter, right?
02:46:53.000 Oh, yeah.
02:46:54.000 Probably.
02:46:55.000 No, I bet you in person – well, he's probably an arrogant asshole in person too.
02:46:58.000 Well, he's very brilliant.
02:46:59.000 But it's a brilliant book.
02:47:00.000 A lot of very brilliant people are arrogant.
02:47:01.000 Yeah.
02:47:02.000 It's part of what makes you brilliant in the first place and put in the hard work.
02:47:05.000 I mean that's the discourse on the censorship stuff too is it's always – we're trying to reduce harm.
02:47:12.000 They go, we want to reduce speech that causes harm.
02:47:14.000 It's like, wait a second.
02:47:16.000 I know what you mean.
02:47:17.000 We don't want to do bad in the world.
02:47:19.000 On the other hand, we know that coddling children is terrible.
02:47:25.000 You create unstoppable harm.
02:47:27.000 It's way worse.
02:47:28.000 It's way worse than letting that kid experience some adversity.
02:47:31.000 Yeah, just the right amount of adversity, the right amount of harm.
02:47:35.000 It's not easy, but we have to get back to teaching that.
02:47:40.000 I agree.
02:47:41.000 Michael, it's always a pleasure.
02:47:43.000 Thank you very much for coming here.
02:47:44.000 Thanks for having me.
02:47:44.000 If you ever have any more Twitter pages that you have to go through, like if you really do go through the Fauci files or whatever, please come back.
02:47:52.000 We'll do it again.
02:47:53.000 Appreciate you.
02:47:53.000 Appreciate you.
02:47:54.000 Thank you.
02:47:54.000 Bye, everybody.