The Joe Rogan Experience - February 13, 2023


Joe Rogan Experience #1940 - Matt Taibbi


Episode Stats

Length

2 hours and 46 minutes

Words per Minute

157.37154

Word Count

26,184

Sentence Count

2,116

Misogynist Sentences

27

Hate Speech Sentences

24


Summary

In this episode of the Joe Rogan Experience podcast, the host talks with journalist Matt Tiberi about his work with the committee investigating the use of surveillance technology by the Department of Homeland Security (DHS) against journalists and other journalists. They discuss the role of the committee in uncovering the extent of the government's surveillance of journalists and their access to information. They also talk about how the committee is working with companies like Google and other tech companies to make sure they don t get hacked, and how they don't get hacked by the government. And, of course, they talk about the latest in the Hunter Biden laptop scandal, and why they think it's a good idea to keep them in the loop on what's going on behind the scenes of the House Intelligence Committee investigation into whether or not the government is spying on us. And, as always, there's a little bit of everything else. Check it out! Check out The Joe Rogans Experience Podcast by day, and The Podcast by night, all day, all the time. Enjoy, -Joe Rogan Podcast by Night, All Day. -JOE JORGAN EXPERIENCODE by DAY, by NAKED by NANCY ROGAN PODCAST by NICK CORAKE by BOB SCARLATA by THE JOE JORDAN EPISODE by MICHAEL CRUZELAKEVIN MCCARTELLO by JOEJORDAN by ROBERT MEYER by STEPHEN MEYANCHOR AND JOSHANCHEESE by RYAN KLEIN by KEVIN WELCOMEVERYTHING by JOSH MARTIN and JOSH MILLER BY JOE RODAN EPICINE by MATT TAYLOR by SONGS, JOSEPH PENNELL by PEN PENNY MILLER, JOSH MOORE by STEVE PENNEY AND MUCH MORE, BOB PENNETT AND JOSIE MCDO AND KEVEN MARTINEKETCHTON by GREECE AND KELLY LYNN PENJOYCE BONUS SPEAKING ABOUT THE JORGE MCDARTO & JOSH WALLACE by PAUL MCDERAN AND JAMES ECHTER


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:12.000 Hello, Matt Tiger.
00:00:13.000 Hey, Joe.
00:00:14.000 How's it going?
00:00:16.000 Good to see you.
00:00:17.000 It's always so hard to get rolling after you've been talking.
00:00:20.000 I'm always excited to see you, so we're just blabbing, and now we're rolling.
00:00:24.000 So what's cracking?
00:00:26.000 A lot.
00:00:27.000 A lot.
00:00:27.000 It's been a crazy couple of months.
00:00:31.000 I've enjoyed your work with the Twitter file.
00:00:33.000 I enjoy all your work, but I really have enjoyed the Twitter files.
00:00:36.000 That has been some really fascinating views behind the curtain.
00:00:41.000 It's been one of the weirder, more surreal experiences of my life because, you know, as a reporter, you're always kind of banging away to try to get one little piece of reality,
00:00:57.000 right?
00:00:58.000 Like you might make 30 or 40 phone calls to get one sentence.
00:01:02.000 The Twitter files is Oh, by the way, here, you know, take a laptop and look at 50,000 emails, you know, full of all kinds of stuff.
00:01:12.000 And so it's, you know, for somebody like me, it's like a dream come true.
00:01:17.000 We get to see all kinds of things, get answers to questions that we've had for years.
00:01:22.000 And it's been really incredible.
00:01:25.000 Has anything been surprising to you?
00:01:30.000 A little bit.
00:01:31.000 I think going into it, I thought that the relationship between the security agencies like the FBI and the DHS and companies like Twitter and Facebook, I thought it was a little bit less formal.
00:01:46.000 I thought maybe they had...
00:01:48.000 Kind of an advisory role.
00:01:50.000 And what we find is that it's not that.
00:01:53.000 It's very formalized.
00:01:55.000 They have a really intense structure that they've worked out over a period of years where they have regular meetings.
00:02:04.000 They have a system where the DHS handles censorship requests that come up from the states and the FBI handles the international ones and they all float all these companies and It's a big bureaucracy and I don't think we expect it to see that.
00:02:20.000 It's very bizarre to me that they would just openly call for censorship in emails and these private transmissions but ones that are easily duplicated.
00:02:34.000 You could send them to other people.
00:02:36.000 It can easily get out.
00:02:38.000 Like that they're so comfortable with the idea that the government should be involved in this censorship of what turns out to be true information.
00:02:45.000 Especially in regards to the Hunter Biden laptop, that they would be so comfortable that they would just send it in emails.
00:02:52.000 Yeah.
00:02:53.000 Yeah.
00:02:53.000 Well, I think that shows you the mentality, right?
00:02:55.000 Yeah.
00:02:55.000 Like that they really genuinely felt that they were impregnable, that they don't have anybody to answer to.
00:03:03.000 I mean, a normal person doesn't put incriminating things in emails because we all – Have the expectation that someday it might come out.
00:03:13.000 But these folks didn't act that way.
00:03:15.000 I mean, you see – I was especially shocked by an email from a staffer for Adam Schiff, the congressperson, the California congressman.
00:03:27.000 And they're just outright saying, we would like you to suspend the accounts of this journalist and anybody who retweets information about this committee.
00:03:38.000 You know, I mean, this is a member of Congress, right?
00:03:42.000 Most of these people have legal backgrounds.
00:03:45.000 They've got lawyers in the office for sure.
00:03:47.000 And this is the House Intelligence Committee.
00:03:50.000 You would think they would have better operational security.
00:03:54.000 Another moment that was shocking to me, there was an email from an FBI agent named Elvis Chan in San Francisco to Twitter.
00:04:06.000 And they're setting up this signal group, which is going to include all the top We're good to go.
00:04:27.000 Phone numbers, right?
00:04:29.000 And the word, Doc, is just called secret phone numbers, right?
00:04:34.000 And I'm thinking, this is how they taught you to do it at Quantico?
00:04:37.000 Really?
00:04:38.000 You know?
00:04:39.000 I mean, even a journalist can't miss that, you know what I'm saying?
00:04:45.000 Call it something else.
00:04:46.000 I don't know.
00:04:47.000 That part of it was amazing.
00:04:49.000 So strange.
00:04:50.000 It's so strange to get such a peak.
00:04:53.000 Because I don't think anybody ever anticipated that something like this would happen, where Twitter would get sold to an eccentric billionaire who's intent on letting all the information get released.
00:05:03.000 Yeah, I mean, I think Elon Musk, essentially, he spent $44 billion to become a whistleblower of his own company.
00:05:12.000 And, I mean, I don't really fully know his motives in doing that.
00:05:19.000 I think he's got a pretty developed sense of humor, though.
00:05:33.000 Yeah.
00:05:42.000 I mean, that's – $44 billion is a lot to spend on that thrill, but I'm glad he did.
00:05:48.000 Well, he truly believes that censored social media is a threat to democracy.
00:05:53.000 He really believes that.
00:05:55.000 Absolutely.
00:05:56.000 Yeah, and I believe it too.
00:05:57.000 I just don't have $44 billion.
00:05:59.000 Even if I did, I'd be like, I don't want that heat.
00:06:02.000 Right, right, yeah.
00:06:04.000 I don't think that's what I would spend it on, but no, he believes that.
00:06:09.000 I think he also believes that the credibility of these companies...
00:06:17.000 Can only be restored by telling people what they talk about in private or what they have been talking about with the government and that sort of thing.
00:06:28.000 So he might be right about that.
00:06:31.000 We'll see, I guess.
00:06:33.000 I think he is.
00:06:34.000 I mean it's going to be interesting.
00:06:35.000 It's going to be interesting to see how this plays out.
00:06:37.000 There's an amazing amount of resistance against him and just the publicity campaign against him has been fascinating to watch.
00:06:47.000 People go from thinking that Elon Musk is the savior that's bringing us these amazing electric cars and engineering new reusable rockets to he's an alt-right piece of shit who wants Donald Trump back in the office and it's very wild.
00:07:02.000 The speed with which they can sort of shuffle somebody into the Hitler of the Month Club routine, right?
00:07:11.000 We've always done this with foreigners, whether it's Noriega or Saddam Hussein or Milosevic or Assad or whatever it is.
00:07:21.000 We have a playbook for...
00:07:23.000 Cranking out negative information about foreigners who get in our way for whatever reason.
00:07:29.000 But now we've kind of refined that technique for domestic people who are inconvenient.
00:07:38.000 I think they did it with Trump, obviously.
00:07:41.000 They try to do it with Trump.
00:07:43.000 Tucker Carlson with you.
00:07:45.000 You know, I mean, you got a taste of that for a few times.
00:07:48.000 Yeah, it's interesting.
00:07:49.000 Right?
00:07:50.000 And then, you know, with Elon, yeah, he went from being the guy who made electric cars sexy to like, you know, something to the right of Victor Orban in like 10 seconds.
00:08:03.000 It's amazing.
00:08:04.000 It is amazing, and the narrative is spread through progressive people.
00:08:08.000 Well, they'll just say it now.
00:08:10.000 It's like they've reached the memo, the memos got to them, and then they just...
00:08:14.000 I hear people in LA, I hear people that I know, like, oh, Elon's just so crazy.
00:08:19.000 It's like something happened to him.
00:08:20.000 He went nuts, and he's a right-winger now.
00:08:22.000 Like, how?
00:08:23.000 What are you saying?
00:08:24.000 Like, what examples do you have?
00:08:25.000 Like, they don't have an example.
00:08:27.000 They just have this narrative that reached them the signal.
00:08:30.000 Like, Elon bad now.
00:08:31.000 Oh, Elon bad now.
00:08:32.000 Elon bad now.
00:08:34.000 Elon bad now.
00:08:34.000 And they just start saying it.
00:08:36.000 And you go, like, what examples are you using of, like, his behavior?
00:08:39.000 Well, he let Trump back on the platform.
00:08:41.000 Okay.
00:08:43.000 The Taliban's there.
00:08:44.000 Right, yeah, exactly.
00:08:45.000 You don't have a problem with the Taliban?
00:08:46.000 The Taliban just bought blue checkmarks.
00:08:48.000 Do you know that?
00:08:49.000 Did they really?
00:08:50.000 Yes.
00:08:51.000 I didn't know that.
00:08:51.000 Yeah, they're buying blue checkmarks so they could be verified.
00:08:54.000 The real terrorist.
00:08:56.000 The fucking Taliban is on and no one has a problem with it.
00:08:58.000 The CCP's on Twitter.
00:09:00.000 Right.
00:09:00.000 No one has a problem with it.
00:09:02.000 Right.
00:09:02.000 But they're like, Trump, they'll let Trump back on.
00:09:04.000 Look, Trump is hilarious.
00:09:07.000 He's a ridiculous person.
00:09:09.000 But don't you think it's better that his tweets get out there and then a bunch of people get to attack him in the tweets?
00:09:16.000 And if those tweets that people attack him with are good, if people are saying good things, then those things get retweeted and liked and then they rise up to the top of the algorithm.
00:09:26.000 It's all good.
00:09:27.000 Like you need a voice against someone like that.
00:09:30.000 You can't have that guy Howling into the wind on some QAnon forum and all those wackos just so they're only talking to each other with no pushback at all.
00:09:39.000 If you really don't like Trump, you want him on Twitter.
00:09:41.000 Absolutely.
00:09:42.000 You want that guy to have some pushback.
00:09:44.000 You want people to be talking against what he's saying.
00:09:47.000 You want Twitter, the real Twitter now, which will actually fact check everybody.
00:09:51.000 They fact check Biden, right?
00:09:53.000 They'll fact check him.
00:09:54.000 So if he says something stupid, they'll go, no, that's not what's true.
00:09:57.000 Here's what's true.
00:09:58.000 Right.
00:09:59.000 That would be good.
00:10:00.000 And that was actually for a while Twitter's official policy.
00:10:04.000 They had something called the public interest policy, which specifically laid out exactly what you said.
00:10:10.000 Like when a world leader, no matter who it is, says anything, we want it to be out there because we want it to be debated.
00:10:19.000 We want people to see it.
00:10:21.000 We want people to talk about it.
00:10:22.000 We want people to reach conclusions about it.
00:10:24.000 And one of the things that we found in the Twitter files was after January 6th, there was this intense debate within the company where they were basically saying, oh, thank God we're going to repeal the public interest policy or we're going to poke a hole in it,
00:10:40.000 right?
00:10:41.000 And no longer have that.
00:10:48.000 So they invented a new policy called glorification of violence, or they called it that.
00:10:58.000 And essentially what they said was you had to look at Trump not in terms of each individual tweet, but in terms of what they called the context surrounding, his whole career, all the people who followed him,
00:11:14.000 whether or not they were violent, whether or not they said the things that were offensive.
00:11:18.000 It's like the speech version of stochastic terrorism.
00:11:22.000 I don't know if you've ever heard that term.
00:11:25.000 Stochastic terrorism is this idea that you can incite people to violence by saying things that aren't specifically inciting but are statistically likely to create somebody who will do something violent even if it's not individually predictable.
00:11:45.000 And that's what they did with Trump.
00:11:47.000 They basically invented this concept that Yes, he may not have actually incited violence, but the whole totality of his persona...
00:11:57.000 Is inciting.
00:11:58.000 So we're going to strike him.
00:12:00.000 And so they sort of massively expanded the purview of things they can censor just in that one moment.
00:12:09.000 And you can see it in these dialogues, how they came to that decision, which is just fascinating.
00:12:15.000 It's just such an extraordinary amount of power to give people, the ability to censor people on the biggest public forum in the world.
00:12:22.000 It's so extraordinary in the fact that they can come up with these justifications for why this is a good idea without anyone pushing back, without anyone saying, do you understand where this goes to?
00:12:33.000 This eventually leads to government control of all thought and speech.
00:12:36.000 This is where you're going.
00:12:37.000 You're allowing the government to influence you based on one specific problematic individual, and that could spread out into every one of us.
00:12:44.000 Right.
00:12:44.000 All of us.
00:12:45.000 Easily.
00:12:45.000 Quickly.
00:12:46.000 Right.
00:12:47.000 Right.
00:12:47.000 I mean, we heard at the World Economic Forum, right?
00:12:51.000 We heard the- Brian Stelther was there.
00:12:54.000 Brian Stelther.
00:12:55.000 Brian Stelther is now at the World Economic Forum.
00:12:57.000 What can we do about these problems?
00:13:00.000 He looked very comfortable there, didn't he?
00:13:02.000 Of course he does.
00:13:03.000 He's with evil lizard people that are trying to control the world.
00:13:06.000 That's his bosses.
00:13:07.000 He knows how to handle that kind of situation.
00:13:10.000 I've been around evil lizard people.
00:13:14.000 He looked as happy as maybe he's ever been.
00:13:17.000 Well, he's probably very excited just to be working again in any way, shape, or form.
00:13:22.000 That's true.
00:13:23.000 He's not a guy that really is supposed to be in front of a camera, right?
00:13:25.000 He's supposed to be a journalist, but he's not even good at that.
00:13:28.000 So what he's doing now is holding water For the evil leaders of the world who want to institute hate speech policies nationwide and, you know, centralized digital currency and they want everybody to eat bugs and you will own nothing and be happy.
00:13:42.000 This is the fucking people he's working for now.
00:13:44.000 Because he's basically a prostitute.
00:13:46.000 And, you know, they hired him to go over there and do that and he's like, what can we do?
00:13:51.000 What can we do better?
00:13:52.000 What can we do different to get everybody to stand in line?
00:13:55.000 Yeah.
00:13:55.000 What can we do?
00:13:56.000 And for a journalist to sit there, there was that one moment where that woman, Vera Yorova, she's an EU official, and she's talking about hate speech laws.
00:14:08.000 And then she touches the knee of somebody sitting next to her and saying, you're going to have that in America soon.
00:14:13.000 Yeah.
00:14:17.000 Yeah.
00:14:17.000 You know, like, that's not offensive to him.
00:14:20.000 Like, a European basically saying, oh, yeah, you're going to have this too soon.
00:14:25.000 Like, even though it's completely antithetical to everything that we believe in in this country.
00:14:30.000 Well, I think when you're working in a corporate news structure, and you could speak to this better than I could, obviously, but...
00:14:35.000 I think when you're working in an environment where you have editors and people in your ear and you have producers and you have narratives that the company is pushing and then you have sponsorships that you're beholden to, it's very difficult to form any sort of Problematic or controversial independent thought and then try to express it publicly.
00:14:56.000 You're not going to do it.
00:14:57.000 It's just too scary and sketchy.
00:14:59.000 So when you're trying to keep that job and here's a guy like Brian Stelter who already lost one of the biggest gigs in all of broadcast news.
00:15:06.000 He was on fucking CNN. And then, you know, here he's standing there and they're saying, you're going to have hate speech laws in America, too.
00:15:13.000 Okay, everything's running smooth.
00:15:14.000 Everyone's smiling.
00:15:16.000 He's not suitable for that role.
00:15:19.000 He doesn't belong there.
00:15:21.000 You don't have the stones to carry that conversation in a way that's going to benefit all these people that are listening to it.
00:15:28.000 What you want is someone who's in that position that goes, hold on, what do you think is hate speech?
00:15:33.000 What's hate speech to you and what's hate speech to me?
00:15:36.000 And who gets to decide?
00:15:37.000 Yeah.
00:15:38.000 How is that going to be adjudicated?
00:15:40.000 Like, what's your definition?
00:15:42.000 What's your definition of hate?
00:15:44.000 What's your definition of speech?
00:15:46.000 Exactly.
00:15:46.000 You know what I mean?
00:15:47.000 There are a lot of questions.
00:15:48.000 Does context matter?
00:15:49.000 Yeah.
00:15:49.000 And how do you decide?
00:15:50.000 And obviously when you're looking at things over text, context gets very blurry.
00:15:55.000 You don't know if someone's joking around.
00:15:57.000 Like, there's so many pages that I get sent that are satire.
00:15:59.000 I've got a hilarious one.
00:16:01.000 There's a new one.
00:16:02.000 Rick Rubin sent me this one.
00:16:03.000 He's like, this has got to be satire, right?
00:16:05.000 And it's brilliant, brilliant satire.
00:16:08.000 But there's this person who has like the best version of super liberal, like my children will never eat food from a gas stove, like that kind of shit.
00:16:19.000 And there's so many of these, like it's hard to tell who's what and what's real.
00:16:26.000 It's just one of those things where it's hard when you're looking things through text because people are sneaky.
00:16:34.000 They're really good at it.
00:16:35.000 And people are so ridiculous in real life that a really subtle parody is very hard to discern.
00:16:41.000 So is that hate speech?
00:16:42.000 If someone's doing it as a parody, is that hate speech?
00:16:45.000 When do you decide that something is hateful?
00:16:48.000 And that's exactly why traditionally in this country, judges have always said, well, they haven't always said it, but they eventually came around to the idea that We can't involve ourselves in these questions.
00:17:03.000 They're too difficult, and it's not our job.
00:17:07.000 We're going to step in in only the most extreme cases, right?
00:17:10.000 So the current standard is, you know, a Supreme Court case, Brandenburg v.
00:17:16.000 Ohio, which outlaws incitement to imminent lawless action, right?
00:17:23.000 So you have to be basically saying, Let's go get them.
00:17:27.000 Go get them.
00:17:28.000 Break into the White House.
00:17:47.000 You know, you can spend endless amounts of time building sandcastles trying to figure out what is what.
00:17:54.000 And it will always end in a place where the government, you know, interprets it to its greatest advantage.
00:18:02.000 And that's why we, you know, we don't want it.
00:18:05.000 You know, I mean, ultimately, it's not a good thing for...
00:18:09.000 For most people, but...
00:18:11.000 It's just very hard for people to realize, even though this thing that you're talking about wielding, this weapon, will work against your enemies, it can ultimately also be used against you.
00:18:24.000 That was the thing with the Patriot Act.
00:18:27.000 When the, you know, indefinite detention, when they were talking about just being able to detain people, and Obama was like, don't worry, well, I would never do that, but you're not going to be in the president forever.
00:18:37.000 Like, someone else is going to come along.
00:18:39.000 And perfect example, that next person was Trump.
00:18:42.000 Right.
00:18:43.000 Well, what if someone's crazier than him?
00:18:45.000 Like, what if something happens?
00:18:46.000 What if there's some sort of a nuke goes off somewhere?
00:18:50.000 And then everybody gets way more radicalized.
00:18:52.000 And then you can get a really fucking insane, like, Stephen King character.
00:18:57.000 What was that movie?
00:18:58.000 Where the one evil guy becomes president and...
00:19:01.000 Oh, God, the Greg Stilson.
00:19:05.000 Dead Zone.
00:19:05.000 Dead Zone.
00:19:06.000 Thank you.
00:19:07.000 I mean, we're not far removed from that in terms of plausible plots that this wacky country can fall into.
00:19:15.000 And that's the same thing with censorship.
00:19:18.000 They can use it against you.
00:19:20.000 So if you think you're using this to push back against right-wing extremism, they can use that to push back against progressive ideas that would genuinely benefit good people, genuinely benefit families, genuinely benefit people in need, genuinely benefit people in terms of healthcare and education.
00:19:38.000 They can stop that.
00:19:39.000 They can stop that if it's unprofitable with the same sort of tools.
00:19:43.000 Absolutely.
00:19:44.000 You gotta have free speech.
00:19:45.000 It's the most important thing we have and it's the one thing that separates us from everybody else.
00:19:50.000 So when you have liberals and progressives that are screaming against removing people from platforms and stopping this and stopping that, understand what the fuck you're saying.
00:19:59.000 Yeah.
00:20:00.000 And they don't, right?
00:20:02.000 It's just convenience.
00:20:04.000 It'll work against my enemies.
00:20:06.000 You talk about how they can use it to shut down things on the other side.
00:20:13.000 We see reports in these files of government agencies...
00:20:22.000 We're good to go.
00:20:35.000 That's a liberal issue.
00:20:37.000 That's a progressive issue.
00:20:40.000 Progressives want generic vaccines to be available to poor countries.
00:20:45.000 But you can use this tool to eliminate speech about that if you want to.
00:20:51.000 I think that's what they don't get is that the significance is not who.
00:20:57.000 The significance is the tool.
00:20:59.000 Like, what is it capable of doing?
00:21:03.000 How easily is it employed?
00:21:06.000 And, you know, how often is it used?
00:21:10.000 And they don't focus on that.
00:21:12.000 They focus on, oh, it's Donald Trump, so therefore we want it.
00:21:15.000 And that's where their mistake is.
00:21:19.000 It's a very interesting and very nuanced conversation as to what should be allowed and what should not be allowed and why.
00:21:28.000 And I think it's complex and it's ever-changing and it depends upon the tools that are involved and depends upon what are you talking about?
00:21:36.000 And then it also depends upon, like, here's a big one that drives me nuts about this January 6th.
00:21:42.000 Why is it okay for the FBI to have agents that incite people to go into the Capitol?
00:21:48.000 Why is that okay?
00:21:49.000 What benefit?
00:21:51.000 Is that for society?
00:21:53.000 Like, how much influence did they have?
00:21:57.000 How much rabble-rousing influence did they have?
00:22:00.000 How much coercion?
00:22:02.000 I mean, why is that okay?
00:22:05.000 So this is another topic that is fascinating because it hasn't gotten a ton of press.
00:22:11.000 But if you go back all the way to the early 70s, the CIA and the FBI got in a lot of trouble for various things.
00:22:22.000 The CIA for assassination schemes involving people like Castro, the FBI for COINTELPRO and other programs, domestic surveillance.
00:22:33.000 And they made changes after congressional hearings, the church committee, that basically said, the FBI, from now on, you have to have some kind of reason to be following somebody or investigating somebody.
00:22:48.000 You have to have some kind of criminal predicate.
00:22:50.000 And we want you mainly to be investigating cases.
00:22:54.000 But after 9-11, they peeled all this back.
00:22:58.000 There was a series of attorney general memos that essentially refashioned what the FBI does.
00:23:06.000 And now they don't have to be doing crime fighting all the time.
00:23:09.000 Now they can be doing basically 100% intelligence gathering all the time.
00:23:15.000 They can be infiltrating groups for no reason at all.
00:23:19.000 Not to build cases.
00:23:20.000 But just to get information.
00:23:23.000 And so that's why they're there.
00:23:26.000 They're in these groups.
00:23:28.000 They're posted up outside of the homes of people they find suspicious.
00:23:35.000 But they're not building cases.
00:23:37.000 They're not investigating crimes.
00:23:40.000 It's sort of like Minority Report.
00:23:43.000 It's pre-crime.
00:23:45.000 And the public has accepted this You know, without much trouble.
00:23:51.000 Yeah, there's a little bit of pushback from people online, then it goes away, where there's no real repercussions.
00:23:56.000 Like the Governor Whitmer case.
00:23:57.000 Right.
00:23:57.000 Where there's 14 people involved in kidnapping, and 12 of them are FBI informants.
00:24:04.000 Right.
00:24:04.000 Which is fucking bonkers.
00:24:06.000 And then the two guys are doing hard time.
00:24:08.000 They're like, we thought it was fantasy.
00:24:10.000 Like, we're idiots.
00:24:12.000 We didn't know.
00:24:13.000 Like, one of the guys literally said, I never planned on doing anything.
00:24:16.000 To me, it was just fantasy.
00:24:18.000 Yeah, well, I mean...
00:24:20.000 They're morons who get talked into this.
00:24:22.000 And imagine you're getting talked into this by 12 people who turn out to be informants.
00:24:27.000 Right.
00:24:28.000 That's wild.
00:24:29.000 Yeah, yeah.
00:24:30.000 And why are there so many informants, you know, like, hanging around with...
00:24:34.000 Fucking head of the Proud Boys.
00:24:35.000 Right.
00:24:36.000 The head of the Proud Boys was an informant.
00:24:37.000 Is that true?
00:24:38.000 I didn't know that.
00:24:38.000 You didn't know that?
00:24:39.000 No.
00:24:39.000 What's his name?
00:24:40.000 Enrique Tarrio?
00:24:41.000 Yeah, pull that up.
00:24:43.000 He was a fucking informant.
00:24:44.000 So this guy who is at the head of the Proud Boys, the guy who's organizing the things, he was an FBI informant.
00:24:52.000 Wow.
00:24:54.000 Do you know the story of the Proud Boys?
00:24:56.000 The real origin story?
00:24:57.000 No, I don't actually.
00:24:58.000 Oh my god, the origin story is amazing.
00:25:01.000 The origin story, here we go.
00:25:04.000 Proud Boys leader was prolific informer for law enforcement.
00:25:08.000 Enrique Tarrio, leader of the Proud Boys extremist group, has a past informer for federal and local law enforcement, repeatedly working undercover for investigators after he was arrested in 2012. Quoted to a former prosecutor in a transcript of a 2014 14, federal court proceeding obtained by Reuters.
00:25:25.000 So, the Proud Boys started off as a joke on Anthony Cumia's radio show, where Gavin McGinnis, who was a regular guest, they made a joke about one of the guys who was an intern, and they were doing a joke about him being in a musical,
00:25:42.000 and the musical, like, Proud of My Boy.
00:25:45.000 And they were singing a song like, we're the Proud Boys, proud of my boy.
00:25:48.000 And they're like, we're going to put together a group called the Proud Boys.
00:25:52.000 And so they decided to have like this fake group of people.
00:25:55.000 And to get into this group, you had to, they had to punch you in the arm and you have to read off, like, remember different breakfast cereals.
00:26:03.000 Like, it's all, like, really hilarious, stupid shit.
00:26:07.000 But then Gavin, you know, Gavin's one of those guys that just, like, he's a legitimate maniac, which was great when he was running Vice, and not so great when he gets involved in extremist groups.
00:26:18.000 Right.
00:26:21.000 Fuck these crazy Antifa people.
00:26:24.000 We're going to develop our own group.
00:26:25.000 We're going to go after them.
00:26:27.000 We're going to fight them when we go to these...
00:26:28.000 Because they would show up at any time he had to do speeches and they would protest him.
00:26:32.000 He's like, we're going to beat the Proud Boys.
00:26:34.000 We're going to show them.
00:26:34.000 We're going to fight them.
00:26:35.000 And I'm like, hey, hey, hey.
00:26:37.000 That doesn't end well.
00:26:38.000 And obviously it ended terribly.
00:26:40.000 And that's where the Proud Boys origin story came from.
00:26:43.000 That's amazing.
00:26:44.000 It came from the Anthony Cumia radio show.
00:26:47.000 From Opie and Anthony.
00:26:48.000 Anthony from Opie and Anthony.
00:26:51.000 Dude, it's a joke.
00:26:52.000 Anthony came on the podcast and told the origin story of it.
00:26:56.000 There's a famous clip that's on YouTube, and it's hilarious.
00:26:59.000 And then it ends up being a central issue in the presidential campaign.
00:27:06.000 Yes!
00:27:07.000 Yes!
00:27:08.000 It's nuts!
00:27:09.000 America couldn't be more ridiculous.
00:27:12.000 Dude, but if you know Anthony, and you know Gavin, like I've known Gavin forever, Gavin's just nuts.
00:27:18.000 He's not an extremist.
00:27:20.000 He's not an evil person.
00:27:22.000 He's just nuts.
00:27:23.000 But he fucked up when he created that group.
00:27:26.000 I'm like, dude, you can't just punch people in the face.
00:27:29.000 Because you punch people in the face, they come back with a bat, now you have a war.
00:27:33.000 You can't just say, we're going to go hit people.
00:27:36.000 That always ends badly.
00:27:38.000 Right, right, right.
00:27:38.000 But he just was having fun and kept pushing it.
00:27:41.000 He's a push the limits, push the envelope guy.
00:27:44.000 And then now it's this hate group that people bring up in political speech and the proud boys and the white supremacists and like the proud boys.
00:27:53.000 And they have a life of their own now, right?
00:27:55.000 Now it's beyond – they don't even like him anymore.
00:27:57.000 They kicked him out of a meeting.
00:27:59.000 He went to a meeting in Vegas.
00:28:00.000 He's persona non grata in the Proud Boys now.
00:28:04.000 He can't even go to the Proud Boys.
00:28:05.000 A thing that he started as a joke has now completely ran away from him.
00:28:10.000 Because like with any group, right?
00:28:11.000 If you have a group that people can join, anyone could join.
00:28:14.000 Well, anyone could join.
00:28:15.000 You're gonna get assholes to join it.
00:28:16.000 Of course.
00:28:16.000 And you're also gonna get law enforcement.
00:28:18.000 And that's how you got this Enrique Tarrio guy.
00:28:20.000 Right, right.
00:28:21.000 Wild.
00:28:22.000 Yeah, yeah.
00:28:23.000 That's unbelievable.
00:28:24.000 It's wild.
00:28:24.000 So I guess they found his cooperator agreement.
00:28:29.000 Yeah.
00:28:29.000 And what, did he continue to be the head of the Proud Boys after that?
00:28:33.000 I think they arrested him recently.
00:28:35.000 That's the interesting, also, the thing about these guys is they don't get off scot-free.
00:28:39.000 Like, they make you work as an informant, they make you be a snitch, and then they put you in jail.
00:28:44.000 Right.
00:28:44.000 They'll still put you in jail.
00:28:45.000 But they won't put you in jail for life.
00:28:48.000 We're going to give you 10 years instead of 50. And so these guys just do it anyway.
00:28:53.000 I mean, they treat you like a real bitch.
00:28:57.000 Like, they don't treat you like a half a bitch or like, you know, hey, we're going to work out a deal.
00:29:01.000 No, they treat you like a bitch.
00:29:03.000 Two days before a far-right mob stormed the U.S. Capitol, police arrested the leader of the Proud Boys militia group for burning a Black Lives Matter flag at a different protest.
00:29:12.000 Wow!
00:29:13.000 Meanwhile, he's black, which is hilarious.
00:29:15.000 Right, right.
00:29:16.000 The fact that, like, you can't burn that flag.
00:29:18.000 Meanwhile, if you burn an American flag, no problem at all.
00:29:21.000 Right.
00:29:21.000 That's fine.
00:29:22.000 Yeah.
00:29:22.000 That's fine.
00:29:23.000 I mean, I actually, I'm for that.
00:29:26.000 You're for them arresting people for burning flags?
00:29:28.000 No, no, no.
00:29:28.000 I'm fine with people burning flags if they feel like it.
00:29:32.000 Yeah.
00:29:32.000 I don't think you should arrest people for burning a flag.
00:29:35.000 Like, I don't think you should arrest people for burning a piece of paper.
00:29:38.000 Right.
00:29:38.000 Like, you should be able to, if you want to burn something in protest, it's not like we can't make more flags.
00:29:43.000 Right.
00:29:43.000 Like, what are you doing?
00:29:44.000 Yeah, we got plenty.
00:29:45.000 If you want to buy a flag and burn it, okay.
00:29:47.000 That's your big thing?
00:29:47.000 Go ahead.
00:29:48.000 Yeah, exactly.
00:29:49.000 I think we should make flags out of asbestos.
00:29:51.000 Solve all the problems.
00:29:54.000 That would make for a lot of unphotogenic protests, but that would be funny.
00:29:59.000 What the fuck?
00:30:03.000 I'm just astounded by the lack of ability to see the future.
00:30:09.000 The lack of ability to see where all this goes.
00:30:12.000 In letting this happen, in not being outraged, in letting the government creep into all aspects of your life in this way.
00:30:20.000 There's no net positive.
00:30:23.000 In the end, it's just...
00:30:25.000 Government control of speech and thought and content and everything you do.
00:30:31.000 Right.
00:30:32.000 And complete capture of the media and all of that.
00:30:36.000 And that's kind of what we're trying to fight against.
00:30:42.000 The one heartening thing is that the quote-unquote mainstream press is now – really, it's in free fall now, right?
00:30:52.000 Its influence is more and more limited every day.
00:30:57.000 The problem is that Something needs to step up in its place and – but they're just – they don't have any authority at all outside their own little bubble anymore.
00:31:10.000 Propaganda is a scary thing and when you have mainstream news organizations going along with what – Appears to be propaganda with no pushback at all like where where's journalism like journalism is such an important part of Any sort of functioning culture where people need to find out what what is the real information?
00:31:34.000 And there's people that have a responsibility to try to find that information and then give it to people so they can make informed decisions and they can know what what is the workings behind the machine or What's the wiring?
00:31:45.000 What's happening?
00:31:46.000 How are these decisions getting made?
00:31:48.000 When the corporate media doesn't do that anymore, we're fucked.
00:31:53.000 And you, in your time, have seen that.
00:31:57.000 You've seen this transition into the media becoming an arm of propaganda.
00:32:06.000 As opposed to what it was in the 70s or what it was in the 60s, where it was the news.
00:32:11.000 This is what's happening.
00:32:13.000 This is what we've uncovered.
00:32:14.000 This is our undercover investigation.
00:32:16.000 These are our facts.
00:32:18.000 Our informant has told us this.
00:32:20.000 Now we know that.
00:32:22.000 Nixon did this.
00:32:23.000 Kennedy was aware of that.
00:32:25.000 We know these things now because of real journalism.
00:32:28.000 And it seems like For whatever reason, there's two branches going on with journalism.
00:32:35.000 There's people like you and Barry Weiss and Glenn Greenwald and the Substack people that are like, hey, hey, hey, hey, this is not what I fucking signed up for.
00:32:46.000 I'm here to do actual real journalism and you people in these gigantic mainstream organizations are losing your fucking minds.
00:32:55.000 You're crazy and you're doing it with For so many reasons.
00:33:00.000 Because Trump sucks, because you're pushing a woke agenda, because you want...
00:33:05.000 Whatever the reason is.
00:33:07.000 You've decided to become a part of a propaganda machine.
00:33:10.000 And it's not journalism anymore.
00:33:12.000 You're ignoring really important stories that are inconvenient to the narrative that you feel like it's your obligation to push.
00:33:22.000 Yeah.
00:33:22.000 Remember when Trump became president and he was making noise about not letting certain people have credentials to get into the White House?
00:33:30.000 And there was this big hue and cry like, oh my god, he's not going to let us into the White House.
00:33:35.000 And my first reaction to that was, who fucking cares if you're not let into the White House?
00:33:41.000 You have an adversarial relationship already.
00:33:44.000 You're supposed to, with government, right?
00:33:46.000 If they don't let you in...
00:33:49.000 Just report on it anyway.
00:33:52.000 It's not a big deal, but for the new generation of journalists who've come in, they imagine themselves, because they're socially the same people that they're reporting on, they hang out in the same circles, they go to the same parties.
00:34:07.000 The idea of not being let behind the rope line Is an atrocity to them.
00:34:14.000 They don't understand it.
00:34:15.000 And they see their role as helping explain the point of view of power.
00:34:20.000 I mean it's just what we're talking about with Brian Stelter before.
00:34:23.000 Like my job here is to kind of sell this.
00:34:27.000 Yeah.
00:34:43.000 But if it fucks up, we got to report that too.
00:34:46.000 Like, you know, our job is to ask difficult questions.
00:34:51.000 And if we have difficult truths, we got to report those things.
00:34:54.000 And so we're not really on your side.
00:34:57.000 Like, we're not your friends.
00:34:59.000 You know, you can hang out with us.
00:35:01.000 We can hang out with you at a campaign stop.
00:35:05.000 But there's supposed to be tension there.
00:35:10.000 There's always supposed to be tension there.
00:35:12.000 And what you see now is that there's no tension at all, right?
00:35:17.000 Like, there's just this sort of seamless community of people who all think the same way.
00:35:26.000 You know, whether it's...
00:35:28.000 Rachel Maddow or Don Lamont or whoever it is and some Biden administration official.
00:35:38.000 They're all kind of on the same page.
00:35:40.000 They see themselves as part of the same group.
00:35:42.000 They see themselves as having the same mission.
00:35:44.000 But the press has to have its own mission or else it's not legitimate.
00:35:50.000 Well, I don't think they're journalists.
00:35:52.000 I mean they're really just television propagandists.
00:35:54.000 That's what they are.
00:35:55.000 And they're working for these enormous corporations and it benefits those corporations to have a narrative.
00:36:01.000 And so you have a spokesperson for the narrative.
00:36:03.000 There's no way Don Lemon is a journalist.
00:36:06.000 There's no way Brian Stelter is a journalist.
00:36:08.000 They're just not.
00:36:09.000 Right, right, but...
00:36:11.000 They are the way, like, that insurance lady on those insurance commercials is a stand-up comic.
00:36:18.000 You know what I mean?
00:36:20.000 The progressive lady.
00:36:21.000 Yeah, is she really a comedian?
00:36:22.000 I guess she might make a few people laugh, like, maybe, you know, but if she comes to the green room of the comedy club, we're all gonna look at her and go, hey, what are you doing here?
00:36:31.000 Like, this is...
00:36:32.000 It's not like Sarah Silverman showing up.
00:36:35.000 She's not really a comedian.
00:36:36.000 That lady is a – she's a paid spokesman for the insurance company that just happens to make you laugh.
00:36:42.000 And what those people are is paid spokesmen for the narrative that just happen to be saying what's happening in the world.
00:36:49.000 But they're not journalists.
00:36:51.000 Right.
00:36:51.000 They're reading off – You know, aversion of something.
00:36:57.000 And then their idea of journalism is sort of digging up facts to defend whatever that is.
00:37:05.000 And not looking at things objectively.
00:37:08.000 There's a very clear narrative, and you have to push that narrative.
00:37:12.000 And when it changes, you don't say a goddamn thing.
00:37:15.000 When the science changes, and when new information comes out that refutes everything you've said in the past, you just shut the fuck up and keep moving.
00:37:23.000 Right.
00:37:24.000 They do that, and they shouldn't do that because, again, it's another thing that loses you faith with audiences, right?
00:37:36.000 And this is another thing that drives me crazy about this propagandistic model of media, is that in addition to being wrong, it doesn't work.
00:37:47.000 Right?
00:37:48.000 Like propaganda, it only goes so far.
00:37:52.000 You actually need people to trust you, and trust is a complicated thing.
00:37:57.000 You know, you have to have a relationship with your audiences.
00:38:01.000 And audiences will not believe you if they see you making a mistake and then they see you not owning up to it.
00:38:09.000 Right?
00:38:09.000 Yeah.
00:38:10.000 Which is why, you know, you and I both have had this experience of saying something wrong and then coming out and saying, you know what, I screwed that up.
00:38:17.000 Like, I did that.
00:38:18.000 I made a really bad call about the beginning of...
00:38:22.000 The Ukraine war.
00:38:24.000 I didn't believe the reports that were coming out.
00:38:27.000 They sounded wrong to me.
00:38:30.000 And I kind of wrote a sarcastic column about how this is ridiculous.
00:38:36.000 Everyone's being so credulous about this.
00:38:38.000 And then, of course, the invasion happened shortly after that.
00:38:40.000 And I look like an idiot.
00:38:44.000 But you gotta come out and say after that, you know what?
00:38:48.000 I fucked up.
00:38:48.000 Yeah, I fucked up.
00:38:49.000 I was over my skis on that thing.
00:38:51.000 And, you know, if you don't do that...
00:38:54.000 They never trust you.
00:38:55.000 They never trust you.
00:38:56.000 And they shouldn't trust you.
00:38:57.000 Right.
00:38:57.000 And you should.
00:38:58.000 When you fuck up, you should own it.
00:39:00.000 Right.
00:39:00.000 You're just a human being, you know, and you're talking about things in real time.
00:39:04.000 Especially when you're doing a podcast, you're literally thinking out loud publicly.
00:39:08.000 Right.
00:39:09.000 And it becomes a permanent record.
00:39:11.000 Right, right.
00:39:12.000 Good luck with that.
00:39:13.000 Yeah.
00:39:13.000 No, I don't know how.
00:39:15.000 That's got to be, you know, incredibly difficult to, you know, you're not even sitting there doing what I do.
00:39:22.000 It's just choosing my words carefully and typing them, right?
00:39:25.000 Well, a lot of times I'm intoxicated.
00:39:28.000 Fucking good solid amount of the time.
00:39:31.000 Yeah.
00:39:31.000 No, it's – but that's also why people like it because they feel like you're really hanging out with them having a conversation because that's what it sounds like.
00:39:40.000 Because it is what it's like.
00:39:41.000 I mean it's like the same way I talk to you.
00:39:44.000 You and I could be at dinner right now and we would probably be having the same conversation.
00:39:48.000 Yeah, exactly.
00:39:49.000 And people can pick up on that.
00:39:50.000 You know what I mean?
00:39:51.000 I always thought that was one of the reasons that your show is so successful is that people don't detect that there's something staged about it.
00:40:03.000 But you can see that so clearly in every other kind of It's a dirty business.
00:40:17.000 It's a dirty business and it's dirty for them too because if they could be free, they would like to.
00:40:23.000 I think almost everybody would like to just be able to be themselves and have their own opinions and be able to express themselves and be able to think about things openly.
00:40:31.000 When you are working for a major news organization and you have an enormous paycheck every week that comes to you, if you keep this thing going, you have to keep this charade going, keep this con going, you're going to keep going.
00:40:45.000 You're going to be Rachel Maddow.
00:40:48.000 I love the fact that the way you compared Rachel Maddow to Bill O'Reilly.
00:40:53.000 It's the same person.
00:40:54.000 It's the same thing.
00:40:55.000 It's just one's doing it with progressive values and one's doing it with right-wing values.
00:41:00.000 Right.
00:41:00.000 And you were really the first person who's a liberal guy to openly say that.
00:41:04.000 I'm like, yes.
00:41:05.000 Yeah, that's what it is.
00:41:06.000 Yeah, no.
00:41:08.000 You identify an audience.
00:41:10.000 You give them stuff they want to hear.
00:41:13.000 You do it over and over again, rinse, repeat, blah, blah, blah.
00:41:17.000 And, you know, look, that's a dangerous pattern that you can fall into when, you know, the business works the way it does.
00:41:28.000 But...
00:41:29.000 People do it, you know?
00:41:30.000 And the converse of that is that if you work in these organizations, and this is something that, believe it or not, Noam Chomsky wrote about a million years ago in a book called Manufacturing Consent.
00:41:45.000 You see coming up in the business that when somebody tries to buck the system and tries to force through an unpopular story or refuses to write a story that's not true or does anything that the editors don't like,
00:42:03.000 They see that those people are moved out of the business sooner rather than later, right?
00:42:10.000 They just sort of end up being washed out with reputations for being difficult people.
00:42:16.000 Chris Hedges is somebody who comes to mind, right?
00:42:19.000 They kind of just squeeze you out.
00:42:22.000 There's no particular thing that happens.
00:42:26.000 And that just sends signals down the ranks of people in journalism that If you want to get ahead, just keep doing the shit that we want you to do.
00:42:37.000 You don't have to be a genius to figure out what that is.
00:42:40.000 Just keep doing it, and you'll eventually rise up through the ranks.
00:42:46.000 And before you know it, you'll have your own show, or you'll be running a desk.
00:42:53.000 But you won't have anything to say because early on, you'll have made the decision to abandon your individuality.
00:43:02.000 That's the key to the whole thing is that it's not people who are making these big decisions to sell out when they're 50. They make the decision to sell out when they're 22 or 23. At the very start, when they first see it and they understand how the business works and they start climbing,
00:43:18.000 that's when they sell out.
00:43:23.000 It's who they are.
00:43:25.000 Well, you see it in politics too, right?
00:43:26.000 Like you see it like people like AOC who starts out as this like, you know, really kind of inspiring story.
00:43:33.000 Girl wore shoes out campaigning, just walking around going door to door and now she's cozying up to the likes of Nancy Pelosi and they're all in this weird sort of group together making decisions.
00:43:48.000 And people don't like it.
00:43:50.000 They see where it's going.
00:43:51.000 Like, oh, if you want to be president, you have to go down this road.
00:43:54.000 And you're going down that road.
00:43:55.000 Exactly.
00:43:56.000 And you're going to be that lady that presses the button that drops the bombs.
00:43:58.000 You're going to be that lady that, like, sends the fucking drones out and finds some way to justify the fact that it kills 90% civilians.
00:44:05.000 You're going to be that person because that's where that person starts.
00:44:08.000 You start out idealistic.
00:44:10.000 You start out this person who's very progressive and really wants, you know, to help lower income families and really wants to help inner city schools, really wants to help.
00:44:19.000 And then along the way, you get indoctrinated into the system and you figure out how everything works.
00:44:25.000 This is how you have to do it.
00:44:27.000 This is how you play ball.
00:44:28.000 This is the bill you have to sign.
00:44:30.000 This is what you have to get in on.
00:44:31.000 In order to get this, we have to do that.
00:44:33.000 In order to get that, we have to do this.
00:44:35.000 And next thing you know, you're a fucking politician.
00:44:38.000 Right.
00:44:38.000 And you're the Speaker of the House, and you're doing insider trading and making hundreds of millions of dollars because everybody else is doing it.
00:44:44.000 Right, right.
00:44:45.000 Remember when she came in and everybody was – there was a whole clan of sort of senior Democratic Party officials in Congress who were giving her a hard time because she had – she was on Twitter a lot.
00:45:02.000 Right.
00:45:02.000 And she was being successful with Twitter.
00:45:04.000 And they were like, you know, you have to make the choice between whether you want to be on social media or whether you want to be a politician.
00:45:10.000 And I actually admired her at the time.
00:45:12.000 I didn't agree with her about everything.
00:45:14.000 But I – like you, I thought her story was interesting.
00:45:17.000 I thought that she had – I think we're good to go.
00:45:29.000 I think we're good to go.
00:45:31.000 I think we're good to go.
00:45:39.000 They let you know, right?
00:45:40.000 Look, if you ever want to be a committee chair, if you want to get in line for these powerful positions, if you want to get appropriations money sent to whatever district, you got to play ball, right?
00:45:53.000 And if you do, then you very quickly start climbing the ladder.
00:45:59.000 If you don't, you end up just somebody who...
00:46:03.000 Tends to be on the outside and is portrayed as a nut.
00:46:07.000 Bernie Sanders.
00:46:07.000 Bernie Sanders or Ron Paul, right?
00:46:10.000 I mean that's how it works.
00:46:15.000 There are people who are kind of like in Congress – You might disagree with what they believe, but they're honest.
00:46:23.000 And you can tell they're honest because the leadership hates them and doesn't give them the opportunity.
00:46:30.000 But you're absolutely right.
00:46:30.000 It's the same thing.
00:46:35.000 It's an organizational sort of belief system.
00:46:40.000 You either have to go with it or not.
00:46:43.000 I just think it's especially offensive in journalism The whole idea of being a journalist, you're not doing it for money or power.
00:46:57.000 If you're going into this business, what are you doing it for if you're not going to be trying to break cool stories?
00:47:05.000 Go into some other thing.
00:47:07.000 Go into finance, right?
00:47:09.000 I never understood that.
00:47:10.000 That part of it doesn't make any sense to me.
00:47:12.000 Trevor Burrus Do you think that this understanding of this now, because people are talking about it, and then the birth of Substack and the fact that it's become very successful, And that people are flocking towards genuine independent journalists, whether they're on social media like YouTube or Twitter or Substack.
00:47:29.000 Do you think that this is in many ways like changing the way young people see the possibilities?
00:47:38.000 Because I think young people looking at the two options, like one, you can kind of be a hero.
00:47:45.000 Or two, you can kind of be a whore.
00:47:47.000 Right.
00:47:48.000 You know, there's a lot of whores.
00:47:50.000 They don't seem happy.
00:47:51.000 These whores seem very upset at everything and they're always pulling their fucking hair out and they're probably on antidepressants.
00:47:57.000 And then you have these people that are like breaking stories and it's like, oh my god, journalism is alive.
00:48:04.000 It's just alive in like, you know, when people travel with a fire and they have like embers and they're...
00:48:11.000 They're blowing on them and they get them to the next camp and then they could start a fire with it.
00:48:15.000 That's journalism right now.
00:48:16.000 It's not this raging bonfire that everybody can go get warmed.
00:48:20.000 You know, the information will warm everyone.
00:48:22.000 No, it's like these people have like small wooden vessels filled with embers and they're blowing on them as they run through the woods and people are fanning them to try to keep them alive.
00:48:33.000 But I think For young people that are considering paths, like what to do with their future, they don't want to be contained.
00:48:40.000 They want to be free.
00:48:41.000 And because of social media and because of the fact that any kid can just start a YouTube page and just start talking about things.
00:48:48.000 Exactly.
00:48:49.000 And because of that kind of ability to now do your own show, becoming independent is becoming not just more plausible but more attractive.
00:48:58.000 Absolutely.
00:49:00.000 And I think, you know, younger people have less tolerance for phoniness, or at least historically they did.
00:49:09.000 It's been a little weird lately.
00:49:12.000 I haven't always been sure of that lately.
00:49:15.000 But, you know, people who are going to go into journalism when they're 18 or 19...
00:49:24.000 Once upon a time, they all wanted to be Woodward and Bernstein or Cy Hirsch or Hunter Thompson or whoever it was.
00:49:32.000 They just wanted to be a rule breaker, somebody who told the truth and consequences be damned.
00:49:40.000 Because that's what it's about.
00:49:41.000 It's about being free and speaking your mind, right?
00:49:44.000 Like what is it William Blake said?
00:49:45.000 You can always be ready to speak your mind and a base man will avoid you, right?
00:49:49.000 Like that's what journalism is.
00:49:51.000 Like you derive power from your willingness to say the unpopular true thing, right?
00:49:59.000 And that's an attractive idealistic thing for a young person.
00:50:04.000 But if they see That path closed, they're not going to go into...
00:50:11.000 I mean, why would you go into journalism and try to work at The New Yorker or MSNBC if you know you're never going to get to do that, basically?
00:50:24.000 Right.
00:50:25.000 Why would you?
00:50:26.000 But you can create your own show with almost no overhead.
00:50:31.000 And do the same thing and have a much bigger impact and you'll have a bigger audience even.
00:50:38.000 Much bigger.
00:50:39.000 Right?
00:50:39.000 Much bigger.
00:50:40.000 That's what's done.
00:50:40.000 Yeah.
00:50:41.000 Yeah.
00:50:41.000 I mean, as you know, as you very well know, right?
00:50:44.000 I mean, that's...
00:50:46.000 And these corporate media companies...
00:50:53.000 I have been living for a long time on their name and on the memory of the prestige that their names inspired.
00:51:00.000 But if people, if they actually had to sell how much reach they had now, they wouldn't have much to talk about, right?
00:51:12.000 Like, their audiences are shrinking.
00:51:16.000 Their influence is very, very small.
00:51:20.000 You know, the jobs that they're offering are just less and less exciting for young people.
00:51:28.000 We were talking about CNN. Like, how does CNN even keep the lights on?
00:51:31.000 They have an enormous building in Atlanta.
00:51:33.000 Giant CNN sign on the front of it.
00:51:36.000 And they get terrible ratings.
00:51:37.000 Oh, yeah, absolutely.
00:51:38.000 And they have so many people working there.
00:51:40.000 You have a whole giant building filled with people, and then your product, no one wants it.
00:51:48.000 People are just accidentally watching it.
00:51:52.000 Flipping through the channels they're watching.
00:51:54.000 There's nothing compelling that they have to offer, yet they are in the business of selling compelling information.
00:52:02.000 You're literally the most compelling thing, because the news is supposed to be one of the most compelling things.
00:52:08.000 Everybody traditionally would come home and watch the evening news because you need to know what the fuck is going on in the world.
00:52:13.000 But now because of social media and because of just websites and phones and just news off your apps, the different apps that people use, Google apps, no one cares anymore.
00:52:24.000 So you're just howling into the wind.
00:52:27.000 It's something...
00:52:28.000 It's background you see at an airport, maybe?
00:52:32.000 Yeah.
00:52:32.000 Right?
00:52:33.000 Right.
00:52:33.000 But yeah, you're right.
00:52:34.000 They have a huge building in New York, too.
00:52:37.000 You know, the Time War Center and...
00:52:39.000 Bizarre.
00:52:41.000 And they're not even in the top 20, I think, of cable news shows anymore, right?
00:52:46.000 So...
00:52:47.000 And then look what happened with CNN+. I mean...
00:52:50.000 You know, they went and they hired Chris Wallace and they were going to launch this big subscription service, CNN Plus, where I guess the idea was they were going to get people to pay to watch the same stuff they were already refusing to watch for free.
00:53:09.000 Then they had to cancel the service after three weeks.
00:53:13.000 They spent hundreds of millions of dollars on it.
00:53:15.000 I mean, how could they have not seen that?
00:53:17.000 How could they think that people wanted to pay for something that they don't like for free?
00:53:24.000 I don't know.
00:53:25.000 I don't know.
00:53:25.000 But what's clear is that they don't have any conception of where the way out is for them, right?
00:53:34.000 Well, they're like blockbuster video.
00:53:36.000 There's no way out.
00:53:38.000 Well, I think there would be a way out if they started actually doing their jobs, but they just don't know what that is anymore.
00:53:44.000 Well, hasn't the new guy said they want to switch away from opinion, editorial-type news stories and public people to people that just disseminate objective views of information?
00:53:56.000 Just like this is what's happening in the world.
00:53:57.000 This is what's going on.
00:53:58.000 This is the crash that happened.
00:54:00.000 This is the that.
00:54:01.000 They have said that.
00:54:03.000 But is it too late?
00:54:05.000 The problem is like people now have associated CNN with bullshit propaganda.
00:54:10.000 Right.
00:54:11.000 They treat you like you're dumb.
00:54:12.000 They think you're stupid.
00:54:13.000 They think, you know, that you could just tell people that someone's taking horse dewormer and you could just repeat it over and over again and people believe it.
00:54:21.000 I mean, the amount of damage that they did to their own reputation saying things like that, Because most people would look at that and go, is he really doing that?
00:54:30.000 And then some people would go, well, that's not even true.
00:54:33.000 Like, wait a minute, CNN says this?
00:54:35.000 What else are they lying about?
00:54:36.000 What about international stories?
00:54:40.000 What about financial stories?
00:54:41.000 What about things that have to do with crypto?
00:54:44.000 What narratives are they spitting out that are just bullshit?
00:54:48.000 Right, right.
00:54:49.000 And there's a lot of them, you know?
00:54:52.000 Yeah.
00:54:53.000 You remember when the Bounty Gate story came out and then...
00:54:57.000 Bounty Gate?
00:54:58.000 What's Bounty Gate?
00:54:58.000 Bounty Gate was this weird story that came out, I think it was in 2020, when basically they were reporting that Russians were paying bounties in Afghanistan to kill American soldiers.
00:55:12.000 Yeah.
00:55:13.000 And it turned out to be like, you know, basically one theory that somebody within the intelligence community was positing.
00:55:26.000 The army itself came out a couple of months later and said, yeah, we don't really have evidence for this.
00:55:31.000 And then, you know, a year later they came out even more strongly and said, you know, we can't back that up.
00:55:38.000 Trevor Burrus Well, here's a crazier one.
00:55:40.000 The Russiagate.
00:55:42.000 That's the craziest one.
00:55:43.000 Absolutely.
00:55:43.000 The fact that they pushed that for three years, and they've never come out and said, we were misinformed.
00:55:49.000 That is not the case.
00:55:51.000 There really wasn't this crazy collusion between Russia and Donald Trump.
00:55:56.000 And in fact, there was some information that seems to point to that Hillary Clinton had involvement with Russia too, and that they've kind of all had involvement with Russia.
00:56:05.000 And this wasn't some grand conspiracy to elect a Russian puppet as the President of the United States.
00:56:11.000 Sorry.
00:56:12.000 Yeah.
00:56:13.000 It was a three-and-a-half-year sort of mass hysteria experiment, right?
00:56:21.000 I mean this is one of the things – it's one of the reasons I got kind of quietly moved out of mainstream journalism, right?
00:56:29.000 I didn't have a particular problem at Rolling Stone.
00:56:35.000 Early on in the Trump years, I said, there's something wrong with the story.
00:56:41.000 I think there are elements of it that aren't provable.
00:56:45.000 I don't think we should be running this stuff, you know?
00:56:47.000 And then before I knew it, I was working independently.
00:56:52.000 But anyway, at the Twitter files, we're finding stuff that now tells you absolutely the truth.
00:57:00.000 What actually the truth was during that time.
00:57:03.000 Like, for instance, one of the big Russiagate stories was from early 2018 when Devin Nunes – remember, he was the Republican congressman.
00:57:12.000 He was the head of the House Intelligence Committee at the time.
00:57:15.000 He wrote a memo basically saying, we think they faked FISA applications.
00:57:23.000 We think the FBI used – The Steele dossier to try to get surveillance authority against some Trump people like Carter Page.
00:57:33.000 And we think they lied and cheated to do that.
00:57:38.000 And so he submitted this classified memo and not only was he denounced everywhere as a liar and wrong and all that, but there was this big story that was all over the place that a hashtag Hashtag release the memo had been amplified by Russian bots.
00:57:58.000 You probably don't remember this, but this story was everywhere in January and February of 2018. This idea that release the memo was basically a Russian operation and that Nunes was benefiting from it.
00:58:16.000 Well, I'm reading the Twitter files.
00:58:18.000 I was looking for something else entirely and then suddenly we come across a string of emails internally at Twitter where the Twitter officials are saying, you know, we're not finding any Russians at all behind this hashtag and we told the members of Congress who asked about this that there are no Russians involved in this because Dianne Feinstein,
00:58:45.000 Richard Blumendahl of Connecticut, they all came out with this accusation about it being linked to Russia.
00:58:52.000 We told them that there's nothing there and they went and they did it anyway, you know?
00:58:56.000 And so there are lots of stories like that now that are kind of falling apart, right?
00:59:02.000 Most people I think don't even know that the Russia collusion thing was bullshit.
00:59:06.000 I think the general public that heard that Russiagate narrative The people that haven't looked into it past what they've seen on television probably still believe there's some sort of collusion.
00:59:17.000 Yeah, because there's never been a reckoning for it.
00:59:21.000 After the WMD thing, which went on for a surprisingly long time considering how little evidence there ever was for that, and considering that there were lots of journalists at the time who would have liked to have proved Bush wrong about that,
00:59:42.000 it still took years and years and years for the business To admit that they screwed that up.
00:59:48.000 They blamed it almost entirely on one person, Judy Miller from New York Times.
00:59:55.000 Really?
00:59:55.000 Yeah.
00:59:56.000 Other people who got that story just as wrong, like Jeffrey Goldberg.
01:00:01.000 He's now the editor of Atlantic Magazine.
01:00:03.000 There are all kinds of people who totally screwed that story up and got promoted.
01:00:10.000 But there was at least a little bit of reflection about getting a big story wrong.
01:00:16.000 But that's such a big story.
01:00:18.000 Right?
01:00:19.000 That's such a big story.
01:00:20.000 Yeah.
01:00:21.000 The fact that there really were no weapons of mass destruction and we really did start a war for nothing that really did kill somewhere in the neighborhood of a million innocent people.
01:00:30.000 Right.
01:00:31.000 It's over a fake news story.
01:00:33.000 Yeah, over a fake news story.
01:00:34.000 I mean, there should be sorrow within news organizations about a mistake of that magnitude.
01:00:43.000 And the fact there's no repercussions and the people were promoted that promoted that very same story that led to that, that led to the public support of it, of the war.
01:00:54.000 Yeah, and not only did we promote the people who got that story wrong, except in that one case with Judy Miller who was sort of villainized.
01:01:07.000 But people were fired who were questioning it, like Phil Donahue had a show, a very highly rated show on MSNBC at the time.
01:01:16.000 He lost his job.
01:01:18.000 Jesse- What did he say?
01:01:19.000 He was just very critical of the war effort.
01:01:21.000 He didn't believe the whole thing.
01:01:23.000 And that's why he lost the show?
01:01:26.000 Yeah, they took him off the air.
01:01:28.000 Jesse Ventura will tell you the story.
01:01:30.000 He lives in a compound in Mexico, Casa MSNBC, he calls it, because they hired him thinking that because he was a former Navy SEAL, he was going to be pro-war.
01:01:44.000 When they found out on the phone that he didn't feel that way, that he was very skeptical of the whole thing, they basically bought out his contract.
01:01:55.000 They just paid him the balance and said, thanks, but no thanks.
01:02:01.000 We're not going to want that show of yours.
01:02:04.000 So he was right, but instead of going on the air, he got a mansion in Mexico.
01:02:13.000 So, you know, the business has a history of doing stuff like this.
01:02:20.000 But at least in the WMD episode, they had the decency to admit now, you know, like a decade later, we screwed that up.
01:02:32.000 It has the reputation of being a media mistake.
01:02:35.000 They haven't done that yet with the Russia stuff.
01:02:38.000 Well, the WMD thing, though, there's no repercussions because over time everybody had kind of either forgotten about it or been overwhelmed by news stories.
01:02:46.000 And when the WMD came out, when that sort of thing came out at the beginning of the war, You're also dealing with a very different internet and the news cycle wasn't as extreme and dynamic.
01:02:58.000 Like nowadays, things that happen like no one gives a shit that Epstein was murdered and that the cameras were shut off and that there's no list of all the people that went to the island.
01:03:08.000 That's just gone.
01:03:09.000 There's too much new stuff to come that is in front of your eyes that you have to pay attention to.
01:03:13.000 Right.
01:03:14.000 That's the new cycle.
01:03:15.000 We do need a few more answers on that one, I think.
01:03:18.000 You think?
01:03:18.000 Yeah.
01:03:19.000 Yeah.
01:03:19.000 I mean, it's wild.
01:03:21.000 So that's the new cycle of today that is just overwhelming and then it just keeps coming forward and you can't stop it.
01:03:28.000 You know, when you were working at Rolling Stone, did you interact with Jan Wenner a lot?
01:03:33.000 Yeah.
01:03:33.000 I did.
01:03:34.000 Did you see my conversation that I had with him?
01:03:36.000 Yeah, yeah, absolutely.
01:03:37.000 Did you think that was kind of sad in a way?
01:03:40.000 Very, very young, yeah.
01:03:42.000 I mean, look, Jan and I, Jan was always good to me, you know.
01:03:48.000 He didn't agree with my politics.
01:03:50.000 He didn't agree with my approach to the job a lot.
01:03:54.000 And I know that my stories got him in a lot of trouble socially.
01:03:58.000 So that came out from time to time.
01:04:00.000 But he never went to the step of firing me.
01:04:04.000 And he let me do the stories that I wanted to do for the most part.
01:04:10.000 But I think as you found out, Somewhere along the line, he lost interest in being part of a real actual journalistic venture,
01:04:28.000 right?
01:04:28.000 Yeah.
01:04:28.000 I think he has a hard time concentrating on the nuances of all these different things and balancing out – like when he was talking about the government regulating the internet.
01:04:41.000 That was the most shocking.
01:04:43.000 Look, I was a fan of the guy because I've always been a fan of Rolling Stone.
01:04:47.000 And I'm a giant fan of Hunter S. Thompson.
01:04:49.000 And I knew he and Hunter had this very close relationship.
01:04:53.000 It was complicated, but yeah.
01:04:54.000 I'm sure.
01:04:54.000 But I wanted to talk to him just about that.
01:04:57.000 And I enjoyed that.
01:04:59.000 But then to say you were a fan of that guy, and now you're talking to me.
01:05:08.000 About the government censoring the internet.
01:05:11.000 And when I was like, you're talking about the same people that gave us bad information about weapons of mass destruction that led to a war.
01:05:19.000 And then he's sort of balancing that out.
01:05:21.000 No, those were politicians.
01:05:23.000 That's the government.
01:05:25.000 Right.
01:05:25.000 You're talking about the government.
01:05:27.000 Yeah.
01:05:28.000 To see him try to wrestle with that in his head and to realize he had never wrestled with it before.
01:05:34.000 Yeah.
01:05:35.000 Yeah.
01:05:35.000 That's what was fucked.
01:05:37.000 It's like, I mean, is he just tired?
01:05:40.000 Is he just older?
01:05:41.000 Or is he just like completely insulated in those liberal cocktail parties where, you know, you have to wear a mask and you have to say this.
01:05:49.000 And if you don't say that, you'll be ostracized from the social group.
01:05:53.000 Like what, what kind of narrow bandwidth are you operating on?
01:05:59.000 Like that you could just say that.
01:06:01.000 Yeah, I mean, again, I feel bad because, you know, I'm living in a home probably that was paid for by Jan Wenner and all that stuff.
01:06:09.000 And he was, like I said, he was always good to me personally, even though we had some pretty intense disagreements and arguments and there was a lot of yelling.
01:06:22.000 That went on.
01:06:23.000 But I think, you know, what happens is that, yeah, you do end up in a bubble.
01:06:33.000 And even people who spent their whole lives in the journalism business, and not just in the journalism business, but rock and roll journalism, right?
01:06:44.000 You're supposed to be kind of on the edge, right?
01:06:48.000 And Hunter S. Thompson was...
01:06:50.000 Was completely out of control.
01:06:53.000 His writing was wild and free.
01:06:56.000 That was what was beautiful about it.
01:06:57.000 And it took a lot of guts to publish that.
01:07:01.000 And to send somebody like that on the campaign trail was a revolutionary idea at the time.
01:07:09.000 But Jan...
01:07:14.000 And this came out in 2016 because he endorsed Hillary...
01:07:20.000 I asked permission to write a counter to that and endorse Bernie.
01:07:28.000 But his whole reasoning was when we were young and we were supporting McGovern, we were wrong because McGovern It was a bad candidate and Nixon got elected.
01:07:41.000 We needed to support somebody else who had a better chance of winning.
01:07:46.000 And so his whole idea of youthful idealism is nice and all, but it's not pragmatic, right?
01:07:54.000 And this was the place that he had ended up in.
01:07:57.000 And that leads you to other things, like internet censorship.
01:08:03.000 One of the reasons that...
01:08:05.000 One of the first signs that I knew that I wasn't going to have a future at the magazine is when he told me I just flat out shouldn't touch the Russia story anymore.
01:08:17.000 This was in the first year of the...
01:08:27.000 I think there's also being the boss.
01:08:46.000 Yeah, and you're insulated in the fact that no one wants to challenge you.
01:08:53.000 Yeah.
01:08:54.000 And look, most of the people who worked at Rolling Stone had kind of a love-hate relationship with Jan.
01:09:02.000 Like, he could be tempestuous and he could be...
01:09:07.000 He would go into fits of anger and things would happen.
01:09:12.000 But on the other hand, like, there was a lot of great journalism that came out of Rolling Stone.
01:09:17.000 And he had...
01:09:20.000 You know, he had a really sort of brilliant sense of how magazines worked and what audiences would like in magazines.
01:09:30.000 And that mechanism in his head functioned extremely well for like 40 years or so.
01:09:37.000 And so, you know, people respected him as a leader on that front.
01:09:44.000 But, you know, over time the magazine, you know, It started to become, I don't know, it lost its sense of purpose.
01:09:57.000 And yeah, it became something other than what it had been, right?
01:10:03.000 It was a symbol of rebelliousness once.
01:10:05.000 And now it's the opposite.
01:10:08.000 It is the opposite.
01:10:09.000 Yeah, it was sad to me to have that conversation with them.
01:10:13.000 Part of it I really enjoyed, talking about Hunter and talking about the early days of the magazine, what it was like to take a chance on a magazine like that in this counterculture environment that they found themselves in.
01:10:25.000 But then, you know, sometimes people just get tired, man.
01:10:30.000 They just get tired and they get old and they kind of give in to narratives.
01:10:33.000 They don't want to explore the subtle nuance of each individual topic because sometimes those are uncomfortable.
01:10:40.000 And you have to wrestle with those thoughts.
01:10:42.000 And sometimes people would rather just medicate themselves and go to sleep.
01:10:45.000 Yeah, and also, you can also get used to not answering difficult questions, which also came through in that interview, I think.
01:10:53.000 Right.
01:10:53.000 Because you're the boss.
01:10:54.000 You're the boss, right?
01:10:55.000 Like, you're used to being asked, you know, throwing a whole bunch of softballs.
01:11:01.000 And, you know, if you're not in that place, you know, yeah, that can be difficult, too.
01:11:08.000 I have a good friend who used to be an executive at Google and we were at a party.
01:11:13.000 And it was me and my wife and this lady who was one of the big wigs at YouTube.
01:11:19.000 And so she sits down with us and we're talking and she's, you know, asking me about, you know, podcasting and this and that.
01:11:26.000 And we're having this conversation.
01:11:27.000 And I say, how does YouTube decide what gets marked as bad?
01:11:35.000 Because there's a conversation between Sam Harris and Douglas Murray, two public intellectuals, that someone put on their YouTube playlist.
01:11:44.000 They didn't even make this conversation.
01:11:47.000 They just like, this is something that I have on my channel, on my playlist.
01:11:51.000 And they got flagged for that.
01:11:55.000 And yeah, they got a strike against their channel for that.
01:11:59.000 And I said to her, I go, why would that be?
01:12:01.000 And she goes, because it was hate speech.
01:12:03.000 And so I go, how did you just say that?
01:12:05.000 I go, how did you just say it was hate speech?
01:12:07.000 I go, do you know the content of the conversation?
01:12:09.000 You're talking about two public intellectuals.
01:12:11.000 Right.
01:12:12.000 And you just said it was hate speech.
01:12:13.000 And my wife is like squeezing my knee because she sees I'm getting red.
01:12:17.000 You know, I'm like, you just said it was hate speech.
01:12:19.000 I go, do you understand what they talked about?
01:12:21.000 Do you know what they talked about?
01:12:23.000 Why did you just say that?
01:12:24.000 But she was just like, what they said was hate speech.
01:12:27.000 She just had this arrogance because she was in this position of power where she could say, that's hate speech, that's not hate speech.
01:12:34.000 And this was quite a while ago.
01:12:36.000 It was like 2015, 16. Oh, wow, so it's early.
01:12:39.000 Yeah, early.
01:12:40.000 Wow.
01:12:41.000 But her saying, because it was hate speech, like this look in my, that's what it is.
01:12:45.000 Do you remember that commercial where it's like during the drug war, the height of the drug war propaganda, and during the- The brain on drugs thing?
01:12:55.000 No, no, no.
01:12:56.000 The one where there's a man and a younger man, and they're eating dinner, and he's saying, if you buy drugs, you support terrorism.
01:13:04.000 And he's like eating his food at a steakhouse.
01:13:07.000 He goes, what do you mean it supports terrorism?
01:13:09.000 It just does.
01:13:09.000 It just does.
01:13:10.000 You're supporting terrorism.
01:13:12.000 It's like this this no-nonsense guy, and then this young goofy guy like, what are you saying?
01:13:17.000 I just like to smoke pot.
01:13:19.000 If you buy drugs, you support terrorism.
01:13:21.000 You ever seen that commercial?
01:13:22.000 I don't remember it.
01:13:24.000 Let's play it.
01:13:24.000 Play it from the beginning.
01:13:25.000 Play it from the beginning, because this is...
01:13:27.000 It's a ploy.
01:13:28.000 What?
01:13:29.000 This drug money funds terror.
01:13:31.000 It's a ploy.
01:13:32.000 A ploy.
01:13:33.000 A manipulation.
01:13:35.000 Ploy.
01:13:36.000 Drug, money, funds, terror.
01:13:37.000 I mean, why should I believe that?
01:13:39.000 Because it's a fact.
01:13:40.000 A fact?
01:13:41.000 F-A-C-T fact.
01:13:43.000 So you're saying that I should believe it because it's true.
01:13:47.000 That's your argument.
01:13:48.000 It is true.
01:13:53.000 Says who?!
01:13:54.000 Says who, motherfucker?
01:13:57.000 And that's me and that lady from YouTube.
01:13:59.000 Because it's hate speech.
01:14:01.000 Because it is hate speech.
01:14:02.000 Because it's true.
01:14:03.000 While you're eating salad.
01:14:04.000 What the fuck?
01:14:05.000 She was probably eating too, right?
01:14:06.000 Yeah, she was eating.
01:14:07.000 Yeah, we were eating.
01:14:09.000 I was so red.
01:14:10.000 I was so hot.
01:14:11.000 I was like, what?
01:14:12.000 And this is before I was even getting censored on YouTube.
01:14:14.000 This was in the early days.
01:14:15.000 Like, you know, no one gave a fuck about what I was doing.
01:14:19.000 Right.
01:14:19.000 And sitting there with this woman who was telling me this interesting conversation about the change.
01:14:26.000 This was like when he was doing The Strange Death of Europe, that book, Douglas Murray's book.
01:14:31.000 Uh-huh, uh-huh.
01:14:32.000 Yep.
01:14:33.000 And, you know, they were having this conversation about what happens when religious ideology starts to change the environment of these European cities.
01:14:41.000 But that fucking attitude that people have, like, because it's true.
01:14:46.000 Because it is true.
01:14:47.000 Fucking says who?
01:14:48.000 You can't just say that.
01:14:50.000 You tell me how.
01:14:51.000 You tell me how you know.
01:14:52.000 We're going to talk forever, motherfucker.
01:14:54.000 We're going to sit here forever.
01:14:55.000 I'm not going to be done with this dinner.
01:14:57.000 I'm going to put my fork down.
01:14:59.000 You tell me.
01:15:00.000 You fucking tell me.
01:15:01.000 How is it funding drugs?
01:15:03.000 Is it funding terrorism?
01:15:04.000 How is that fucking conversation between Douglas Murray and Sam Harris?
01:15:08.000 How's that hate speech?
01:15:09.000 Right, but they're used to not having to go past that first part of the conversation.
01:15:15.000 Exactly.
01:15:16.000 If my wife wasn't squeezing my knee, she was like squeezing my knee under the table.
01:15:23.000 She's like, oh Jesus, I know what's happening here.
01:15:26.000 Right, right.
01:15:27.000 No, but the people, and that was another strange thing about my experience at Rolling Stone.
01:15:36.000 Like, early, I guess it was 2017, 2018, when they first started to really aggressively police the internet, I did a story about how they wiped out a bunch of accounts This is after the Alex Jones thing.
01:15:56.000 Facebook just sort of zapped a whole bunch of accounts.
01:16:01.000 Some of them were just sort of ordinary, hardworking people who had built up these independent media channels.
01:16:08.000 The company just sent them notices, you are coordinated in authentic activity and your page is down.
01:16:14.000 This is after they'd spent tens of thousands of dollars on Facebook ads to build up their pages and everything.
01:16:22.000 They weren't bots, they were real people.
01:16:24.000 And not only could I not convince other people in the business that it was a significant story that these companies were now doing this, but within Rolling Stone, you know,
01:16:40.000 the story, the headline had to be refashioned.
01:16:46.000 If you look at it now, the story is called, Who Will Fix Facebook?
01:16:51.000 Because they wanted to imply that the problem was that Facebook was out of control and needed to be policed more.
01:17:01.000 My headline that I submitted was very different.
01:17:04.000 It was something like, you know, censorship on Facebook is out of control or whatever it is.
01:17:09.000 But this belief that the censorship is a good thing, that we need more of it, I just think it became an upper class kind of New York, Washington, just cocktail party belief,
01:17:28.000 right?
01:17:29.000 I mean, it was something you started to hear from people right around the time that Trump got elected.
01:17:34.000 Oh, we just need more of that, you know?
01:17:36.000 We have to do something to reckon with all those people or whatever it is.
01:17:40.000 And they haven't let go of it, I don't think, have they?
01:17:43.000 I don't know.
01:17:44.000 Have they?
01:17:45.000 Yeah.
01:17:46.000 I think they're still in that place for the most part.
01:17:53.000 This move toward that World Economic Forum version of a more regulated internet, I think we're only in the beginning stages of that.
01:18:06.000 I think they're going to make many steps You know, that are going to be much more significant in the future to try to prevent, you know, things like your show from breaking out, right?
01:18:19.000 Like, they're not going to want that in the future.
01:18:22.000 Well, how does the World Economic Forum actually wield influence?
01:18:26.000 Like, what works there?
01:18:28.000 How does it work?
01:18:29.000 One thing we know about this year's event is – we know two things.
01:18:33.000 One, that hookers are going there in droves.
01:18:35.000 There was an article about the increase in the population of prostitutes, which makes sense.
01:18:42.000 Yeah, it makes a lot of sense.
01:18:43.000 Yeah, when the boys are away.
01:18:45.000 And then the other thing was that Klaus Schwab and George Soros decided to opt out this year.
01:18:53.000 Which is interesting.
01:18:54.000 That is interesting.
01:18:55.000 Maybe it's getting too hot.
01:18:56.000 Because it used to be a thing that really didn't get much public attention.
01:18:59.000 They could go there and they could all have these meetings and decide the fate of the world and try to sort of move the world in general directions.
01:19:08.000 And then there was also Michael Schellenberger released a bunch of stuff this week showing that they lie about things that they've said that have become very problematic.
01:19:18.000 One of them was you will own nothing, you'll be happy.
01:19:21.000 And they said, no...
01:19:23.000 That was all just internet conspiracy theories.
01:19:27.000 It's not true.
01:19:27.000 But it is true.
01:19:28.000 And then there was websites.
01:19:29.000 They had a whole advertising campaign based on this.
01:19:34.000 They did that.
01:19:34.000 They said that.
01:19:35.000 They absolutely put it up.
01:19:37.000 They really did.
01:19:38.000 Or I think another one was eating bugs, right?
01:19:42.000 Yes.
01:19:42.000 Yeah, exactly.
01:19:44.000 Well, I don't have a problem with eating bugs.
01:19:47.000 I do have a problem with people trying to say what is good and is not good for the world when I know that If you say it is good, it's gonna benefit enormous groups economically, and it's gonna lock other people out.
01:20:04.000 And I think that's what they're doing with things like plant-based meat.
01:20:07.000 When all those people were saying plant-based meat is the future, like, the fuck it is.
01:20:11.000 It's really bad for you.
01:20:13.000 It's really bad for you.
01:20:14.000 Not only that, it's monocrop agriculture, which is terrible for the land.
01:20:19.000 It's terrible for living creatures.
01:20:21.000 This idea that if one life equals one life, you're way better off buying cows and eating cows Than you are buying corn.
01:20:29.000 Because in order to grow a stalk of corn, a lot of shit has to die.
01:20:33.000 And if you're using monocrop agriculture and using industrialized farming methods, and you're controlling enormous swaths of land with only one crop, that is totally unnatural, doesn't exist anywhere in nature, and in order to do that, you have to poison everything else.
01:20:47.000 You have to kill all the animals.
01:20:49.000 You have to strip the topsoil.
01:20:52.000 You have to use industrialized fertilizer.
01:20:54.000 You can't grow things that way normally.
01:20:57.000 That's why there's only—using these industrialized methods, there's only like 60 more— Like, more crop circles that they can do or crop cycles they can do.
01:21:09.000 Like, there's only a certain amount of topsoil that's even left that's viable to grow food on because they don't use regenerative agriculture anymore.
01:21:19.000 The people that like white oak pastures and polyface farms like Joel Salatin and Will Harris, these like grizzled old farmers who use these regenerative methods that are very like almost boutique, they're very rare now.
01:21:35.000 But they're more popular than ever because people are aware of them, but most of the stuff that you buy is using industrialized fertilizer.
01:21:43.000 What these people are doing is they're letting cows graze, they take the manure, they use the manure as fertilizer, chickens roam the land, chickens peck the bugs and eat the stuff.
01:21:55.000 Pigs roam, and then they cycle where these animals are.
01:21:59.000 So what they're essentially doing is they're recreating nature in a contained environment, and that is actually carbon neutral.
01:22:06.000 It actually sequesters carbon in the soil in a lot of cases.
01:22:10.000 But if you want to buy plant-based food and plant-based meat, you're not getting that.
01:22:16.000 You're supporting monocrop agriculture, industrialized farming, and you're supporting very unhealthy food.
01:22:24.000 And the idea of a small group of people who meet in a ski resort town in Switzerland making the decisions about this for people all around the world.
01:22:38.000 And they're doing it on private jets.
01:22:40.000 He was there in, I think, or he was never supposed to be there.
01:22:42.000 Bill Gates and Klaus Schwab was there.
01:22:44.000 It says, Post falsely claimed Bill Gates withdrew from Davos Forum.
01:22:49.000 Yeah, but Klaus Schwab did withdraw.
01:22:51.000 I think he was there, though.
01:22:52.000 Well, he said he couldn't make it.
01:22:54.000 It was a public release, and George Soros also said he couldn't make it.
01:22:58.000 Okay.
01:22:59.000 I mean, you could Google that.
01:23:00.000 I was looking.
01:23:01.000 I thought I saw a video of him talking.
01:23:03.000 Well, he's definitely talked at it before.
01:23:05.000 It could be from another time.
01:23:06.000 Klaus Schwab?
01:23:07.000 I think there was a thing that said that due to a scheduling conflict, he couldn't make it.
01:23:12.000 I thought so too.
01:23:12.000 I remember reading that too, but now I'm Googling it and it's like it doesn't come up.
01:23:17.000 Like I'm looking for did Klaus Schwab leave Davos?
01:23:19.000 Is he not there?
01:23:21.000 I don't get a report.
01:23:23.000 Did you Google did Klaus Schwab opt out of the World Economic Forum this year?
01:23:31.000 What?
01:23:31.000 Like, who organized this thing?
01:23:35.000 That's what's super bizarre.
01:23:37.000 Like, these billionaires, and then Justin Trudeau, and then they all fly there.
01:23:42.000 This is what happens when I Google that.
01:23:44.000 Did Klaus Schwab, you spelled Schwab wrong, but it's okay, opt out, is that how you spell his name?
01:23:51.000 It gives you the right answer.
01:23:52.000 It knows what you're looking for.
01:23:53.000 Did Klaus Schwab, what does it say?
01:23:58.000 It doesn't say that he's not there.
01:24:00.000 It just sort of seems like he's there.
01:24:04.000 I don't know.
01:24:05.000 It's confusing.
01:24:05.000 That's why I was trying to just clarify.
01:24:08.000 Trevor Burrus I mean this is part of this whole infrastructure with Aspen Institute, World Economic Forum.
01:24:16.000 Trevor Burrus What kind of influence do they actually have?
01:24:18.000 Like how do they – when we have young global leaders, when he talks about like Trudeau being one of our young global leaders and this is what they do.
01:24:27.000 They get their young global leaders that are indoctrinated into the World Economic Forum's ideas and they implement them in politics.
01:24:35.000 Yeah, it's the same thing as Justin Timberlake being a Mouseketeer and then later on he gets to have a real career in entertainment.
01:24:42.000 I mean, it's the same exact concept.
01:24:48.000 They bring people along.
01:24:52.000 There's a feeder system for how people become powerful politicians.
01:24:55.000 We've seen how it works.
01:24:59.000 If you want to be a financial regulatory official, you run a desk at Goldman Sachs for a few years.
01:25:05.000 Next thing you know, you'll be running the World Bank in Canada or running the – you'll be the chief economist of the World Bank or you'll be the chief economist of the ECB or the Bank of Canada or whatever it is.
01:25:23.000 There's just all these places where politicians come from.
01:25:27.000 You do a tour in the military, maybe even in the CIA. Maybe you work for a consulting firm like McKinsey.
01:25:36.000 You do a little time working for this or that politician as an aide and then they raise some money for you to become a candidate in Congress and next thing you know, you're running for president.
01:25:53.000 We're good to go.
01:26:08.000 This idea that leaders from all over these countries are getting together and setting an agenda that may be completely contrary to what people in the individual nations might want.
01:26:26.000 Yeah, that's upsetting.
01:26:29.000 That seems totally anti-democratic and disturbing.
01:26:35.000 Yeah, I don't know.
01:26:37.000 I have a lot of fears about that.
01:26:39.000 Yeah, as do I. And this is a fairly new thing in terms of the public zeitgeist.
01:26:45.000 Like, people didn't know about the World Economic Forum six, seven years ago.
01:26:48.000 At least most people didn't.
01:26:49.000 They didn't hear about it.
01:26:50.000 They didn't hear about Klaus Schwab and, you know, you will own nothing and you will be happy.
01:26:54.000 When you hear them say stuff like that and hear them talk about young global leaders, you're like, what is happening here?
01:27:00.000 Like, you seem like a bad guy in a science fiction movie.
01:27:05.000 Like, he wears that crazy outfit that I show you, the picture we have in the bathroom.
01:27:09.000 Yeah, yeah, yeah.
01:27:09.000 Like, could you be any more obvious that you're fucking insane?
01:27:14.000 Like, you're an insane, megalomaniacal, dictator-type character who wants to run the world, and you're literally dressing like a Star Wars character.
01:27:24.000 Yeah, there's gotta be some weird sexual fetishism thing going on there, too.
01:27:28.000 Pull that picture up, Jamie.
01:27:29.000 You know that nutty picture.
01:27:30.000 That picture is so...
01:27:31.000 Every time I go to take a leak, I look at that picture.
01:27:33.000 That's why it's there.
01:27:34.000 Because I'm like, what kind of freak shit is that guy into when no one's around?
01:27:38.000 Because if you're dressing like that publicly, and you're telling people they're going to eat bugs, and that you're going to own nothing, and then when people catch you on it, you go, those are conspiracy theories, we have nothing to say to those things.
01:27:51.000 Like, look at that fucking outfit.
01:27:53.000 He looks like a space druid.
01:27:55.000 That's awesome.
01:27:56.000 It's amazing.
01:27:57.000 Yeah.
01:27:58.000 The fact that he chose to leave his house dressed like that, like, yes, I will adjust the peons.
01:28:04.000 All the public needs to know that we are in control.
01:28:09.000 Look at this guy, like that weird star on the right one and whatever the fuck it says on the left one.
01:28:14.000 Right.
01:28:15.000 Yeah.
01:28:16.000 Yeah.
01:28:16.000 But on his vest, like what is that thing on his vest?
01:28:20.000 Is that the same thing?
01:28:21.000 What is that symbol?
01:28:24.000 It looks like a sun.
01:28:25.000 Right, but what does it stand for?
01:28:26.000 I could look that up.
01:28:27.000 It's also a goat with a cross on its head.
01:28:29.000 Oh, a fucking goat with a cross on its head.
01:28:32.000 It's a bull, I think.
01:28:32.000 Whatever it is, a bull.
01:28:34.000 Jesus Christ, you fucking psychos.
01:28:35.000 Science, ingenuity, truth, I think is what that says.
01:28:38.000 I almost wish that I didn't have this podcast and I could just go and hang out with those people.
01:28:44.000 But doesn't that look like the outfit?
01:28:45.000 Oh, wow, look at that one.
01:28:47.000 I don't think that's him.
01:28:47.000 No.
01:28:48.000 Well, how about The Great Reset?
01:28:49.000 He wrote a book called The Great Reset, and then they denied that The Great Reset is a thing they're working towards.
01:28:55.000 Like, bro, you wrote a book.
01:28:56.000 Right, yeah.
01:28:57.000 You wrote a book.
01:28:58.000 It's called The Great Reset.
01:28:59.000 This is not like the fucking, like, the Steele dossier.
01:29:03.000 Like, Trump didn't write a book called The Steele dossier and say, I never peed on anybody.
01:29:06.000 Like, this is what you're doing.
01:29:08.000 Right, right.
01:29:08.000 Or isn't New World Order another term that they – you see it on the background of some of their events.
01:29:18.000 How do they have influence though?
01:29:20.000 Like other than like are they just financing politicians?
01:29:23.000 And so they have this meeting where they get together and say, oh, this is what we want you to do.
01:29:29.000 And it's just understood that if you follow those people and you do that, you'll have some sort of a career.
01:29:36.000 Like almost like a workshopping thing.
01:29:38.000 Like a conference for people to get together that are fucking aluminum siding salesmen.
01:29:41.000 And like they find out what's the new tech and what's the latest stuff and sales techniques.
01:29:47.000 I assume it just works the same way that think tanks work in the United States, right?
01:29:51.000 Like if you...
01:29:53.000 If you...
01:29:55.000 Here it goes.
01:29:56.000 It was founded on the 24th of January 1971 by German engineer and economist Klaus Schwab.
01:30:02.000 Jesus, he founded that in 71?
01:30:04.000 The foundation, which is mostly funded by its 1,000 member companies, typically global enterprises with more than 5 billion US dollars in turnover, as well as public subsidies.
01:30:16.000 I'd like to find out what those subsidies are.
01:30:18.000 Views its own mission as improving the state of the world by engaging business, political, academic and other leaders of society to shape global, regional and industry agendas.
01:30:31.000 Boy, does that sound gross.
01:30:34.000 Yeah.
01:30:54.000 Celebrities and journalists up to five days to discuss global issues across 500 sessions.
01:31:00.000 There was some guy who was trying to interview someone from MSNBC. And, you know, he was like some independent journalist guy.
01:31:08.000 And he was trying to talk to this guy in the street.
01:31:10.000 And the guy from MSNBC said something along the lines of someone should knock that fucking guy out.
01:31:18.000 Like, threatened...
01:31:20.000 This guy for asking him questions about speaking truth to power.
01:31:25.000 That's okay.
01:31:26.000 Thank you.
01:31:26.000 Where are you from?
01:31:27.000 Rebel News.
01:31:28.000 Yes, but what is your interest?
01:31:30.000 What is your question?
01:31:31.000 What do you mean?
01:31:32.000 I'm covering the news.
01:31:33.000 I'm doing what your bosses are supposed to be doing.
01:31:36.000 Okay.
01:31:38.000 Why did you get so upset?
01:31:39.000 What's he so scared about?
01:31:41.000 No, I'm not scared.
01:31:42.000 No, you, your boss.
01:31:43.000 He seemed really scared.
01:31:44.000 He ran in there and called you out.
01:31:46.000 We have to know who is out here.
01:31:49.000 My name is Avi Yamini.
01:31:50.000 I work for Rebel News.
01:31:52.000 We're reporters.
01:31:53.000 We do what CNBC is supposed to be doing.
01:31:56.000 And he seemed a bit upset that we were asking some questions in the public area outside.
01:32:01.000 Yeah, if you're here, it's a public area.
01:32:03.000 It's no worries.
01:32:04.000 That's private area from here on.
01:32:07.000 So you're doing fine.
01:32:09.000 Wish you a very nice day.
01:32:11.000 You too.
01:32:13.000 I think it keeps going, because this is where they threaten to punch him out.
01:32:17.000 What's CNBC doing here?
01:32:19.000 I can't ask you?
01:32:20.000 No.
01:32:21.000 I'm sorry you didn't put a camera in my face, thank you.
01:32:23.000 Really?
01:32:24.000 But you're here as an invited guest and you're an editor for CNBC. Don't you think that's a bit of a conflict of interest?
01:32:31.000 I'd like you to go away.
01:32:32.000 I haven't agreed to an interview.
01:32:33.000 If you're doorstepping me, go away.
01:32:37.000 Don't touch the mic.
01:32:39.000 You're meant to be speaking truth to power.
01:32:43.000 Are you here just to take your marching orders?
01:32:45.000 Is that what you're here for?
01:32:47.000 Do you want to go away?
01:32:48.000 Not really.
01:32:49.000 I'm here to do what you should be doing.
01:32:52.000 Yeah?
01:32:53.000 Please take this out of my mouth.
01:32:54.000 I'm gonna have to go score it off for security.
01:32:56.000 Alright, do that.
01:32:57.000 I like how he's got the cigarette.
01:32:59.000 There you go.
01:32:59.000 Yeah.
01:33:00.000 So he goes inside.
01:33:01.000 Keep it going, because he goes inside, and that's when he says someone should knock this fucking guy out.
01:33:05.000 Escort me.
01:33:05.000 I want to hear what he actually says.
01:33:07.000 I'm paraphrasing.
01:33:08.000 And now he's calling security to escort me off the premises, Will.
01:33:14.000 Let's go.
01:33:14.000 Let's go.
01:33:19.000 Let's go.
01:33:21.000 What's the problem?
01:33:22.000 You're my problem.
01:33:23.000 You've been very rude to me this morning.
01:33:24.000 You haven't asked me anything, so I'd like you to take the camera off me.
01:33:27.000 I've literally asked you questions politely, which should be your job.
01:33:32.000 That guy just littered, you fucking piece of shit.
01:33:34.000 That's your job, sir.
01:33:35.000 I'm doing your job.
01:33:36.000 I'm just not getting paid for by Klaus Schwab.
01:33:40.000 You were inside as he walked in a bit upset.
01:33:43.000 What did you hear him say?
01:33:44.000 I heard him say, I'm going to paraphrase it because I don't have the exact thing, but he came in sounding quite angry, saying, I'm going to punch him out.
01:33:52.000 Paraphrasing there, he's knocked out, punched out, but, you know, he wanted to hit you.
01:33:58.000 Yeah, and there's an actual recording of it, but whatever.
01:34:01.000 We get it.
01:34:02.000 We get it.
01:34:02.000 Is that same guy the guy that got set up by Jim Jeffries show back in the day?
01:34:08.000 The guy that he was talking to?
01:34:09.000 Avi Yemeni.
01:34:12.000 I'll check.
01:34:13.000 Please check.
01:34:14.000 I think he is.
01:34:15.000 I think he's a guy that got set up and they took a bunch of his words out of context and tried to pretend that he was saying something horrible and he wasn't.
01:34:22.000 And he had a recording of the entire event because he recorded on his cell phone knowing that they were going to set him up.
01:34:29.000 If it's not that guy, we have to edit this part out.
01:34:34.000 Activist exposes Jim Jeffries' deceptive tactics.
01:34:37.000 Yeah, that's him.
01:34:38.000 Yeah.
01:34:40.000 Yeah.
01:34:40.000 I actually have a story that's very similar to that.
01:34:43.000 One of the things I wanted to talk about...
01:34:45.000 Sorry.
01:34:45.000 Go ahead.
01:34:46.000 No, that was just autoplay.
01:34:48.000 So the first time I got sent to cover the presidential campaign for Rolling Stone, I was in 2004, and I was on the plane with Kerry, you know,
01:35:04.000 and it's teaming with journalists, obviously.
01:35:07.000 And there was a story that came out, probably everybody's forgotten it, but there was a story that turned out to be fake that Matt Drudge put out about, well, maybe it wasn't fake, but it was at least not proved that Carrie had a secret mistress in Africa,
01:35:23.000 right?
01:35:24.000 And if you look this up, you'll find stories about it that were out there.
01:35:29.000 And Kerry came out in the morning and all the journalists were sort of peppering him with questions about the mistress.
01:35:40.000 And, you know, I don't care about John Kerry, but I thought it was odd that...
01:35:45.000 They went straight from reading something where there's no evidence to posing this question and having it on camera, right?
01:35:54.000 So I asked some of the journalists, and I was kind of the new kid, I said, Why were you doing that?
01:36:05.000 Like, on the basis of what were you asking that question?
01:36:08.000 And the minute they perceived that I was actually trying to ask another journalist a question, like, for a story, this one guy, he sort of steps in front of all the other ones and he says, dude, this is a fucking no-fly zone,
01:36:25.000 right?
01:36:25.000 Like, in other words, We don't cover each other in here.
01:36:29.000 That was the message.
01:36:33.000 From that point forward, I was always in the back of the plane with the tech people whenever I covered presidential politics because the press does not like it.
01:36:45.000 Even though it is a crucial part of the story, It denies that it has that role and it insists on not being covered.
01:36:54.000 And you can see how nervous these guys get when a camera's on them.
01:36:59.000 Like, oh my, you're putting a camera on me?
01:37:01.000 When I Google John Kerry's Secret Mistress Africa, it brings me down a John Edwards hole.
01:37:09.000 Which is fucking weird.
01:37:10.000 That is bizarre.
01:37:11.000 But when you put it in the Bing, I get a story.
01:37:14.000 Oh!
01:37:16.000 So Microsoft won't censor it?
01:37:19.000 But Google will?
01:37:20.000 There's a couple stories about that.
01:37:23.000 Like John Kerry stuff comes up when I look at that.
01:37:26.000 Different John Kerry stuff?
01:37:27.000 Yeah.
01:37:27.000 Where it doesn't redirect to John Edwards, you know?
01:37:30.000 So that's even weirder, right?
01:37:32.000 Like you're getting different versions of reality based on what search engine you're using?
01:37:36.000 Well, you most certainly do.
01:37:37.000 If you use DuckDuckGo, you just get what's out there.
01:37:40.000 And when you use Google, you get really – like I noticed that during the pandemic.
01:37:44.000 There was a doctor that had a heart attack immediately after taking his second shot of – I think it was Moderna.
01:37:52.000 And so I was like, what is that about?
01:37:54.000 And this was like very early on.
01:37:56.000 And I googled it.
01:37:58.000 I could not find it.
01:37:59.000 I could not find the story.
01:38:00.000 And then I went to DuckDuckGo immediately.
01:38:03.000 And I was like, whoa, this is wild.
01:38:06.000 Like they're hiding this story.
01:38:07.000 Right.
01:38:07.000 So they would hide certain stories because they thought that they would increase vaccine hesitancy.
01:38:13.000 Yeah.
01:38:14.000 Yeah.
01:38:15.000 See, that's...
01:38:16.000 It's terrifying.
01:38:17.000 Right.
01:38:18.000 You're carrying water for the pharmaceutical companies.
01:38:20.000 Right.
01:38:20.000 Which is really spooky.
01:38:46.000 Florida doctor.
01:38:48.000 Adverse reaction vaccine heart attack.
01:38:53.000 Instantly on DuckDuckGo, I get all these articles about this guy that died.
01:38:58.000 Couldn't find shit on Google.
01:39:00.000 Well, I mean, look, they've gotten very sophisticated in their ability to suppress certain things, you know?
01:39:08.000 And, you know, this is where you see the influence of, you know, how money works with the content suppression thing.
01:39:20.000 I mean, You take something like the Digital Forensic Research Lab for the Atlantic Council.
01:39:26.000 It's one of the things that these platforms use to decide whether or not a news story is true.
01:39:35.000 But if you look at where they get their money, it's the German Marshall Fund, which is a mishmash of sort of sovereign wealth funds and Fortune 500 companies.
01:39:51.000 So it's – you're paying for the fact check essentially, right?
01:39:57.000 Like that's how all of these sites that are allegedly deciding what's true and what's not, they're all influenced, you know?
01:40:06.000 And that's another thing that drives me crazy is this persistent belief that people have that you can objectively decide what is true and what is not somehow – Yeah.
01:40:34.000 Yeah, independent fact checkers.
01:40:36.000 Independent fact checkers review certain things and you find them on social media where they have a little warning or a little notification afterwards.
01:40:44.000 And you actually go down the rabbit hole and say, well, what have you done?
01:40:48.000 A lot of it is subjective.
01:40:50.000 They've just decided that this is not true or decided this is partially true.
01:40:54.000 Or it's missing context.
01:40:56.000 That's their favorite thing.
01:40:57.000 Yeah.
01:40:57.000 Missing context is great.
01:40:59.000 Missing context, right?
01:40:59.000 Like, oh, it's true, but here are eight reasons why you should think otherwise.
01:41:03.000 Like, you know, that's not our job.
01:41:06.000 And by the way...
01:41:08.000 Reporting by itself is fact-checking.
01:41:10.000 That's the whole point of it.
01:41:11.000 We don't need a separate thing called fact-checking to go with report.
01:41:18.000 That whole phenomenon drives me nuts.
01:41:21.000 It's weird, though, that we don't have – I mean, it used to be Snopes.
01:41:26.000 And a lot of people used to go to Snopes, but then I read about Snopes and you find out all the wacky shit about the people that are involved in Snopes and that the guy who's the head of it is like very heavily left-leaning and then he married a prostitute and like all kinds of wild shit.
01:41:42.000 It's like Snopes is not like some like rock-solid, independent, purely objective organization that is dedicated to the dissemination of truthful information.
01:41:51.000 Like no, they're like fucking heavily left-leaning.
01:41:54.000 Right, right.
01:42:08.000 I mean, every outlet is subjective, but that's why you have to allow them all.
01:42:13.000 You know what I mean?
01:42:15.000 We're grownups.
01:42:17.000 Let's read all the stuff and then we'll decide.
01:42:21.000 But they don't want to do it that way.
01:42:23.000 They want to have a hierarchical system that decides what's more...
01:42:29.000 You talk about Google's search engine.
01:42:33.000 They had a thing called Project OWL that they implemented in, I think it was 2017, where they changed their way of measuring what stories come up first.
01:42:48.000 And they shifted to a model that emphasized what they called authority.
01:42:53.000 And when I asked them what that meant, they told me that the analogy they gave was, think about If you search for baseball previously, you might have gotten your local little league.
01:43:05.000 Now you're going to get MLB.com, right?
01:43:07.000 So whatever we consider the more authoritative source, and that's based on surveys of people, what people think is authoritative, that's what's going to come up first.
01:43:19.000 So instead of, if you search for, let's just say, Trotskyism, instead of getting the world's leading Trotskyist website, Right?
01:43:29.000 Which is the World Socialist website.
01:43:31.000 You will get like a New York Times story about Trotskyism instead, right?
01:43:35.000 Because they want to push you towards the authoritative source.
01:43:39.000 But that's subjective, right?
01:43:42.000 And again, it's hierarchical.
01:43:48.000 And it's away from the spirit of how we would like to ingest information, which is just, let's see all of it and make our own decision.
01:43:57.000 And if you did come up with your own search engine or your own fact-checking organization that decided what's true or is not true, the real fear would be that that would eventually get compromised and that someone would come along and they'd pay for your advertising and do this and do that and then slowly but surely get their hooks into you.
01:44:15.000 Which is what they've done with Wikipedia.
01:44:17.000 Yeah.
01:44:18.000 Wikipedia was originally like this open source, you know, kind of free thing.
01:44:24.000 Now, like just, I mean, I'm discovering this now with the Twitter files.
01:44:28.000 You can't get Twitter files information into Wikipedia because they will not recognize what they call like a, I forget the term they use.
01:44:39.000 It's not an authoritative source.
01:44:40.000 It's like a recognized source or something like that.
01:44:43.000 So as long as the big newspapers don't cover it, they don't have a site that allows them to put it into Wikipedia, that allows the algorithm to put it in.
01:44:54.000 Has no mainstream media source covered the Twitter files?
01:44:58.000 Not really, no.
01:45:00.000 They've done hit pieces on me and on Elon and on Barry, but they haven't covered the stuff in the stories.
01:45:10.000 Which is wild.
01:45:11.000 That is wild.
01:45:22.000 I mean, the idea that the FBI and Homeland Security having a system of sending moderation requests to every, you know, internet platform in the country The idea that that's not a news story is insane to me.
01:45:41.000 I can't even process that.
01:45:43.000 But you have to make a conscious decision to not do that story, which is what they've done.
01:45:49.000 Which is really indicting.
01:45:51.000 Yeah, and of course, you know, they've done a gazillion stories about, you know, how I've become this evil sellout right-wing character.
01:46:07.000 The Washington Post actually humorously described me as a conservative journalist, and they scrubbed it within a day because there was so much blowback on Twitter.
01:46:21.000 I don't care about that so much because I'm used to it by now, but it's a message that's sent to other journalists, which is, if you step outside the club, we're just going to dump buckets of shit on you all day long, and that's going to be your life forever,
01:46:38.000 right?
01:46:38.000 You have to get used to that.
01:46:44.000 That's a new part of the business.
01:46:46.000 Once, when you broke a big story, you got plaudits from your peers.
01:46:52.000 And now, you know, it's a very different thing.
01:46:58.000 But you still got to do it, definitely.
01:47:01.000 You still got to do it, but more importantly...
01:47:04.000 When they continue to do that and call someone like you a right-wing journalist or call someone a far-right this or an alt-right that, and then people objectively know that that's not true, then it undermines all of their credibility and slowly but surely dissolves all confidence that people have in every other story they come up with.
01:47:23.000 And that's what we're seeing.
01:47:24.000 That's why the rel—like, what New York Times is today, to the generation that's coming up today, I used to deliver the New York Times just because it was the New York Times.
01:47:34.000 It wasn't even profitable for me.
01:47:36.000 I thought it was cool that I was delivering the New York Times because I delivered the Boston Globe and I delivered the Boston Herald and I got a route for the Times.
01:47:43.000 That's awesome.
01:47:44.000 And the Times was a pain in the ass because I had to drive, like, if I was doing the Boston Globe, which was the most popular paper, What town were you in?
01:47:51.000 Boston.
01:47:51.000 Newton.
01:47:52.000 Newton.
01:47:53.000 So I would deliver.
01:47:54.000 I would get up every morning.
01:47:56.000 I'd work 365 days a year.
01:47:58.000 You have to deliver every day if you have a paper route.
01:47:59.000 I used to have a paper route in Massachusetts too.
01:48:01.000 So I would get up and I would go to the depot.
01:48:04.000 I'd pick up my heralds.
01:48:06.000 I'd pick up my globes at a different place.
01:48:08.000 Then I'd go and get my New York Times at a different place.
01:48:11.000 And the New York Times is a nightmare because, like, if I was going to deliver to the Boston Globe, if I'm on one street, I might have ten houses on the street.
01:48:18.000 But the fucking Times, I might have one and then might have to go a mile to my next house.
01:48:26.000 You've got to carry the bag the whole way, right?
01:48:28.000 And I would try to coordinate my routes, right?
01:48:30.000 So I would have a route that was all the Globes, and then I would have a route that was the Heralds, and then I would have the Times.
01:48:37.000 And the Times was a nightmare, but I delivered the Times just because it was the New York Times.
01:48:41.000 Because you thought it was cool.
01:48:43.000 It was cool!
01:48:43.000 They had a blue plastic bag.
01:48:45.000 Everyone else had a white one.
01:48:47.000 And, you know, it's like, if that guy got the Times delivered, that's a smart dude.
01:48:51.000 That's a guy who's reading the New York Times.
01:48:52.000 And I read the New York Times.
01:48:54.000 So I was like, you know, I'm going to deliver the Times.
01:48:56.000 Like, I felt like I was a cooler person for delivering the New York Times.
01:49:00.000 That does not exist with 21-year-old people of today.
01:49:04.000 When I was 21, that was like, I mean, I had aspired to be a more intelligent, more well-read person, and that to me was a symbol of that.
01:49:14.000 And I would get a free copy of it every day, and I would read it after I was done working.
01:49:18.000 That doesn't exist anymore.
01:49:20.000 Nobody gives a fuck about the New York Times, and they think about them as like some left-wing Tumblr blog.
01:49:26.000 That's what it is.
01:49:29.000 It's not even like the establishment paper.
01:49:35.000 No.
01:49:36.000 Like you say, it's like a Facebook group for a small group of wealthy people who all went to the same schools.
01:49:51.000 I don't know.
01:49:52.000 It's dull.
01:49:55.000 And uninformative at the same time.
01:50:15.000 These kids have access to all these independent people talking about things, whether they're doing it on YouTube or podcasts or Substack or whatever it is.
01:50:24.000 They get access to independent people that are talking about real information.
01:50:29.000 And every time the New York Times prints bullshit, every time the Washington Post Prince bullshit.
01:50:34.000 It further undermines their credibility and further slides them down this inevitable road that they're on.
01:50:43.000 Right.
01:50:44.000 And it's a road to obscurity.
01:50:45.000 Nobody's going to care anymore.
01:50:46.000 It's like you're seeing it take place with CNN. You're seeing it take place with all these cable news networks.
01:50:51.000 You're seeing it take place with late night television.
01:50:53.000 Nobody gives a fuck about it anymore.
01:50:55.000 I know.
01:50:56.000 And it's slowly you have a bad business model and you're not adapting.
01:51:01.000 Yeah, the creativity's gone for late night comedy, right?
01:51:07.000 Like, that's not really a thing anymore.
01:51:09.000 You got Jimmy Fallon doing a song about different variants of COVID. Did you see that?
01:51:16.000 Could we see it, though?
01:51:18.000 Have you seen it?
01:51:20.000 I saw the Annie Vax Barbie one, which was amazing.
01:51:24.000 I didn't see that one.
01:51:25.000 But this one is so strange.
01:51:27.000 It's like...
01:51:29.000 I mean, what did they drug him up with to get him to agree to do this?
01:51:33.000 Like, I want to be a fly on the wall in that meeting, where they go, okay, Jimmy, this is what we're going to do.
01:51:39.000 You're going to be singing about all the different variants, and it's like, here's a song, and you're going to be dancing and singing.
01:51:45.000 He's like, okay, okay, okay, okay, okay.
01:51:51.000 XBB.1.5, another brand of COVID-19 has arrived.
01:51:57.000 It's a new strain, but it isn't the same.
01:52:01.000 Sounds more like Elon Musk, his name.
01:52:04.000 It's XBB.1.5, not UB40 who sings Red Red Wine.
01:52:11.000 Put on your mask when you're inside a facility.
01:52:15.000 It could be a robot from a Star Wars trip.
01:52:20.000 God.
01:52:21.000 What is happening there?
01:52:22.000 So that's supposed to be...
01:52:24.000 Is that Devo that he's...
01:52:25.000 No.
01:52:26.000 Love Shack.
01:52:27.000 Yeah.
01:52:27.000 Oh, that's right.
01:52:28.000 It's B-52s, right?
01:52:30.000 No.
01:52:32.000 Who's Love Shack?
01:52:33.000 Was it B-52s?
01:52:34.000 That sounds right.
01:52:37.000 Yeah, it is a B-52s, right?
01:52:40.000 Yeah.
01:52:41.000 That's it, right?
01:52:45.000 Yeah, you're right.
01:52:46.000 Of course, of course, of course, yes.
01:52:47.000 Oh, God.
01:52:49.000 But that's just, like, straight up...
01:52:50.000 What is that?
01:52:51.000 That's insanity.
01:52:52.000 Like, look at his little dance.
01:52:54.000 Like, there's no enthusiasm in his face.
01:52:57.000 It's like he's drugged.
01:53:01.000 It kind of reminds you of a USO thing, right?
01:53:06.000 I don't know what it reminds me of.
01:53:08.000 It reminds me of madness.
01:53:10.000 It's just pure madness.
01:53:13.000 That's not what he got in the show business.
01:53:17.000 Someone talked him into doing that.
01:53:18.000 That's not his idea.
01:53:20.000 Somebody had a meeting.
01:53:21.000 I don't think it's his idea.
01:53:22.000 30 million subscribers to that video did not do very well.
01:53:25.000 104,000 views and 4,000 of them are us.
01:53:28.000 I know we've played it twice or three times now.
01:53:31.000 That's really...
01:53:32.000 Yeah, nobody gives a fuck about that.
01:53:33.000 That's nonsense.
01:53:34.000 How do they have that many subscribers, too?
01:53:36.000 They probably buy them.
01:53:37.000 That's the problem, too, is that you're finding out how many people buy social media followers.
01:53:41.000 Right, right.
01:53:43.000 Yeah, and that people were selling blue checkmarks at Twitter.
01:53:47.000 That's the other weird thing, where they were selling verified accounts.
01:53:52.000 So, like, you could pay someone, and they would get you verified.
01:53:55.000 People were spending, like, $5,000, $10,000...
01:53:58.000 How little of a life do you have to have for that to matter that much to you?
01:54:02.000 I guess if you're, like, an independent journalist or you're some sort of a YouTuber that's trying, like, if you have that check next to your name, that gives you more credibility.
01:54:12.000 I love when the fact they were removing check marks, like, we're going to take away your verification.
01:54:17.000 Oh, yeah, they did that to Thomas Chattern Williams, remember that, after the Harper's letter?
01:54:22.000 No.
01:54:22.000 What was that about?
01:54:24.000 We still don't know.
01:54:25.000 They just took his checkmark away.
01:54:29.000 He was the guy who organized the Harper's letter, the pro-free speech declaration.
01:54:36.000 No, I'm not aware of that.
01:54:38.000 Or I forgot about it.
01:54:39.000 It was a petition.
01:54:44.000 Where he organized a bunch of high-profile people basically to say that, like, canceling is bad and we should all respect each other's opinions and, you know, support academic freedom and that sort of thing.
01:54:59.000 So we got people like Salman Rushdie and Noam Chomsky to sign onto it.
01:55:06.000 But then there was also Barry Weiss and J.K. Rowling were on the list and it soon became a thing in the media that to be on the Harper's letter It was like membership in a hate society.
01:55:28.000 And he was just absolutely dumped upon.
01:55:34.000 He denounces a racist even though he's black.
01:55:38.000 And he got his check taken away.
01:55:43.000 I don't know if he had it.
01:55:44.000 He was trying to get verified, it seems like.
01:55:47.000 I'm reading this article about it right now, and he's saying he tried to get verified here for the second time.
01:55:52.000 Maybe he got it taken away.
01:55:53.000 I think he got it taken away, because he had it at one point.
01:55:58.000 It says, denied verification.
01:56:02.000 Did he definitely have it at one point in time?
01:56:05.000 I'm just saying, I don't know that that's what this story sounds like it was.
01:56:09.000 It sounds like he was trying to get it again.
01:56:10.000 That is so amazing.
01:56:13.000 I mean, this brings me back to that conversation I had with that YouTube lady who said it's hate speech.
01:56:17.000 It is so amazing that you would say that freedom of speech and to be able to talk about things openly is somehow hate speech.
01:56:27.000 Yeah, no, the Harper's Letter thing was nuts because if you actually read the Harper's Letter, it's...
01:56:36.000 It's so anodyne.
01:56:38.000 It barely says anything at all.
01:56:41.000 It's very kumbaya.
01:56:44.000 It's just like, hey, let's all get along.
01:56:46.000 Let's not get people fired for saying harmless things, that kind of thing.
01:56:52.000 Crazy talk.
01:56:53.000 Right, yeah.
01:56:54.000 Outright crazy talk.
01:56:55.000 Yeah, and it became a huge thing.
01:56:58.000 I mean, it changed media.
01:57:00.000 Like, it ended up being one of the reasons that Matt Iglesias left Vox, which he co-founded, because he signed the letter and there was somebody on staff who felt threatened by that.
01:57:12.000 It was a whole kerfuffle within the media industry, which is, you know, endlessly navel-gazing anyway.
01:57:19.000 But yeah, no, the...
01:57:24.000 You can become the subject of one of those ridiculous villain for the day campaigns really, really easily now.
01:57:38.000 I think that that is changing because now people aren't scared to speak their mind on Twitter.
01:57:43.000 Like, you're seeing so much pushback.
01:57:46.000 When someone types something on Twitter now and it's ridiculous, now people aren't scared to go in after it.
01:57:54.000 They're not worried about losing their account, which they were before.
01:57:57.000 Which is, I think, one of the more interesting things about Elon Musk buying Twitter is that you are seeing a much more vigorous debate.
01:58:05.000 You're seeing a lot more trolling.
01:58:07.000 You're seeing a lot more people that are posting, like, ridiculous GIFs, like, to make fun of people after they say something stupid.
01:58:13.000 Yeah.
01:58:14.000 Yeah.
01:58:14.000 I mean, I hope that's one of the results, right?
01:58:18.000 Because the old Twitter was just a grindstone of official messaging where if you said like a thing, like a micrometer outside whatever the narrative was, you could expect to just be descended upon by all these people.
01:58:36.000 And nobody – you just ended up not wanting to bother, right?
01:58:40.000 So you wouldn't say anything.
01:58:42.000 But I hope people are feeling encouraged to say more now.
01:58:49.000 But as my experience shows, you can still end up getting lots and lots of – You know, shit in the media for doing the wrong thing.
01:59:01.000 And that can last quite a long time.
01:59:03.000 But I mean, how much influence does that media really have anymore?
01:59:06.000 I mean, just because something's written down, how much different is it than people just having a conversation and putting that conversation on YouTube?
01:59:13.000 Like, the actual idea that someone writing an article about someone, like a hit piece on you, for instance, that that actually has an impact anymore.
01:59:24.000 It's really no different than two people on some sort of a progressive podcast talking about, oh, Matt Taibbi's now a right-winger now.
01:59:33.000 It's like crazy.
01:59:34.000 Like, what happened to him?
01:59:35.000 He used to be the guy with the Rolling Stone.
01:59:37.000 He was so progressive.
01:59:38.000 And now he used to go after Wall Street.
01:59:41.000 Now...
01:59:42.000 Right.
01:59:42.000 It's not—those articles don't work.
01:59:45.000 No, they don't work.
01:59:45.000 And people aren't reading them.
01:59:47.000 Yeah, and look, I mean, you know this too, right?
01:59:52.000 Because when there was that whole movement to try to get Bernie Sanders to denounce you and everything like that, like, after you endorsed him— If you're not afraid of whatever the ultimate consequence is,
02:00:10.000 you learn that these cancellation episodes are survivable.
02:00:18.000 Once that happens, you lose your fear of it pretty quickly.
02:00:22.000 Well, they embolden...
02:00:25.000 People after they've survived it.
02:00:27.000 That's the other thing.
02:00:28.000 Yeah.
02:00:28.000 That now you realize that, oh, this is okay.
02:00:33.000 Like, I could do this.
02:00:34.000 Yeah.
02:00:34.000 Not only that, like, when all the things that was happening with Spotify with me, I gained two million subscribers in a month.
02:00:42.000 Right.
02:00:42.000 So it worked the opposite way.
02:00:44.000 Like, Patrick Bet David did a whole thing on it.
02:00:46.000 He said, he goes, in my estimation, the amount of publicity they gave him was worth about $300 million.
02:00:54.000 That's great.
02:00:55.000 Yeah, but that's the new world.
02:00:57.000 Whereas using those methods before we had independent journalism, before we had the internet, before we had YouTube and all these different ways that you could just get a message out, it was a death sentence.
02:01:09.000 If they all came at you in targeted fashion like they did, you're fucked.
02:01:15.000 They were going to change the narrative of you.
02:01:16.000 And it changed the narrative already with some people.
02:01:18.000 Some people still believe certain things about me because they read it on CNN or they heard it on CNN. Sure, sure.
02:01:23.000 There's no way you can get around that, but for most people that are actually paying attention, all it does is undermine the credibility of those sources.
02:01:30.000 Anybody who's calling you a right-wing journalist, like anybody who knows you, knows that's straight horse shit.
02:01:37.000 Right.
02:01:37.000 The amount of damage they're doing to their own reputation by printing that, the individual author and the publication itself, the publication should be terrified of anybody that would be so willing to undermine their credibility by calling you a right-wing journalist for just one point about one thing that they disagree with you on.
02:02:02.000 So they're going to make this blanket statement that's so patently untrue.
02:02:08.000 Right.
02:02:09.000 If that was my news organization, I'd be like, what the fuck are you doing?
02:02:14.000 Do you know what damage you're going to do by calling him a right-wing journalist?
02:02:17.000 Now, 100,000 people that are going to read this, you've got 50,000 people that now think you're capable of being full of shit.
02:02:24.000 Right.
02:02:25.000 And they're probably going to go over to Substack or Spotify or whatever.
02:02:29.000 I mean, like, that's...
02:02:31.000 Yeah, they haven't figured that out yet.
02:02:34.000 But, you know, there's still the collateral damage of, you know, they're able to say nasty things about you that people hear, which is not fun.
02:02:43.000 But, you know, you're right.
02:02:45.000 There was a moment before independent media where if they all decided to do it, you were done, you know?
02:02:54.000 I mean, I remember the first time that, you know, I knew there was a story coming out about me and my I passed with the Exile and I knew I was in serious trouble.
02:03:06.000 At the time, there was no alternative.
02:03:11.000 If the club kicked you out, there was nowhere else to go in journalism or in any kind of media job.
02:03:20.000 But that's no longer the case.
02:03:24.000 They don't have that absolute power anymore.
02:03:28.000 They have less power than the independents, which is nuts, and it happened so quickly.
02:03:32.000 Right, right.
02:03:33.000 I mean, if you look at, like, what Crystal and Sagar have done, in Breaking Points, Breaking Points is fucking gigantic now.
02:03:41.000 Absolutely.
02:03:41.000 And I remember when they weren't independent, they were thinking about going independent, and I was like, I'll help you.
02:03:46.000 I'm like, we can do this.
02:03:47.000 You guys are fantastic.
02:03:49.000 You're honest.
02:03:51.000 You talk about things.
02:03:52.000 I might not agree with you, but you're talking about things based on your actual interpretation of what's going on and your opinions on these things.
02:03:59.000 That's what people want.
02:04:00.000 Yeah, and they had a concept that at the time was forbidden, which was people on the opposite end of the political spectrum trying to have a civilized conversation.
02:04:13.000 Remember when they used to have a show like that on Fox, Hannity and Combs?
02:04:17.000 Right.
02:04:17.000 Remember that?
02:04:18.000 But that was sort of like pro-wrestling.
02:04:21.000 Combs' job was to get pinned.
02:04:23.000 Right.
02:04:23.000 Yeah.
02:04:24.000 You know?
02:04:25.000 I mean...
02:04:27.000 It wasn't a real fight.
02:04:30.000 Yeah.
02:04:30.000 Oh, he tapped out quick, too.
02:04:32.000 Yeah.
02:04:32.000 Yeah, he always got pinned.
02:04:33.000 But that's what the show was.
02:04:35.000 It was like, we got a guy from the left, and we got a guy from the right.
02:04:38.000 And, you know, Hannity went on to become a big star, and where did Combs go?
02:04:43.000 He quit.
02:04:44.000 Probably got brain damage from being pinned so many times.
02:04:50.000 That would be an interesting one.
02:04:52.000 Whatever happened to Alan Combs, that would be an interesting question.
02:04:55.000 Does he have a podcast or anything?
02:04:56.000 Where is that guy?
02:04:57.000 But he was just so wishy-washy.
02:04:59.000 He was like the perfect caricature of a left-wing guy confronted by a strong right-wing pro-America, Sean Hannity.
02:05:11.000 I'm friends with Trump.
02:05:13.000 I remember I knew a guy named Jeff Cohen who was briefly on Crossfire.
02:05:20.000 He played the from the left person.
02:05:24.000 He told me afterwards that the role of the liberal in that show was to be somebody who couldn't punch back.
02:05:35.000 Right?
02:05:36.000 Like, they were trying not so much to talk about the politics, but to highlight the kind of weeniness of that character, right?
02:05:45.000 Yeah.
02:05:46.000 And so, because he didn't go along with it, so he didn't last...
02:05:52.000 Terribly long in that role, but that's the kind of person they wanted.
02:05:56.000 They wanted somebody who was kind of snivelly, retreating, right?
02:06:02.000 And the conservative was always like this attacking, aggressive...
02:06:10.000 No-nonsense.
02:06:11.000 No-nonsense, sort of cackling, confident character.
02:06:16.000 It made for good TV, but as politics, it was totally nuts.
02:06:23.000 So what Crystal and Sagar are doing is the better version of that.
02:06:28.000 An honest version of that.
02:06:30.000 An honest version of it.
02:06:31.000 And they really are friends.
02:06:32.000 They like each other.
02:06:33.000 He is on the right.
02:06:34.000 She is on the left.
02:06:35.000 And they disagree on things.
02:06:37.000 But it works.
02:06:38.000 It really works.
02:06:39.000 Right.
02:06:39.000 And why is that forbidden?
02:06:43.000 That's a really interesting question, right?
02:06:45.000 Yeah.
02:06:45.000 Like, why do people not want us to know that it's possible for people on the right and the left to talk in a civilized way and disagree on some things but still get along?
02:06:55.000 Yeah.
02:06:57.000 Well, why doesn't anybody want us to know that?
02:07:00.000 I think that's a question that's worth exploring.
02:07:04.000 Why does CBS not want us to know that?
02:07:07.000 Why does Fox, for that matter, not want us to know that?
02:07:10.000 Well, I think the fear is that if you do allow that, like say if you're NBC or CNBC, and you allow this right-left thing to happen on your show, what if the person on the right makes a really good point?
02:07:25.000 And what if they swing people more towards the right?
02:07:28.000 Like what if this person's on the air multiple times and they're really compelling and maybe they're better at arguing or maybe they're more reasonable or maybe they're more objective or maybe they're more calm.
02:07:39.000 Maybe whatever about them is more attractive than the person who's on the left.
02:07:43.000 Now all of a sudden you got a problem.
02:07:44.000 Because now you have people that are tuning in specifically for this one woman or one man who is right-leaning on a network that has a progressive agenda.
02:07:54.000 You have a left-wing agenda.
02:07:55.000 Right, right.
02:07:56.000 And you're funded by these left-wing super PACs and left-wing special interest groups and left-wing advertising.
02:08:03.000 And you're like, hey, hey, hey, what is this fucking abortion is murder argument that this guy just made reasonably?
02:08:11.000 What is this term limits argument that this person made?
02:08:15.000 What is this argument this person made about getting money out of politics?
02:08:19.000 Are you fucking crazy?
02:08:20.000 Get that off of there.
02:08:22.000 Yeah, and that's too bad because what ends up happening is we end up in this sort of system of bifurcated media where everybody's in armed camps, like they don't talk to one another because there's no model for that in American society.
02:08:41.000 We don't have a place where we can see people of differing political opinions getting along with one another and acting like civilized human beings.
02:08:51.000 It doesn't really exist in establishment culture, establishment media.
02:08:58.000 But that's why people are rejecting it.
02:09:01.000 Because they know.
02:09:03.000 They're picking up their kids at school and talking to their neighbors who they know have totally different politics and they're getting along fine with them.
02:09:11.000 Yeah.
02:09:12.000 Right?
02:09:12.000 And so they know it's a lie, you know?
02:09:16.000 And I think it's exhausting.
02:09:19.000 It's starting to run its course, which is great because I've been waiting for it to run its course for a long time.
02:09:24.000 So it's cool to see.
02:09:27.000 Well, that's why the World Economic Forum and things along those lines are so fascinating because you can see that they're the ones that are holding the strings that dangle the narratives in front of the people that make them attractive and then you realize like, well, this is not our real problem.
02:09:41.000 Our problem is not really these narratives.
02:09:43.000 Our problem is who's promoting these narratives and what are they doing while they're promoting that and we're distracted.
02:09:48.000 Well, they're trying to institute a centralized digital currency.
02:09:51.000 They're trying to give people vaccine passports and some sort of a social credit score system.
02:09:56.000 They're trying to do all sorts of weird methods of control that you're not going to be able to get out of it if you're on the left or if you're on the right.
02:10:03.000 It's going to fuck up everybody's life.
02:10:05.000 And in the meantime, we're arguing about who's right, Greta Thornburg or Andrew Tate.
02:10:10.000 It's like these distractions that they put in front of us in the media that get us so hyped up while real shit is going down that there's real decisions that could be made that might affect you forever.
02:10:23.000 The amount of freedom you have, your ability to travel.
02:10:26.000 In China, you say the wrong thing.
02:10:28.000 You can't buy a plane ticket.
02:10:30.000 You can't go anywhere.
02:10:31.000 Oh, sorry, you're not allowed to buy a home.
02:10:33.000 We saw you tweet about something we found disagreeable.
02:10:37.000 Well, we have to worry about that now, too, in the States, though, right?
02:10:42.000 You have to worry.
02:10:43.000 If you cross a certain line, are they going to make it difficult for you to process your credit card transactions?
02:10:51.000 Will you be unemployed?
02:10:52.000 Will you be unemployable?
02:10:55.000 Will you not be able to use PayPal anymore?
02:11:00.000 It's stuff like that, that kind of creeping dystopian systems of control.
02:11:12.000 It's a big news story, and I think people recognize that it's a serious thing, but we don't see it talked about very much in the corporate press because, again, they're in favor of it, right?
02:11:27.000 But I'm terrified by all that stuff.
02:11:31.000 When PayPal was saying they were going to fine people for misinformation, I'm like, hey, hey, hey, you guys are just supposed to be a way I can buy things online.
02:11:41.000 Yeah, that's it.
02:11:42.000 When you're saying, what misinformation?
02:11:44.000 On what?
02:11:45.000 On what platform?
02:11:46.000 On any platform?
02:11:47.000 What if I'm at home?
02:11:48.000 Are you listening?
02:11:49.000 Like, how the fuck do you, what are you saying when you're saying misinformation?
02:11:52.000 You're gonna fine me?
02:11:53.000 Where's that money going?
02:11:54.000 You're stealing money from me?
02:11:56.000 Because you don't agree?
02:11:57.000 And what happens if it turns out that misinformation turns out to be true?
02:12:01.000 Yeah, no.
02:12:02.000 I did a story about, I think it was Mint Press got, they had funds frozen by PayPal, if I remember correctly.
02:12:15.000 And Yeah, but the idea that this company, they should be doing one thing.
02:12:21.000 They're trying to make a transaction happen.
02:12:24.000 Why are they in the truth business?
02:12:27.000 Exactly.
02:12:29.000 That can only happen if something has gone wildly wrong in society and somebody feels the need to start using all these different pressure points to control people, like whether or not you can process credit card transactions.
02:12:44.000 Or, you know, PayPal.
02:12:48.000 You know, if you leave a record of, you know, certain kind of web surfing, maybe, you know, that's going to, you know, be a negative that will appear somewhere.
02:13:01.000 Like, yeah, that stuff is all scary.
02:13:04.000 Like, we...
02:13:06.000 It's a serious problem, I think.
02:13:10.000 Especially when it comes to money.
02:13:11.000 Like, you can freeze funds, you can move money around, so you can withdraw money from someone's account.
02:13:18.000 Because you think, like, what was it, like $2,500 or something around those lines?
02:13:23.000 Something like that, yeah.
02:13:23.000 Some fine that you would get for misinformation.
02:13:26.000 So if your grandma posts some crazy shit about Trump really winning the election, you know, you have a crazy QAnon grandma, like, they're gonna steal her money?
02:13:35.000 Yeah.
02:13:36.000 Like, what are you saying?
02:13:38.000 Right.
02:13:38.000 Or the Canadian trucker thing, the GoFundMe thing, right?
02:13:42.000 Like, you raise a whole bunch of money.
02:13:43.000 Well, not just the GoFundMe thing.
02:13:45.000 How they froze their fucking bank accounts.
02:13:47.000 Yeah.
02:13:48.000 Yeah.
02:13:48.000 And all these people did was protest.
02:13:51.000 Right.
02:13:52.000 And you don't have to agree with the protests, but you certainly have to be freaked out by their response to it.
02:13:58.000 Well, also freaked out that Trudeau is one of the young global leaders of the World Economic Forum and that Trudeau labeled the truckers as misogynists and racists.
02:14:10.000 Right.
02:14:11.000 Like, says who?
02:14:13.000 Where are you getting this from?
02:14:15.000 You're not even pointing to a thing they've said.
02:14:17.000 You're just saying that in this blanket statement to try to diffuse everything they've said and everything they stand for.
02:14:24.000 It's so transparent and such a checkers move in a world of 4D chess.
02:14:30.000 Yeah, I mean, I always think back to this moment in the 2016 presidential campaign when Bernie was drawing some blood against Hillary by talking about her, the gigantic speaking fee she was taking from banks.
02:14:47.000 And they tried to throw a bunch of stuff back at him.
02:14:50.000 None of it worked until one day she came out and she said this thing, if we break up the banks tomorrow, will that end racism?
02:14:59.000 And suddenly, this idea sort of popped out into the ether that talking about Hillary Clinton's ties to banks was somehow racist or somehow not progressive,
02:15:18.000 right?
02:15:18.000 And because Sanders, who had grown up his whole life in that ecosystem, There was nothing more terrifying than being accused of racism or misogyny or whatever it was because they came out with the Bernie bro thing right after that.
02:15:32.000 Yeah, exactly.
02:15:33.000 And it was a disciplinary method basically, right?
02:15:37.000 Yeah.
02:15:39.000 Go to a certain place, we're going to start dropping these words on you, and those words are not survivable in certain areas.
02:15:48.000 So it's very effective, but I think it can't be effective forever, I don't think.
02:15:56.000 No, I think that effectiveness is waning like a bucket with a hole in the bottom of it.
02:16:03.000 And I don't think there's any escape.
02:16:05.000 I think that the path they've put themselves on, you can't return from it.
02:16:10.000 And I think they're doomed.
02:16:12.000 I really do.
02:16:13.000 I think we're looking at a future where almost all credible media is independent.
02:16:17.000 I really do.
02:16:19.000 I just don't think they're gonna make it.
02:16:20.000 I think the only thing that Hollywood will be good for and like these entertainment corporations would be good for is creating things that are exorbitantly expensive, like films with special effects.
02:16:32.000 Well, they are good at that still.
02:16:33.000 Yeah, but that's the only thing they'll be able to do.
02:16:35.000 They make good action movies with superheroes in them, I guess.
02:16:38.000 I mean, you can make films with iPhones now.
02:16:41.000 Right.
02:16:41.000 I mean, a real film.
02:16:42.000 Like, I have a guy that was on the podcast named Sonny from Best Ever Food Review.
02:16:48.000 He goes to these countries and samples their exotic foods and travels and sees their cultures and hangs out with these tribal people.
02:16:57.000 Really interesting show on YouTube.
02:16:59.000 He went to Egypt and they confiscated all of their equipment, everything.
02:17:04.000 They took all of their cameras, even though they had visas, they had working visas.
02:17:08.000 He said it's the worst place to film.
02:17:10.000 They're very restrictive.
02:17:12.000 And since then, because he made a video about it, they've actually changed the laws.
02:17:16.000 Point being, he decided to film the entire episode on iPhones.
02:17:21.000 And it looks great.
02:17:22.000 Right.
02:17:23.000 It looks amazing because these new iPhones are so fucking good and these new Samsung phones are so good.
02:17:28.000 You don't need really complicated equipment anymore.
02:17:31.000 You can make a really great 4K video with your phone easily.
02:17:38.000 Right.
02:17:38.000 You have plenty of storage.
02:17:39.000 It's not hard to do.
02:17:41.000 Yeah.
02:17:41.000 And so that's what they did.
02:17:42.000 And now he's looking at it like, man, why are we traveling with all this shit?
02:17:45.000 I could just have my cameraman use iPhones.
02:17:48.000 Right.
02:17:48.000 Yeah, and you don't need a big institutional backer anymore.
02:17:52.000 And if the only way they can fight back against independent content creators is by calling every single one of them a racist, misogynist, right-winger, or whatever...
02:18:08.000 Pretty soon you're going to get to the situation where we're near it now, where all those people are running into one another and every single one of them in a room has already been through episodes like that, right?
02:18:23.000 It loses its power at that point.
02:18:26.000 Once you've done it to a million people or two million people, people stop being shocked by the term.
02:18:34.000 It's a kind of wolf.
02:18:35.000 Yeah, exactly.
02:18:36.000 It's a weird time.
02:18:39.000 Really interesting.
02:18:40.000 It's fun.
02:18:41.000 Yeah.
02:18:42.000 It's fun to watch everything get fucking thrown up into the wind and scatter all over the place.
02:18:47.000 It's fun.
02:18:47.000 It is fun, and I'm actually having fun with the job for the first time in a long time, so I hope you are too.
02:18:55.000 I'm having a great time.
02:18:57.000 Yeah.
02:18:57.000 Do you enjoy doing Substack and being independent and doing things the way you do and doing your podcast?
02:19:04.000 Are you enjoying that?
02:19:05.000 Yeah.
02:19:06.000 Yeah, I mean...
02:19:09.000 It's a little different because I used to have the luxury to spend 10 weeks on investigating something and I don't anymore.
02:19:16.000 I got to crank stuff out, right?
02:19:18.000 But I would never have been able to do this Twitter Files thing, you know, just on a lark without asking lots of people for permission.
02:19:27.000 You know what I'm saying?
02:19:28.000 Can I ask you this?
02:19:29.000 How does Elon set that up?
02:19:32.000 How did he approach you guys?
02:19:35.000 Individually?
02:19:37.000 You know, I mean, I woke up and I got a text one day, basically.
02:19:44.000 And, you know, would you like to do this?
02:19:49.000 And the answer, of course, is yes.
02:19:52.000 And by the way, people talk about this all the time.
02:19:55.000 They want to make a big deal about...
02:19:59.000 Because there was an army of people after the first Twitter files who all said the same thing.
02:20:04.000 Like, imagine doing PR for the richest man on earth, right?
02:20:07.000 Like, that was the universal response of all the Mehdi Hassans of the world.
02:20:15.000 Look, the story here is about organizations that are vastly more powerful even than the richest man on earth.
02:20:25.000 It's about the FBI, the NSA, the CIA, the DOD, like DHS. And it's about—it's an opportunity to see how these figures operate in the wild.
02:20:40.000 When you get a source like that, like, it's not important what their motives are.
02:20:47.000 What's important are what your motives are, you know?
02:20:50.000 And my motives are I want to know— What was going on and what these – how these organizations operate.
02:20:59.000 And you would never turn down that – no real journalist would turn down that opportunity.
02:21:05.000 And incidentally, I kind of like Elon Musk.
02:21:11.000 I mean he's got a – He's got a sense of humor about this and I think his ideology in terms of...
02:21:27.000 You know, the desire for putting this out there, I mean, who would do this?
02:21:32.000 Who would spend that much money to do this?
02:21:34.000 His sense of humor is an internet sense of humor.
02:21:37.000 Oh, absolutely.
02:21:38.000 Like, he posted that meme of the pregnant man next to the photo of Bill Gates and his pot belly, and he said, in case you want to lose a boner real fast, and he put that on Twitter.
02:21:50.000 I mean, imagine getting dunked on by the richest man in the world on Twitter.
02:21:56.000 But that's hilarious.
02:21:58.000 It's hilarious.
02:21:59.000 Right?
02:21:59.000 Yeah.
02:22:00.000 Again, could you imagine John Jacob Astor or, you know, one of the Guggenheims doing like a teenage joke in public?
02:22:10.000 Of course not.
02:22:12.000 It's funny.
02:22:14.000 It's the same phenomenon with Trump.
02:22:18.000 It's like...
02:22:19.000 You know, I couldn't stand his politics, but if you denied that he was funny, you were lying.
02:22:25.000 Like, the campaign was funny.
02:22:27.000 When he called Kim Jong-un Little Rocket Man, I mean, come on.
02:22:33.000 That shit's hilarious.
02:22:34.000 The guy's got great timing.
02:22:35.000 I mean, he could have been a stand-up.
02:22:37.000 His timing is excellent.
02:22:40.000 Yeah, he probably would have been good.
02:22:44.000 Yeah, sure.
02:22:46.000 Probably would have said some hilarious shit.
02:22:48.000 Look, the guy can – and he's good off the cuff.
02:22:51.000 It's not saying he's the best leader we've ever had.
02:22:53.000 It's not saying he's a great statesman or a great president or even a good representation of what America is supposed to stand for.
02:22:59.000 Because not.
02:22:59.000 He's not.
02:23:00.000 And it's a real problem.
02:23:01.000 He's a real problem because there's a lot of people that are conservative-minded, fiscally conservative, hardworking people.
02:23:07.000 They don't like any of his antics.
02:23:09.000 And they're forced to choose between someone they deeply disrespect versus someone who they also deeply disrespect.
02:23:17.000 It's like, what am I doing here?
02:23:19.000 How do I go?
02:23:20.000 I'm trying to figure out if I'm conservative anymore.
02:23:22.000 Am I a liberal now?
02:23:24.000 And you don't know because you don't want to align yourself with problematic personalities that also embody some of the economic ideas that you agree with.
02:23:33.000 Right, right.
02:23:34.000 But his send-up of the whole process?
02:23:38.000 Yeah.
02:23:39.000 Was accurate, you know?
02:23:42.000 The swamp.
02:23:43.000 And the self-seriousness of it.
02:23:46.000 He was constantly kind of making fun of how seriously people like Hillary Clinton took themselves.
02:24:01.000 Yeah.
02:24:04.000 There was no way it was not going to land.
02:24:06.000 Like, you know, Jeb Bush, you know, saying, my mother is the strongest person in the world, and him, you know, saying she should be running.
02:24:16.000 You know, like, that stuff was just, it was designed, it was never not going to work, you know?
02:24:24.000 And the fact that none of us, or none of the reporters could see it at the time was kind of amazing.
02:24:31.000 Yeah, well, I think people were just so terrified that an asshole like that could actually win and become president.
02:24:37.000 And I remember we were watching.
02:24:38.000 We did an End of the World podcast from the Comedy Store.
02:24:42.000 We're a bunch of comics.
02:24:43.000 We were watching the election take place and talking shit while it was happening.
02:24:49.000 And then afterwards, it was all over.
02:24:53.000 We were all stunned.
02:24:54.000 And we went back to the Comedian's Bar.
02:24:56.000 And I was watching Jake Tapper on TV with this somber look on his face.
02:25:01.000 Well, it really does look like Trump is the president.
02:25:05.000 Like, the whole thing was so surreal and wild that, I mean, they just did everything they could to stop that from happening and it didn't work.
02:25:16.000 Yeah.
02:25:16.000 Yeah, no, I mean, and even until the end, I didn't think it was going to happen.
02:25:20.000 But then Florida came in.
02:25:22.000 Yeah.
02:25:22.000 Do you remember that moment?
02:25:23.000 Yeah.
02:25:24.000 Yeah.
02:25:25.000 And then I was like, holy shit, this is going to happen.
02:25:27.000 Like, One of the best things is watching the compilation of the Young Turks watching the election go down and the beginning being super confident and then towards the end they're like, FUCK! And then there's the same people that have to pretend that Biden's okay.
02:25:46.000 It's amazing how well this country is running while Biden is literally not there.
02:25:52.000 Oh, yeah.
02:25:53.000 Yeah.
02:25:55.000 Completely absent.
02:25:56.000 Yeah.
02:25:57.000 Out of it.
02:25:58.000 It's gone.
02:25:58.000 I mean, it's Weekend at Bernie's, the whole presidency.
02:26:02.000 And I love the idea that they're going to do Weekend at Bernie's, too, which is great.
02:26:07.000 Now, I wanted to ask you about this because I had a conspiracy theory.
02:26:10.000 I think when he announced that he was going to run again, he said that he was going to run again.
02:26:15.000 He talked about running again.
02:26:17.000 Then they start finding, like, classified files.
02:26:21.000 Absolutely.
02:26:22.000 Absolutely.
02:26:23.000 I mean, you saw last week Andrew Weissman, who was one of the lead prosecutors in the Mueller investigation, was tweeting all these things about Biden, right?
02:26:34.000 You know, so there's no question that the party and maybe some folks in certain agencies were sending him a message, I think.
02:26:45.000 Yeah.
02:26:47.000 Yeah, you can't prove that.
02:26:49.000 It's not easily provable.
02:26:51.000 But it certainly feels that way.
02:26:53.000 Yeah, it seems very transparent, like a lot of news stories.
02:26:58.000 It seems pretty obvious what's going on.
02:27:02.000 But again, that's what we need a real press for, to get to the bottom of it so that we can actually talk about it.
02:27:08.000 Yeah.
02:27:08.000 Yeah.
02:27:09.000 Yeah.
02:27:09.000 But clearly, yeah.
02:27:11.000 Right?
02:27:12.000 I mean, it's been six years.
02:27:14.000 They didn't go looking for that stuff in the Corvette until there was suddenly a decision that, nah, we don't really want him running again.
02:27:24.000 But it's just so amazing to watch the hypocrisy play out.
02:27:29.000 Like, do you not remember what you said about the documents at Mar-a-Lago?
02:27:34.000 That wasn't that long ago.
02:27:36.000 And at least in Mar-a-Lago, they were in a safe.
02:27:39.000 Oh my, I was in a locked garage.
02:27:42.000 Oh, your garage was locked?
02:27:44.000 Well, those are impossible to get into.
02:27:46.000 They're way harder to get into than a safe.
02:27:49.000 Was it in the backseat of a classic car?
02:27:52.000 No.
02:27:53.000 He had him in his fucking Corvette, right?
02:27:56.000 Didn't he?
02:27:57.000 Yeah, exactly.
02:27:58.000 Yeah, and he made a big deal about that too.
02:28:00.000 And then they weren't just there.
02:28:01.000 They're in multiple locations.
02:28:02.000 Right.
02:28:03.000 So they keep finding these classified documents after he had given Trump so much shit.
02:28:09.000 Like, they wanted to take Trump down for having these classified documents.
02:28:13.000 They're making it like this huge sticking point.
02:28:16.000 I mean, they're throwing the Espionage Act at him, which is like, you know, five years of count.
02:28:21.000 Yeah.
02:28:22.000 Right?
02:28:22.000 I mean...
02:28:23.000 And meanwhile, Biden has even more.
02:28:26.000 Right.
02:28:26.000 I know.
02:28:27.000 I know.
02:28:27.000 And his own aides turned him in!
02:28:30.000 Yeah.
02:28:31.000 It's like an Inspector Clouseau act, the whole thing.
02:28:33.000 It's incredible.
02:28:35.000 What do you think happens in 2024?
02:28:37.000 Like, who do the Democrats pick?
02:28:38.000 Do you think they go with Kamala again, or does she develop a disease?
02:28:44.000 Maybe she's like anxiety or maybe she's got restless leg syndrome and she can't do it anymore.
02:28:50.000 Yeah, something unfortunate is going to happen to her.
02:28:53.000 I mean, look, they have to know that she's not viable as a candidate because they tried twice already to make her the candidate in the last election cycle.
02:29:03.000 Well, maybe if they wait enough time.
02:29:06.000 The passage of time.
02:29:09.000 It wasn't that she wasn't, you know, reaching a contender threshold.
02:29:17.000 She was basically flatlining despite massive media attention, you know?
02:29:24.000 Do you think that if they have good speechwriters and they get a hold of her and go, listen, this is your last fucking chance at this dance.
02:29:32.000 Okay, we gotta do this right.
02:29:33.000 And this is what you gotta do.
02:29:34.000 You gotta listen to the speechwriters.
02:29:36.000 No more ad-libbing.
02:29:38.000 No more doing this.
02:29:39.000 She did this a lot.
02:29:40.000 We're gonna write some stuff for you and no more going off script.
02:29:43.000 When she goes off script, she rambles.
02:29:45.000 You know?
02:29:47.000 Yeah, I don't...
02:29:49.000 I think they want Gavin Newsom to be the candidate.
02:29:52.000 Do you think that's sustainable though?
02:29:54.000 Oh, he's totally unlikable in the worst way, but I get the sense that that's who they want to be the candidate.
02:30:02.000 I mean, look, they had the guy in the White House when When Biden went overseas.
02:30:07.000 Was he there?
02:30:08.000 Yeah, he went to the freaking White House.
02:30:10.000 He's filmed going into the White House when Biden takes a trip overseas.
02:30:14.000 What was he doing there?
02:30:14.000 I forget.
02:30:17.000 But this was at the time when he was running campaign ads against DeSantis.
02:30:22.000 Don't you think people go ballistic if they try to do Kamala Harris and him?
02:30:26.000 Because if Kamala Harris decides to stay...
02:30:29.000 Then, like, what, is she the president?
02:30:31.000 Or is he the president?
02:30:32.000 And if Kamala Harris leaves, people are going to freak out.
02:30:34.000 Like, where is she going?
02:30:36.000 Like, something has to almost happen.
02:30:37.000 Like, if I was writing a script, and I definitely don't want anything to happen to her.
02:30:42.000 But if I was going to write a script, I would say, like, some shit has to go down.
02:30:46.000 It could be a scandal.
02:30:49.000 Yes, that's what I'm saying.
02:30:50.000 Some shit has to go down.
02:30:51.000 And there's stuff there, isn't there?
02:30:53.000 I mean, there's stuff for their husband and finances.
02:30:58.000 Is there?
02:30:58.000 Yeah, I think so.
02:31:01.000 At least the appearance of it.
02:31:04.000 So who would it be?
02:31:05.000 It would be Gavin Newsom and who else?
02:31:07.000 I don't know.
02:31:08.000 I mean, look, they've clearly tried to make Buttigieg a candidate.
02:31:12.000 But again, he's another one of these people who was market tested extensively in the 2020 campaign.
02:31:19.000 I mean, I covered that campaign.
02:31:22.000 These candidates, it's somebody's idea of who would be a popular candidate.
02:31:29.000 But in reality, these candidates do not register with ordinary voters.
02:31:34.000 Like Beto O'Rourke was another one.
02:31:35.000 I watched them on the trail in Iowa and voters just – he would sort of tearfully talk about problems at the border and – And they just weren't interested.
02:31:51.000 He's a soap opera actor.
02:31:52.000 Yeah.
02:31:53.000 That's what he's like.
02:31:54.000 Right.
02:31:54.000 He's like a shitty soap opera actor trying to go through these lines and you're like, I don't resonate with anything you're saying.
02:32:01.000 He does look like the handsome young doctor who has an affair out of wedlock, right?
02:32:07.000 Yeah.
02:32:09.000 That actually is kind of, he might be better at that.
02:32:13.000 But the big story of 2020 to me was always how stubbornly high Biden's numbers were, right?
02:32:22.000 Like people did respond to him.
02:32:24.000 Even though he was clearly crazy.
02:32:27.000 Like, you know, he would go out there and his emotional register would be all off.
02:32:32.000 He would stick his finger in people's chests.
02:32:34.000 He would go off on people in crowds, but somehow people responded to that.
02:32:40.000 And he kept, you know, he wouldn't sink far enough in the polls so that he would be pushed out of the race.
02:32:49.000 How accurate are polls?
02:32:51.000 And how manipulated are polls?
02:32:53.000 I think polls are, they can be useful over periods of time, right?
02:32:59.000 Like, because the polls were clearly wrong about Trump, you know, the polling analyses of, you know, like, for instance, they'll do things like, you know,
02:33:15.000 favorability, unfavorability ratings, but that sometimes doesn't take into account other issues like People will still vote for somebody they feel unfavorably toward if they hate the other candidate more, you know?
02:33:28.000 And I think they do tell you something.
02:33:32.000 I mean, as reporters, you should never get in the habit of being too reliant on them as indicators.
02:33:40.000 But if a candidate can't get above 2% or 3% over a year, then you might want to You know, take that seriously.
02:33:52.000 And especially if it follows through and is matched by results.
02:33:57.000 But to me, it's almost like the heavyweight division when Tyson was a champion and there was no challengers.
02:34:03.000 That's true.
02:34:04.000 It's like, what do I get to be excited about?
02:34:06.000 There's no one that seems like they could step up.
02:34:10.000 That's true, yeah.
02:34:12.000 Bonecrusher Smith might have been the best guy he fought, right?
02:34:15.000 I'm trying to remember.
02:34:16.000 Well, there was a run.
02:34:17.000 Bruce Seldon looked good.
02:34:19.000 He looked the part.
02:34:20.000 There was a few guys.
02:34:20.000 Frank Bruno looked the part.
02:34:22.000 But Tyson was so dominant that there was no one to get excited about.
02:34:25.000 And it's not like saying that there's anyone dominant like that.
02:34:28.000 I mean, I guess Trump is pretty dominant, but he's got a lot of resistance on the right, too.
02:34:31.000 But the point is that in the left, there's no contender.
02:34:36.000 That is compelling, that you can see.
02:34:39.000 I mean, they've tried to push Buttigieg.
02:34:42.000 They gave up on Elizabeth Warren.
02:34:44.000 They've tried to push some of these people, but no one stands out.
02:34:47.000 Yeah.
02:34:48.000 I mean, they made a mistake.
02:34:50.000 I think they had a window where Bernie would have been a viable candidate.
02:34:53.000 I think they're scared of Bernie.
02:34:54.000 Oh, of course.
02:34:56.000 Same way they're scared of Tulsi Gabbard.
02:34:58.000 They can't control her.
02:34:59.000 Yeah.
02:35:00.000 No, they're not organizational people, right?
02:35:04.000 Even though Bernie tries very hard to be a good Democrat.
02:35:07.000 I mean, that's, I think, part of his personality that ended up being a fatal flaw, I think, for him.
02:35:15.000 I mean, he loves the Democratic Party.
02:35:16.000 He grew up in it.
02:35:18.000 And if you talk to him about it, he'll talk about his fond memories of the party and how he doesn't want to see it fractured.
02:35:28.000 But that ended up being his undoing.
02:35:29.000 He needed to be in burn-it-all-down mode the way Trump was.
02:35:35.000 Imagine if he did go that way.
02:35:37.000 Oh, my God.
02:35:37.000 He might have won.
02:35:38.000 Yeah, he might have won.
02:35:40.000 If he started ranting and raving, I'm mad as hell and I can't take it anymore?
02:35:43.000 Exactly.
02:35:44.000 Yeah.
02:35:45.000 Exactly.
02:35:45.000 But he didn't want to permanently damage either Biden or Hillary, even though, especially Biden, because he liked Biden as a person.
02:35:58.000 And this I know because I've talked to people who work for Bernie, and Biden was nice to him when Bernie came to the Senate.
02:36:09.000 He couldn't stand Hillary, right?
02:36:11.000 So he was very aggressive toward her in the beginning and that was when he was doing really, really well.
02:36:18.000 I think if he had pushed it a little further in that first year, if he had been a little bit more balls out, He might have won that one, you know?
02:36:31.000 Yeah.
02:36:31.000 It would have been tough.
02:36:32.000 I mean, he might have won the nomination, at least.
02:36:35.000 Well, when you saw the collusion between the Democrats and during the primaries with Bernie that Donna Brazile talked about in her book, it's like there was an effort to try to get rid of him.
02:36:49.000 Oh, of course.
02:36:49.000 And it was like calculated maneuvers.
02:36:51.000 To try to get him out of there, which is really wild.
02:36:55.000 It's really wild to see the way these intricate little chess pieces move around behind closed doors, and then someone like Donna Brazile writes a book and comes out with it, and you get to see what they were up to.
02:37:08.000 Yeah, I mean, they had a whole system they had worked out.
02:37:11.000 The invisible primary, and the endorsements are all lined up ahead of time, and the money's all lined up ahead of time.
02:37:21.000 But Bernie did well to fight back against that.
02:37:25.000 I mean, I think his big accomplishment, looking back, is going to be The proof of concept that you can be the top fundraiser in a race without taking corporate money,
02:37:41.000 which he did do in 2020, that's an important thing that he figured out.
02:37:50.000 But you're right.
02:37:52.000 They have lots and lots of ways to put the thumb on the scale.
02:37:57.000 The difference is Trump overcame all of those.
02:38:01.000 Just with sheer bullshit and asshole-dom.
02:38:07.000 And you know what?
02:38:10.000 On one level, that's impressive.
02:38:12.000 But that's what it takes.
02:38:16.000 And I don't think...
02:38:19.000 I don't see a character like that in the Democratic side who's going to be able to pull that off.
02:38:24.000 Do you think?
02:38:25.000 No, I don't.
02:38:25.000 I don't see anybody like that.
02:38:26.000 I don't see anybody coming up and I don't see anybody in like the distant future either.
02:38:33.000 No.
02:38:34.000 No.
02:38:35.000 But do you think that Ron DeSantis can overcome Trump?
02:38:39.000 Ron DeSantis, I think, can get a lot of people that are on the fence.
02:38:43.000 Yeah, I think so.
02:38:44.000 Whereas I don't think Trump can.
02:38:45.000 I think so.
02:38:46.000 I mean, Trump is going to have diehard supporters, and I've learned never to write him off.
02:38:51.000 Like, I did that after the Access Hollywood thing happened.
02:38:55.000 I made the mistake of putting that down in print.
02:38:57.000 I'm never going to do that again.
02:38:59.000 The guys like Jason, he never fucking dies.
02:39:03.000 He always comes back.
02:39:06.000 But DeSantis is...
02:39:10.000 He survived one of the things that's usually fatal for a Republican politician, which is the approval of pundits in the Washington Post and the New York Times.
02:39:24.000 They all kind of...
02:39:27.000 He portrayed him as the more civilized Republican alternative and usually that's a death knell for a Republican candidate in the Trump era.
02:39:37.000 But he's still hanging around.
02:39:40.000 He's a powerful enough figure and veteran.
02:39:44.000 He's got a lot of things going for him and what he did with Florida actually turned out to have worked.
02:39:51.000 Right.
02:39:52.000 I think it's going to be difficult.
02:39:58.000 It's so unpredictable on the Democratic side right now.
02:40:04.000 It's weird to see the incumbent party be in such disarray at this stage of a presidential race.
02:40:13.000 I'm looking forward to covering it.
02:40:15.000 It's going to be fun.
02:40:17.000 I'm looking forward to seeing how it plays out without someone censoring Twitter.
02:40:21.000 Right.
02:40:21.000 Yes, there's that too.
02:40:22.000 That's a big one.
02:40:23.000 It's a big factor.
02:40:25.000 It's a giant factor.
02:40:26.000 I mean, if Twitter did not censor the Hunter Biden laptop story, if that went viral and everyone knew about it and they were forced to cover it on the news and they showed the images and all the talk about 10% to the big guy and the fact that he was getting these contracts with Burisma,
02:40:44.000 Where he's making millions of dollars, totally unqualified to get that money, should not have been in that position of power, saying that he could use his influence and saying he could connect people and get people to the dance.
02:40:55.000 That was wild shit.
02:40:57.000 And the fact that they came along and censored that on Twitter.
02:41:01.000 And Facebook, yeah.
02:41:02.000 And Facebook.
02:41:03.000 And then Zuckerberg on this podcast talked about it.
02:41:06.000 I remember when you did that interview because, you know, you have moments in this period where media has been so controlled where you think...
02:41:19.000 Man, am I crazy?
02:41:21.000 Or did that just happen?
02:41:23.000 You know, like, I thought that was a really big deal when that happened.
02:41:26.000 It was a big deal.
02:41:27.000 And then, you know, when that interview, when Zuckerberg said it out loud, even though he had testified about it before, when he said it out loud in that setting and he kind of described it, you know, you have this sort of almost feeling of psychological relief, like, okay, all right, I wasn't crazy.
02:41:43.000 I'm not crazy.
02:41:43.000 You know what I'm saying?
02:41:44.000 Yeah, they really are doing that.
02:41:45.000 Yeah.
02:41:46.000 And...
02:41:47.000 The thing is, when he testified about it, you have to seek out that testimony.
02:41:50.000 You have to read a review of that testimony.
02:41:53.000 You could just be consuming a podcast just because you're jogging or whatever.
02:41:58.000 And he says, well, the FBI contacted us.
02:42:01.000 And you're like, what?
02:42:03.000 The FBI contacted Facebook and told you that this smacks of Russian disinformation, but it doesn't, so they lied, and you did something that helped get one guy elected over the other guy, based on lies, and lies that the FBI helped.
02:42:19.000 What else does the FBI lie about?
02:42:21.000 And then, you know, you get into this long history of manipulation, and you're like, holy shit!
02:42:29.000 Yeah.
02:42:30.000 Yeah.
02:42:30.000 And they're having these regular industry meetings now that we know about, right?
02:42:35.000 Yeah.
02:42:36.000 You know, we're looking at the agendas of those and, you know, it'll say other government organization companies.
02:42:46.000 I'm sorry, other government agency briefing, which is like CIA briefing.
02:42:52.000 What are they talking about in those briefings?
02:42:54.000 What else are they locking down on?
02:42:57.000 Or what else are they amplifying, de-amplifying?
02:43:00.000 Who knows?
02:43:03.000 That's terrifying.
02:43:05.000 It would be nice to go back to a presidential campaign where maybe we have something more like an organic landscape to judge all this stuff.
02:43:14.000 Well, we certainly will on Twitter, unless something radical happens over the next two years, which is a possibility.
02:43:20.000 I mean, you know, Elon said that when he bought Twitter, it was on the fast track to bankruptcy.
02:43:25.000 And, you know, that was interesting to find out, too, that the reason why they took that deal was like it really wasn't profitable, you know, which is crazy because he bought it for $44 million and it's worth almost nothing.
02:43:36.000 But it's very valuable, although it's not profitable.
02:43:39.000 It's very valuable in what it can do.
02:43:41.000 And now he's trying to figure out a way to make it profitable.
02:43:44.000 Yeah, I don't have you in there.
02:43:46.000 Fuck.
02:43:47.000 Yeah, right.
02:43:47.000 I guess, like, the deal is to attract content creators and give them a better portion of the revenue than YouTube does.
02:43:55.000 Yeah.
02:43:55.000 And you could do that.
02:43:56.000 It does have the infrastructure to do something like that, and it could become a hub of podcasts.
02:44:01.000 Yeah.
02:44:01.000 You could easily have, I mean...
02:44:04.000 See, there's no incentive for people to keep their podcasts only on iTunes.
02:44:09.000 One of the things that iTunes has done, it's like a tremendous blunder, in my opinion, is they never figured out a way to monetize podcasts.
02:44:16.000 So they act as an aggregator for podcasts, but they never make any money off of it, which is nuts.
02:44:22.000 That's interesting.
02:44:22.000 I didn't know that.
02:44:23.000 Yeah, it's nuts.
02:44:24.000 It's fucking nuts.
02:44:25.000 If you think about the fact that Spotify makes fucking untold billions off of it, and Apple makes zero?
02:44:32.000 Hmm.
02:44:33.000 When you put your podcast on iTunes, on Apple Podcasts, all you're doing is linking an RSS feed to Apple Podcasts.
02:44:44.000 But Apple doesn't profit off of it.
02:44:46.000 They just distribute it.
02:44:47.000 We had meetings with them years ago, before I ever went to Spotify.
02:44:51.000 And I was like, well, you guys don't make any money off of this at all?
02:44:55.000 Like, this is crazy.
02:44:56.000 Like, how could you have something that's so widely distributed through your company, and you make zero money off of it?
02:45:06.000 That seems like a pretty big oversight.
02:45:07.000 It's a giant fuck-up.
02:45:09.000 Because I think in the beginning, they thought of podcasts as being just a joke.
02:45:13.000 Like, no big deal.
02:45:14.000 Right.
02:45:14.000 And then it became this enormous, like, media thing.
02:45:19.000 Yeah.
02:45:19.000 Yeah, well...
02:45:21.000 No, it's just a lack of foresight, I guess.
02:45:25.000 But it's ironic, right?
02:45:26.000 The podcast is the one thing that you can't control.
02:45:29.000 You can't algorithmically clamp down on and that becomes the most enormously popular format.
02:45:38.000 I don't think that's a coincidence.
02:45:39.000 It's very bizarre.
02:45:40.000 Yeah.
02:45:41.000 Matt Taibbi, I appreciate you very much, and thank you for coming in here.
02:45:44.000 Joe, thanks so much for having me.
02:45:46.000 Thanks for everything you do.
02:45:47.000 No, thank you.
02:45:48.000 You're one of the last of the real ones out there doing it.
02:45:51.000 I appreciate it, and thanks for having me on.
02:45:54.000 Anytime.
02:45:54.000 I'm so glad your show's doing great.
02:46:00.000 Are these the skulls of your vanquished enemies?
02:46:02.000 Yes.
02:46:03.000 This is Brian Selter.
02:46:04.000 This is Chris Cuomo.
02:46:06.000 This is...
02:46:07.000 What's the other guy's name?
02:46:09.000 The one that was Jim DaCosta.
02:46:11.000 Jim DaCosta.
02:46:13.000 Yeah.
02:46:13.000 They're all...
02:46:14.000 I got them all.
02:46:15.000 That's great.
02:46:17.000 Excellent.
02:46:18.000 Well, may there be more next time I come on.
02:46:20.000 Yes, for sure.
02:46:21.000 Thank you, brother.
02:46:21.000 Appreciate you.
02:46:22.000 Thanks a lot.
02:46:22.000 Bye, everybody.