The Joe Rogan Experience - November 16, 2019


Joe Rogan Experience #1386 - Matt Taibbi


Episode Stats

Length

2 hours and 5 minutes

Words per Minute

172.81401

Word Count

21,668

Sentence Count

1,735

Misogynist Sentences

30


Summary

On this episode of the podcast, Alex Blumbergbergberg and co-hosts Will and Kate discuss the theory that Jeffrey Epstein may have been murdered by a member of the press. They also discuss the possibility that Epstein was murdered by his own press secretary. And they talk about how the media handled the story of Epstein's disappearance and what it means for the rest of the investigation into his death. And they discuss why the media seems to have done nothing about it. Guests: Rep. Paul Gosser (D-Massachusetts) and journalist Jamie Grisham (R-Michigan). Thanks to caller Jamie. Thanks also to our sponsor and our sponsor, for sponsoring this episode. Thank you for supporting the podcast! Logo by Courtney DeKorte. Theme by Mavus White. Music by PSOVOD and tyops. Art by Jeff Kaale. The theme song is Come Alone by Suneaters, courtesy of Lotuspool Records. Our ad music is by Skandalous. and the album art is by Build Buildings. Please rate and review our ad-libbed version of this song, "Goodbye Outer Space" by Fountains of Paradise. We'll see you next Tuesday, November 19th! - The Good, The Bad, the Good, the Bad, The Beautiful, the Beautiful, and the Beautiful (feat. by Blame It On Us, by Dweck_ and (featuring & -- The Good Goodbye featuring . , The Good Bad, Goodbye, Good Morning by , and , Good Morning America, and by Mr. Goodbye. -- Thank You, Good Luck - Thank You Thank You. - by The Good Morning Folks (feat , Bad, Thank You & Good Morning, Good Life, and Thank You! by Shadydave, , by Thanks To You, My Brother & , My Brother, & Thank You ( ) in honor of , Will & Kate ( ) and Kate ( ), and Good Morning ( ) , by Ms. , Kaitlyn ( ) & Will ( ) . ( ) ( ) - Thank you, Kate ( , & ) & Will & Katelyn ( ) (


Transcript

00:00:03.000 So, Jamie pointed out, this congressman, is that who it is?
00:00:11.000 Jamie pointed this out, that there's a congressman, and he released a series of tweets, and the first letter of all these tweets, if you put them all together, it says, Epstein didn't kill himself.
00:00:20.000 Or did not kill himself?
00:00:21.000 Is that what it is?
00:00:22.000 Yeah, I think it's didn't.
00:00:23.000 He did upload it.
00:00:24.000 Yeah, how do you do the apostrophe?
00:00:26.000 Yeah, you can't.
00:00:27.000 You should have gone with did not.
00:00:28.000 Starting here with that evidence of a link.
00:00:30.000 Rep.
00:00:30.000 Paul Gosser What are the odds that this guy did this accidentally?
00:00:34.000 Really small, right?
00:00:36.000 That's kind of like one of those monkeys typing Shakespeare things.
00:00:40.000 Yeah, I don't think it could work.
00:00:42.000 And the thing is, he did it backwards, right?
00:00:45.000 So you didn't see what the puzzle was until the last tweet.
00:00:49.000 Who caught that?
00:00:50.000 I got a tweet from someone about 35 minutes ago that I don't know if there's a bunch of people online paying attention to it or what, but someone alerted me and a few other people.
00:00:58.000 Does he have an image of that fucking crazy mask?
00:01:02.000 Is that in his shit too?
00:01:04.000 Okay.
00:01:05.000 He's a weirdo.
00:01:06.000 That might be the H of that.
00:01:06.000 Not until I was November 1st.
00:01:08.000 The V mask?
00:01:09.000 Yes.
00:01:10.000 What is that mask again?
00:01:11.000 V for Vendetta?
00:01:12.000 What was it representative of?
00:01:14.000 It's the Guy Fawkes mask.
00:01:15.000 Yes, that's right.
00:01:17.000 So this guy, he's thinking along alternative lines of thought.
00:01:21.000 But that is really an interesting way of saying it.
00:01:24.000 Alphabetry, that's called.
00:01:25.000 Yeah, just making a bunch of tweets.
00:01:27.000 Don't ever address it.
00:01:28.000 Just leave it there.
00:01:29.000 Walk away.
00:01:30.000 Lewis Carroll was famous for that.
00:01:32.000 Was he?
00:01:32.000 Yeah.
00:01:32.000 Yeah, that was one of, he did a lot of sort of tricks with words.
00:01:37.000 Did you read the book Godelich or Bach?
00:01:39.000 No.
00:01:40.000 Yeah, there's a whole bunch of stuff in there about people who used, who put puzzles in text.
00:01:46.000 You know, it's kind of a thing that people did, I guess, back more in the 18th century and before.
00:01:51.000 Yeah.
00:01:51.000 Well, this Epstein case is probably the most blatant example of a public murder of a crucial witness I've ever seen in my entire life, or anybody's ever seen.
00:02:02.000 And the minimal amount of outrage about this, the minimal amount of coverage, it's fucking fascinating.
00:02:09.000 I mean, what's amazing to me, just as somebody who works in the media, is that this was shaping up to be the biggest news story in history.
00:02:17.000 Yes.
00:02:19.000 The instant he died, or was died, or however you want to call it, the story just fell off the face of the earth.
00:02:28.000 It's like nobody's doing anything about it.
00:02:30.000 And I don't 100% understand that.
00:02:33.000 I mean, I get it, why that's happening, but it's just amazing.
00:02:38.000 Well, when the woman from ABC, what was her name?
00:02:41.000 ABC. Amy, that lady, the one who had the frustrated moment that she called it, a frustrating private moment, when she was talking about having the scoop and having that story and them squashing it.
00:02:58.000 Right.
00:02:59.000 This is all stuff that everybody used to think was conspiracy.
00:03:04.000 Everybody used to think this was stoner talk.
00:03:08.000 This is stuff where people are just delusional.
00:03:11.000 They believe all kinds of wacky conspiracies, but the reality is much less complicated.
00:03:16.000 Well, this is not possible.
00:03:17.000 This is one of those things that's so obvious.
00:03:20.000 It's so in everyone's face.
00:03:23.000 Well, there's a couple of things going on because there are many different ways this can play out.
00:03:27.000 I mean, you could have a news director who just sort of instinctively decides, well, we can't do that story because I might want to have Will and Kate on later or I might want to have this politician on later.
00:03:38.000 And it's not like anybody tells them necessarily that we can't do this.
00:03:42.000 They just decide it's too hot.
00:03:43.000 If you grow up in this system and you've been in the business for a long time, you have all these things that are drilled into you At almost like the cellular level about what you can and cannot get into.
00:03:57.000 But there were some explicit things that happened with Epstein, too.
00:04:00.000 I mean, there were a lot of news agencies that killed stories about him.
00:04:04.000 And we're hearing about some of them, Vanity Fair, this thing.
00:04:07.000 So, yeah, it's bad.
00:04:10.000 It's terrible.
00:04:11.000 Yeah.
00:04:12.000 When I found out that Clinton flew no less than 26 times on a plane with Epstein, I was like, dude, I haven't flown that many times with my mom.
00:04:23.000 How long did he know Epstein?
00:04:25.000 Yeah, I don't know.
00:04:27.000 But I mean, to have that many flights, to have the Secret Service people involved, I mean, that's incredibly bold.
00:04:33.000 What was he doing?
00:04:35.000 With just girls?
00:04:37.000 Is Clinton that much of a hound that he would go that deep into the well that many times, 26 times?
00:04:44.000 Well, that's the thing about the Epstein story that makes no sense to me.
00:04:47.000 Like, I thought that the percentage of people who were out-and-out, like perverts, who had a serious problem, like with pedophilia or whatever, was pretty small, you know?
00:04:58.000 Yeah.
00:05:03.000 We're good to go.
00:05:08.000 We're good to go.
00:05:26.000 We're good to go.
00:05:47.000 Yes.
00:05:47.000 Yeah.
00:05:48.000 I mean, I'm not a hundred percent.
00:05:49.000 Yeah.
00:05:49.000 I haven't covered this story in depth.
00:05:51.000 I only really got into it a little bit.
00:05:53.000 We need you.
00:05:54.000 We need you in this one.
00:05:55.000 You're the guy.
00:05:57.000 This is a tough one.
00:05:58.000 I mean, you know, because it mixes a lot of things that are very tough to cover.
00:06:02.000 Yes.
00:06:02.000 You know, the intelligence world is very tough to cover.
00:06:05.000 You know, it's hard to get stories out of there that they don't want you to have.
00:06:09.000 Yeah.
00:06:09.000 And this is like the mother of all stories, you know, in terms of that.
00:06:14.000 And they're just little breadcrumbs here and there.
00:06:17.000 That whole thing about Acosta, you know, the Vanity Fair quote from him is that when he said that when he looked at the case, he didn't do it because I was told he belonged to intelligence.
00:06:29.000 What does that mean?
00:06:31.000 Who's intelligence?
00:06:32.000 You know what I mean?
00:06:32.000 Like, what agency?
00:06:34.000 What for?
00:06:35.000 And then you pair that with things like, you know, I have friends on Wall Street.
00:06:39.000 You tell me, I've never heard a single instance of this guy actually having a trade.
00:06:44.000 So what was this hedge fund doing?
00:06:46.000 I mean, if you think about it, a hedge fund's a perfect way to do blackmail, because you can just have people putting money in and out all the time, and it would look like...
00:06:56.000 Yeah.
00:06:56.000 So, very strange story.
00:06:59.000 Well, Eric Weinstein had a conversation with him.
00:07:01.000 You know, Eric Weinstein with Peter Thiel Capital.
00:07:04.000 Right.
00:07:04.000 He's like, this guy doesn't know what the fuck he's talking about.
00:07:06.000 Oh, yeah.
00:07:07.000 Financially.
00:07:08.000 Yeah, he's like, he's an actor.
00:07:09.000 Right.
00:07:09.000 This is nonsense.
00:07:11.000 Right, right.
00:07:11.000 That was his initial, almost instantaneous response.
00:07:14.000 Yeah, yeah.
00:07:14.000 And what real clients did he ever have?
00:07:17.000 What did he trade in?
00:07:18.000 How's he got a billion dollars or whatever he had?
00:07:21.000 Yeah, no.
00:07:22.000 Half a billion.
00:07:23.000 Under management?
00:07:24.000 Yeah, that's ridiculous.
00:07:25.000 Why did the guy who owns Victoria's Secrets give him a $70 million home in New York City?
00:07:31.000 Like, what?
00:07:32.000 I mean, these are all things that would have been really interesting to get into, you know?
00:07:35.000 If he didn't try to kill himself twice.
00:07:38.000 The suicide didn't happen to him like in The Wire.
00:07:40.000 Poor fella.
00:07:41.000 Yeah, yeah.
00:07:42.000 It's just so unfortunate.
00:07:44.000 Yeah.
00:07:45.000 So unfortunate that the cameras died.
00:07:47.000 So unfortunate he sustained an injury that you usually only get through strangulation.
00:07:53.000 Right, yeah.
00:07:53.000 Someone murders you.
00:07:54.000 He fell on the ground and accidentally broke his hyoid bone.
00:07:57.000 Yeah.
00:07:57.000 Happens all the time.
00:07:58.000 Whatever.
00:07:59.000 No big deal.
00:08:00.000 I mean, it's so bizarre.
00:08:01.000 I can't stand conspiracy theories.
00:08:04.000 I'm one of these people who doesn't like reading it, but I can't make this story work in a way that isn't You know, conspiratorial.
00:08:12.000 Well, that's the thing.
00:08:13.000 It's like, it gets to a point where you're like, okay, even Michael Shermer, who runs Skeptic Magazine, he's like, wait a minute, the cameras were not working?
00:08:21.000 Yeah.
00:08:22.000 I mean, it's like a bad excuse.
00:08:23.000 This seems like a conspiracy.
00:08:25.000 Fucking when Michael Shermer says, that guy doesn't believe in anything.
00:08:28.000 Right.
00:08:29.000 I mean, he is fucking, he's down the line on virtually every single thing that's ever happened.
00:08:34.000 He doesn't believe in any conspiracies.
00:08:36.000 Right.
00:08:36.000 Well, what's the innocent explanation for any of this?
00:08:39.000 Because none.
00:08:40.000 It doesn't make any sense.
00:08:41.000 You can't spin it in any way to make it not a crazy conspiracy theory.
00:08:46.000 Especially when the brother hires a doctor to do an autopsy.
00:08:50.000 Oh, yeah.
00:08:51.000 The doctor says, like, this guy was fucking murdered.
00:08:53.000 Right.
00:08:53.000 Yeah, Michael Baden, the famous guy from the HBO autopsy show.
00:08:57.000 Right.
00:08:57.000 Yep.
00:08:58.000 Absolutely.
00:08:59.000 Craziness.
00:09:00.000 Complete craziness.
00:09:01.000 And, you know, it's an example of...
00:09:06.000 The Epstein story is interesting because it's about villains on both sides of the aisle.
00:09:12.000 This is a classic.
00:09:13.000 This is something I've written about before.
00:09:14.000 The press does not like to do stories where the problem is bipartisan.
00:09:20.000 So when you have an institutional problem, when Democrats and Republicans...
00:09:24.000 Both share responsibility for it when, you know, or if it's an institution that kind of exists in perpetuity, no matter what the administration is.
00:09:32.000 We don't really like to do those stories.
00:09:34.000 Fox likes to do stories about Democrats.
00:09:38.000 MSNBC likes to do stories about Republicans.
00:09:39.000 But the thing that's kind of, you know, all over the place, they don't like to do that story.
00:09:44.000 Epstein is, you know, he's friends with Trump and with Clinton.
00:09:48.000 I mean, it looks like he has more friends on the Clinton side, but still...
00:09:51.000 And I think this is one of the reasons why this story doesn't have a lot of traction in the media, because neither side really likes the idea of going too deeply on it, feels like to me.
00:10:02.000 Well, but the blatant aspect of it, I mean, the closest that we have to that is the absolute murder, the Jamal Khashoggi murder.
00:10:12.000 That's the closest thing we have to it, or it's absolute murder.
00:10:15.000 Right.
00:10:15.000 This one, but it's also so insanely blatant, but now you have foreign actors that are involved in it and they all disperse and then left with this confusion of who's responsible for it.
00:10:25.000 Well, Saudi Arabia, that's another example where you can't really say it's, you know, one side of the...
00:10:31.000 Both parties have been incredibly complicit in their cooperation with the Saudi regime and in, you know, the massacres that are going on in Yemen.
00:10:41.000 It's a classic example of what Noam Chomsky used to talk about with worthy and unworthy victims, right?
00:10:46.000 Like if the Soviet communists did it, that was bad.
00:10:50.000 But if death squads in El Salvador killed a priest or a Catholic priest, you know, then that was something we didn't write about because they were our client state.
00:10:59.000 Yemen is a story we don't write about.
00:11:01.000 Syria is a story we do write about, but they're really equivalent stories.
00:11:06.000 But you're absolutely right, the Khashoggi thing, I don't think either party or either side's media really wants to get into that all that deeply.
00:11:14.000 How much is media shifting now?
00:11:18.000 You've obviously been a journalist for a long time.
00:11:20.000 How much are things changing in the light of the internet?
00:11:25.000 Well, a lot.
00:11:25.000 I mean, I have a new book out now that's really about this, right?
00:11:28.000 Why the business has changed.
00:11:30.000 What's it called?
00:11:31.000 Hate Inc.
00:11:32.000 Yeah, it's out now.
00:11:33.000 And it's really about how the press, the business model of the press has changed.
00:11:39.000 I mean, it's something that you talk about a lot.
00:11:41.000 I hear you on your show all the time talking about how news agencies are always trying to push narratives on people, trying to get people wound up and upset.
00:11:52.000 And that is a conscious business strategy that we didn't have maybe 30 years ago.
00:11:57.000 You know, you think about Walter Cronkite or what the news was like back in the day, you had the whole family sitting around the table and everybody watching, sort of a unifying experience to watch the news.
00:12:08.000 Now you have news for the crazy right-wing uncle, and then you have news for the kid in the Shay t-shirt, and they're different channels, and they're trying to wind these people up, you know, to get them upset constantly and stay there.
00:12:22.000 And a lot of that has to do with the internet, because...
00:12:26.000 Before the internet, news companies had a basically free way of making money.
00:12:30.000 They dominated distribution.
00:12:32.000 The newspaper was the only thing in town that had a...
00:12:35.000 If you wanted to get a WAN ad, it had to be through the local newspaper.
00:12:39.000 Now with the internet, the internet is the distribution system.
00:12:42.000 Anybody has access to it, not just the local newspaper.
00:12:45.000 And so the easy money is gone and we have to chase clicks more than we ever had to before.
00:12:52.000 We have to chase eyeballs more than we have to.
00:12:53.000 So we've had to build new money-making strategies and a lot of it has to do with just sort of monetizing anger and division and all these things.
00:13:01.000 We just didn't do that before and it's had a profound difference on the media.
00:13:06.000 As a writer, have you personally experienced this sort of the influence where people have tried to lean you in the direction of clickbait or perhaps maybe alter titles that make them a little bit disingenuous in order to get people excited about the story?
00:13:21.000 I mean, you know, my editors at Rolling Stone are pretty good and they give me a lot of leeway to kind of explore whatever I want to explore, but I definitely feel a lot of pressure that I didn't feel before.
00:13:32.000 In the business because, especially in the Trump era, and I've written a lot about the Russia story, right?
00:13:39.000 But that's an example of one side's media has one take on it and another side's media has another take on it.
00:13:46.000 And if you are just a journalist and you want to just sort of report the facts, you feel a lot of pressure to fit the facts into a narrative that your audience is going to like.
00:13:56.000 And I had a lot of problem with the Russia story because I thought, you know, I don't like Donald Trump, but I'm like, I don't think this guy's James Bond consorting with Russian spies.
00:14:06.000 I think he's corrupt in other ways.
00:14:08.000 And there was a lot of blowback on my side of the business because, you know, people in sort of liberal, quote unquote, liberal media, you just have, there's a lot of pressure to have everybody fit into a certain narrative.
00:14:20.000 And I think that's really unhealthy for the business.
00:14:23.000 Yeah, very unhealthy, right?
00:14:24.000 Because as soon as people can be manipulated to conform to that narrative, then all sorts of stories can be shifted.
00:14:30.000 Oh, yeah.
00:14:31.000 Yeah, absolutely.
00:14:32.000 And the job used to be about challenging your audience every now and then, right?
00:14:37.000 Like, if you think a certain thing is true, well, it's our job to give you the bad news and say that you're wrong about that.
00:14:42.000 That used to be what the job was, to be a journalist.
00:14:45.000 Now it's the opposite.
00:14:46.000 Now we have an audience.
00:14:48.000 We're going to tell you exactly what you want to hear and we're going to reinforce what you think.
00:14:53.000 And that's very unhealthy.
00:14:54.000 A great example of this was...
00:14:57.000 In the summer of 2016, I was covering the campaign, I started to hear reporters talking about how they didn't want to report poll numbers that showed the race was close.
00:15:09.000 They thought that that was going to hurt Hillary.
00:15:12.000 In other words, we had information that the race was close.
00:15:15.000 And we're not telling this to audiences because they wanted to hear that it was going to be a blowout for Hillary, right?
00:15:22.000 And that didn't help Hillary.
00:15:24.000 It didn't help the Democrats to not warn people about this, right?
00:15:28.000 But it was just because if you turned on MSNBC or CNN and you heard that Trump was within five points or whatever it was, that was going to be a bummer for that audience.
00:15:38.000 So we stayed away from it.
00:15:41.000 And, you know, this is the kind of thing that it's not politically beneficial to anybody.
00:15:45.000 It's just, we're just trying to keep people glued to the set by telling them what they want to hear.
00:15:50.000 And that's not the news.
00:15:51.000 That's not our job, you know?
00:15:53.000 And it drives me crazy.
00:15:55.000 Yeah, it should drive you crazy.
00:15:57.000 What you said about journalism used to be something that you're challenging your reader.
00:16:02.000 You're giving them this reality that may be uncomfortable, but it's educational and expands their view of the world.
00:16:11.000 Where do they get that now?
00:16:12.000 They don't.
00:16:13.000 That's the whole problem.
00:16:15.000 You can predict exactly what each news organization, what their take is going to be on any issue.
00:16:24.000 Just to take an example, when the business about the ISIS leader, al-Baghdadi, being killed hit the news, Instantaneously, you knew that the New York Times, CNN, the Washington Post,
00:16:39.000 that they were going to write a whole bunch of stories about how Trump was overplaying the significance of it, that he was telling lies about it.
00:16:50.000 You knew they were going to make the entire thing about Trump.
00:16:53.000 And then, meanwhile, Fox had a completely different spin on about how heroic it was.
00:16:57.000 But news audiences didn't have anywhere to go to just simply hear, who was this person?
00:17:01.000 Why was he important?
00:17:03.000 What do the people in the region think?
00:17:06.000 What is this going to mean going forward?
00:17:08.000 Is it actually going to have any...
00:17:12.000 Are we going to have to continually...
00:17:14.000 Is there going to be a new person like this every time?
00:17:18.000 Are we actually accomplishing it?
00:17:20.000 You don't get that anywhere.
00:17:21.000 All you get is Trump is a shithead on one side and Trump is a hero on the other side.
00:17:26.000 That's not the news.
00:17:28.000 No.
00:17:29.000 But the thing is, it's like...
00:17:40.000 I think?
00:17:49.000 So where do we go where I see both sides?
00:17:52.000 Where's the middle ground where someone goes, well, this is true, but you've got to say this is honest too, and this is what's going on over on this side, and the Republicans have a point here, and there's no mainstream media place where you can go for that right now.
00:18:08.000 No, there isn't, and this is one of the things I write about.
00:18:11.000 This is one of the reasons why shows like yours are so popular.
00:18:13.000 I mean, I think...
00:18:15.000 There's a complete loss of trust.
00:18:17.000 They feel like people are not being honest with them.
00:18:19.000 They're not being straight.
00:18:22.000 They come to people like you and a lot of other independent folks who aren't the quote-unquote mainstream media.
00:18:33.000 Because it's not really thought, it's not reporting, it's not anything.
00:18:36.000 If you can predict 100% what a person is going to say, that's not thinking, that's not reporting, it's just marketing.
00:18:42.000 For someone like me, that's so disturbing.
00:18:44.000 I'm a fucking comedian and a cage-fighting commentator.
00:18:47.000 When people are coming to me, like, this is the source where you go for unbiased representations of what's going on in the world?
00:18:54.000 That's crazy.
00:18:55.000 Well, I mean, I saw your interview with Barry Weiss, right?
00:18:59.000 And you just, you did a simple base, you didn't go to journalism school, right?
00:19:03.000 No.
00:19:04.000 No.
00:19:04.000 So, she said something about how, you know, oh, she's an Assad toady, and you said, what does that mean?
00:19:12.000 You just ask the simple, basic questions, right?
00:19:15.000 What does that mean?
00:19:16.000 Where is that coming from?
00:19:17.000 How do you know that?
00:19:18.000 You know?
00:19:19.000 Like, journalism isn't brain surgery.
00:19:21.000 That's all it is.
00:19:22.000 It's just asking the simple questions that sort of pop to mind when you When you're in a situation, like where did this happen?
00:19:27.000 How do we know that?
00:19:28.000 That's true.
00:19:30.000 But there's a whole generation of people in the press now who just simply do not go through the process of just asking simple questions.
00:19:39.000 How do I know that's true?
00:19:40.000 After each story you report, you're supposed to kind of wipe your memory clean and start over.
00:19:45.000 So just because somebody was banned the last time you covered them, Doesn't mean that they're necessarily going to be the bad guy this time you cover them.
00:19:52.000 You have to continually test your assumptions and ask yourself, is this true?
00:19:57.000 Is that true?
00:19:58.000 Is this true?
00:19:59.000 How do we know this?
00:20:00.000 And we've just stopped doing that.
00:20:03.000 It's just a morass of pre-written takes on things.
00:20:08.000 And it's really, really bad.
00:20:10.000 And you can see why audiences are fleeing from this stuff.
00:20:14.000 They just don't have the impact they used to.
00:20:16.000 Well, it's really interesting that a lot of this is this unpredicted consequence of having these open platforms like Facebook where people are getting their news and then the algorithm sort of directs them towards things that are going to piss them off, which I don't even think necessarily was initially the plan.
00:20:34.000 I think the plan is to accelerate engagement, right?
00:20:37.000 So they find out what...
00:20:39.000 What you're engaging with, what stories you're engaging with, and then they give you more of that.
00:20:44.000 Like Ari, my friend Ari Shafir, actually tried this out.
00:20:48.000 And what he did was, he went on YouTube and only looked up puppy videos.
00:20:54.000 And that's all he looked at for, like, weeks.
00:20:56.000 And then YouTube only started recommending puppy videos to him.
00:21:01.000 So it's not necessarily that Facebook wants you to be outraged, but that when you are outraged, whether it's over abortion or war or whatever the subject is, you're going to engage more, and their algorithm favors you engaging more.
00:21:13.000 So if you're engaging more about something very positive, you know, if you're all about yoga and meditation, your algorithm would probably favor yoga and meditation because those are the things you engage with.
00:21:24.000 But it's natural for people to be pissed off and to look for things that are annoying, especially if you're done working and you're like, God, this world sucks.
00:21:32.000 What's going on that sucks worse?
00:21:34.000 And then you go to your Facebook and, oh, Jesus, look at this goddamn border crisis.
00:21:38.000 Oh, Jesus, look at this.
00:21:39.000 Well, fucking, here's the problem with these goddamn liberals.
00:21:42.000 They don't know.
00:21:42.000 And you engage, and then that's your life.
00:21:45.000 And then it's saying, oh, I know how to get Matt all fired up.
00:21:48.000 I'm going to fucking send him some abortion stories.
00:21:51.000 Woo!
00:21:51.000 Right.
00:21:52.000 And then that's your feed.
00:21:53.000 Right, yeah, exactly.
00:21:54.000 But there's so many economic incentives that go in there, right?
00:21:57.000 They know that the more that you engage, the longer that you're on, the more ads that you're going to see, right?
00:22:05.000 So that same dynamic that Facebook and the social media companies...
00:22:33.000 Well, the news companies figured out the same thing.
00:22:38.000 That they know you're going to just be in an endless cycle of sort of impotent, mute rage all the time.
00:22:45.000 But it's kind of addicting, you know?
00:22:47.000 And they know that.
00:22:48.000 And it's sort of like the tobacco companies.
00:22:50.000 They know it's a product that's bad for you.
00:22:53.000 And they just keep giving it to you because, you know, it makes money for them.
00:22:57.000 Yeah.
00:22:57.000 And it's just, the thing about it is, all of it is about ads.
00:23:03.000 Totally.
00:23:04.000 And how many clicks they get in ads.
00:23:05.000 If they just said, you can have a social media company, but you can't have ads.
00:23:09.000 There's a new federal law, no more ads on Facebook, no more ads on YouTube, no more ads on Twitter, no more ads on Instagram.
00:23:17.000 Good luck.
00:23:18.000 Right.
00:23:19.000 Yeah.
00:23:19.000 Those businesses were all collapsed.
00:23:21.000 Yeah.
00:23:21.000 Yep.
00:23:21.000 Yeah, but that seems to be what it is.
00:23:23.000 It's like they figured out that your data is worth a tremendous amount of money.
00:23:28.000 And the way they can utilize that money is to sell advertising.
00:23:32.000 Yeah, no, they get it coming and going because they're not only selling you ads, but they're also collecting the information about your habits, which they can then sell again.
00:23:42.000 So it's a dual revenue stream.
00:23:45.000 The media companies...
00:23:48.000 Basically, they're just consumer businesses where they're trading attention for ad space, right?
00:23:53.000 So if they can get you to watch four hours of television a day, they have that many ad slots that they can show you and they know how much money they're going to make.
00:24:01.000 But the social media companies get it two ways.
00:24:04.000 They get it by...
00:24:06.000 You know, attracting your eyeballs and then also selling your habits to the next set of advertisers, which, you know, is very insidious.
00:24:13.000 But what's interesting about this is that most people don't think about this as a consumer business, right?
00:24:18.000 Like, Americans, these days, are very conscious of, like, what they put in their bodies.
00:24:22.000 You know, they won't eat too many candy.
00:24:23.000 Well, depending on who they are, right?
00:24:25.000 But people at least look at what the calories are, but they don't think about the news that way or social media, what they put in their brains.
00:24:32.000 And it's also a consumer product.
00:24:33.000 Yeah, it really is.
00:24:35.000 I've gone over that many times with people that that's a diet.
00:24:38.000 This is your diet.
00:24:40.000 You have a mental diet as well as you have a physical food diet.
00:24:43.000 Absolutely.
00:24:44.000 You have an information diet.
00:24:45.000 And a lot of people are just eating shit with their brain.
00:24:48.000 It's the worst kind of junk food.
00:24:50.000 It's like a cigarette sandwich, the stuff that we eat.
00:24:53.000 It's so fucking bad.
00:24:55.000 And it's getting worse.
00:24:56.000 It is.
00:24:56.000 It is getting worse.
00:24:57.000 And what's weird is that this is a 10-year-old problem and no one saw it coming.
00:25:01.000 And it's kind of overtaking politics.
00:25:03.000 It's overtaking social discourse.
00:25:06.000 Everybody's wrapped up in social media conversations.
00:25:09.000 They carry them on over to the dinner table and it gets people in arguments at work.
00:25:13.000 And all this stuff no one saw coming.
00:25:16.000 No one saw this outrage happening.
00:25:28.000 I mean, I think some people in the tech business probably saw early on the potential for this.
00:25:37.000 But, you know, in terms of other businesses like the news media and also politics, I mean, you have to think about the impact of this on politics.
00:25:46.000 It's been enormous.
00:25:48.000 You know, I covered Donald Trump.
00:25:49.000 Trump really was just all about whatever you're pissed off about, I'm right there with you.
00:25:55.000 And people are just sort of pissed off about lots of things these days because they're doing this all day long.
00:26:00.000 And if you can take advantage of that, then you're going to have a lot of success.
00:26:06.000 And I think a lot of people haven't figured that out.
00:26:09.000 And some of these things are real causes.
00:26:11.000 Like people are upset about real things.
00:26:13.000 But it's just, you're absolutely right.
00:26:16.000 People did not see this coming and they didn't prepare for it.
00:26:18.000 It's just weird that it's one of the biggest sources of income online.
00:26:22.000 And people didn't see it coming.
00:26:24.000 I mean, Facebook is generating billions of dollars and now potentially shifting global politics.
00:26:31.000 Yeah, and the whole issue of a couple of companies like Facebook having control over what you do and do not see is an enormous problem that nobody really cares about.
00:26:44.000 I've tried to write about it a few times.
00:26:46.000 I've written a couple of features about it and about how What a serious problem this is.
00:26:51.000 If you look at other countries like Israel, China, there are a number of companies where you've seen this pattern of internet platforms liaising with the government to decide what people can and cannot see.
00:27:07.000 And they'll say, well, we don't want to see, you know, Palestinian protest movements, or we don't want to see, you know, the Venezuelan channel, Telesaur, like, we want to take that off.
00:27:17.000 You think about how that could end up happening in the United States, and it is already a little bit happening.
00:27:23.000 It's a little bit, but it seems to be happening only in the terms of, like, leaning towards the progressive side, which people are okay with.
00:27:28.000 Because I think, especially in the light of Donald Trump being in office, this is acceptable censorship.
00:27:33.000 Yeah.
00:27:34.000 Yeah, but I think they're wrong about that.
00:27:35.000 I think they're wrong about that, too.
00:27:37.000 It's terribly dangerous.
00:27:38.000 It's very short-sighted.
00:27:40.000 Yes.
00:27:40.000 And I think there's also this thing that happens with people where they think, well, this is never going to happen to me.
00:27:50.000 You can do that bad thing to this person that I don't like, but as long as it's never going to happen to me.
00:27:55.000 But they're wrong.
00:27:56.000 History shows it always does happen to you.
00:27:59.000 So we're giving these companies an enormous amount of power to decide all kinds of things.
00:28:03.000 What we look at, what kind of political ideas we can be exposed to.
00:28:10.000 I think it's very, very dangerous.
00:28:12.000 That biased interpretation of what something is, that was what people talked about when the initial Patriot Act was enacted.
00:28:19.000 When people were like, hey, this might be fine with Obama in office.
00:28:24.000 Maybe Obama is not going to enact some of the worst clauses of this and use it on people.
00:28:30.000 Or the...
00:28:32.000 Was it NDAA? Is that what it was?
00:28:34.000 Yeah.
00:28:35.000 Some of the things were just completely unconstitutional, but don't worry, we're not going to use those.
00:28:39.000 But you're setting these tools aside for whatever fucking president we have.
00:28:45.000 Like, what if we have a guy who out-trumps Trump?
00:28:47.000 Right.
00:28:47.000 I mean, we never thought we'd have a Trump, right?
00:28:49.000 What if we have a next-level guy post-Trump?
00:28:52.000 What if there's some sort of...
00:28:55.000 Catastrophe, tragedy, attack, something that really gets people fired up, and they vote in someone who takes it up to another level.
00:29:02.000 And then he has these tools, and then he uses these tools on his political enemies, which is entirely possible.
00:29:07.000 Well, I mean, we've already seen that a little bit.
00:29:09.000 I mean, people don't want to bring this up, but...
00:29:13.000 A lot of the stories that have come out about Trump, they're coming from leaks of classified information that are coming from those war on terror programs that were instituted after 9-11.
00:29:22.000 The FISA Amendments Act, the NSA programs to collect data, they're unmasking people.
00:29:29.000 We have a lot of evidence now.
00:29:31.000 There was a lawsuit a couple that came out about a month ago that showed that the FBI was doing something like 60,000 searches a month At one point, they were asking the NSA for the ability to unmask names and that sort of thing.
00:29:49.000 These tools are incredibly powerful.
00:29:51.000 They're incredibly dangerous.
00:29:52.000 But people thought after 9-11, they were scared.
00:29:54.000 So we want to protect ourselves.
00:29:56.000 So that's okay for now.
00:29:58.000 We'll pull it back later.
00:30:00.000 But you never do pull it back.
00:30:02.000 It always ends up being used by somebody in the wrong way.
00:30:07.000 And I think we're starting to see that that's going to be a problem.
00:30:10.000 Yeah, I'm real concerned about places like Google and Facebook altering the path of free speech and leaning people in certain directions and silencing people that have opposing viewpoints and the fact that they think that they're doing this for good because this is how they see the world and they don't understand that you have to let these ideas play out In the marketplace of free speech and free ideas.
00:30:36.000 If you don't do that, if you don't do that, if you don't let people debate the merits, the pros, the cons, what's wrong, what's right, if you don't do that, then you don't get real discourse.
00:30:45.000 If you don't get real discourse, you're essentially, you've got some sort of intellectual dictatorship going on.
00:30:50.000 And because it's a progressive dictatorship, you think it's okay.
00:30:53.000 Because it's people who want everybody to be inclusive and, you know, I mean, this is a weird time for that.
00:30:59.000 It's a really weird time for that because, as you said, people are so short-sighted.
00:31:04.000 They don't understand that these, like, the First Amendment's in place for a very good reason and set up a long fucking time ago because they did the math.
00:31:11.000 They saw where it was going, and they were like, look, we have to have the ability to express ourselves.
00:31:15.000 We have to have the ability to freely express thoughts and ideas and challenge people that are in a position of power, because if we don't, we wind up exactly where we came from.
00:31:24.000 Yeah, no, and courts continually reaffirmed that idea that the way to deal with bad speech was with more speech.
00:31:34.000 And they did it over and over and over again.
00:31:38.000 The legal standard for speech still, I think, remains that unless it's directly inciting violence, you can have speech that incites violence generally, and the Supreme Court even upheld that.
00:31:52.000 You can have speech that comes from material that was stolen illegally.
00:31:57.000 That's okay.
00:31:58.000 But we had a very, very high bar for prohibiting speech always.
00:32:02.000 And the libel cases, the cases for defamation, You know, that also established a very, very high standard for punishing speech.
00:32:11.000 But now, all of a sudden, people have a completely different idea about it.
00:32:14.000 It's like, you know, forget about the fact that this was a fundamental concept in American society for, you know, 230 years or whatever, but they just want to change it, you know, without thinking about the consequences.
00:32:25.000 Well, that's where a guy like Trump could be almost like...
00:32:30.000 It's almost like a Trojan horse, in a way.
00:32:33.000 Like, if you wanted to play 3D chess, what you would do, you'd get a guy who's just so egregious and so outrageous, and then so many people oppose him.
00:32:41.000 Get that guy, let him get into a position of power, and then sit back.
00:32:45.000 Watch the outrage bubble.
00:32:47.000 I mean, I don't think that's what's happening.
00:32:52.000 But if I was super fucking tinfoil hattie, that's how I would go about it.
00:32:57.000 I would say, this is what you want.
00:32:58.000 If you really want to change things for your direction, put someone that opposes it.
00:33:04.000 That's disgusting.
00:33:05.000 And that way people just, a rational, intelligent person is never going to side with him.
00:33:11.000 So they're going to side with the people that oppose him and then you could sneak a lot of shit in that maybe they wouldn't agree with in any other circumstance.
00:33:17.000 Yeah, Trump's election is sort of like another 9-11, right?
00:33:21.000 Like, you know, 9-11 happened.
00:33:22.000 All of a sudden, people who weren't in favor of the government being able to go through your library records or listen to your phone calls, and all of a sudden, they were like, oh, Jesus, I'm so freaked out.
00:33:31.000 Like, yeah, fine.
00:33:32.000 When Trump got elected, all of a sudden, people suddenly had very different ideas about speech.
00:33:37.000 Like, you know, hey, that guy's so bad.
00:33:41.000 Maybe we should consider banning X, Y, and Z. If he was conceived as a way to discredit the First Amendment and some other ideas, that would be a brilliant 3D chess move.
00:33:59.000 Yeah, super sneaky.
00:34:01.000 That's like China level, many steps ahead.
00:34:04.000 Yeah, exactly.
00:34:07.000 Where do you think all this goes?
00:34:10.000 It seems like this is, I mean, obviously you just wrote a book about it, but it seems like this is accelerating.
00:34:16.000 And it doesn't seem like anyone's taking a step back and hitting the brakes or opting out.
00:34:22.000 It seems like people are just ramping up the rhetoric.
00:34:25.000 Yeah, I mean, I think the divisiveness problem is going to get worse before it gets better.
00:34:32.000 The business model of the media now is so entrenched that until some of these companies start going out of business because they're doing You know, they're losing audience because people don't trust them anymore.
00:34:48.000 The news is going to keep doing what it's doing.
00:34:51.000 The Hannity model is going to become normal for news companies.
00:34:56.000 I think it already basically is, you know, on both the left and the right.
00:35:01.000 And in terms of, you know, the internet companies...
00:35:04.000 They're consolidating.
00:35:06.000 They're getting more and more power all the time.
00:35:08.000 And I think we've already seen that people have, I think, too much tolerance for letting them make decisions about what we can and cannot see.
00:35:17.000 And I think it's going to get worse before it gets better.
00:35:19.000 I don't know.
00:35:19.000 What do you think?
00:35:20.000 That's what I think.
00:35:21.000 I mean, Facebook, Twitter, all these places.
00:35:23.000 I mean, Twitter has some of the most ridiculous reasons for banning people.
00:35:26.000 One of them is deadnaming.
00:35:27.000 Oh, yeah.
00:35:28.000 So if you call Caitlyn Jenner Bruce, like, hey, I like you better when you were Bruce, ban for life.
00:35:33.000 Right.
00:35:34.000 You can't even say, I like you better when you were Bruce, ban for life.
00:35:37.000 Right.
00:35:38.000 Yeah.
00:35:38.000 And actually, what's really interesting about that is...
00:35:43.000 That's a core concept that we've changed completely.
00:35:48.000 All the different ways in the past that we punish speech, we punish the speech, not the person.
00:35:54.000 So if libel, defamation, all those things, first of all, they were all done through the courts.
00:36:00.000 So you had a way to fight back if you thought you were unjustly accused of having defamed somebody or libeled somebody.
00:36:07.000 But if they found against you, the person who got something out of it was the person who was directly harmed, right?
00:36:13.000 And the courts judged that.
00:36:14.000 And it wasn't like you were banned for life from ever speaking again.
00:36:20.000 They just gave a bunch of money to a person who might have suffered some kind of career injury or whatever it was because of that.
00:36:28.000 And usually there was a retraction or it was removed from the press or whatever it was.
00:36:32.000 But it wasn't like we were saying, we're never going to allow you to be heard or seen from again.
00:36:37.000 We were sort of encouraging, optimistically, people to get better and to be different.
00:36:44.000 And now we're not doing that at all.
00:36:45.000 Now we're just saying, one strike or two strikes, whatever, you're gone.
00:36:50.000 And it's not like it's a public thing, so you can't sue over it.
00:36:53.000 Well, that's what's crazy about it, because it is a public utility, in a way.
00:36:58.000 Yes, it is.
00:36:59.000 It should be.
00:36:59.000 And even Jack Dorsey from Twitter admitted as much on the podcast, and he wishes that we would view it that way.
00:37:05.000 He's actually proposed two versions of Twitter.
00:37:08.000 A Twitter with their standard censorship in place, and then a Wild West Twitter.
00:37:13.000 And I'm like, sign me up.
00:37:15.000 How do I get on that Wild West Twitter?
00:37:16.000 Because the problem with things like Gab, And I've gone there a few times and watched it, and even Milo Yiannopoulos has criticized it for being this, is that it's just so hate-filled because it's the place where you can go and fucking say anything.
00:37:30.000 So the only people that it's attracting are people that just want to go there and just fucking shoot off cannons of N-bombs and call everybody a kike.
00:37:38.000 It's crazy.
00:37:40.000 And there's real communication there as well.
00:37:43.000 There's plenty of that, too.
00:37:45.000 But the sheer number of people that go there just to blow off steam because they can't say those things on Twitter or Facebook or any other social media platform without being banned, because of that, it becomes a channel for it.
00:37:59.000 And it's like, it doesn't get a chance.
00:38:01.000 It doesn't get a chance to...
00:38:02.000 The concept is great.
00:38:04.000 The concept is, if you're not doing anything illegal, we're not going to stop you.
00:38:07.000 You're not doxing anybody.
00:38:08.000 You're not threatening anybody's life.
00:38:09.000 We're not going to stop you.
00:38:10.000 Go ahead.
00:38:10.000 But if you...
00:38:16.000 Yeah.
00:38:30.000 And so you can't.
00:38:31.000 So even if you have controversial ideas that maybe some people would agree with and some won't, you can get banned for life for just controversial ideas.
00:38:38.000 Even controversial ideas that are scientifically and biologically factual, like the transgender issue.
00:38:46.000 Like if you say...
00:38:47.000 There's a woman, I brought her up a million times, Megan Murphy.
00:38:50.000 Murphy, yes.
00:38:51.000 A man is never a woman, she says.
00:38:53.000 They tell her to take it down.
00:38:54.000 She takes a screenshot of it, puts that up, takes it down, but takes a screenshot of the initial tweet.
00:39:00.000 Says, haha, look at that.
00:39:01.000 Banned for life.
00:39:02.000 Right.
00:39:02.000 A man is never a woman is a fact.
00:39:05.000 That is a fact.
00:39:05.000 It's a biological fact.
00:39:07.000 Now, if you decide to become a woman and we recognize you as a woman in society, well, that's just common courtesy in my eyes.
00:39:13.000 Like, you have a person who has this issue.
00:39:15.000 They feel like they were born in the wrong body.
00:39:17.000 Okay, I get that.
00:39:19.000 I'm cool with that.
00:39:19.000 But to make it so that you're banned forever, you can call someone a dumb fuck, an idiot, a piece of shit.
00:39:26.000 Your mother should have swallowed you.
00:39:28.000 Everybody's like, yeah, terms of service seem fine here.
00:39:31.000 Everything's good.
00:39:32.000 Say a man is never a woman.
00:39:34.000 Gone.
00:39:34.000 For life.
00:39:35.000 Right, yeah.
00:39:36.000 Call Caitlyn Jenner.
00:39:37.000 I liked you better when you were Bruce.
00:39:38.000 Done!
00:39:39.000 That's it.
00:39:40.000 Yeah.
00:39:40.000 No, and it's crazy, and obviously people see that, and they just get madder, and it makes people very, very resentful in ways that they wouldn't be otherwise.
00:39:51.000 And it makes...
00:39:52.000 There's no pathway.
00:39:53.000 There's no other thing, right?
00:39:56.000 There's no free speech platform that's universally accepted.
00:40:00.000 Right.
00:40:01.000 These ones, like I said, like Gab or there's a couple other ones out there, no one's using them.
00:40:06.000 It's a very small percentage of the people in comparison to something like Twitter, which is enormous.
00:40:11.000 Right.
00:40:11.000 And so because people don't want to be kicked off the platform, they're radically changing their behavior.
00:40:16.000 Yes, yes.
00:40:17.000 Self-censoring.
00:40:18.000 And we're seeing this a lot also with political ideas, too.
00:40:22.000 I have a podcast, Useful Idiots, it's called.
00:40:25.000 We try to talk to people who are kind of...
00:40:30.000 I think we're good to go.
00:40:35.000 I think we're good to go.
00:40:52.000 Nobody wants to associate you with you.
00:40:54.000 No one wants to defend you.
00:40:58.000 You're suddenly like the kid with lice, and people don't want that to happen to them, so they stop saying X, Y, and Z, and they just go with the flow, go with the crowd.
00:41:09.000 And it causes this sort of, you know, uniform, conformist discourse that isn't really about anything, right?
00:41:19.000 Because people are just afraid to talk, which is crazy.
00:41:22.000 Yeah.
00:41:22.000 Well, you're not supposed to talk to someone – I experience this all the time – this idea of giving someone a platform.
00:41:29.000 Like, if I have someone on, like, a Ben Shapiro or someone like that, you shouldn't give that guy a platform.
00:41:34.000 Well, he's already got a platform.
00:41:36.000 Right?
00:41:36.000 Wouldn't it be better if I just talked to him and find out what his ideas are and ask him about those ideas?
00:41:41.000 We had a very bizarre conversation about gay people, where he's basically full-on biblical, religious interpretation of gay people, which to me...
00:41:51.000 It's always strange.
00:41:52.000 Like, okay, how do you stand on shellfish?
00:41:55.000 You know?
00:41:56.000 Are you just as strong on shrimp as you are on gay guys?
00:42:00.000 Right, pork.
00:42:01.000 Why is it gay guys?
00:42:03.000 The Bible's pretty clear on a bunch of different things that don't seem to fire people up the way homosexuality does.
00:42:11.000 Like, why?
00:42:12.000 Why do you care?
00:42:13.000 If you had a friend that was eating shrimp, would you go to his house if he had shrimp cocktail?
00:42:17.000 No.
00:42:17.000 But you wouldn't go to a friend's house We're good to go.
00:42:35.000 You know, there's a bunch of shit in the Bible that you're like, well, God was wrong about that.
00:42:41.000 Like, how confident are you?
00:42:42.000 How confident are you that you can interpret God's word so perfectly that you let the lobster slide?
00:42:48.000 But all that butt-fucking, we've got to stop that.
00:42:51.000 You know, like, it's really weird.
00:42:53.000 But that's the whole point.
00:42:54.000 You challenge the idea, right?
00:42:55.000 Yes.
00:42:56.000 But the prevailing view now is that even having the discussion...
00:43:02.000 Yes.
00:43:02.000 Because you have a platform.
00:43:04.000 I mean, I read that thing in The Atlantic, you know, where they're like, you give people to...
00:43:09.000 I forget what the phrase was.
00:43:11.000 They were saying something like, you had...
00:43:13.000 I give people too many chances.
00:43:15.000 Too many chances, people who had already forfeited the right to have them, or something along those lines, right?
00:43:19.000 That guy was silly.
00:43:20.000 That guy gave up his hand when he said about me that I'm inexhaustible, but that he likes naps.
00:43:25.000 I go, oh, it's about you and your naps.
00:43:29.000 That's what it is.
00:43:30.000 You like naps.
00:43:31.000 So you don't like people that have energy.
00:43:33.000 I'm super sorry.
00:43:34.000 I thought that piece was really interesting because that whole idea that there are people who have forfeited the right to communicate forever.
00:43:44.000 Well, who decides that?
00:43:45.000 Again, there's this intellectual snobbism that goes on in You know, frankly, on my side of the media aisle, where, well, let's say what an appropriate thought is, what's right-thinking, what's wrong-thinking, you know,
00:44:01.000 who gets to have a platform, who doesn't get to have a platform, who we're going to call a monster, who we're not going to call.
00:44:08.000 I just don't understand the arrogance, where that comes from to decide that some people, you know, I totally disagree with people like, you know, Alex Jones or Shapiro or, you know, most things.
00:44:19.000 But I don't think that they should be wiped off the face of the earth.
00:44:22.000 I mean, I don't know.
00:44:23.000 Well, it's interesting to challenge people on these weird ideas and find out how they come to them.
00:44:27.000 And you will get a lot of fence-sitters that will recognize the flaws in their thinking if you let them talk.
00:44:33.000 Because there's a lot of people that aren't sure either way.
00:44:36.000 Maybe they haven't invested a lot of time investigating it.
00:44:39.000 Maybe they really don't know what this guy stands for.
00:44:41.000 Maybe they just read a cartoonish version of who he is.
00:44:44.000 And then you get to hear him talk and you go, oh, well, I see the flaw in his thinking.
00:44:47.000 Or, oh, well, he's right about some things.
00:44:50.000 And a lot of people are right about some things.
00:44:52.000 Sure.
00:44:53.000 They're wrong about things and they're right about things.
00:44:55.000 And the only way you can discern that is you communicate with them.
00:44:58.000 But as soon as you de-platform people like forever, you're just going to make a bunch of angry people.
00:45:03.000 You're just going to make a bunch of people that are completely distrusting and you're going to absolutely empower the opponents of your ideas.
00:45:11.000 People that do get to...
00:45:13.000 When do they get a chance to have their voice?
00:45:15.000 Well, when they vote.
00:45:16.000 So the more you do this shit, the more you censor conservatives, the more they're going to vote against liberals.
00:45:22.000 This is just a fact.
00:45:23.000 There's no getting around that.
00:45:24.000 This is human nature.
00:45:26.000 Yeah, I mean, I lived in the former Soviet Union, you know, for...
00:45:31.000 11 years.
00:45:32.000 And 100%, if you lived in Soviet Russia and something was published by an official publisher, people thought it was basically full of shit.
00:45:42.000 But if it was in the samizdat, if it was in the privately circled stuff that had been repressed and censored, people thought that was the coolest thing in the world.
00:45:51.000 That was the hot ticket.
00:45:52.000 And you're automatically giving something cachet and And added weight by censoring it.
00:46:01.000 I mean, this is just the way it works.
00:46:03.000 It's human nature.
00:46:04.000 If people think that you don't want them to see something, they're going to run through it twice as hard, you know?
00:46:08.000 So I just don't understand a lot of that instinct.
00:46:11.000 I think people have this idea that it works, that, you know, that de-platforming works, but you can't de-platform an idea, you know?
00:46:20.000 You may be able to do it to a person or two, but eventually you have to confront the idea.
00:46:25.000 You can do it to a few people, and it has been successful, which is one of the reasons why people are so emboldened.
00:46:29.000 Like, they have a successfully deplatformed Milo.
00:46:32.000 I mean, they really have.
00:46:33.000 It's very hard to hear him talk anymore.
00:46:37.000 He's not in the public conversation the way he used to be, because they kicked him off of all these different platforms.
00:46:43.000 And if you go into why they kicked him off these different platforms, even if you don't agree with him, and I don't on a lot of things, like, boy, I don't agree with kicking him off those platforms.
00:46:52.000 If you listen to what he got kicked off for, it's like, man, I don't know.
00:46:56.000 This doesn't seem like this makes a lot of sense.
00:47:00.000 Yeah, no, I mean, same thing with Alex Jones.
00:47:02.000 Alex Jones has said, you know, he's gone after me a couple of times in ways that were pretty funny, actually.
00:47:09.000 But when he was, you know, kicked off all these platforms, you know, I wrote a piece saying I think people are kind of doing an end zone dance a little early on this one, you know, because Jones is a classic example of how the system, the way the system used to work,
00:47:26.000 they would have punished him for being, you know, We're good to go.
00:47:52.000 Right, because the goalposts keep getting moved.
00:47:55.000 Right.
00:47:55.000 If you can ban him for that, then why don't you ban me for repeating the things that I said about Megan Murphy?
00:48:02.000 Right.
00:48:02.000 Or ban, because what I said about Bruce Jenner, ban this for that.
00:48:06.000 I mean, you get further and further down the line, you keep moving these goalposts, and next thing you know, you're in a very rigid situation.
00:48:12.000 Tightly controlled area where you can communicate, and you're suppressed.
00:48:17.000 And that just accelerates your desire to step out of that boundary.
00:48:22.000 And it makes you want to say things that maybe you wouldn't even have thought of before.
00:48:26.000 And also, logistically, it's an insane thing to even think about asking platforms To rationally go through all this content.
00:48:36.000 I talked to somebody who was a pretty high-ranking Facebook executive after the Alex Jones thing.
00:48:40.000 And he said, think about what we used to do just to keep porn off Facebook.
00:48:46.000 And we're dealing with, what, a couple of billion items of content every single day.
00:48:50.000 We had these really high-tech algorithms that we designed to look for flesh tones.
00:48:54.000 And that's how the Vietnamese running girl photo got taken off Facebook, because they automatically spotted a naked...
00:49:12.000 Oh, wow.
00:49:19.000 If it's that hard and that expensive for us to go through and just to keep child porn off of Facebook, think about how crazy it's going to be when we start having entry-level people deciding what is and is not appropriate political content.
00:49:35.000 It's not only going to be...
00:49:44.000 Well, that's why Twitter is so weird, because you can get away with shit on Facebook.
00:49:50.000 You can say things on Facebook, like Facebook doesn't have a policy about deadnaming, or Facebook doesn't have a policy about misgendering people, but they do have a porn policy.
00:50:01.000 Well now, Twitter, you can have porn!
00:50:03.000 Right.
00:50:04.000 I have to be very careful when I give my phone to my kids to make sure they don't open up the fucking Twitter app because I follow a lot of dirty girls and some of them, I mean, it's just right there.
00:50:15.000 There's no warning.
00:50:16.000 Bang!
00:50:16.000 Right in your face.
00:50:17.000 I mean, it's kind of crazy.
00:50:19.000 Right.
00:50:19.000 They have such an open policy when it comes to sex, which I'm happy they do.
00:50:24.000 I'm happy, not even that I want to see porn, but I'm happy that their attitude is just fine.
00:50:30.000 It's legal.
00:50:31.000 Do it.
00:50:32.000 You don't have to follow those people if you don't like...
00:50:34.000 It seems like it's in the American spirit to me.
00:50:37.000 That's what it all comes down to for me.
00:50:39.000 But yeah, no, the policies are completely inconsistent too with Twitter.
00:50:46.000 I've talked to people who've been removed from Twitter for saying pretty borderline Yeah.
00:51:08.000 I think a Clinton fan.
00:51:09.000 I forget what it was exactly.
00:51:10.000 But you'll see behavior that's much worse from people who have another political ilk and they will not be removed.
00:51:19.000 Or they might be a smaller profile person, they won't be removed.
00:51:22.000 So then what is that all about, right?
00:51:24.000 Like if it's only a person who has 20,000 followers or higher, we're going to, I mean, it's just so, you just can't do it.
00:51:30.000 There's just too many layers.
00:51:32.000 I mean, I'm against it just generally, but just in terms of the logistics, it doesn't make any sense.
00:51:37.000 I'm against it generally too.
00:51:38.000 And when I talked to Jack and he was explaining to me the problems with trying to manage things at scale, you really kind of get a sense of it.
00:51:46.000 Like, oh, you guys are dealing with billions and billions of humans using these things.
00:51:51.000 Right.
00:51:51.000 Yeah.
00:51:52.000 Yeah, but they're already, you know, in many countries around the world, they have armies of thousands of people who go through content to try to flag this or that kind of political content.
00:52:05.000 Yeah, and punish people.
00:52:09.000 I forget what the term was.
00:52:10.000 They had some really scary sort of authoritarian word for filtration centers or something like that.
00:52:16.000 The Chinese have armies of people.
00:52:20.000 I did a story about Facebook and how it was Teaming up with groups like the Atlantic Council here in the United States.
00:52:28.000 Remember a couple of years ago, the Senate called in Twitter, Facebook, and Google to Washington and asked them to devise strategies for preventing the sowing of discord.
00:52:41.000 Basically, it was asking them to come up with strategies for filtering out fake news and then also certain kinds of offensive content.
00:52:51.000 But, you know, that is a stepping stone to what we've seen in other countries, I think.
00:52:56.000 And I think it's really worrisome, but nobody seems to care on our side of the aisle, which is very strange.
00:53:03.000 My side of the aisle.
00:53:03.000 Well, it's my side of the aisle as well.
00:53:05.000 It's a censorship issue, you know, and it's...
00:53:08.000 It's a short-sighted thing, as you said before.
00:53:12.000 And it's not even.
00:53:14.000 There's people that do pretty egregious things from the left, like the Covington School thing, when people were saying, we've got to dox these kids and give me their names, release their names.
00:53:24.000 These people are still on Twitter to this day.
00:53:27.000 We're talking about kids that just happen to have these Make America Great Again hats.
00:53:31.000 And I have a friend who used to live in that area said, like, no, you don't get it.
00:53:34.000 Like, there's these stands.
00:53:35.000 These kids are on a high school, like, field trip.
00:53:38.000 There's these stands where you can buy these hats everywhere.
00:53:40.000 These kids bought the hats there.
00:53:42.000 They think they're being funny.
00:53:43.000 These guys play the music and then get in their face.
00:53:46.000 You take a photo of it, it looks like this guy's standing in this Native American guy's face.
00:53:50.000 But then you see the whole video.
00:53:52.000 It's, no, no, no.
00:53:53.000 The Native American guy was playing his drum, walking towards him.
00:53:58.000 And then everybody starts piling in.
00:53:59.000 Yeah, everybody just loses their minds, you know what I mean?
00:54:02.000 It's outrage cycle.
00:54:03.000 It's just so exhausting now, you know?
00:54:05.000 And signaling.
00:54:06.000 Everyone's signaling how virtuous they are.
00:54:08.000 Everyone's signaling that they're on the right side.
00:54:10.000 Everyone's signaling, you know, I want names.
00:54:13.000 Take these guys down.
00:54:14.000 Like, you're talking about 16-year-old kids.
00:54:16.000 Right.
00:54:16.000 It's so fucking crazy.
00:54:18.000 And what is he, he's guilty of smiling?
00:54:21.000 Right.
00:54:21.000 Is that what he's guilty of?
00:54:22.000 Yeah, no, he's got a MAGA hat on.
00:54:24.000 I mean, yeah, it's crazy.
00:54:26.000 And the signaling thing is crazy.
00:54:28.000 And, you know, for me, in the news business, a lot of people that I know went into journalism precisely because we didn't want to talk about our political views.
00:54:39.000 Like, the whole point of the job is like...
00:54:42.000 I think?
00:55:04.000 It's exactly what you're talking about.
00:55:05.000 People used to go to the news because they wanted to find out what happened in the world, and they can't do it anymore because everything that you turn on, every kind of content, is just editorialized content where people are sort of telling you where they stand on things.
00:55:19.000 You know, I don't want to know that.
00:55:20.000 I want to know what the information is.
00:55:22.000 Yeah, it's so hard.
00:55:23.000 How does this get resolved?
00:55:24.000 Because we're dealing with essentially a two-decade-old problem, right?
00:55:28.000 I mean, give or take.
00:55:29.000 Before that, before the social media and before the internet and websites, this just wasn't what it was.
00:55:38.000 You could count on the New York Times to give you an unbiased version of what's going on in the world.
00:55:43.000 I don't necessarily know that's true anymore.
00:55:45.000 No.
00:55:46.000 No, the Times has kind of gone over to this model as well.
00:55:49.000 They're super woke.
00:55:50.000 They've struggled with it.
00:55:52.000 There was an editorial, and I wrote about this in the book, that in the summer of 2016, this guy, Jim Rutenberg, wrote this piece, said, Trump is testing the norms of objectivity.
00:56:02.000 That was the name of the piece.
00:56:04.000 And basically what he said is Trump is so bad that we have to rethink What objectivity means.
00:56:10.000 We have to not only be true, but true to history's judgment, he said.
00:56:14.000 And we have to have copious coverage and aggressive coverage.
00:56:18.000 So we're going to cover Trump a lot.
00:56:20.000 We're going to cover him aggressively.
00:56:21.000 And we're going to show you, we're going to take a stand on this issue rather than just tell you what happened.
00:56:27.000 So rather than doing the traditional New York Times thing of just the facts, we'll tell you, you sort it out.
00:56:34.000 We're going to tell you what your stance should be.
00:56:40.000 Where do we go from here?
00:56:42.000 How does it get resolved?
00:56:43.000 I don't know, because unless the financial incentives change, they're not going to change.
00:56:51.000 You know, the business used to be, back when you were talking about it, the New York Times, and then there were three networks, and they were all trying to get the whole audience, right?
00:56:59.000 So they were doing that kind of neutral fact-finding mission, and it was working for them financially.
00:57:05.000 Now they can't do that because of the internet.
00:57:07.000 You're hunting for audience in little groups, and they're just giving you hyper-politicized stuff because that's the only way they can make money.
00:57:14.000 I don't know how we change it.
00:57:15.000 I don't know how we reverse it.
00:57:17.000 It's a problem.
00:57:19.000 It's so interesting, though, because, I mean, if you looked at human interactions, and if you looked at, you know, dispensing news and information, and you followed trends from, like, the 30s to the 40s to the 50s to the 60s to the 70s,
00:57:37.000 you'd be like, oh, well, people are getting better at this.
00:57:40.000 Whoa, whoa, whoa!
00:57:42.000 What the fuck is going on now?
00:57:44.000 Everything's off the rails.
00:57:46.000 There's two camps barking at each other.
00:57:48.000 There's blatant misinformation on both sides.
00:57:51.000 Blatant distortions of the truth.
00:57:52.000 Blatant editorializing of facts.
00:57:55.000 And you're like, hey, what happened, guys?
00:57:57.000 Yeah, no, it's crazy.
00:57:58.000 And not that the news didn't have distortions before.
00:58:04.000 Like, you think about, you know, we covered up all sorts of things.
00:58:09.000 You know, massacres in Cambodia, secret bombing, you know, use of Agent Orange.
00:58:14.000 Stuff like that just didn't appear in the news in a degree it should.
00:58:18.000 Now, though, you turn on either NBC or Fox...
00:58:23.000 And you're right.
00:58:24.000 You'll find something that's just totally full of shit within five minutes, usually.
00:58:28.000 And that did not used to be the case.
00:58:32.000 I think individual reporters used to take a lot of pride in their work.
00:58:37.000 And it's different now.
00:58:39.000 And now when you make mistakes in the business, you don't...
00:58:41.000 You don't get bounced out of the business in the way you used to, and that's really strange.
00:58:47.000 Only plagiarism, right?
00:58:48.000 Plagiarism still bounces you, doesn't it?
00:58:50.000 Plagiarism is pretty – yeah, that's usually fatal, right?
00:58:54.000 You're not going to usually recover from that.
00:58:55.000 I mean, some people have kind of near problems with that, and they – I'm not going to name this, but no.
00:59:04.000 But you think about people who got stories like the WMD thing wrong.
00:59:08.000 Right.
00:59:09.000 Not only do they not get bounced out of the business, they all got promoted.
00:59:12.000 They're editors of major magazines now.
00:59:15.000 And so what does that tell people in the business?
00:59:18.000 Well, it tells you if you screw up, as long as you screw up with a whole bunch of other people, it's okay, which is not good.
00:59:24.000 And we used to have a lot of pride about that stuff in this business, and now we don't anymore.
00:59:30.000 You know, there isn't the shame connected with screwing something up that there used to be.
00:59:36.000 I think there's a real danger in terms of social media especially in not complying to the Constitution, not complying to the First Amendment.
00:59:45.000 I think there's a real danger in that.
00:59:47.000 And I don't think we recognize that danger because I don't think we saw what social media was until it was too late.
00:59:53.000 And then by the time it was too late, we had already had these sort of standards in place and the people that run it were already getting away with enforcing their own personal bias, their ideological bias.
01:00:06.000 And this is when you're at this position where you go, well, how does that ever get resolved?
01:00:12.000 They're not going to resolve it on their own.
01:00:13.000 They're still making ass loads of money.
01:00:15.000 What do you do?
01:00:16.000 Does the government resolve it?
01:00:17.000 Well, if Trump steps in and resolves it, it looks like he's trying to resolve it to save his own political career or to help his supporters.
01:00:26.000 Yeah, no, and no matter what, if Trump does anything about it, automatically everyone's going to be against it.
01:00:32.000 Right.
01:00:33.000 Even if there's some sense in there somewhere, people won't get behind it.
01:00:40.000 And if they do anything about it, there's going to be a correction time.
01:00:43.000 There's going to be a gab time where it's going to be like that, where it's just going to flood with people that are just, like, with this newfound freedom that's just going to go...
01:00:53.000 Just shoot up the town, you know?
01:00:55.000 But how would you fix it now?
01:00:58.000 That's the thing, because it's not only about rules, it's also about culture.
01:01:01.000 People have already, they're in this pattern of, you know, not saying the wrong thing, and they don't, I think there's, we're in a culture that doesn't even really know how to deal with free speech if we actually had it in the same way we used to, you know?
01:01:16.000 No one seems to have a forecast.
01:01:18.000 No one's like, well, the storm is going to last about four years.
01:01:21.000 There's no forecast.
01:01:24.000 No.
01:01:24.000 Everyone's like, well, it's fucking uncharted waters.
01:01:28.000 Right, right.
01:01:29.000 But historically, the tendency is once you have a tool that kind of can be used to keep people in line and enforce compliance of ideas, then it always ends up worsening and becoming more and more dictatorial and authoritarian.
01:01:47.000 Yes.
01:01:47.000 Again, you go back to the Soviet example.
01:01:50.000 Once they started really exercising a lot of control over the press and literature and things like that, it didn't get better.
01:01:57.000 It just continued becoming more of an entrenched thing.
01:02:02.000 So that's what I worry about.
01:02:03.000 I think we're headed more in that direction.
01:02:06.000 Yeah, I think so too.
01:02:07.000 I'm just really concerned on both sides.
01:02:10.000 When people dig their heels in ideologically, the other side just gets even more convinced they're correct.
01:02:15.000 Oh, yeah.
01:02:17.000 Yeah, and there's no cross-dialogue of any kind anymore.
01:02:23.000 And even now, I mean, it's interesting.
01:02:27.000 You had Bernie Sanders on your show, and Sanders is one of the few politicians left who has this idea that we should talk to everybody.
01:02:37.000 Like, there are no illegitimate audiences out there.
01:02:40.000 And, like, you know, that's my job as a politician, is to try to convince you of things that But that's not normal in the Democratic Party anymore.
01:02:47.000 I mean, Elizabeth Warren has made a big thing about not going on Fox and about having certain people taken off Twitter.
01:02:56.000 And I think that's increasingly the sort of line of thought...
01:03:03.000 In mainstream Democratic Party thought now is that we're just going to rule out whatever that is, 47% of the electorate, we're just not going to talk to them anymore.
01:03:11.000 Right, right.
01:03:12.000 I don't know how that can possibly be a successful political strategy.
01:03:17.000 And what the point is, you know?
01:03:20.000 Yeah.
01:03:21.000 No, it doesn't make any sense.
01:03:22.000 I was reading something where people were going after Tulsi Gabbard for being on Tucker Carlson.
01:03:28.000 She's like, I'll talk to everybody.
01:03:30.000 And I'm glad she does.
01:03:31.000 And by the way, it's hard for her because she's kind of an outside candidate.
01:03:35.000 It's hard for her to get time on these other networks.
01:03:38.000 And so they want to punish her for being on Tucker Carlson's and then they have this, you know, reductionist view of who he is.
01:03:46.000 He's a white supremacist.
01:03:47.000 Like, oh, well, she supports white supremacists.
01:03:49.000 She goes on a white supremacist show.
01:03:51.000 Okay, is that what he is?
01:03:52.000 Is that really what he is?
01:03:54.000 There's a lot more than that.
01:03:56.000 There's a lot going on there.
01:03:57.000 You guys are fucking with life.
01:03:59.000 You know, you're fucking with the reality of life and you're saying it in these sentences.
01:04:05.000 You're printing it out in these paragraphs as fact and you're sending it out there irresponsibly.
01:04:10.000 And it's just really strange that people don't understand the repercussions of that.
01:04:14.000 Yeah, this is something we talk about on our podcast, Usefully It's All the Time, is that it's a catch-22, right?
01:04:21.000 Like, you don't invite somebody like Tulsi Gabbard on to CNN, MSNBC, or they're kind of excluded from the same platforms the other politicians get.
01:04:30.000 So they go to other platforms, right?
01:04:33.000 And then you say, oh, you went on that platform, so you're illegitimate.
01:04:36.000 Yes.
01:04:36.000 You know, what do you want them to do?
01:04:37.000 Like, you know, they do the same thing with people who go on RT, for instance, right?
01:04:41.000 Right.
01:04:41.000 Oh, well, you're helping the Russians because you went on RT. Well, that's because you didn't invite them on any...
01:04:46.000 I mean, people are going to try to talk to anybody they can to spread their ideas.
01:04:52.000 And that kind of propaganda thing is pretty constant now.
01:04:56.000 In the use of the term, terms like white supremacists with Tucker Carlson...
01:05:01.000 I mean, there are a million terms now that you use to just kind of throw at people.
01:05:05.000 And what they're trying to do is create this ick factor around people, right?
01:05:08.000 Like, once someone gets a label associated with them, then nobody wants to be associated with that person.
01:05:16.000 Right?
01:05:16.000 Right.
01:05:16.000 And then they quickly kind of die out of the public scene.
01:05:19.000 And I think that's really bad, too.
01:05:24.000 It's just an anti-intellectual way of dealing with things, and I think it's not good.
01:05:29.000 It's weird that it's so prevalent.
01:05:31.000 It's weird that there's so few proponents of a more open-minded way of thinking.
01:05:38.000 Right, yeah.
01:05:39.000 And just to take the gap, we had Tulsi Gavin on our show, too, and immediately we got accused, what, do you love Assad?
01:05:47.000 Right?
01:05:48.000 Do you want to bomb Syrians?
01:05:50.000 Do you want to murder Syrians?
01:05:51.000 No, you know, she's a presidential candidate, and we want to talk to her and hear what she has to say.
01:05:56.000 But they immediately go to the maximalist interpretation of everything.
01:06:00.000 And then what they're basically saying when they ask you those questions are, do you want to wear that label too?
01:06:06.000 Because she's got it already.
01:06:08.000 So if you have her on again, you're going to have that label.
01:06:11.000 And people, they see that.
01:06:13.000 And so people who don't have a big following and who are worried...
01:06:19.000 About their careers and about money and advertisers and stuff like that.
01:06:24.000 They think twice about interviewing that person the next time.
01:06:27.000 And that's another way to get at speech.
01:06:30.000 Exactly.
01:06:30.000 And again, I don't know how you get out of it.
01:06:35.000 I mean, I've experienced some blowback, I guess, but it hasn't worked yet.
01:06:43.000 You know what I mean?
01:06:44.000 It's not real.
01:06:45.000 It's just words.
01:06:47.000 Like, okay, okay.
01:06:49.000 But you're handling it the right way, Ben.
01:06:51.000 I think your audience is rewarding you for not bowing to it.
01:06:59.000 And I think that more people, if they took that example and said, I'm not going to listen to what the...
01:07:04.000 The PAC says about this.
01:07:06.000 I'm not going to be afraid of being called a name.
01:07:08.000 Fuck that.
01:07:09.000 I'm going to talk to who I want to talk to and I'm going to explore whatever ideas I want to explore.
01:07:16.000 Then this kind of stuff wouldn't be as effective.
01:07:20.000 But it's so easy to do to people and it's so easy for them to deplatform people.
01:07:24.000 It's so easy.
01:07:25.000 And shadow banning and all this other weird shit that's going on.
01:07:28.000 Yeah.
01:07:29.000 They're channeling people and pushing people into these areas of their platforms that makes them less accessible.
01:07:39.000 And I know where it comes from.
01:07:40.000 I mean, I was young and politically active once.
01:07:44.000 You know, you want to change the world.
01:07:46.000 You want to make it a better place.
01:07:47.000 So you're in college.
01:07:49.000 You don't have any power.
01:07:50.000 You don't have any way to make something into legislation.
01:07:56.000 So what do you do?
01:07:59.000 Social media gives you the illusion that you're having an impact in the world by Maybe getting somebody deplatformed or taken off Twitter or something like that.
01:08:08.000 It feels like it's political action to people, but it's not.
01:08:12.000 It's something that is open to people to do, but it's not the same as getting 60 members of the Senate to raise taxes on a corporation that's been evading them for 20 years.
01:08:26.000 You know what I mean?
01:08:27.000 That's real action.
01:08:30.000 This, you know, getting some random person taken off the internet is just not change, you know, but people feel like it is and they want to do the right thing.
01:08:38.000 So I get it, but no, it's not, you know, real political action, I don't think.
01:08:44.000 No, it's fucking gross.
01:08:49.000 Yeah.
01:08:50.000 And there's so much of it, and there's so little logic.
01:08:55.000 Also, and this must be a personal thing for you, but isn't this the unfunniest time in American history?
01:09:03.000 Yes and no, because you're rewarded for stepping outside the box.
01:09:09.000 That's true.
01:09:09.000 In a big way.
01:09:10.000 Like, yeah, you mean Dave Chappelle gets attacked, but guess what?
01:09:14.000 He also gets rewarded in a huge way.
01:09:17.000 When he goes on stage now, people go ape shit.
01:09:20.000 That's true.
01:09:21.000 And part of the reason why they go fucking bonkers is because they know that this guy doesn't give a fuck.
01:09:26.000 And he's one of the rare ones who doesn't give a fuck.
01:09:28.000 So when he goes up there, you know if he thinks something crazy about whatever it is, whatever protected group or whatever idea that he's not supposed to explore, that's not going to stop him at all.
01:09:40.000 He's going to tell you exactly what he thinks about those things, regardless of all this woke blowback.
01:09:45.000 He doesn't care.
01:09:46.000 And so because of that, he's rewarded even more.
01:09:48.000 And same thing with Bill Burr.
01:09:50.000 Same thing with a lot of comics.
01:09:51.000 I experience it with my own jokes.
01:09:53.000 More controversial bits get people more fired up now.
01:09:56.000 They love it.
01:09:57.000 Because everyone's smothered.
01:10:00.000 You're smothered by human resources and smothered by office politics and you're smothered by social discourse...
01:10:20.000 I think that's true.
01:10:27.000 Yeah, I just, I feel like, I mean, I'm not a comic, but I just imagine it must be a more challenging environment.
01:10:35.000 It's more challenging, but more rewarding, too.
01:10:37.000 My friend Ari said it best.
01:10:38.000 He said, this is a great time for comedy, because comedy is dangerous again.
01:10:42.000 Right, that's true.
01:10:43.000 That's true.
01:10:44.000 It kind of goes back to the Lenny Bruce era, when you could completely freak people out with saying a couple of things.
01:10:52.000 Sure.
01:10:53.000 For good or bad.
01:10:54.000 Richard Pryor, yeah.
01:10:56.000 You saw it with Louis C.K., right?
01:10:58.000 Louis C.K. is under the microscope now.
01:11:01.000 That joke that he made about Parkland is absolutely a Louis C.K. joke.
01:11:06.000 If you've followed him throughout his career...
01:11:09.000 What was the joke again?
01:11:09.000 I'm sorry.
01:11:10.000 The joke was, why am I listening to these Parkland survivors?
01:11:13.000 Why are you interesting?
01:11:14.000 Because you push some fat kid in the way?
01:11:17.000 See, you're laughing.
01:11:18.000 Right.
01:11:20.000 That is a Louis C.K. joke.
01:11:23.000 He's saying something fucked up that you're not supposed to say.
01:11:26.000 Throughout his goddamn career, he's done that.
01:11:28.000 That's He's always done.
01:11:29.000 But after the jerking off in front of women and all that stuff and him coming out and admitting it and then taking a bunch of time off, now he's a target.
01:11:38.000 So now he does something like that and they're like, oh, he's alt-right now.
01:11:41.000 Like, no, this is what he's always done.
01:11:43.000 He's always taking this...
01:11:45.000 Sort of contrarian, outside the box, fucked up, but hilarious take on things.
01:11:51.000 And that bit, unfortunately, because it was released by someone who made a YouTube video of it, he didn't get a chance to...
01:11:57.000 He was gone for 10 months, and he had only done a couple sets when he was fleshing these ideas out.
01:12:02.000 I guarantee you he would have turned that idea into a brilliant bit, but he never got the chance.
01:12:06.000 Because it was set out there in the wild when it was a baby and it was mauled down by wolves.
01:12:11.000 It needed to grow.
01:12:14.000 These bits, they grow and they develop.
01:12:18.000 And that was a controversial idea that we're supposed to think that someone's interesting just because they survived a tragedy.
01:12:24.000 And his take is like, no, no, no, no, you're not interesting.
01:12:26.000 You're fucking boring.
01:12:28.000 You're annoying.
01:12:28.000 Get off my TV. And a lot of us have felt that way.
01:12:32.000 Sure.
01:12:33.000 He just, the way he said it was easy to take and put in, you know, out of context, put it in quotes and turn him into an asshole.
01:12:42.000 Yeah, but that's what comedy is, right?
01:12:44.000 It's taking with people the thoughts that everybody has and vocalizing that thing, that forbidden thing, in a way that people can kind of come together over, right?
01:12:55.000 I mean, I think that was a lot of what Richard Pryor's humor was about.
01:12:58.000 He took a lot of the sort of uncomfortable race problems, right?
01:13:04.000 Yeah.
01:13:05.000 And he just kind of put them out there, and both white people and black people laughed at it, right?
01:13:09.000 Like, together, you know?
01:13:11.000 And that was what was good about it.
01:13:13.000 But if you can't, if people are afraid to vocalize those things, if they think it's gonna, you know, ruin their career, I mean, I guess, you know, that makes it more interesting, right?
01:13:23.000 It does.
01:13:23.000 It's more high stakes.
01:13:24.000 But if you can navigate those waters and get to the promised land of the punchline, it's even more rewarding.
01:13:30.000 But you just have to explain yourself better.
01:13:33.000 You have to have better points.
01:13:34.000 You have to have a better structure to your material.
01:13:39.000 While the people who may find your idea objectionable, you coax them.
01:13:47.000 Like, hold my hand.
01:13:48.000 I'm going to take you through the woods.
01:13:50.000 We're going to be okay.
01:13:52.000 Follow me.
01:13:53.000 And boom!
01:13:54.000 Isn't that funny?
01:13:55.000 Right.
01:13:55.000 Right, right, right.
01:13:55.000 But you have to navigate it skillfully, and you have to navigate it thoughtfully, and you have to really have a point.
01:14:02.000 You can't have a half-assed point.
01:14:04.000 But you can't have a situation where it's fatal to be off by a little bit.
01:14:09.000 You know, like, there was a writer that I loved growing up, a Soviet writer named Isaac Babel.
01:14:15.000 Stalin ended up shooting him.
01:14:17.000 But he gave a speech about, I think it was in 1936, you know, to...
01:14:22.000 To a Soviet writers' collective.
01:14:24.000 And he said, you know, people say that we don't have as much freedom as we used to, but actually, all that, you know, the Communist Party has done is prevented us from writing badly.
01:14:35.000 The only thing that's outlawed now is writing badly, right?
01:14:38.000 And everybody laughed, but he was actually saying something pretty serious, which is that you can't write well unless you can, you know, screw up, too.
01:14:45.000 You know what I mean?
01:14:46.000 Like, on the way to being creative in a good way, You have to miss.
01:14:50.000 And if missing is not allowed, and there's high punishment for missing, you're not going to get art.
01:14:57.000 You're not going to get revelation.
01:14:58.000 You're not going to get all these things.
01:15:01.000 In comedy, it's particularly important because you have to work it out in front of people.
01:15:05.000 Absolutely.
01:15:05.000 I used to sit at a comedy club in Manhattan when I was a Yeah.
01:15:31.000 Yeah, I don't know.
01:15:32.000 But there's also people that are wolves, and they're trying to take out that little baby joke wandering through the woods.
01:15:39.000 They want that feeling of being able to take someone down.
01:15:43.000 Right.
01:15:44.000 And that's, you know, you're getting that now, too.
01:15:48.000 And so now because of that, there's like yonder bags.
01:15:50.000 Like the improv where I'm performing tonight, they use yonder bags.
01:15:53.000 You have to put your cell phone in a bag when you go in there so you can't...
01:15:55.000 Record things.
01:15:56.000 Yonder bags.
01:15:57.000 Yes, it's a company called Yonder.
01:15:59.000 It's just so strange.
01:16:01.000 All the shows I did with Chappelle, he uses Yonder bags.
01:16:05.000 And the idea is to prevent people from...
01:16:07.000 From filming and recording and then eventually putting your stuff out there.
01:16:12.000 Well, you know, look, I'm kind of all for that.
01:16:15.000 I mean, I've seen this with politicians on the campaign trail.
01:16:19.000 They are so tight now in ways that they used to not be.
01:16:23.000 Well, you saw the Donald Trump thing.
01:16:24.000 Donald Trump Jr., where Trump Jr., they wanted him to do a Q&A, and he didn't want to do it, so they booed him.
01:16:32.000 The right-wing people were booing him.
01:16:34.000 They were yelling out, Q&A, Q&A, because they wanted to be able to talk.
01:16:38.000 Oh, I see.
01:16:39.000 They want to be able to say something to him.
01:16:40.000 And these are people that were like far right, far right people.
01:16:44.000 They just didn't think he was being right enough or he was playing the game wrong or he wasn't letting them complain to him.
01:16:50.000 Right, right.
01:16:50.000 Yeah, yeah.
01:16:51.000 No, that's bad.
01:16:52.000 And politicians are aware of that now and they're constantly aware that they're on film everywhere.
01:16:59.000 Right.
01:16:59.000 And so they're, you know, a thousand percent less interesting because they're, I mean, I remember covering campaign in 2004 and I saw Dennis Kucinich give a speech somewhere and he was going from, I think,
01:17:15.000 Maine to New Hampshire.
01:17:15.000 And I said, well, can I get a ride back to New Hampshire?
01:17:18.000 He's like, yeah, sure.
01:17:18.000 So he, you know, takes me on the van.
01:17:20.000 He like takes his shoes off.
01:17:22.000 He's like cracking jokes and everything and like eating udon noodles or something.
01:17:26.000 Political candidates would not do that now.
01:17:28.000 They'd be afraid to be off the record with you.
01:17:31.000 Right, right, right.
01:17:32.000 And they're afraid to be around people and just behave like people, which is not good, I don't think.
01:17:39.000 It's the weirdest time ever to be a politician because it's basically you've got this one guy who made it through being hugely flawed and just going, ah, fucking locker room talk.
01:17:52.000 And everyone's like, well, yeah, it is locker room talk, I guess.
01:17:55.000 And then it works.
01:17:56.000 And he gets through and he wins.
01:17:57.000 And so you've got him who seems like he's so greasy, like nothing sticks to him.
01:18:03.000 And then you have everyone else who's terrified of any slight misstep.
01:18:08.000 Yeah, totally.
01:18:09.000 And you can't replicate the way Trump does this.
01:18:13.000 You know, Trump is – he was born this way.
01:18:15.000 There's like a thing going on in his head.
01:18:16.000 Like he is – You know, pathologically driven to behave in a certain way, and he's not going to be cowed by the way, you know, people are of a social media, because he just doesn't think that way.
01:18:27.000 No.
01:18:27.000 You know, he's – but that's – no one else is going to behave like that.
01:18:30.000 What do you think about him and speed?
01:18:33.000 What do you think about all that?
01:18:34.000 Does he take speed, you mean?
01:18:35.000 Yeah.
01:18:36.000 So did you ever see his speech after Super Tuesday?
01:18:41.000 Mm-hmm.
01:18:42.000 Yeah, that's the one where he was slurry?
01:18:44.000 That was the one where he was ramped up?
01:18:46.000 He was very...
01:18:48.000 I just say, watch that speech.
01:18:50.000 We're not supposed to draw conclusions about what might be going on pharmaceutically with somebody, but I would say just watch Donald Trump's performance after the results of the Super Tuesday rolled in in 2016. Let's hear some of that.
01:19:06.000 First of all, the Chris Christie is hilarious.
01:19:07.000 ...watch Hillary's speech and she's talking about wages have been poor and everything's poor and everything's doing badly but we're going to make it.
01:19:14.000 She's been there for so long.
01:19:15.000 I mean, if she hasn't straightened it out by now, she's not going to straighten it out in the next four years.
01:19:21.000 It's just going to become worse and worse.
01:19:23.000 She wants to make America whole again, and I'm trying to figure out what is that all about.
01:19:27.000 Is this it?
01:19:27.000 Yeah, I mean, it's just...
01:19:28.000 I have to go back and look, but yeah, but he went on and on.
01:19:32.000 Also, the Christie factor was really funny with that because he was...
01:19:35.000 Look at him.
01:19:36.000 He's just sitting back there going, what am I doing?
01:19:38.000 What am I doing with my life?
01:19:40.000 Look at his face.
01:19:42.000 Literally, you could see his brain wandering.
01:19:44.000 Well, how the fuck did this happen?
01:19:45.000 I was going to be the man.
01:19:47.000 Like, I was the goddamn president.
01:19:49.000 It was going to happen for me.
01:19:51.000 I could see it happening.
01:19:53.000 I saw him in Ames, Iowa, basically standing alone in a park waiting for people to try to shake his hand.
01:20:00.000 Yeah, it was pretty bad, like you see that.
01:20:02.000 But yeah, do you have a theory about Trump and speed?
01:20:05.000 Yeah.
01:20:05.000 Yeah, I think he's on some stuff.
01:20:07.000 I think, first of all, I know so many journalists that are on Speed.
01:20:11.000 I know so many people that are on Adderall.
01:20:14.000 And it's very effective.
01:20:15.000 It gives you confidence.
01:20:17.000 It gives you a delusional perspective.
01:20:19.000 You get a delusional state of confidence.
01:20:21.000 It makes people think they can do anything.
01:20:23.000 It's basically a low-level meth.
01:20:25.000 It's very similar to methamphetamine chemically.
01:20:28.000 Sure.
01:20:29.000 I've done it.
01:20:30.000 Tell me what it's like, because I haven't done it.
01:20:33.000 Yeah, I mean, I've done speed, too.
01:20:35.000 I mean, you know, all those drugs are, yeah, they're like baby speed, basically.
01:20:40.000 And you're absolutely right.
01:20:42.000 I think people who – it's not good for a writer because writing is one of these things where one of the most important things is being able to step back and – And ask, am I full of shit here?
01:20:54.000 Are my jokes as funny as I think they are?
01:20:56.000 Once that mechanism starts to go wrong, you're really lost as a writer, right?
01:21:03.000 Because you're not in front of an audience.
01:21:05.000 You're with yourself in front of a computer.
01:21:07.000 So I don't think speed is a great drug.
01:21:10.000 I mean, you get a lot of stuff done.
01:21:13.000 So that's good.
01:21:15.000 But yeah, no, I think there's a lot of people who are on it now.
01:21:19.000 And also a lot of this because Kids come up through school, and they're on it, too.
01:21:24.000 And they get used to it.
01:21:27.000 I have kids.
01:21:28.000 I wouldn't dream of giving them any of those drugs.
01:21:30.000 I think it's crazy.
01:21:32.000 I do, too.
01:21:33.000 I'm sure you saw the Sudafed picture, too, right?
01:21:35.000 No.
01:21:36.000 What was that?
01:21:37.000 Trump was sitting in his office eating a...
01:21:39.000 It was that famous photo where he's like, I love Hispanics, where he's eating a taco bowl at Trump Tower, and behind him there's an open drawer, and in that open drawer is boxes of Sudafed.
01:21:50.000 And Sudafed gives you a low-level buzz.
01:21:58.000 This is why you used to have to go to CVS to buy this stuff.
01:22:03.000 You used to have to give your driver's license because they want to make sure you're not cooking meth.
01:22:08.000 You're not buying 10 boxes of it at a time and cooking up a batch.
01:22:12.000 Yeah, if you're in a hall or in Kentucky and you go in and get 20 boxes of Sudafed, I think pretty much people know what you're doing there.
01:22:19.000 That's really funny.
01:22:20.000 So he had a bunch of Sudafed behind him?
01:22:22.000 Yeah, in his box.
01:22:24.000 And there was that one reporter that...
01:22:28.000 What was that guy's name again?
01:22:29.000 He wrote a series of tweets, which he eventually wound up taking down, by the way, Jamie.
01:22:35.000 I can't find those fucking tweets.
01:22:37.000 He wrote a series of tweets that there was a very specific Dwayne Reed Pharmacy where Trump got amphetamines for something that was in quotes called metabolic disorder.
01:22:49.000 Kurt Eichenwald.
01:22:50.000 Fun fact.
01:22:50.000 Oh, Kurt, yeah.
01:22:51.000 1982, Trump started taking amphetamine derivatives, abused them, only supposed to take two for 25 days, stayed on them for eight years.
01:22:58.000 Really.
01:22:58.000 Now, is he full of shit?
01:23:00.000 So, yeah, Kurt Eichenwald is interesting because he's written some really good books about finance.
01:23:07.000 He wrote a book about Enron.
01:23:10.000 He wrote a book about Prudential.
01:23:13.000 It was really good.
01:23:15.000 And when I was starting out writing about Wall Street, I was like, wow, these books are really incredibly well-researched.
01:23:20.000 But He had some stuff in 2016 where, like, that's an example of something as a reporter.
01:23:29.000 I see that and I'm like, well, where's that coming from?
01:23:32.000 Because in journalism, you can't really accuse somebody of certain things unless it's backed up to the nth degree.
01:23:40.000 So he had a couple of things that I would be concerned about.
01:23:43.000 He took a leap.
01:23:45.000 I don't know.
01:23:45.000 I mean, look...
01:23:46.000 That's what I'm saying.
01:23:47.000 Stepped outside of the journalistic boundaries of what you can absolutely prove and not prove and took a leap.
01:23:54.000 And that's why I think he took down the Dwade Reed pharmacy.
01:23:56.000 He didn't take it down?
01:23:57.000 Oh, it's still there as well?
01:23:58.000 There was...
01:23:59.000 Oh, okay.
01:24:00.000 There it is.
01:24:01.000 There was another thing about a...
01:24:03.000 Oh, he's got the milligrams per day.
01:24:05.000 Wow.
01:24:06.000 Where is this from?
01:24:08.000 I don't know.
01:24:08.000 He doesn't show it or anything, but I believe he got a copy of it from someone, or he talked to the doctor.
01:24:13.000 Drug was diethylpropan, 75 mg a day, prescription filled with Dwayne Reed on 57th Street in Manhattan.
01:24:19.000 Not that I know things.
01:24:21.000 So, you know...
01:24:22.000 He's got the doctor's name, too.
01:24:24.000 Dr. Joseph Greenberg.
01:24:26.000 I countered with medical records.
01:24:29.000 A White House admitted to me only a short time for diet that he took it when he was not overweight.
01:24:35.000 He says, I countered with medical records.
01:24:37.000 They cut me off.
01:24:39.000 Yeah, I mean, you know, one thing I will say is that when you're covering stories, sometimes you hear things and you know they're pretty solid, but it's not quite reportable because the person won't put their name on it, or, you know, you're not 100% sure that the document is a real document,
01:24:57.000 maybe it's a photocopy, and that can be very, very tough for reporters, because they know something's true, but they can't And social media has eliminated a barrier that we used to have.
01:25:08.000 We used to have to go through editors and fact checkers.
01:25:11.000 And now, you know, you're on Twitter, you can just kind of, you know, or you can hint at something, you know, and I think that's something you don't want to get into as a reporter too much.
01:25:22.000 Yeah, that's a weird use of social media, right?
01:25:24.000 It's like sort of a slippery escape from journalistic rules.
01:25:29.000 Yeah, exactly.
01:25:31.000 Or you can insinuate that somebody did X, Y, and Z, or you can use terms that are a little bit sloppy.
01:25:40.000 But it seems like they did admit that he took that stuff for diet.
01:25:44.000 Yeah, so if you have the White House spokesperson saying that he took it for a short time for a diet, then you find that's a reportable story.
01:25:50.000 Right.
01:25:51.000 Yeah.
01:25:51.000 Yeah, well, I think when people get into that shit, it's very hard for them to get out of that shit.
01:25:56.000 That's the speed train, and I've seen many people hop on it.
01:26:01.000 It's got a lot of stops.
01:26:02.000 Nobody seems to get off.
01:26:04.000 Yeah, not with their teeth intact, right?
01:26:07.000 Yeah, no, that's not a good one.
01:26:09.000 Also, he's so old.
01:26:11.000 He's so old, he doesn't exercise, he eats fast food, and he's got so much fucking energy.
01:26:15.000 I know.
01:26:16.000 I mean, people want to think he's this super person, you know, but maybe he's on speed.
01:26:21.000 Maybe, yeah.
01:26:22.000 Maybe he's just going to collapse, turn over, and collapse one day.
01:26:25.000 Or not.
01:26:25.000 Maybe you can go a lot longer on speed than people think.
01:26:29.000 Maybe if you just do it the right way.
01:26:30.000 But isn't that kind of the way history always works?
01:26:33.000 It's like, again, not to go back to the Russian thing, but all the various terrible leaders of Russia, they all died of natural causes when they were 85, right?
01:26:41.000 Whereas in a country where people get murdered and die of industrial accidents and bad health when they're 30 all the time.
01:26:47.000 Right, right.
01:26:48.000 But the worst people in the country make it to very old age and die and they're alcoholics.
01:26:54.000 And maybe that's a thing, right?
01:26:56.000 Maybe he has the worst diet in the world and maybe he's on speed.
01:27:02.000 Maybe it's also...
01:27:03.000 Your perception of how you interface with the world.
01:27:06.000 Maybe because he's not this introspective guy that's really worried about how people see him and feel about him.
01:27:11.000 Maybe he doesn't feel, you know, whether it's sociopathy or whatever it is, he doesn't feel the bad feelings.
01:27:19.000 They don't get in there.
01:27:20.000 Yeah, and he doesn't have the stress impact, right?
01:27:23.000 And that's the thing about speed, apparently, because of the fact that it makes you feel delusional, and it makes you feel like you're the fucking man.
01:27:30.000 Like, you don't worry about what other people think.
01:27:32.000 These fucking losers, who cares?
01:27:33.000 Right, right, yeah, exactly.
01:27:34.000 Let's buy Greenland!
01:27:37.000 You know, that was, why not buy Greenland?
01:27:39.000 Why not buy Greenland?
01:27:40.000 Yeah, and then when that came out, I thought, well, what's wrong with that?
01:27:42.000 We bought Alaska.
01:27:43.000 Well, we leased Alaska, sort of.
01:27:45.000 Yeah, we were supposed to give it back, but we didn't.
01:27:47.000 It seems like Greenland would be a good place to scoop up, especially as things get warmer.
01:27:51.000 Right?
01:27:52.000 Yeah, exactly.
01:27:53.000 The fucking tweet that he made when he put the Trump Tower, I promise not to do this, and have a giant Trump Tower in the middle of Greenland, I was laughing my ass off.
01:28:00.000 I'm like, love or hate, that is hilarious.
01:28:04.000 His trolling skills are very good.
01:28:07.000 They're fantastic.
01:28:08.000 Oh, he knows how to fuck with people.
01:28:09.000 When he starts calling people crazy or gives them a nickname, it's so good because it sticks.
01:28:16.000 It sticks.
01:28:17.000 I mean, part of me wants to see a Trump-Biden race next year just for that reason.
01:28:23.000 Just because the abuse will be unbelievable.
01:28:26.000 I mean, not that I'm encouraging that necessarily, but just as a spectacle, it's going to be unbelievable.
01:28:31.000 You can tell that he...
01:28:32.000 He's salivating at the idea of Biden.
01:28:35.000 Of course.
01:28:36.000 Biden, to me, is like having a flashlight with a dying battery and going for a long hike in the woods.
01:28:43.000 It is not going to work out.
01:28:46.000 It's not going to make it.
01:28:50.000 He's so faded.
01:28:52.000 He has these moments on the campaign trail where he'll be speaking, and these guys do the same speech over and over again, so they can kind of do it on cruise control.
01:29:02.000 But every now and then, he'll stop in the middle of it, and this look of terror comes over, like, where am I? What town am I in?
01:29:12.000 He confused.
01:29:13.000 He thought he was in Vermont when he was in New Hampshire.
01:29:16.000 I'm sorry.
01:29:17.000 Yeah, he got those states confused.
01:29:20.000 He was like, what's not to love about Vermont?
01:29:21.000 He was in New Hampshire.
01:29:23.000 You know, that can happen, obviously, but it happens to him a lot.
01:29:26.000 Well, he's clearly old.
01:29:30.000 Yeah.
01:29:30.000 You know, I mean, he's not much older than Trump.
01:29:32.000 Right.
01:29:33.000 But he needs to get on the same pills.
01:29:35.000 Yeah, yeah.
01:29:36.000 Actually, that would be interesting.
01:29:37.000 We should get a GoFundMe to buy Speed.
01:29:39.000 Can you imagine?
01:29:39.000 If they just filled him up with steroids and just jacked him up with amphetamines and had him going after Trump.
01:29:47.000 Because I really think he needs something like that.
01:29:50.000 Whatever he's doing on the natch, it's not working.
01:29:52.000 Right.
01:29:52.000 Yeah, yeah.
01:29:53.000 He's too tired.
01:29:54.000 Needs a little bit of enhancement.
01:29:55.000 It's not going to work.
01:29:56.000 If he gets the nomination, the Democrats are fucked.
01:29:59.000 I don't see him withstanding the barrage that Trump's going to throw at him.
01:30:06.000 Trump's going to take him out like Tyson took out Marvis Frazier.
01:30:10.000 That was a bad fight.
01:30:11.000 That was a bad fight.
01:30:13.000 But it's going to be that kind of fight.
01:30:14.000 He's just going to bomb on him.
01:30:16.000 He doesn't have a chance.
01:30:17.000 He can't stand with that guy.
01:30:19.000 He doesn't have a chance.
01:30:20.000 He's also too impressed with himself.
01:30:24.000 Yes, he's too used to people deferring to him.
01:30:26.000 He thinks the things he says make sense and are cool and are profound when they're just bland.
01:30:34.000 He's just serving bad meatloaf.
01:30:36.000 And he's like, ta-da!
01:30:38.000 And you're like, no, this is bad meatloaf.
01:30:40.000 Yeah, that's how he got to be vice president, by being just bland enough to get whatever constituency Obama was trying to get.
01:30:48.000 But you saw that exchange when he called Trump an existential threat earlier this year, and Trump basically, he just went off on him.
01:30:57.000 Joe's a dummy.
01:30:58.000 He's not the guy he used to be.
01:31:00.000 Yeah.
01:31:00.000 That's going to be every day.
01:31:02.000 Yep.
01:31:02.000 You know, every minute of every day.
01:31:05.000 And then other people are going to chime in because they love it.
01:31:07.000 People love piling on.
01:31:08.000 Oh, yeah.
01:31:08.000 And his fans, oh my God.
01:31:10.000 He's the asshole king where people never had a representative before.
01:31:14.000 There's a lot of assholes out there like, ah, where's my guy?
01:31:17.000 Right.
01:31:17.000 And then finally, bam, look at this.
01:31:19.000 There he is.
01:31:19.000 The asshole made it to the White House.
01:31:22.000 Holy shit, I can be an asshole now?
01:31:24.000 The president's an asshole?
01:31:25.000 He wants me to be an asshole?
01:31:27.000 Lock her up!
01:31:28.000 Lock her up!
01:31:29.000 Yeah, lock her up!
01:31:31.000 Yeah, totally.
01:31:32.000 I mean, that's going to wear on a guy.
01:31:35.000 I mean, have you been to one of Trump's rallies?
01:31:36.000 No chance.
01:31:38.000 I can't.
01:31:38.000 I have to wear a rubber nose and fucking...
01:31:40.000 I've covered them.
01:31:41.000 What's it like?
01:31:42.000 They're unbelievable.
01:31:43.000 First of all, the t-shirts are amazing.
01:31:48.000 You know, like, Trump 2020, fuck your feelings.
01:31:50.000 You know what I mean?
01:31:51.000 Nice.
01:31:53.000 Trump is the Punisher.
01:31:54.000 You know, it's like the Punisher skull with the thing.
01:31:57.000 It's amazing.
01:31:59.000 In the crowds, it's totally out of idiocracy.
01:32:02.000 Is there a fucking Punisher skull with a Trump wig on it?
01:32:06.000 Yeah, yeah.
01:32:07.000 Oh my goodness.
01:32:08.000 I might have to get one of those.
01:32:09.000 I mean, there's the t-shirts.
01:32:14.000 Do we have one?
01:32:16.000 Jamie, that was such a loud laugh.
01:32:22.000 I've never seen that.
01:32:24.000 It's a red, white, and blue American flag skull Punisher style with a Trump wig on it.
01:32:30.000 I need that shirt.
01:32:32.000 It wasn't the red, white, and blue one.
01:32:34.000 It was the one with the black.
01:32:36.000 And I saw that on an eight-year-old kid.
01:32:41.000 It was like a mother with her little kids in the Trump Punisher skull.
01:32:45.000 Do they sell that shirt on Amazon?
01:32:48.000 I'm sure it's being sold everywhere.
01:32:51.000 It is now!
01:32:52.000 These are stickers and these are being sold over Walmart, eBay.
01:32:56.000 Oh God, these fucking people.
01:32:58.000 I mean, the merch is...
01:33:00.000 He's the most t-shirtable president in history.
01:33:03.000 I mean, Trump 2020 grabbing by the pussy again.
01:33:08.000 Oh boy.
01:33:08.000 I mean, they like embrace that shit.
01:33:12.000 The trolling aspect of all of it is like the fun part for his crowds.
01:33:18.000 Sure.
01:33:18.000 What they get off on...
01:33:20.000 I think?
01:33:41.000 It's crazy.
01:33:42.000 Well, it's dumb, and that's the thing that he's sort of, like, captured, is this place where you can be dumb.
01:33:49.000 Like, it's fun to be dumb and say, grab her by the pussy.
01:33:52.000 Like, everybody knows that's kind of a dumb thing to say publicly.
01:33:54.000 Of course.
01:33:55.000 But you can say it there, because he said it.
01:33:57.000 Yay!
01:33:58.000 You know, build that wall, build that wall, yay!
01:34:01.000 Yay!
01:34:01.000 Right.
01:34:02.000 Like, it's like it's this chance to, like, shut off any possibility of getting over, like, 70 RPM. Like, we're gonna cut this bitch off at 70. There's no high function here.
01:34:12.000 We're gonna cut it off at 70 and just let it rip.
01:34:15.000 Right.
01:34:16.000 Yeah, no, totally, totally.
01:34:17.000 And...
01:34:19.000 It's funny, the way you say that, everybody knows it's a dumb thing to say, right?
01:34:25.000 I would talk to people at the crowds, and I'll talk to a 65-year-old grandmother, and you say, do you agree with everything that Trump says?
01:34:34.000 Almost to the last, they all say, well, I wish he hadn't said this particular thing, but they're all there chanting, you know what I mean?
01:34:40.000 They're all into it.
01:34:42.000 And the crowds are so huge.
01:34:45.000 I was in Cincinnati, and And I was late to one of his events, and I made the mistake that I couldn't drive in because they blocked off all the bridges, if you've ever been there, right?
01:34:53.000 I was on the Kentucky side.
01:34:55.000 So I had to walk, like, three miles away and, like, walk over a bridge, and I thought I was going to be the only person there.
01:35:01.000 And it was like something out of a sci-fi movie.
01:35:03.000 It was just like a line of MAGA hats, like, extending over a bridge all the way into Kentucky, like a mile down a road.
01:35:10.000 I mean, they had to turn away thousands of people to get into this event.
01:35:13.000 It was incredible.
01:35:14.000 How many people did it seat?
01:35:15.000 It was like 17,000 or 18,000.
01:35:17.000 It was the...
01:35:18.000 I forget what arena that is.
01:35:22.000 It's the indoor one.
01:35:23.000 Look at the size of those places.
01:35:25.000 He's the only one that can pull those kind of crowds.
01:35:27.000 Period.
01:35:28.000 Oh, yeah.
01:35:28.000 No one can do that.
01:35:30.000 You know, Bernie and Warren have had big crowds.
01:35:34.000 Bernie had a 25,000-person crowd in Queens a couple of weeks ago.
01:35:40.000 You'll see crowds that big, but Trump's crowds are just...
01:35:44.000 Dating back to 2016, they're just consistently huge everywhere.
01:35:48.000 And again, this gets back to what I was saying before, all the reporters saw this and they all saw that Hillary was having real trouble getting four and five thousand people into her events.
01:35:59.000 And so we all, you know, we were all talking to each other like, that's got to be a thing that's going to, you know, play a role in the election eventually.
01:36:07.000 But nobody kind of brought it up or they explained it away.
01:36:09.000 Well, I think they felt like if you discussed it and brought it up, that somehow or another you were contributing to Trump winning.
01:36:20.000 Right, but that's a fallacious way to look at it.
01:36:23.000 Because covering up the reality of the situation, I think, created a false sense of security for Democrats.
01:36:28.000 Sure.
01:36:29.000 They thought they were going to win by a landslide, right?
01:36:31.000 That's what everybody was saying, but it wasn't true.
01:36:34.000 I mean, there were serious red flags throughout the campaign for Hillary, and people, I think, were too afraid to bring up a lot of this stuff because they didn't want to be seen as helping Trump.
01:36:44.000 But that's not what the business is about.
01:36:46.000 We're not supposed to be, you know...
01:36:47.000 Helping people.
01:36:48.000 Facts don't have, you know, political indications.
01:36:51.000 We're just supposed to tell you what we see.
01:36:53.000 How do you get journalism back on track?
01:36:55.000 Is it possible at this point?
01:36:57.000 Is it a lost art?
01:36:58.000 Is it going to be like calligraphy?
01:37:00.000 Yeah, exactly.
01:37:04.000 Japanese calligraphy.
01:37:05.000 You have to pass it down through masters.
01:37:08.000 Maybe that's going to be what journalism is like.
01:37:12.000 There's two things that could happen.
01:37:13.000 One is that if you created something like Neither Side News right now.
01:37:18.000 That's a great name.
01:37:20.000 Yeah, like a network where it was a bunch of people who just kind of did the job without the editorializing.
01:37:26.000 I think it would probably have a lot of followers right away.
01:37:31.000 It would make money.
01:37:31.000 And nobody has clued into that yet.
01:37:33.000 Like, if some canny entrepreneur were to do that and that were to bring back the business, that or, you know, journalism has always been kind of quasi-subsidized in this country.
01:37:42.000 You know, going back to the Pony Express, newspapers were carried free across to the West, right?
01:37:46.000 The U.S. Postal Service did that.
01:37:49.000 The Communications Act in 1934, the idea was you could lease the public airways, but you had to do something in the public interest.
01:37:58.000 So you could make money doing sports and entertainment, but you could take a loss on news.
01:38:04.000 And so it was kind of quasi-subsidized in that way.
01:38:07.000 But that doesn't exist anymore.
01:38:08.000 There's no subsidy really for news anymore.
01:38:10.000 I'm not necessarily sure I agree with that being the way to go, but there has to be something, because right now the financial pressure to be bad is just too great.
01:38:23.000 Sorry to go on this, but when I came from the business, when the money started getting tighter, the first thing they got rid of were the long-form investigative reporters.
01:38:33.000 You couldn't just hire somebody to work on a story for three months anymore because you needed them to do content all the time.
01:38:38.000 Then they got rid of the fact checkers, which had another serious problem.
01:38:43.000 And so now the money's so tight that they just have these people doing clickbait all the time and they're not doing real reporting.
01:38:50.000 And so they have to fix the money problem.
01:38:53.000 I don't know how they would do that.
01:38:54.000 How much has it changed recently?
01:38:55.000 Because the stuff that you wrote about the banking crisis was my favorite coverage of it, and the most relatable and understandable, and the way you spelled everything out.
01:39:07.000 Could you do that today?
01:39:09.000 Yeah, but I think it would be harder.
01:39:11.000 That's not that long ago.
01:39:13.000 It really isn't.
01:39:15.000 I really stopped doing that in 2014 or so.
01:39:19.000 Yeah, so we're five years out.
01:39:20.000 But the big difference is social media has had a huge impact on attention span.
01:39:26.000 So I was writing like 7,000 word articles about credit default swaps and stuff like that.
01:39:33.000 And I was trying really hard to make it interesting for people.
01:39:35.000 You use jokes and humor and stuff like that.
01:39:38.000 But now, people would not have the energy to really fight through that.
01:39:43.000 You'd have to make it shorter.
01:39:51.000 The kind of process reporting where you're teaching people something because people just tune out right away.
01:39:57.000 They need just a quick hit, a headline, and a couple of facts.
01:40:01.000 So, yeah, there's a big problem with audience, right?
01:40:05.000 We've trained audiences to consume the news differently, and all they really want to get is a take now.
01:40:12.000 Everything's like an ESPN hot take on things, you know?
01:40:15.000 So that's not good.
01:40:16.000 The counter to that, though, is this, what we're doing right now.
01:40:20.000 These are always these long-ass conversations.
01:40:23.000 They're hours and hours long.
01:40:24.000 And there's a bunch of them out there now.
01:40:26.000 It's not like mine is an isolated one.
01:40:28.000 And there's so many podcasts that cover, and some of them cover them in a serial form, like The Dropout.
01:40:35.000 Was that what they called it?
01:40:37.000 Yes.
01:40:38.000 It was The Dropout was the one about that woman who created that fake blood company.
01:40:43.000 Oh, yes, right.
01:40:44.000 Susan, what was her name?
01:40:47.000 Elizabeth...
01:40:50.000 What is her name?
01:40:52.000 Elizabeth Holmes.
01:40:53.000 Elizabeth Holmes.
01:40:54.000 That's right.
01:40:55.000 That's right.
01:40:56.000 Theranos.
01:40:56.000 Yeah.
01:40:57.000 The completely fraudulent company.
01:40:59.000 That was an amazing podcast series that if I read it, you're right.
01:41:04.000 I probably would have like, oh, boring.
01:41:06.000 Right.
01:41:07.000 I probably would have abandoned it earlier.
01:41:08.000 But listening to it in podcast form, listening to actual conversations from these people, listening to people's interpretations of these conversations, Listening to people that were there at the time, telling stories about when they knew things were weird, when they started noticing there's tests that were incorrect,
01:41:28.000 that they were covering up, that kind of shit.
01:41:31.000 You can do that now with something like this, and I think that one of the good things about podcasts, too, is you don't need anybody to tell you that you could publish this.
01:41:42.000 Yeah, absolutely.
01:41:43.000 I think you're right.
01:41:47.000 Formats like this reveal that the news companies are wrong about some things, about audiences.
01:41:55.000 They think that people can't handle an in-depth discussion about things.
01:41:58.000 They think that audiences only want to watch 30 seconds of something.
01:42:02.000 They don't.
01:42:02.000 They're interested.
01:42:03.000 They do have curiosity about things.
01:42:07.000 It's very difficult to convince people, in the news business especially, We're good to go.
01:42:34.000 I don't know.
01:42:42.000 But the flip side of that is that they're not investing in stuff like international news in the way they used to.
01:42:50.000 When I came up in the business, every bureau, every big network had bureaus in every major city around the world, Rome, Berlin, Moscow, whatever it is, right?
01:43:00.000 And they had...
01:43:13.000 I think the news is getting worse.
01:43:20.000 Podcasts are getting more interesting.
01:43:22.000 Maybe there's a happy medium they can find in between.
01:43:25.000 Well, documentaries as well.
01:43:27.000 Documentaries are commercially viable if it's a great subject.
01:43:29.000 Like a good example is that Wild Wild Country one.
01:43:33.000 I didn't even know that that cult existed.
01:43:37.000 I had no idea what happened up there.
01:43:39.000 So this documentary sheds light on it.
01:43:42.000 It does it over, I think it was like six episodes or something like that.
01:43:45.000 It's fucking amazing.
01:43:47.000 It made a shit ton of money.
01:43:48.000 Or Making a Murderer was another one I think was really good.
01:43:52.000 That's something that happens all over the place.
01:43:54.000 You have these criminal justice cases and terrible injustices happen.
01:43:59.000 And if you really tell the whole story and make characters out of people and invest the time and energy to tell it well, people still like really good storytelling.
01:44:12.000 But I think within the news business, they have this belief, their hard-headed belief, that people can't handle difficult material, and I don't know why that is.
01:44:22.000 Yeah, I don't know why it is either.
01:44:25.000 I mean, I think there's a large number of people that aren't satisfied intellectually by a lot of the stuff they're being spoon-fed.
01:44:32.000 And they think that because the vast majority of things that are commercially viable are short attention span things, I think it's like this real sloppy way of thinking, non-risk-taking way of thinking.
01:44:45.000 They're like, listen, this is how people consume things.
01:44:47.000 You've got to give them like a music video style editing or they just tune out.
01:44:56.000 Yeah.
01:45:16.000 Right, yeah.
01:45:17.000 And you're absolutely right about the thirst for something else.
01:45:23.000 And again, I think when people turn on most news products, they're getting this predictable set of things and that doesn't quench that thirst for them.
01:45:34.000 They're not being challenged in any way.
01:45:36.000 They're not seeing different sides of a topic.
01:45:39.000 You're not approaching covering a subject honestly by genuinely exploring the idea that people you may have thought were bad are right or people you may have thought are good or wrong.
01:45:51.000 It's just all predictable.
01:45:53.000 So I think people are fleeing to other things now.
01:45:55.000 They want to just get the story.
01:45:59.000 They don't want to have a whole lot of editorializing on top of it.
01:46:05.000 Yeah, and I think also there's a lot of underestimating of audiences going out there.
01:46:12.000 We just think that they can't handle stuff, and they can.
01:46:16.000 They're interested, but we just take it for granted that they can't do it.
01:46:20.000 Maybe I'm guilty of that too, because I've been doing this for so long, but yeah, it does happen.
01:46:26.000 I don't think people have changed that much.
01:46:28.000 Yeah, no, probably not.
01:46:30.000 It's just difficult.
01:46:33.000 Maybe it's also we don't have the stamina to stick with a story in the same way that we used to.
01:46:39.000 Like now, if a story doesn't get a million hits right away, we don't return to the subject.
01:46:43.000 You think about stories like Watergate.
01:46:47.000 When Woodward and Bernstein first did those stories, they were complete duds.
01:46:52.000 Everybody thought they were on the wrong path.
01:46:54.000 They were the only people who were covering it.
01:46:57.000 And a lot of those stories kind of flailed around.
01:47:01.000 They didn't get the big response.
01:47:03.000 And it wasn't until much later that it became this hot thing that everybody was watching.
01:47:08.000 And you wouldn't, so that wouldn't happen now, right?
01:47:10.000 Like if reporters were on a story, if it didn't catch fire within the first couple of passes, your editor's probably going to take you off it now.
01:47:20.000 What was that story that the New York Times worked on about Trump and they worked on it for a long time and it was released and went in and out of the news cycle in a matter of days and nobody gave a fuck?
01:47:29.000 Yeah, the one about his finances.
01:47:32.000 Yes.
01:47:32.000 And it was like a 36,000-word story.
01:47:34.000 It was like unbelievable.
01:47:36.000 It was like six times as big as the biggest story I've ever written in my life.
01:47:40.000 They thought it was a giant takedown.
01:47:42.000 Right, yeah.
01:47:43.000 And it was.
01:47:44.000 It was like a 36-hour thing, if that, right?
01:47:49.000 Maybe.
01:47:50.000 Maybe, yeah.
01:47:51.000 People kind of said, oh, this is amazing.
01:47:54.000 It's got all this information in it.
01:47:55.000 And it just fell flat.
01:47:57.000 And the important thing about that is that news companies see this and they say, wow, we invested all this time and money.
01:48:06.000 We put our really good reporters on this.
01:48:09.000 We gave them six months to work on something.
01:48:11.000 And it got the same amount of hits as, you know, some story about, you know, a carp with a human face that was filmed in China.
01:48:20.000 You know what I mean?
01:48:21.000 Like some thing that we, you know, we picked off the wires and we stuck it in page 11, whatever it was.
01:48:26.000 So then what that tells them, the incentives now are, let's not bother.
01:48:31.000 Let's not do six months' investigations of anything anymore because What's the point?
01:48:36.000 We're going to get as many hits doing something dumb.
01:48:39.000 So they just don't take the risk anymore.
01:48:41.000 God, it's so crazy that that's the incentive now that it's all clicks.
01:48:45.000 Totally.
01:48:45.000 It's such a strange trap to fall into.
01:48:48.000 And there's also the other thing, which is the litigation problem.
01:48:54.000 And this is another thing I wrote about in the book, is that there was a series of cases in the 80s and 90s where reporters kind of took on big companies.
01:49:01.000 The Chiquita Banana thing that the Cincinnati Enquirer did.
01:49:05.000 Remember the movie The Insider about Brian and Williamson, the tobacco company, CBS, right?
01:49:12.000 There was another one with Monsanto in Florida where some Fox reporters went after Monsanto.
01:49:17.000 So they all got sued.
01:49:19.000 And it costs their companies a ton of money and reputational risk.
01:49:23.000 And so after that, what news companies said is, why take on a big company that can fight back and throw a lawsuit at us?
01:49:34.000 And what do we win by that?
01:49:36.000 We're not going to get more audience from that, you know?
01:49:38.000 So, now if you watch consumer reporting, like a small TV station, usually they're gonna bang on some little Chinese restaurant that has roaches or something like that.
01:49:49.000 They're not gonna go after Monsanto or Chiquita Banana because there's no point.
01:49:57.000 It's too much of a risk, so they just don't do it.
01:50:00.000 And that's another thing that's gone wrong with reporting.
01:50:04.000 The economic benefit of going after a powerful adversary isn't there anymore.
01:50:10.000 So they don't do it.
01:50:12.000 And that's a problem.
01:50:13.000 Now clearly you've seen a giant change in journalism from when you first started to where we are now.
01:50:19.000 Do you have any fears or concerns about the future of it?
01:50:23.000 I mean this is what you do for a living.
01:50:25.000 What are your thoughts on it?
01:50:27.000 Where do you think it's going?
01:50:29.000 I mean, I'm really worried about it because you need the journalists to kind of exist apart from politics and to be a check on everything.
01:50:43.000 The whole idea of having a fourth estate is that it's separate from the political parties.
01:50:48.000 I mean, I don't work for the DNC. It's not my job to write bad news about Donald Trump.
01:50:53.000 That's the DNC's job.
01:50:55.000 They put up press releases about them.
01:50:57.000 And if people see us as being indistinguishable from political parties or being all editorial, then we don't have any power anymore.
01:51:05.000 That's the first thing.
01:51:07.000 The press doesn't have any ability to influence people if people don't see us as independent and truthful and all those things.
01:51:15.000 And so That's what I really worry about right now.
01:51:19.000 People will stop listening to the media.
01:51:22.000 They'll still tune us out.
01:51:23.000 They don't trust us anymore.
01:51:25.000 Walter Cronkite from 1972, the Gallup Poll Agency, found that he was the most trusted man in America.
01:51:32.000 And that was true also in 1985. Like for 13 consecutive years, he was the most trusted.
01:51:38.000 There's no reporter in America who's trusted.
01:51:41.000 The most trusted man in America?
01:51:42.000 That doesn't exist.
01:51:43.000 Yeah, it doesn't exist.
01:51:45.000 Good luck.
01:51:45.000 Yeah, exactly.
01:51:46.000 So people think of us as clowns.
01:51:48.000 You know, entertainment figures.
01:51:50.000 And so how are you going to impact the world if people think you're a joke, you know?
01:51:55.000 And that's what I really worry about.
01:51:57.000 We don't have any institutional self-respect anymore.
01:52:00.000 You know, we don't feel like we have to, you know, challenge audiences, challenge powerful people.
01:52:07.000 You know, it's just a bunch of talking points, and that's not what the business is about.
01:52:12.000 So I worry about it.
01:52:15.000 And, you know, I think there are a lot of journalists who kind of say the same thing.
01:52:19.000 We all kind of talk amongst ourselves, which is, you know, the job as we knew it is kind of being phased out and changed into something else.
01:52:28.000 And that's not a good thing, you know, because people do need, in tough times, people need the press, you know, as ridiculous as that sounds now.
01:52:38.000 But it's true.
01:52:39.000 And I don't know where we go from here.
01:52:46.000 Yeah.
01:53:03.000 It's so hard to find.
01:53:04.000 And I think it's one of the reasons why we're so lost.
01:53:06.000 And it's one of the more insidious aspects of the term fake news.
01:53:11.000 Because god damn that's so easy to throw around.
01:53:13.000 It's like it's so easy to call someone a bigot.
01:53:16.000 It's so easy to call someone a racist.
01:53:17.000 And it's so easy to say fake news.
01:53:19.000 And they all have the same sort of effect.
01:53:22.000 They just diminish anything that you have to say almost instantaneously.
01:53:25.000 Totally.
01:53:26.000 And there's...
01:53:29.000 When you can cast the entire news as being fake, people can tune it out.
01:53:34.000 But a lot of that has to do with who's doing the news reading now, right?
01:53:38.000 Like, in the 60s and 70s, maybe before, reporters, a lot of them came from the middle and lower classes.
01:53:45.000 Like, you know, they were...
01:53:46.000 The job was originally kind of like being a plumber, right?
01:53:50.000 It was more of a trade than a profession.
01:53:53.000 And so you had a lot of people who...
01:53:55.000 Who went into the job and they had this kind of attitude of just wanting to stick it to the man.
01:54:01.000 They didn't want to be close to power.
01:54:05.000 They wanted to take it on.
01:54:06.000 People like Seymour Hersh, right?
01:54:08.000 You see that kind of personality who just wants to take the truth and rub it in somebody's face.
01:54:13.000 But then after all the president's men, it became this sexy thing to be a journalist.
01:54:19.000 And you saw a lot of people from my generation who...
01:54:22.000 I went into journalism because they wanted to be close to politicians and hang out with them It's kind of like the primary colors thing, right?
01:54:29.000 Where you see people who they just want to have a beer with the presidential candidate.
01:54:34.000 And that's totally different from what it used to be.
01:54:37.000 So now we're on the wrong side of the rope line.
01:54:40.000 You see what I'm saying?
01:54:41.000 We used to be outside of power, like taking it on.
01:54:45.000 And now we're more upper class in the press.
01:54:51.000 And we're kind of in bed with the same people we're supposed to be covering.
01:54:55.000 And that's That's not a good thing.
01:54:57.000 When people see that, that's one of the reasons why they call us fake news is because they see us as doing PR for rich people.
01:55:07.000 One of my favorite books ever about politics is Fear and Loathing on the Campaign Trail.
01:55:10.000 Oh, yeah.
01:55:11.000 I wrote the introduction to that.
01:55:12.000 Did you?
01:55:12.000 Yeah, the last edition of that.
01:55:15.000 Oh.
01:55:15.000 Greatest book, yeah.
01:55:16.000 It's a fantastic book.
01:55:17.000 And it's a great example of someone who knew that they weren't a part of that system so they could talk about it as an outsider.
01:55:25.000 He knew he was only going to be covering it for a year, so he just went in, guns blazing, got everybody fucked up, drinking on the bus, making everybody do acid.
01:55:32.000 Burned all of them.
01:55:33.000 Yeah, and he says that in the book.
01:55:35.000 He's like, look, this isn't my beat.
01:55:37.000 I don't have any friends I have to keep, you know?
01:55:40.000 So I'm going to tell you everything that I see, and fuck it.
01:55:44.000 And that's a real problem in reporting.
01:55:46.000 When you're in a beat for too long, you end up developing unhealthy relationships with sources, and you end up in a position where you're not going to burn the people who you're dependent on to get your information out.
01:55:58.000 And when that happens to reporters, I think that's one of the reasons it's good to kind of cycle through different topics over the course of your career.
01:56:07.000 If you get stuck in the same beat too long, eventually you fall into that trap.
01:56:12.000 And Thompson, of course, never did that.
01:56:14.000 Every story that he covered was, he let it all hang out and just said whatever the hell he thought and he let the chips fall where they may.
01:56:23.000 And that's kind of the way, I mean, you can't do that all the time probably, but I think that's the thing.
01:56:27.000 That was great.
01:56:28.000 It was amazing.
01:56:29.000 And there's no other examples of it.
01:56:31.000 No, no.
01:56:33.000 Not like that.
01:56:34.000 Yeah, yeah, yeah.
01:56:35.000 I mean, that book was so great on so many levels.
01:56:38.000 I always thought of it as being also kind of like a novel, because it's this story about this person who's obsessed with finding meaning and truth, but he goes to the most fake place on earth, which is the campaign trail, to look for it.
01:56:54.000 And so all these depictions of all these terrible lying people, They're just so hilarious.
01:56:59.000 And so it's kind of, you know, it's almost like a Franz Kafka novel.
01:57:04.000 It's amazing.
01:57:05.000 And then it's great journalism at the same time.
01:57:07.000 Like, he's telling you how the system works and how elections work, and it's really valuable for that.
01:57:12.000 So, yeah, that was brilliant.
01:57:13.000 He also changed a lot.
01:57:15.000 I mean, he actually affected politicians.
01:57:18.000 Like, the shit that he did with Ed Muskie.
01:57:20.000 Oh, my God.
01:57:20.000 Yeah.
01:57:22.000 That was fantastic.
01:57:23.000 When he was on the Dick Cavett show, and Dick Cavett asked him about it, he goes, well, there's a rumor that he was on Ibogaine, and I started that rumor.
01:57:37.000 I mean, it's just, he, like, literally, that he got in that guy's head.
01:57:42.000 Oh, yeah.
01:57:43.000 And I remember he put that picture of Muskie, and he just found a picture of Muskie, and it's, he's basically going, like that.
01:57:50.000 Yes.
01:57:50.000 And the caption is, Muskie in the throes of an Ibogaine frenzy, right?
01:57:54.000 Yeah.
01:57:54.000 And you couldn't really get away with that now.
01:57:58.000 Well, it's a crazy drug to choose, too, because it's a drug that gets you off addictions.
01:58:02.000 Right, yeah, exactly.
01:58:03.000 It's one of the more hilarious aspects of his choice.
01:58:05.000 But it sounded great.
01:58:07.000 Yeah.
01:58:07.000 And with the witch doctor and all that stuff.
01:58:09.000 Brazilian witch doctor.
01:58:10.000 Yeah, yeah, yeah.
01:58:11.000 It was fantastic.
01:58:13.000 Oh, so good.
01:58:15.000 Yeah, but, you know, that kind of stuff probably wouldn't go over all that well right now.
01:58:20.000 No, he'd get sued.
01:58:21.000 Yeah, but also he had this very, very sort of aggressively caricaturizing way of looking at politics and politicians, and that wouldn't go over that well now either.
01:58:36.000 Like, people don't want you to rip on the process as much as he did in that book, so it was great.
01:58:41.000 It was just a fantastic book.
01:58:42.000 Yeah, I mean, he had a bunch of them that were great, but that one particularly, you can sort of redo it.
01:58:51.000 You could reread it every time we get to an election cycle.
01:58:55.000 It sort of goes, oh, it lets you know these are repeating cycles.
01:59:00.000 This is just like the same shit that he was dealing with in, you know, various different forms.
01:59:05.000 But you can see it all today.
01:59:07.000 And it's funny, the reporters, everybody's read that book, everybody who covers campaigns.
01:59:13.000 You know, I'm on my fifth right now for Rolling Stone.
01:59:16.000 Like, I have his old job.
01:59:20.000 Everybody has read that book, and so they unconsciously try to make the same characters in each election cycle.
01:59:26.000 So there's always like a Christ-like McGovern figure.
01:59:30.000 There's a turncoat, quizzling, spineless, musky figure.
01:59:37.000 There's the villain, Nixon.
01:59:40.000 Trump kind of fills that role for a lot of reporters now.
01:59:43.000 And then a lot of them try to behave in the same way that their characters behaved in that book.
01:59:49.000 So you remember Frank Mankiewicz was McGovern's sort of handler and he was having beers with Thompson after the events and kind of...
02:00:00.000 Strategizing with them.
02:00:01.000 Reporters try to do that.
02:00:02.000 They all try to do that with the candidates and their handlers.
02:00:05.000 Now they try to develop those same relationships.
02:00:07.000 It's just interesting.
02:00:07.000 It's like they're reliving the book.
02:00:09.000 That's a problem with someone that's really good.
02:00:12.000 They take on so many imitators.
02:00:16.000 So many imitators take on their demeanor and their thought process.
02:00:20.000 And Hunter was just such an iconic version of a writer that it's so difficult if you're a fan of his to not Want to be like that guy.
02:00:30.000 Oh, totally.
02:00:30.000 I mean, I know that.
02:00:35.000 Especially because I'm writing for the same magazine and covering a lot of the same topics, you have to immediately realize that you can't do what he did.
02:00:45.000 Thompson's writing was incredibly ambitious and unique.
02:00:48.000 He was using a lot of the same techniques that the great fiction writers use.
02:00:54.000 He was creating...
02:00:55.000 Almost like this four-dimensional story, but at the same time it was also journalism.
02:01:04.000 Most people couldn't get away with that.
02:01:06.000 You have to be a great, great writer.
02:01:08.000 I'm talking like a rare Mark Twain-level talent to do what he did, which is to kind of mix the ambition of great fiction with journalism.
02:01:19.000 So if you try to do that stuff, it's going to be terrible.
02:01:22.000 And I've certainly...
02:01:24.000 If you go back and look at my writing, you'll find a lot of shitty Thompson imitations.
02:01:30.000 And so I learned to not do that pretty early.
02:01:34.000 But yeah, it's one of those don't try this at home things for young writers, if you can avoid that, for sure.
02:01:43.000 Do you have any...
02:01:45.000 Do you have any hope?
02:01:47.000 Is there anything that you look to and go, maybe this is going to be where this turns around in terms of journalism?
02:01:57.000 Yes.
02:01:58.000 I mean, oddly enough, I think shows like yours and the kind of proliferation of what you're talking about with podcasts The great thing about the internet, there are lots of bad things, but the great thing about it is that it's provided a way for people to just have an audience if they're good,
02:02:18.000 right?
02:02:19.000 And if people have a demand for it, if there's a demand for it, you can exist.
02:02:24.000 You can have a platform.
02:02:26.000 And so that's what I think is going to happen, is that people are going to crack the code of what kind of journalism people want.
02:02:33.000 And they're going to create something that people are going to flock to.
02:02:37.000 And I don't have a lot of faith that CBS, MSNBC, ABC, CNN, that they're going to figure it out.
02:02:46.000 Like, I think it's going to be some independent kind of voice that is going to come up with something, a new formula.
02:02:52.000 And people are, that is going to rise up, you know?
02:02:55.000 I mean, you've seen it a little bit with things like the Young Turks, you know, although they've changed a little bit.
02:03:04.000 But they figured out that if you provide something that's an alternative from the usual thing, that you can get a viable functioning business a lot faster than you used to be able to.
02:03:14.000 What do you mean by they changed?
02:03:17.000 You know, I think they've kind of become a little bit more in the direction of a traditional news organization than they were originally, maybe.
02:03:29.000 I don't know.
02:03:30.000 I don't watch it as much as I used to, so maybe I shouldn't say that.
02:03:34.000 But, you know, again, the ability to do that is a lot different than it used to be.
02:03:40.000 In order to have an independent journalism outlet, you used to have to, for instance, put out your own newspaper, do your own distribution, do your own printing, do your own design.
02:03:49.000 All that stuff cost a ton of money, and it was very, very hard to do it without big corporate sponsors.
02:03:56.000 Now anybody with a good idea can pretty much do something, so I have a lot of hope that somebody's going to figure it out.
02:04:06.000 It's just we're not there yet.
02:04:08.000 I agree with you.
02:04:10.000 I'm optimistic.
02:04:10.000 I have a lot of hope too, but I'm always like, fuck, hurry up already.
02:04:14.000 Yeah, I know.
02:04:15.000 I know.
02:04:16.000 And it's just, until we get there, the remnants of the old system of media, they're just, you know, it's just so tough to watch...
02:04:27.000 They're flailing.
02:04:28.000 They don't really know what to do.
02:04:29.000 They're kind of caught between just purely chasing the money and trying to adhere to what they thought the news looked like in the past.
02:04:37.000 So it's like not entertaining.
02:04:39.000 If they were just chasing the money, if they just come up organically today, they would have had a different product entirely.
02:04:45.000 But they're trying to sound like legitimate news, but they're also completely selling out at the same time, and it's just not working.
02:04:54.000 We'll see where all that goes.
02:04:57.000 You're right.
02:04:58.000 They're flailing right now.
02:04:59.000 Matt Taibbi, I appreciate you, man.
02:05:01.000 Thanks a lot, Joe.
02:05:01.000 I really do.
02:05:02.000 It's always an honor to talk to you.
02:05:03.000 No, likewise.
02:05:04.000 Your book, Tell People Hate Inc.
02:05:06.000 It's called Hate Inc.
02:05:08.000 It's by OR Books.
02:05:09.000 It's out now.
02:05:10.000 You can buy it on Amazon.
02:05:11.000 My podcast is called Useful Idiots with Katie Halper, Rollingstone.com.
02:05:16.000 Check that out once a week.
02:05:17.000 Thank you.
02:05:18.000 Thanks, Joe.
02:05:19.000 Appreciate it.
02:05:19.000 Bye, everybody.
02:05:23.000 Awesome.