The Joe Rogan Experience - June 05, 2019


Joe Rogan Experience #1311- David Pakman


Episode Stats

Length

2 hours and 5 minutes

Words per Minute

176.66176

Word Count

22,121

Sentence Count

1,757

Misogynist Sentences

15

Hate Speech Sentences

25


Summary

In this episode of the podcast, I sit down with David Axelrod to talk about how to deal with the toxicity in the world of politics and social media, and why it's important to have a healthy dose of common sense in order to make the world a better place. We talk about the dangers of "dunking on people" and why we should be careful not to get carried away by the amount of negativity on social media platforms like YouTube and other platforms. We also talk about what it means to be a good ally and a good friend to one another, and how we can work together to make things better for people on both sides of the political aisle. I really enjoyed this episode and hope you do too! -Jon Sorrentino Music: Fair Weather Fans by The Baseball Project, Recorded live at WFMU Art: Mackenzie Moore Music: Hayden Coplen Editor: Will Witwer Logo by Ian Dorsch Theme Music: Jeff Kaale - "Goodbye Outer Space" by Haley Shaw - "Outer Space Junk" by Fuse - "Too Effing Highlighted" by Jeff Perla - "Intro" by Zapsplat, "Outro Music: "Incomptech" by Chacho - "Outtro Music: "Feat. Me" by David Axelro - and (feat. , & , "The Good, the Bad, "The Bad, The Good, The Evil, The Bad, the Good, and The Bad & The Beautiful" by by John Doe - & "The Great, the Great, The Great, "Good, the Beautiful" - "The White, the Wrong, The Beautiful, The Right, The Wrong, and the Beautiful, the Right, "The Wrong, the Green, The One" by Eddy, "A Little Bad" , and "The Right, the One" we'll Be Coming Soon, we'll See You Soon, We'll Figure it Out, "We'll Figure It Out" - , we'll Talk About It Out, - We'll See Us Outro, we're Too Effing Out, We've Got It Out & We'll Be Better Next Week, We'll Hear You, We Can't Do It, We Don't Have It (featuring: , We'll Get Better, We're Working On It,


Transcript

00:00:01.000 Hello, David.
00:00:02.000 Here we go.
00:00:03.000 We're live.
00:00:03.000 We're doing it.
00:00:04.000 Yes, we're doing it.
00:00:05.000 What's going on, man?
00:00:06.000 I'm nervous.
00:00:06.000 I'm nervous.
00:00:06.000 Don't be.
00:00:07.000 I enjoy your show.
00:00:08.000 I really do.
00:00:09.000 Thank you.
00:00:09.000 It's a pleasure to watch.
00:00:10.000 You're a very smart guy, man.
00:00:12.000 And like I said, we were just talking about it.
00:00:14.000 You're very reasonable.
00:00:15.000 In this world, I think there's so much of this, the YouTube political world, the YouTube commentary world, where people are so fucking toxic...
00:00:25.000 You know, there's so much negativity.
00:00:27.000 There's so much what they call dunking on people.
00:00:30.000 There's so much dunking.
00:00:31.000 You do a little dunking.
00:00:32.000 Some of it's warranted.
00:00:33.000 It is warranted, yes.
00:00:34.000 But I don't know if it's beneficial.
00:00:37.000 To the people doing the dunking?
00:00:39.000 Yes.
00:00:39.000 Or even to the cause.
00:00:41.000 I think it is temporarily – well, sometimes it's good because it shows – it mocks people's positions and it makes people realize, yeah, that is a ridiculous position.
00:00:50.000 So if you're on the fence, or if you're not really quite sure how you feel about things, and you see someone get mocked for a ridiculous position that maybe you've even shared for a little bit, maybe you haven't explored it deeply, and you see someone who has explored it deeply sort of expose all the flaws in this line of thinking,
00:01:05.000 it's good.
00:01:06.000 But my thing, what I'm...
00:01:09.000 I interview a lot of people on the right and a lot of people on the left, and I just hate all this conflict.
00:01:17.000 The unnecessary conflict, I think, is when you watch television today and you see Antifa fighting with Trump supporters and all this weird conflict.
00:01:34.000 I don't necessarily think that most of it is necessary.
00:01:40.000 Necessary?
00:01:41.000 Well, I think the devil's in the details.
00:01:43.000 So as an example, if you want to bring together, I don't know, people who are on opposite sides of the climate debate, for example...
00:01:52.000 Sure.
00:01:53.000 Right.
00:01:53.000 Well, part of that, you could argue, is if one side just does not accept science, how can you really bring those people together?
00:02:01.000 It doesn't mean you need physical conflict to resolve it.
00:02:03.000 In fact, I completely agree with you.
00:02:05.000 The physical conflict is totally counterproductive.
00:02:07.000 But at a certain point on some issues, I understand why...
00:02:12.000 There's like an intractability to the debate where it seems completely impossible to move forward because whichever side you're on, I would argue that I'm on the right side of these issues and others would disagree.
00:02:24.000 When you're far apart in a way that you can't even agree as to what the starting point facts are about the conversation, how do you start?
00:02:32.000 Right.
00:02:33.000 I have some ideas as to how I try to do it, but it's very tough.
00:02:37.000 It is very tough.
00:02:38.000 I just don't think dunking on people always, like constantly shitting on people, is necessarily the way to do it.
00:02:44.000 Yeah, and I think it's important to distinguish between just straight up ad hominems where someone is wrong and bad because I think they're a bad person or they're an idiot or whatever.
00:02:55.000 To recognizing when somebody is a participant in bad faith in a conversation, to when someone has maybe fallen prey to audience capture or whatever else might be kind of influencing what and how they're doing.
00:03:10.000 I think that those criticisms are legitimate, but you got to stay away from just ad hominem.
00:03:15.000 Yes, yes, I agree.
00:03:17.000 And I think that it's just so common today.
00:03:19.000 It's also extremely attractive.
00:03:22.000 The YouTube algorithm, you know, as far as comments go, I mean, it actually kind of encourages it.
00:03:29.000 And so does Facebook.
00:03:31.000 So does, you know, anytime there's a social media platform, That is ad-dependent.
00:03:36.000 One of the best ways to get people to engage is to have something they disagree with so they can get angry.
00:03:44.000 Yes, until it becomes no longer brand safe according to whoever's running the platform.
00:03:49.000 Right, right.
00:03:59.000 And I text Kyle Kalinske and I say, I think there's like a glitch.
00:04:03.000 It says I made 19 cents and he says, it says I made 35 cents or something like that.
00:04:08.000 Something's going on.
00:04:09.000 And it was the beginning of like Adpocalypse 1.0.
00:04:12.000 And that was a rough three-week period.
00:04:15.000 And so it's, you know, encourage the debate and the battle of ideas, so to speak, and all of this stuff until advertisers get worried and they say, oh, you know, our ads are showing up on stuff that's a little bit touch and go for us.
00:04:28.000 That's a weird one to me because YouTube has always been a secondary thought for me.
00:04:35.000 The first thought was the audio version of the podcast.
00:04:39.000 And in fact, when we were uploading it to YouTube at first, I was like, why are we even doing this?
00:04:43.000 I guess, why not?
00:04:44.000 Some people probably want to watch it.
00:04:45.000 And then somewhere along the line, it became at least close to as big as the audio version of it.
00:04:53.000 And then maybe even more significant because...
00:04:57.000 One of the things that the YouTube version has is the comment section, which is often a fucking dumpster fire, but at least there is some sort of a community engagement aspect of it that doesn't really exist in iTunes.
00:05:10.000 In iTunes, it's in a vacuum, right?
00:05:12.000 Sure.
00:05:13.000 But when the adpocalypse thing happened, I was like, hmm, what is going on here?
00:05:19.000 It wasn't my primary focus, so it wasn't terrifying.
00:05:22.000 But people that only did YouTube...
00:05:26.000 And people that relied on that for their living.
00:05:28.000 I mean, it's a huge blow.
00:05:29.000 It was huge.
00:05:29.000 And at the time, I'm trying to think back, I think maybe like around 30% of my entire show's revenue was coming from YouTube at the time.
00:05:38.000 So it was not everything, but it was still significant, right?
00:05:42.000 I mean, I have staff and overhead and all of that stuff.
00:05:44.000 So just overnight, 30% going away is huge.
00:05:47.000 And that's why I've tried to move to the model of telling my audience...
00:05:51.000 You can skip all of this stuff.
00:05:53.000 You know, even some of these other, you know, super chats and all of this other stuff.
00:05:57.000 Like, we run a membership program on my website.
00:05:59.000 I control 100% of it.
00:06:01.000 So it's on a Patreon deal or anything?
00:06:03.000 We're on Patreon, but it's not big for us.
00:06:06.000 The way I think about it is as long as, I mean, listen, yeah, there's, you know, Yeah.
00:06:18.000 Yeah.
00:06:35.000 I think?
00:06:49.000 I obviously don't agree with Richard Spencer.
00:06:52.000 But can an algorithm figure out that there's a difference between an interview I do with Richard Spencer and white nationalist propaganda?
00:07:00.000 I don't know, but we can kind of get around all of that if you just go directly to me.
00:07:05.000 And that's why my focus has been growing those direct memberships.
00:07:07.000 Did you interview Richard Spencer?
00:07:09.000 Yeah.
00:07:10.000 Did you get shit for that?
00:07:11.000 Yes.
00:07:12.000 Yeah.
00:07:12.000 That's a weird one, right?
00:07:13.000 You know, I'm sure you're aware that, what is it called?
00:07:18.000 The Dayton Society?
00:07:20.000 Where there was a woman who made a bunch of connections.
00:07:24.000 Like, Joe Rogan knows David Pakman, and Joe Rogan also knows Alex Jones.
00:07:28.000 Alex Jones must be friends with David Pakman.
00:07:30.000 Like the map?
00:07:31.000 Yeah.
00:07:31.000 It's like one of those mines, you know?
00:07:33.000 And it was really weird.
00:07:36.000 It's like guilt by association.
00:07:38.000 I saw a couple of them.
00:07:39.000 There was like an initial one which maybe you're thinking of.
00:07:41.000 Then there was a map of like the YouTube sphere specifically, left, middle, and right or something like that.
00:07:46.000 Yeah, yeah, yeah.
00:07:46.000 And this idea that everyone's like a part of a grand conspiracy to help each other out and push right ideology.
00:07:53.000 Even though, you know, a lot of people that were labeled as right aren't right.
00:07:57.000 Like who?
00:07:58.000 Like me.
00:07:59.000 Oh.
00:07:59.000 I'm not right at all.
00:08:01.000 My sense is your politics are pretty left on most stuff.
00:08:06.000 Although I don't know you personally beyond just seeing your shows.
00:08:11.000 But maybe the critique is based on...
00:08:13.000 Because I think that those maps were based on what is the YouTube algorithm suggesting.
00:08:19.000 And so that may not be in line with your personal politics.
00:08:22.000 Right.
00:08:22.000 It's just maybe what we're talking about.
00:08:24.000 Like, if you're interested in conflict, if you're trying to get engagement, that's the way to do it.
00:08:29.000 Like, if YouTube algorithm...
00:08:31.000 Is constantly suggesting people like Ben Shapiro or Gavin McGinnis or whatever.
00:08:35.000 And those videos come up over and over again.
00:08:37.000 Sure.
00:08:37.000 And I mean, so a lot of those people's channels do really well on YouTube.
00:08:42.000 So if you interview someone who has a channel themselves, there's a very good chance that the algorithm, if they're watching your interview with that person...
00:08:50.000 We'll say, well, here's a lot of their stuff.
00:08:52.000 And then once you click there, the algorithm very quickly starts to build a picture of every individual user.
00:08:59.000 If you watch your interview with Ben Shapiro, and then it takes you to a Daily Wire video, then it takes you to the Daily Wire second stringer guy, and then you're off who knows where.
00:09:09.000 It's all machine learning, right?
00:09:11.000 For the most part.
00:09:11.000 Yeah.
00:09:13.000 It is...
00:09:15.000 It's a troubling aspect of that thing that they do where they suggest the next videos, which didn't used to be a thing.
00:09:22.000 It used to be you would go to YouTube, you would watch a video, and then you would go find another video.
00:09:28.000 They didn't suggest anything.
00:09:29.000 And then somewhere along the line, I don't remember what year it was, but this started happening.
00:09:34.000 And then they started auto-playing the next video.
00:09:37.000 Yeah, I think there was some kind of recommendation...
00:09:41.000 Yeah.
00:09:57.000 From that recommendations feed from other stuff.
00:10:01.000 But it's significant for a lot of YouTube channels.
00:10:04.000 The tagging your videos and getting the right metadata on them in order to bring an audience is an important thing.
00:10:11.000 So it's a double-edged sword in some sense, it sounds like.
00:10:14.000 But to get back to what you were saying about...
00:10:17.000 Richard Spencer.
00:10:18.000 I was going to say they labeled you as right but you're not right.
00:10:22.000 Well, it's disingenuous.
00:10:24.000 I mean, I've said it over and over and over again.
00:10:25.000 I've never voted for a Republican in my life.
00:10:28.000 I voted independent for Gary Johnson just because he did my podcast.
00:10:31.000 And I wasn't happy with Clinton, and I wasn't happy with Trump.
00:10:34.000 I was like, this is just gross.
00:10:35.000 I'm just going to vote for Gary Johnson.
00:10:37.000 I mean, I didn't think he was going to win.
00:10:40.000 He had almost no chance when he didn't know what Aleppo was.
00:10:44.000 I was like, that was his scream.
00:10:47.000 What's his face from New Hampshire?
00:10:49.000 Howard Dean.
00:10:50.000 Howard Dean, yeah.
00:10:50.000 Ah!
00:10:51.000 Well, voting in California also, I assume you vote in California.
00:10:54.000 It wasn't going to.
00:10:55.000 It's a joke, yeah.
00:10:57.000 But people conveniently will just, or they'll say that you're a Trojan horse.
00:11:03.000 You're a pretend left-wing person who's really just pushing right-wing ideologies.
00:11:08.000 I'm like, well, which one?
00:11:09.000 Which right-wing ideology?
00:11:11.000 Is it gay marriage?
00:11:12.000 What is it?
00:11:14.000 I'm on the left on everything except maybe the Second Amendment.
00:11:17.000 Right.
00:11:18.000 I think the criticism that could be levied if one wanted to make it into a criticism would be if you engage with right-wing ideas that you don't agree with, right?
00:11:31.000 Like I take you at your face value that you don't agree with a lot of the stuff that your right-wing guests say.
00:11:36.000 One could make the argument.
00:11:38.000 That by not challenging those ideas, it's implicitly lending them more credibility than maybe you think they should have.
00:11:46.000 That's interesting because what I try to do with people, unless someone's saying something egregious, I try to let them talk.
00:12:01.000 Right.
00:12:03.000 Right.
00:12:03.000 Right.
00:12:06.000 Right.
00:12:13.000 They've actually used their thoughts and they've really calculated and thought, this is the position I take and this is why.
00:12:20.000 A lot of people don't.
00:12:22.000 A lot of the times when you challenge people on their positions, you find out they don't really know what the fuck they're talking about.
00:12:28.000 The best way to find that out is to let them talk.
00:12:31.000 Like Candace Owens on climate change.
00:12:33.000 Right.
00:12:34.000 There's the Socratic method of questioning.
00:12:37.000 Which is, why do you think that?
00:12:38.000 And how do you know that that's true?
00:12:40.000 Et cetera, et cetera.
00:12:41.000 And sort of some other questions that come from it.
00:12:44.000 Which I do as well.
00:12:46.000 I mean, I think, I don't know, to tie it to the Richard Spencer interview that I did, some of the criticism I received after was from people on the left.
00:12:54.000 I mean, the people on the right.
00:12:55.000 I would assume it was most of the people on the left.
00:12:57.000 For doing the interview.
00:12:58.000 Or for what I said in the interview.
00:13:00.000 For doing it.
00:13:00.000 Okay, yeah.
00:13:01.000 For doing the interview at all, the criticism was more from the left.
00:13:03.000 For what I said in the interview, the criticism was more from the right, from people who just agreed with Richard Spencer.
00:13:09.000 Like, what things did they agree with?
00:13:11.000 That it is inevitable that people with different ethnic or religious backgrounds simply will not be able to coexist together peacefully and we're better off trying to figure out how can we separate people God,
00:13:30.000 that's sad.
00:13:46.000 Yeah, well, they have a series of, you know, decades of what they call scholarship supporting their view.
00:13:53.000 But for the context of my interview, I made it abundantly clear that I didn't agree with that stuff, right?
00:13:59.000 And my view is, and everybody can have a different view about how they do interviews, my view is, if I just allow what I consider to be disgusting views to be spread out, right?
00:14:10.000 You know, like a spray bottle, just spray them everywhere, not do anything else.
00:14:15.000 I can't say that I'm doing something that I think is valuable.
00:14:17.000 I don't feel like it's valuable.
00:14:19.000 So my approach is, are the ideas Known enough to be worth refuting.
00:14:26.000 That's number one.
00:14:27.000 If it's some weird conspiracy theory that has not even any following whatsoever, I'm probably not going to choose to even entertain it because it's irrelevant in sort of all ways.
00:14:38.000 So my first question is, was Richard Spencer relevant at the time?
00:14:42.000 Alt-right was rising.
00:14:43.000 This guy was considered by many the sort of creator of the alt-right.
00:14:47.000 He was growing a following in the context of the Trump candidacy at the time or maybe administration.
00:14:52.000 I don't remember when exactly it was.
00:14:54.000 It was, I think, 2016. I'm trying to remember when it was.
00:14:58.000 I don't remember when I first heard his name.
00:15:00.000 Yeah.
00:15:01.000 How did he come to prominence?
00:15:04.000 I don't know the sequence, but I think he had an alt-right website that had articles of some kind.
00:15:11.000 And then that website became more known.
00:15:16.000 That fucking term is so...
00:15:18.000 Alt-right, alt-left...
00:15:21.000 The centrist.
00:15:22.000 Yeah.
00:15:23.000 All these different labels.
00:15:24.000 I'd rather talk about issues.
00:15:25.000 I agree with you there.
00:15:26.000 It's so clunky.
00:15:27.000 But so, you know, first thing was, I did want to interview him, but if I had felt that I wouldn't be prepared to make it abundantly clear that I don't agree with the guy, and I think his ideas are terrible, I wouldn't have done the interview.
00:15:38.000 Right, right, right.
00:15:39.000 The problem I had with the critiques from the left of me doing that, some who said, the last thing we need to be doing is giving this guy a voice, that's often how they say it, or a platform.
00:15:49.000 My response was, this guy's getting interviewed in lots of other places that aren't even challenging him.
00:15:54.000 I'm at least making an attempt here to get something in the record that there are arguments against these ideas, these are bad ideas, and I don't want to be part of the diffusion of just the ideas themselves.
00:16:06.000 I want to be pushing back.
00:16:07.000 I'm going to have to watch that now.
00:16:08.000 Now, when you did do that, what was his response?
00:16:11.000 During the interview?
00:16:12.000 What was his response to your pushback?
00:16:14.000 I mean, he had answers.
00:16:15.000 He was well prepared.
00:16:17.000 I don't know if there were unique or new arguments that I was making, but there was no argument.
00:16:25.000 To be made that I was letting him just parrot white nationalist talking points unopposed, which I just wouldn't feel good about that.
00:16:30.000 It's not how I do interviews.
00:16:32.000 Yeah.
00:16:32.000 And then the left was upset that you were giving him, air quotes, a platform.
00:16:36.000 A very small portion of the left.
00:16:37.000 I want to be super clear.
00:16:39.000 I mean, my audience is very left.
00:16:41.000 Almost everybody understood what I was doing.
00:16:44.000 Ten years ago, I was interviewing the Westboro Baptist Church.
00:16:47.000 Most people understood what I was doing.
00:16:49.000 They were more prominent at the time.
00:16:50.000 But there was this sliver of the left that just didn't want the conversation to take place.
00:16:55.000 And I always struggle with this because as you can see, I have no problem criticizing that sliver of the left.
00:17:01.000 My concern is getting like overly wrapped up to criticisms of the left that are only held by these like niche slices.
00:17:10.000 Yes.
00:17:11.000 And that's why I try to avoid going further than necessary into those criticisms.
00:17:18.000 Like I think there are more serious critiques of the left to be made beyond anti-speech or want to limit speech or whatever.
00:17:27.000 That's a pretty big issue though.
00:17:29.000 Well, I don't actually agree that it exists on a significant portion of the left.
00:17:33.000 Like I think a bigger issue, for example, like if you said what is like a serious issue that the left needs to contend with right now?
00:17:38.000 I would say a more serious issue is if you look at the progressive accomplishments of the early 20th century, for example, like 1905 to 1925, and the New Deal accomplishments that the left had in the time of FDR – What was different I think then than the left now is that you didn't have to be completely in line with a specific set of policies or ideas.
00:18:06.000 And I worry that now there's a little bit of the left maybe having this idea that If you're not in line on all of these issues, whatever the checklist is, so to speak, you're not really worthy of being a participant in what is clearly a leftward move in sort of the average American's political orientation.
00:18:28.000 I don't want to see that prevent progress.
00:18:31.000 Yeah, that's the hard tribalism, right?
00:18:34.000 That's where the line gets drawn, you're with us or against us.
00:18:38.000 There's one way to think.
00:18:39.000 There is a lot of that.
00:18:41.000 I mean, I saw it with healthcare recently.
00:18:43.000 With healthcare, I don't think that you can make any serious case from the left that healthcare is fine and the for-profit, employer-connected system that we have is working.
00:18:53.000 I don't think there's any progressive case to be made for that.
00:18:56.000 Where people will differ...
00:18:59.000 Is what about Medicare for All versus some other system?
00:19:03.000 System that looks more like Canada's or the UK or Germany or whatever.
00:19:08.000 And I've already started to see – like when I say on my show, I'm kind of agnostic on this.
00:19:12.000 Like the system we have is a disaster.
00:19:14.000 We need a system that will get coverage to everybody – The numbers can be made to work any number of different ways.
00:19:20.000 We've looked at it.
00:19:21.000 But 80% of people on Medicare, I believe it is, have some additional coverage.
00:19:25.000 They either are still working part-time or full-time and get coverage that way or they're poor enough to be on Medicaid.
00:19:30.000 The point is Medicare for All doesn't solve every issue.
00:19:33.000 It's way better than what we have.
00:19:35.000 But here's like a dozen other possibilities looking at other countries.
00:19:39.000 There is a portion of the left that doesn't like that because I'm saying I'm against Medicare for All.
00:19:44.000 I'm not saying that.
00:19:45.000 What I'm saying is there are a number of different ways to improve upon the system we have, all of which sever this relationship between usually your employer and these for-profit insurance companies.
00:19:57.000 Why can't we be open to that?
00:19:58.000 I really don't understand private citizens that don't want Easy access to quality healthcare for everybody.
00:20:06.000 That confuses the shit out of me.
00:20:08.000 Like, have you ever been hurt?
00:20:09.000 Have you ever been sick?
00:20:10.000 Yeah.
00:20:11.000 Have you ever been broke?
00:20:12.000 Right.
00:20:13.000 Do you want to be broke and have no access to healthcare?
00:20:16.000 No one does.
00:20:17.000 No one wants anybody they care about to not have access to healthcare.
00:20:20.000 Of all the things that we concentrate on in this country, there's two things that drive me fucking crazy that people just dismiss.
00:20:27.000 Education and healthcare.
00:20:28.000 The idea that you have to, like my buddy Greg, Greg Fitzsimmons, he's sending his kid off to school.
00:20:33.000 How much did he say that it was?
00:20:36.000 $65.
00:20:38.000 $65,000 a year for both of his kids, for each of his kids.
00:20:41.000 So, you know, he's got two kids.
00:20:44.000 That hurts my head to even think about spending $130,000 a year just on...
00:20:53.000 If you're a regular person with a regular job, how the fuck do you do that?
00:20:56.000 Impossible.
00:20:57.000 It's impossible.
00:20:58.000 It's so much fucking money.
00:20:59.000 And then that's not even paying for housing and food and transportation and books and everything else that you're going to need, too.
00:21:07.000 And to make it more difficult for young people to succeed is one of the worst ways to make a stronger country.
00:21:14.000 If you want a strong country, you want educated people that get to pursue their dreams.
00:21:19.000 And the idea that we are so willing to spend so much money on these costly regime change wars and flying troops overseas to these places that they don't want to go.
00:21:30.000 No one wants it to happen.
00:21:34.000 It's trillions of dollars, and people are fine with that.
00:21:36.000 But you talk to them about some sort of socialized education system, and people freak out and think you want to turn us into communists.
00:21:43.000 Well, I think what is really important to understand is that the facts you just laid out don't matter to people who see this as an issue of what do people deserve.
00:21:52.000 What do they deserve?
00:21:53.000 So if you say to a fiscal conservative, you know, if you consider the amount that the employee pays for premiums, plus the employer, plus your co-pays, plus co-insurance, you put it all together into some amount.
00:22:08.000 And you explain to them there's lots of great analyses that have been done which tell us that with roughly the same amount of money, maybe a small payroll tax in addition, with roughly that same amount, it all could be done with a single-payer system that covers everybody.
00:22:22.000 It's the same.
00:22:23.000 You're taking all of these individual risk pools where you have different for-profit insurers and then you have systems for people that don't have enough money, Medicaid.
00:22:31.000 You have systems for people that are over 65, Medicare.
00:22:33.000 You put it all together.
00:22:35.000 You spread the risk far wider.
00:22:37.000 The employer no longer has to pay their part of the premium.
00:22:39.000 The employee no longer pays part of their premium to the for-profit insurance company.
00:22:44.000 The numbers work.
00:22:45.000 They're still not going to say, you know what?
00:22:47.000 That sounds great.
00:22:48.000 It's actually pretty fiscally conservative.
00:22:50.000 Let's do it.
00:22:51.000 At some point, there is a portion of the right that just doesn't think people have earned healthcare.
00:22:57.000 They just haven't earned it.
00:22:58.000 Or education.
00:22:58.000 Or education.
00:22:59.000 That's right.
00:23:00.000 And it's very hard to change people's minds when that's their view.
00:23:05.000 I think it might be George Lakoff, who I believe calls it strict father morality, which is like, how would a really strict father treat a child who comes to them and says, hey, you know what?
00:23:18.000 I figured out a way that we can all have healthcare.
00:23:21.000 The strict father, even if the numbers make sense, would say, I'm going to teach you a lesson.
00:23:26.000 You haven't earned that healthcare, either because you don't work or you don't make enough money or you're on disability, whatever the case may be.
00:23:33.000 How do you convince someone to change their mind when that's their worldview?
00:23:36.000 Yeah, how do you when it's an ideologically based decision and you're on team R or team L, which group of ideas do you adopt?
00:23:47.000 Right.
00:23:50.000 The UK system sucks.
00:23:51.000 You talk to people that get healthcare over in the UK, it sucks.
00:23:54.000 But at least they have a system.
00:23:55.000 It's just not the same quality healthcare that you get in America.
00:23:58.000 Same with my friends in Canada.
00:23:59.000 I have friends in Canada that have come down here to get surgery because they find better doctors over here.
00:24:04.000 Rand Paul was going to Canada to get his operation.
00:24:07.000 Why was that?
00:24:08.000 Why did he do that?
00:24:09.000 The best place is in Canada.
00:24:10.000 The best place for hernias?
00:24:11.000 For that type of hernia, I believe, yeah.
00:24:14.000 I mean, here's the thing.
00:24:15.000 Even in saying the UK system and the Canadian system, neither one is that good.
00:24:20.000 Right.
00:24:20.000 Those two systems are totally different.
00:24:22.000 Right.
00:24:22.000 So I feel like… But they're both socialized medicine.
00:24:25.000 They are both… Well, yes, in some sense.
00:24:27.000 I mean, the Canadian system is administered at the province level.
00:24:30.000 So the province is sort of like the market.
00:24:33.000 Instead of having all these sub-markets attached to individual for-profit insurers at the provincial level… That's how it's organized.
00:24:40.000 The UK has the National Health Service, where they don't actually run the healthcare facilities, but they're the ones who are contracting them.
00:24:50.000 So it's sort of like the healthcare facility still is its own entity.
00:24:55.000 It's not that you're going and the government is the employer of the doctor, so to speak, but they're contracting with the healthcare facilities.
00:25:02.000 But the point I want to make is that There are criticisms of all of these systems, but they're different ones.
00:25:08.000 So when we say the British and Canadian systems aren't that good, let's figure out in what ways each is not that good because they're different ways, whether you're talking about health outcomes, early detection, cost per treatment, whatever.
00:25:23.000 You really have to drill down and figure out in what way are we saying it's not as good.
00:25:28.000 Yeah, what I'm saying is that there's no perfect system.
00:25:32.000 There's no perfect system, right.
00:25:34.000 But I believe that most of the best doctors, in terms of like North America at least, are in the United States.
00:25:41.000 I'm sure there's probably some very good doctors in Canada that do specialized medicine, but I think...
00:25:47.000 Really good doctors are incentivized by profit.
00:25:49.000 I really do.
00:25:50.000 I think there is some motivation for...
00:25:52.000 If you spend so much money for medical school and you bust your ass, you want to make a lot of money.
00:25:57.000 And some of the best doctors earn a really good living.
00:26:00.000 That may be.
00:26:01.000 I think...
00:26:02.000 Limiting their ability to earn that money won't incentivize people to be excellent.
00:26:08.000 So, a couple different things.
00:26:10.000 I mean, number one, to be clear, we're now starting to get into a little bit of broader economic philosophy.
00:26:14.000 I'm a capitalist.
00:26:15.000 I'm for social democracy, which is a mixed system.
00:26:18.000 That's a capitalist system that says we're going to invest tax revenue in a particular way.
00:26:24.000 To make sure that no one falls below a certain level.
00:26:27.000 So just to contextualize that my point of view is not from one of becoming a socialist country.
00:26:34.000 I think we share those views.
00:26:35.000 I think so.
00:26:36.000 A lot of doctors will say that even though sort of on paper in a socialized medicine system they might make less for a particular procedure, for example, or something like that, A lot of them are still in favor of those systems because it would drastically reduce their overhead.
00:26:56.000 So there's all of this apparatus that includes medical billing and coding both on the insurer end and at the healthcare provider end.
00:27:06.000 The hospital and the insurance company both are battling over what is it that was done?
00:27:10.000 What are the codes that apply here and what are our reimbursement rates?
00:27:15.000 Right.
00:27:15.000 There's fraud when it comes to that and that requires an apparatus for investigating and adjudicating that.
00:27:22.000 That adds more and more cost.
00:27:24.000 So I don't think it's as obvious that under those systems, at the end of the day, a doctor that owns a PCP group, for example, or an orthopedic clinic or whatever the case may be, I don't know that it's that clear that they end up taking home less money.
00:27:42.000 Hmm.
00:27:43.000 That's interesting.
00:27:44.000 I wonder if in practice that would play out that way.
00:27:48.000 Maybe I'm talking about just like high-end orthopedic surgeons that do...
00:27:52.000 Knee surgeries for athletes and things along those lines.
00:27:56.000 They often would be outside of whatever insurance apparatus we're talking about anyway.
00:28:00.000 A lot of those folks are often being paid out of pocket anyway.
00:28:03.000 Yes.
00:28:04.000 At least some.
00:28:05.000 At least some.
00:28:06.000 So for the average person's experience, I think it's less relevant.
00:28:09.000 Yeah.
00:28:10.000 Then there's also liability insurance, which is extremely expensive.
00:28:13.000 That's a giant issue with doctors.
00:28:15.000 It's a huge expense.
00:28:17.000 It is, yeah.
00:28:18.000 I mean, I think it is necessary.
00:28:20.000 There's a question as to whether it's organized in the best way.
00:28:24.000 I know less about that component than some of the other ones.
00:28:27.000 Yeah.
00:28:29.000 Education and healthcare, those are the two things that I think we can both agree we need to invest money on, and we need to figure out some way to make that more accessible to people.
00:28:38.000 Yes, yes.
00:28:39.000 And I don't understand people that don't think that.
00:28:41.000 And if that's what that is, the strict father mentality, The only thing that makes sense to me is that you don't want people who are kind of half-assing college that can just get in.
00:28:53.000 I think that that comes up a lot when you hear about so-called free college, which isn't free.
00:28:58.000 We're saying we're paying for it through taxation.
00:29:00.000 Really important to point that out.
00:29:02.000 It's just not for everybody.
00:29:04.000 And that's okay.
00:29:05.000 I mean, I think that that sometimes gets lost.
00:29:08.000 And yes, there are more and more jobs that require college degrees, even though you could make the case, maybe the college degree is not actually necessary, but it's a way to sort of thin the herd of applicants.
00:29:19.000 In order to just make hiring more practical.
00:29:23.000 But I do think that it's okay to say that college isn't for everybody.
00:29:28.000 But the same ideas that apply to so-called free college, meaning college paid for through education, could apply to trade school.
00:29:35.000 They could apply to retraining programs.
00:29:36.000 There's a whole bunch of other ways that it could be done.
00:29:38.000 Yeah, no, I agree.
00:29:41.000 The college is not for everyone thing is more true now than ever before, particularly with certain technology studies.
00:29:49.000 You're learning things during your four years at a university that are just going to be completely outdated by the time you graduate.
00:29:56.000 In what kind of program, for example?
00:29:58.000 Well, Jamie, what he did with audio engineering.
00:30:00.000 Oh, I see.
00:30:01.000 He went to school for audio engineering.
00:30:03.000 By the time he got out, it was all useless.
00:30:04.000 Yeah.
00:30:05.000 But that was not a four-year bachelor's program, right?
00:30:08.000 It is now.
00:30:08.000 Oh, it is now.
00:30:09.000 Okay.
00:30:09.000 Yeah.
00:30:10.000 When I went there, it wasn't available for that, but since then, they have made that available, and that's also in the time that YouTube has made basically education-free for a lot of people.
00:30:19.000 Sure.
00:30:20.000 I mean I think with that, the issue is in my mind that when you consider the cost relative to the earnings potential, as you pointed out when you talk about $68,000 a year or I guess taught at Boston College and I think it was like $64,000,
00:30:39.000 something like that.
00:30:44.000 Yeah.
00:31:02.000 A free market capitalist, a social democrat like myself, and actual socialism.
00:31:07.000 Like what should happen with the gains that come from those technological advancements?
00:31:10.000 But as far as the education piece is concerned, it's completely unsustainable the way it is now.
00:31:16.000 I knew about you before this happened, but then I really kind of got on board with you when someone was trying to get you fired from Boston University.
00:31:26.000 Boston College.
00:31:27.000 Boston College, sorry.
00:31:28.000 And I tweeted it, and I was like, what is this?
00:31:31.000 This is craziness.
00:31:32.000 So it's a woman named Amy Siskind, who I don't know other than that incident.
00:31:38.000 And you had a disagreement about something.
00:31:40.000 It wasn't toxic.
00:31:42.000 It wasn't hostile.
00:31:43.000 I didn't think so.
00:31:44.000 Explain what you said and what she said that you disagree with.
00:31:48.000 So we may be able to even find the tweet, but she tweeted something, the gist being...
00:31:56.000 That she would not be supporting any candidate in 2020 who's white or male.
00:32:01.000 I think that that was the gist of it.
00:32:04.000 And I responded—I'm going from memory here—the gist was something like, isn't that the definition of racism?
00:32:10.000 You're sort of preemptively excluding someone from consideration on the basis of race, and in that case, gender, if it was white and male.
00:32:18.000 There it is right there.
00:32:19.000 Yeah, there it is.
00:32:20.000 I will not support white male candidates in the Dem primary.
00:32:24.000 Unless you slept through midterms, women were our most successful candidate.
00:32:27.000 Biggest Dem vote-getters in history.
00:32:29.000 Obama, 08. Hillary, 16. White mail is not where our party is at, and it is our least safe option in 2020. Right.
00:32:38.000 So I said, isn't there something not progressive about preemptively dismissing a candidate based on their race and gender?
00:32:44.000 I feel like there's a word to describe that.
00:32:46.000 As a progressive, I won't be jumping on board.
00:32:48.000 Yeah, so it exploded.
00:32:50.000 Yeah, well, you basically didn't even say it's racist.
00:32:52.000 Right.
00:32:52.000 You said there's a word to describe that.
00:32:54.000 Yeah.
00:32:54.000 And that's a very polite way of disagreeing with someone.
00:32:56.000 I thought it was polite.
00:32:57.000 And she tried to get you fired.
00:32:59.000 Yeah, she contacted – as far as I know – so, okay, I don't – I'm going by what she said.
00:33:05.000 She said she contacted Boston College and told them not to allow me to teach there.
00:33:10.000 That's insane.
00:33:11.000 And Boston College, since I'm just an adjunct, I'm not on staff when I'm not teaching.
00:33:17.000 Like, during the three months of the semester, I'm employed there, and the other nine months I'm not.
00:33:22.000 So I think Boston College said he's not currently employed here.
00:33:25.000 And I think that that's basically as far as it went.
00:33:28.000 But I did talk to some other faculty there who were aware of the thing that was going on.
00:33:35.000 That's a crazy thing to do.
00:33:38.000 I think so.
00:33:39.000 It's a nasty, mean thing to do, too.
00:33:42.000 Someone can't disagree with you, and it's a very good point.
00:33:45.000 I think she probably got upset because you made a very good point and tweets started coming her way.
00:33:50.000 And a lot of people, they read those fucking comments, and people get toxic in those comments.
00:33:55.000 Random, strange people that you don't know, and then you're forced to look at their opinions and their criticisms and their insults.
00:34:03.000 That incident started me down the path.
00:34:06.000 Good for you.
00:34:26.000 It's extraordinarily toxic and horrible negative stuff that is only a distraction to what I'm trying to do.
00:34:32.000 And most of it probably isn't, but I had Naval on yesterday, Naval Ravikant.
00:34:36.000 Yeah.
00:34:37.000 And one of the things that he brought up that's so huge, it's so true, is that you can have 10 positive things, but that one negative will outweigh the 10 positives.
00:34:47.000 In your mind, you mean?
00:34:47.000 In your mind.
00:34:48.000 Oh, absolutely.
00:34:48.000 Yeah.
00:34:48.000 Absolutely.
00:34:49.000 Especially if you're a person who's self-critical or self-objective.
00:34:54.000 You're analyzing your behavior.
00:34:55.000 Was that good?
00:34:56.000 Was that bad?
00:34:57.000 And then you read that one bad comment.
00:34:58.000 Fuck, are they right?
00:34:59.000 You don't read all the people that say you're great.
00:35:01.000 Oh, brilliant.
00:35:02.000 Loved it.
00:35:02.000 Fuck you, loser.
00:35:03.000 Oh, I'm a loser?
00:35:05.000 Absolutely.
00:35:05.000 I mean, when I announced I was going to be doing your show, If you look at what the comments were, almost all, this is awesome, great left-wing voice talking to Joe Rogan.
00:35:15.000 Go get him, David.
00:35:17.000 This is such a great opportunity.
00:35:18.000 Can't wait to watch you faceplant.
00:35:20.000 Oh, Jesus Christ.
00:35:21.000 And that's the one where I'm like, man, are they right?
00:35:24.000 Am I going to face plant?
00:35:26.000 Fuck!
00:35:27.000 So, I don't know.
00:35:27.000 I'm just trying to limit the amount that I'm on social media.
00:35:30.000 One thing I am doing, though, because, I mean, my show is in part as successful as it is because of social media.
00:35:35.000 So, I can't ignore that.
00:35:37.000 Right, but you don't have to engage.
00:35:38.000 I don't have to engage, and I also can just say, I will check our networks in the morning, then I'll spend the whole day, I'll do my show, I'll do what I need to do, And then before I sign off for the evening, I'll check it.
00:35:50.000 Oh, that's a terrible idea.
00:35:51.000 Because what if you read the worst shit right before you eat dinner or go to bed?
00:35:54.000 Well, no.
00:35:54.000 It'll be like 5 p.m.
00:35:56.000 So you'll have five hours to recover?
00:35:57.000 I'll have five hours to cool off, yeah.
00:35:59.000 And it's way better.
00:36:00.000 And weekends, I'm almost, I mean, people are like, David, you're still tweeting on the weekends.
00:36:03.000 A lot of those are like pre-scheduled tweets where I'll just sit and schedule some stuff.
00:36:07.000 And I am trying to stay off it, and it's been really great.
00:36:10.000 I mean, it's been a fantastic experience.
00:36:11.000 We looked at our phones yesterday.
00:36:13.000 We did a Sober October podcast with my four friends.
00:36:16.000 We looked at our phones to look at phone usage, and I use my phone four hours a day.
00:36:21.000 I'm on my phone four hours a day.
00:36:22.000 I'm like, fuck, that's a lot.
00:36:24.000 What app did you use to measure it?
00:36:26.000 It's something on your iPhone.
00:36:27.000 Oh, okay.
00:36:28.000 Yeah, you have an Android.
00:36:29.000 I'm sure they have a similar thing.
00:36:30.000 Yeah, yeah.
00:36:31.000 Yeah, four hours of screen time.
00:36:33.000 I'm like, ooh.
00:36:34.000 That's not good.
00:36:35.000 They're adding that to the computer, too, so it's going to combine.
00:36:37.000 You'll know how much you're looking at all screens soon.
00:36:40.000 Yeah, but what if you're writing?
00:36:41.000 Well...
00:36:42.000 Is that going to count?
00:36:43.000 Yeah, sure.
00:36:43.000 I mean, you're just staring at the screen.
00:36:45.000 That's how it works, bitch.
00:36:46.000 Hey, that's the argument you guys were making for the phone yesterday.
00:36:49.000 That's Bert's argument.
00:36:49.000 Yeah, well, Bert's argument's not good because he doesn't even write.
00:36:52.000 That's ridiculous.
00:36:53.000 One thing I did that actually is useful is I used to have my social apps on the home screen.
00:36:59.000 And Cal Newport and some others have said, you've got to get rid of those.
00:37:03.000 He actually advocates getting rid of the apps altogether so that you have to go on a computer and choose to go to Facebook.com or Twitter.
00:37:11.000 I haven't gotten there yet, but even just removing them from the home screen makes me significantly less likely to even pull them up.
00:37:18.000 It's two clicks to swipe up and scroll over to the app, but even just getting them off the home screen keeps me off of them significantly.
00:37:26.000 That's smart.
00:37:27.000 That makes sense.
00:37:28.000 Yeah, I need a certain amount of access to those things with my business.
00:37:34.000 Scheduling shows and things along those lines.
00:37:37.000 But yeah, it's not good for you.
00:37:39.000 I think one of the biggest realizations is that people don't really miss you that much.
00:37:46.000 They don't hear from you for a couple days.
00:37:48.000 That's one of the things where the idea of needing constant engagement comes from sort of like a slightly narcissistic point of view where like people are going to notice if I don't tweet from Thursday night until Monday morning or do anything like that.
00:38:01.000 And they really don't.
00:38:02.000 They don't notice.
00:38:03.000 There's a lot of other people to pay attention to.
00:38:05.000 Yes, there are.
00:38:06.000 There are a few people that tweet on you that are kind of crazy and that want to hear from you all day long.
00:38:11.000 Yeah.
00:38:13.000 They'll get used to it.
00:38:14.000 They'll get used to you vanishing.
00:38:17.000 And I don't know, I mean, Cal Newport, have you had him on?
00:38:19.000 No, I haven't.
00:38:20.000 He wrote Deep Work, and then more recently he wrote Digital Minimalism.
00:38:24.000 And he goes into detail about just the effect of...
00:38:28.000 I was going to write it down on my phone, but it feels sacrilegious to put that as a note.
00:38:32.000 To take it out right now.
00:38:33.000 Yeah, I mean, he goes into detail about this stuff and just about, you know, we need more uninterrupted periods of concentration.
00:38:40.000 What is the book called again?
00:38:41.000 Deep Work.
00:38:42.000 And then digital minimalism.
00:38:46.000 And they're both...
00:38:47.000 I interviewed them recently.
00:38:49.000 Really just solid, very solid stuff.
00:38:51.000 Awesome.
00:38:51.000 Yeah.
00:38:53.000 What were we just about to get into?
00:38:55.000 Amy Siskind.
00:38:56.000 This woman, Amy Siskind.
00:38:58.000 Did you reach out to her when she did that?
00:39:00.000 Privately?
00:39:00.000 Yeah.
00:39:01.000 No.
00:39:01.000 No?
00:39:02.000 Did you reach out publicly?
00:39:04.000 Well...
00:39:04.000 You did publicly declare that she tried to get you fired, right?
00:39:07.000 Yes, I did.
00:39:07.000 Did she respond?
00:39:09.000 I don't know, because she blocked me.
00:39:10.000 Oh, goddammit!
00:39:12.000 She blocked you over that?
00:39:13.000 Mm-hmm.
00:39:15.000 Jesus!
00:39:17.000 That is so sensitive.
00:39:20.000 What is Twitter for?
00:39:22.000 Is it just to fall in line?
00:39:25.000 Is it just to agree with everything someone says with no questioning whatsoever?
00:39:29.000 What's extra interesting about it is she blocked me on Twitter, but then I treat my Facebook profile basically as public, so I post stuff on there.
00:39:39.000 It's the same whether you're friends with me or not.
00:39:42.000 And I had posted something totally innocuous about I was at a restaurant or drinking an espresso on it.
00:39:48.000 I don't even know what it was.
00:39:49.000 She showed up there and commented that she had called Boston College and told them not to hire me or to fire me or whatever.
00:39:58.000 On a post about you having an espresso?
00:40:00.000 It was just a personal post, right.
00:40:02.000 But the point is, she determined that the exchange was worthy of blocking me on Twitter.
00:40:07.000 But then she came to my personal Facebook page and said, I'm calling Boston College and telling them to fire you.
00:40:16.000 There's a word for that.
00:40:22.000 God, there's a gang, but there's a big one.
00:40:25.000 There's a four-letter one.
00:40:26.000 I just don't understand why someone would want to do that to someone.
00:40:30.000 Why can't you disagree?
00:40:31.000 And she's just upset that you pointed out a glaring problem with what she was saying.
00:40:37.000 Yeah, apparently.
00:40:38.000 And you know, the thing is, the way I operate, I don't think even necessarily that she's a bad person.
00:40:45.000 I just assume that she has some emotional thing going on.
00:40:49.000 She could have had a terrible day.
00:40:50.000 As far as I know, someone near and dear to her died that day.
00:40:53.000 If someone near and dear to me died, I wouldn't go to your Facebook.
00:40:57.000 You wouldn't necessarily be on Twitter.
00:40:57.000 I wouldn't stalk your Facebook or post about you drinking an espresso.
00:41:01.000 Fair, fair.
00:41:02.000 Tell people I'd try to get you fired.
00:41:03.000 But my approach is, I really do assume most people are pretty good people, and even when we have disagreements, I tend to give the benefit of the doubt that if we could only talk the way we're doing, we could figure out 95% of the disagreement.
00:41:18.000 I agree with you.
00:41:19.000 Maybe not all of it.
00:41:20.000 Right.
00:41:20.000 But most of it.
00:41:21.000 So I don't begrudge her.
00:41:23.000 I mean, yeah, I don't...
00:41:24.000 She behaved in a way I wouldn't behave, but who knows what she had going on, you know?
00:41:27.000 I mean, it's...
00:41:28.000 Yeah.
00:41:29.000 Well, it didn't get you fired, so it's not that bad.
00:41:31.000 But if it did, that would have been horrible.
00:41:33.000 It would have been a different situation.
00:41:35.000 Yeah.
00:41:35.000 Would have been probably good publicity.
00:41:37.000 Yeah.
00:41:37.000 Probably would have helped you.
00:41:38.000 I think so.
00:41:39.000 You got excited about that.
00:41:40.000 Yeah, yeah, yeah.
00:41:40.000 You got a little twinkle in your eye.
00:41:42.000 The reason I'm thinking back, actually, to a conversation I had at the time where someone said to me, If you do get fired, it's the best possible thing that'll happen.
00:41:51.000 It would just be fantastic.
00:41:53.000 And it didn't.
00:41:54.000 Because I wasn't actually employed there at the time.
00:41:56.000 That's the irony of it.
00:41:57.000 Well, this is the thing, the falling in line, the no room for deviation from the ideology.
00:42:04.000 Sure.
00:42:04.000 This is a giant issue that I have with both parties.
00:42:08.000 And I think it's one of the reasons why people are in these parties to begin with.
00:42:12.000 I don't necessarily think that people have clearly thought out every single aspect of whatever party they align with.
00:42:20.000 I think they fall in line and they adopt a predetermined pattern of behavior that seems to be attractive at the time and then they fall in line with whatever that party is saying.
00:42:29.000 I think that is a giant percentage of people.
00:42:31.000 When someone deviates from that like you did, someone who is also clearly a progressive and clearly a left-wing person and you're criticizing something and very Right.
00:42:46.000 Right.
00:42:59.000 Are you responsible for the reaction to what you post?
00:43:02.000 Because if you look at what Steven Crowder said, for people who don't know the story, Steven Crowder got into it with this guy who is a writer for Vox, who is gay.
00:43:16.000 His Twitter handle is GayWonk.
00:43:18.000 Carlos Maza.
00:43:19.000 Yeah, so it's not that he's hiding that he's gay.
00:43:22.000 He talks about it all the time.
00:43:24.000 He's kind of effeminate and Steven Crowder mocked that and he mocked that in these videos where he was Criticizing Carlos's position on Antifa, specifically what I saw.
00:43:39.000 And in doing that, he called him this queer Mexican.
00:43:45.000 He's doing it in a ribbing way.
00:43:48.000 He's doing it in a joking way.
00:43:49.000 And then Carlos Maza posts all these horrible tweets that came his way, and apparently he got doxxed so people got his phone And they were saying, debate Steven Crowder.
00:44:00.000 He was getting all these text messages in and all this hateful stuff that was coming his way.
00:44:05.000 So the question is, who's responsible for that hateful stuff?
00:44:10.000 If Steven Crowder calls him queer, what is queer?
00:44:15.000 Is LBGTQ? What do we do there?
00:44:18.000 What do we do if the Q is in...
00:44:21.000 Is it okay to call someone gay who identifies as gay if he calls him the gay little Mexican?
00:44:27.000 Is that bad?
00:44:29.000 Like, what is...
00:44:30.000 How bad is that?
00:44:31.000 Like, what is that?
00:44:32.000 You know what I'm saying?
00:44:32.000 But do you feel what I'm saying here?
00:44:34.000 I know where you're getting at.
00:44:35.000 Let's zoom out a little bit.
00:44:36.000 Right.
00:44:36.000 And then we'll get into this.
00:44:38.000 Man, where do you even start with this?
00:44:40.000 Because there's a lot to unpack here.
00:44:41.000 Right.
00:44:43.000 We'll analyze the specifics in a second, maybe.
00:44:46.000 But first, if you look at the policy, the terms of service of YouTube, there's a Verge article from yesterday, before, a few days ago, earlier this week, before YouTube had made the decision to demonetize Steven Crowder.
00:44:57.000 Well, they made the decision to not act.
00:44:59.000 And just say that it didn't violate the terms of services.
00:45:02.000 And then today, as I got in here, Jamie informed me that they made a decision to demontize.
00:45:06.000 That's right.
00:45:07.000 So in the article where they made the decision not to act, they actually put what YouTube's terms of service are with regard to bullying and harassment.
00:45:17.000 My reading of it, and we could go through them, if we could pull them up, we could go through it line by line if we wanted.
00:45:22.000 My reading was that that definitely did break the terms and conditions.
00:45:26.000 That was my view as I looked at what it was that was done by Steven Crowder and what the terms of service are.
00:45:32.000 Just matching it up, not looking at the comments from either person.
00:45:35.000 What was it specifically?
00:45:36.000 It was specifically targeting an individual on the basis of sexual orientation.
00:45:43.000 But he wasn't targeting them on the basis of it.
00:45:45.000 He was mentioning that with his bad ideas.
00:45:48.000 He was targeting his bad ideas in regards to Antifa.
00:45:52.000 That he was dismissing Antifa.
00:45:53.000 But if you look at Crowder's video, and I can't believe I spent so much time doing this, but I spent like a whole hour on this.
00:45:59.000 Two days ago.
00:46:01.000 He was talking about how Carlos just dismisses Antifa as being not that big a deal, and that there's bias in the media whenever there's anything negative that happens, but if you look at the overall picture.
00:46:12.000 And then Crowder goes on to talk about all the assaults, all the murders, that there were sexual assaults, there was rapes, there was all these things that happened with Antifa.
00:46:20.000 He was talking about all these different people that got maced in the face, all these people that got hurt.
00:46:25.000 And he's...
00:46:27.000 Highlighting, this is not something to easily dismiss, and that the FBI had labeled Antifa a terrorist organization.
00:46:33.000 So far, it's just politics.
00:46:35.000 It's just, what does he think, what do I think?
00:46:37.000 So far, it's just that part of it, and along the way, he's like, yeah, but the queer little Latino says this.
00:46:43.000 And when he does that...
00:46:45.000 That's where it's like, okay, what is he doing?
00:46:47.000 He's kind of mocking him, right, and he's mocking him by saying he's queer, but he says he's queer, or he says he's gay.
00:46:53.000 Yeah, but that's like saying, I mean, listen, just because the N-word is in rap songs doesn't mean that it's defined to go Right, but the N word is not in like – it's not like the LBGTN. You know what I'm saying?
00:47:06.000 It's not like a part of their – I think the principle though is you're suggesting that because a certain word is sometimes used self-referentially by members of a group, that any use of it from the outside is – By definition,
00:47:23.000 not problematic.
00:47:23.000 And I'm just saying it's more complicated and you've got to look at the specifics.
00:47:26.000 It's certainly more complicated.
00:47:27.000 You do have to look at specifics.
00:47:28.000 I'm going from memory, but wasn't Steven Crowder also wearing a shirt that said fags with the A with an asterisk?
00:47:35.000 It said figs.
00:47:36.000 Wink, wink, nudge, nudge.
00:47:37.000 It said socialism is for figs.
00:47:40.000 While he's calling a gay guy.
00:47:42.000 The A is a fig instead of an I. As he's calling a gay guy a queer Mexican.
00:47:48.000 Yes.
00:47:49.000 I mean, in total, it's not crazy.
00:47:51.000 No, there's certainly an argument that...
00:47:56.000 I don't necessarily think the t-shirt is for Carlos Meza.
00:47:59.000 I think that's a t-shirt that he just has because he thinks it's funny.
00:48:02.000 And because Che Guevara, who's on the shirt, is...
00:48:06.000 That is one of the weirdest things that people worship that guy.
00:48:08.000 He was a horrific human being, a mass murderer, a terrible sociopath, a psychopath, because he looks good.
00:48:16.000 Involved in the Cuban Revolution.
00:48:17.000 Looks good with a beret on.
00:48:19.000 He became, for a long time, I mean, it's kind of died off, but he became the woke...
00:48:26.000 Poster boy.
00:48:27.000 I'm from Argentina.
00:48:28.000 I know...
00:48:29.000 Are you really?
00:48:29.000 Yeah.
00:48:30.000 Were you born there?
00:48:31.000 Yeah.
00:48:31.000 No kidding.
00:48:32.000 Yeah, yeah.
00:48:33.000 So, I mean, listen, here's the thing...
00:48:34.000 Welcome to my country, bro.
00:48:35.000 I made it.
00:48:37.000 So listen, I think that...
00:48:38.000 I do appreciate what you're saying, and I agree with you to a certain extent.
00:48:42.000 I believe that when YouTube yesterday said, we looked at the content in total, and we don't think it violates our terms and conditions, I disagreed with them.
00:48:51.000 I thought it very clearly violated their terms and conditions.
00:48:54.000 Where I am thinking about it now is the application of those terms and conditions violations because a similar thing happened with Alex Jones as well, which was there's lots of way smaller players that are violating these same terms and conditions,
00:49:11.000 but nobody knows about them.
00:49:13.000 YouTube doesn't know about them.
00:49:14.000 They don't get any attention because they have no audience.
00:49:16.000 So I think there's the question of the application of these terms and conditions in a way that's sort of fair and And is not ultimately going by the public blowback or reaction to situations because that's how Adpocalypse 1.0 happened.
00:49:33.000 I think it was a Coke ad appeared on an obviously racist video on a channel with like 800 or 1,000 subscribers.
00:49:42.000 The Wall Street Journal, I think it was, did an article saying, look at these screenshots of these advertisers on these crazy racist videos.
00:49:50.000 That led to blowback because YouTube didn't want to lose money.
00:49:54.000 And ultimately, that's what this is about.
00:49:56.000 I know that there are people who say YouTube has an inherently left-wing bias.
00:50:00.000 Others say YouTube has a right-wing, whatever.
00:50:04.000 YouTube's bias is towards corporatism and profit.
00:50:07.000 That's fundamentally what it is.
00:50:09.000 But as a company, they have a left-wing bias.
00:50:11.000 I don't know that.
00:50:13.000 In what sense?
00:50:15.000 Well, in the sense that the woman is the CEO of YouTube has talked about it pretty openly.
00:50:21.000 Like, the fact that she doesn't...
00:50:22.000 What was it that she had gotten into?
00:50:25.000 Oh, well, first of all, it was the James Damore thing.
00:50:28.000 You know, she was talking about the Google memo, and she was talking about how it was incredibly damaging, damaging stereotypes against women, which it just wasn't.
00:50:40.000 It's not accurate.
00:50:41.000 Is Home Depot a right-wing company?
00:50:44.000 Because the CEO supports Trump?
00:50:47.000 That's a good question.
00:50:48.000 I'm basing it on that they're a part of Facebook, and Facebook is pretty clearly left-wing.
00:50:53.000 Who's a part of Facebook?
00:50:54.000 Google.
00:50:54.000 Oh, sorry, Google.
00:50:55.000 They're a part of Google.
00:50:56.000 I meant Google.
00:50:56.000 Google is a very, very left-wing group, and it's all Silicon Valley, which is almost entirely left-wing biased.
00:51:04.000 So I think we have to distinguish between the personal political biases of Silicon Valley entrepreneurs and And the broader place that Google has in the sort of corporate sphere.
00:51:16.000 Google is part of the group of huge multinational corporations that lobbies for particular tax policy to avoid paying taxes legally.
00:51:29.000 That is not a particularly left-wing thing to do.
00:51:32.000 Google is part of the large tech companies that In order to avoid serious regulation of their businesses have come up with this idea of regulating themselves, which I know is a topic of self-regulation that's come up before on your program in a variety of ways.
00:51:50.000 So those are not left-wing things and if you want to make the case that as a company It has a left-wing politic in the outward-facing world.
00:52:01.000 You have to have something more than just a lot of their engineers live in Palo Alto and are hipsters who go to coffee shops.
00:52:07.000 What do you think?
00:52:09.000 I think that in terms of the place that it occupies within the economic system we have, they are not very different from all of the large corporations that are pushing against regulation, pushing for ways to avoid taxes,
00:52:27.000 period.
00:52:27.000 Trevor Burrus So in terms of economic decisions?
00:52:29.000 Yeah, I mean, listen, if we want to talk about how the personal politics of the employees translate to policy, we can do that, but we need to be able to make some specific claims about how it does.
00:52:43.000 What I'm saying is, we know the way in which the structure that Google is a part of leads to it advocating for things that are center-right, corporatist, capitalist.
00:52:55.000 The status quo of Well, that's what's interesting about this Crowder thing is that ultimately the decision was to allow him to have his freedom to post videos on there, but the punitive aspect of it is they're going to reduce his ability or eliminate his ability to make money from it.
00:53:16.000 Well, I should say reduce, right?
00:53:17.000 Because he could...
00:53:18.000 Couldn't he put videos, put ads up in his video?
00:53:22.000 Yeah.
00:53:22.000 His own ads?
00:53:23.000 Sure.
00:53:24.000 So one thing that I do is we kind of split off the ad sales for my show into an ad agency.
00:53:31.000 And we're doing ad sales not just for my show but for other shows as well.
00:53:35.000 And those include ad placements that are not like the pre-roll ads on YouTube.
00:53:40.000 It's the host is actually talking about a product or whatever.
00:53:43.000 It's a live read sort of thing.
00:53:45.000 Unbox Therapy does a lot of that.
00:53:47.000 Yeah, that's where I first saw those.
00:53:49.000 He does some pretty extensive ones.
00:53:51.000 Okay, so of course you can do that.
00:53:53.000 I mean, yeah, there's nothing...
00:53:54.000 So he can do that.
00:53:55.000 But he can't just collect revenue like I assume he's been doing before.
00:54:00.000 He has a significant number of followers.
00:54:02.000 I think his YouTube subscribers are more than three and a half million.
00:54:08.000 It's very high.
00:54:09.000 Yeah, more than me.
00:54:11.000 And they've just eliminated his income that comes out of YouTube.
00:54:17.000 And this was their decision based on his way of talking about Carlos Meza.
00:54:24.000 That's what happened.
00:54:26.000 Yeah.
00:54:26.000 I mean, so what are the concerns to me?
00:54:29.000 It's not that he didn't violate terms and conditions.
00:54:32.000 Like I said, I think he pretty clearly did.
00:54:33.000 The concerns to me are, is YouTube only going to even look into these circumstances or instances when there is a public outcry?
00:54:46.000 The answer is probably yes, because why would they look into stuff nobody's paying attention to?
00:54:50.000 Well, it seems like they changed their decision based on public outcry, based on Carlos Mez's reaction to their initial decision.
00:54:57.000 I happen to think their initial decision was the wrong one, but I have a sort of broader concern here, which is about the fairness of the application and also the distinguishing between content that is promoting Whatever falls under any of our definitions of hateful or whatever content and those who are fighting against it.
00:55:17.000 So is it because he mentioned his sexual orientation and that he called him a lispy little queer or whatever he called him or a queer Mexican and if he just called him a fucking idiot and he received the exact same amount of hate would you still think that that was a good move?
00:55:36.000 No.
00:55:36.000 I mean, I think that it would not fall under what they are now claiming is the justification for the demonetization.
00:55:44.000 It would be different because – It would have the same result.
00:55:47.000 The only difference would be they wouldn't be attacking his sexual orientation specifically because of Crowder.
00:55:52.000 That's the only difference.
00:55:55.000 It's policing speech.
00:55:57.000 You know, it really is.
00:55:58.000 So, who gets to decide, if not the private businesses, what their rules are?
00:56:05.000 That's where the real question comes up, right?
00:56:07.000 Tulsi Gabbard believes that it's a First Amendment issue, and she believes that everyone should have the freedom of expression.
00:56:12.000 And that as long as you're not doing anything illegal, you're not putting anyone in danger by giving up their address or doxing them or something along those lines or making overt physical threats, that you should be allowed to do that because that's what the freedom of speech is all about.
00:56:26.000 And freedom of speech, when you eliminate social media in this country, your freedom is basically just yelling out in public.
00:56:36.000 I mean, we're in this weird place as a culture.
00:56:38.000 It's a weird place.
00:56:39.000 As a culture, it's unprecedented really in terms of the waters we're navigating right now.
00:56:44.000 There's a couple different things to – so I like the principle.
00:56:48.000 Like my principle is we do almost no moderation on any of our platforms that my program is on.
00:56:54.000 My only thing that I tell my team is if you see something that really seems to be illegal, it's calling for violence, whatever.
00:57:04.000 We have a very, very high bar before we will remove anything.
00:57:08.000 And quite frankly, we're just too busy.
00:57:09.000 What do you mean by that?
00:57:10.000 I'm confused.
00:57:12.000 Not your videos.
00:57:13.000 Whose videos?
00:57:14.000 Yeah, so if we find out that on our videos someone is posting endless comments, for example, my personal view is if it's not illegal, I just let it all be there and sit.
00:57:26.000 That's my personal view, and that's a great principle to have.
00:57:28.000 Well, we don't touch them.
00:57:30.000 We leave them alone, even though we get accused of it.
00:57:32.000 But the question is, YouTube at one point in time had thrown out there that they were going to make people responsible for the things that were in their comments.
00:57:40.000 I vaguely remember that, but it didn't ultimately happen.
00:57:42.000 I think they backed out of it very quickly when they realized that places like yours, which like your average video gets how many thousands of comments?
00:57:49.000 A lot and many of them anti-Semitic.
00:57:51.000 And how would you even be able to look at all those?
00:57:54.000 I mean, you would have to be 24-7 monitoring them because you've also got people that are watching your videos from overseas at all times in the night.
00:58:00.000 Yep.
00:58:01.000 So I think that the principle of only illegal content will be removed is great.
00:58:06.000 That's my personal principle.
00:58:10.000 However, I think that there is no serious case to be made that a private company can't say, these are our terms of service.
00:58:20.000 And if you want to, I mean, it's sort of almost a conservative principle, right?
00:58:23.000 The idea that unless illegal things are going on, we are not going to tell a business how it is that it should be run.
00:58:31.000 And that's where I think a lot of right-wingers start to stumble on this issue because they're calling for a very invasive form of government regulation.
00:58:40.000 They're calling for the government to step in and even break up these organizations because they've gotten too large.
00:58:46.000 But you're hearing that from the left as well.
00:58:47.000 Yeah, well, I think there's a difference though between Elizabeth Warren saying we should separate the social platform, Facebook.
00:58:55.000 From the ad sales, revenue-generating piece of it.
00:59:00.000 That's one thing that falls under antitrust.
00:59:02.000 That's different than saying the government should come in and it should tell anybody who runs a social network that you can't do anything unless the content is illegal.
00:59:13.000 Because there are financial considerations, right?
00:59:16.000 I mean, there's lots of content that would not be illegal, but it would make a platform, a video platform like YouTube, not financially viable.
00:59:25.000 Right.
00:59:49.000 Then, okay, maybe you could pass a law that changes how they would be regulated.
00:59:54.000 But that's typically the type of stuff the right is against because it is more regulation.
00:59:59.000 It is more regulation, but it's regulation to keep a private company from regulating against free speech.
01:00:05.000 Do you see what I'm saying?
01:00:05.000 It's a sneaky kind of regulation.
01:00:07.000 It's a regulation that's enforcing the First Amendment and the people's ability to freely express themselves.
01:00:13.000 If we're admitting, or if we're agreeing, that we are entering into this new world where this is – that's my position, is that it is a town square.
01:00:22.000 I feel like everybody should be able to communicate.
01:00:26.000 The really unfortunate, unsavory aspect of it is when someone gets harassed, like Carlos Mesa was because of this, where people are sending him all these homophobic tweets and he's getting text messages and all this shit.
01:00:39.000 That's the unsavory and unfortunate aspect of it.
01:00:43.000 How do you stop that?
01:00:45.000 I don't know how you stop it.
01:00:46.000 I don't know exactly how you stop it, but I think it would be useful.
01:00:49.000 I mean, one thing is, when does a platform get big enough, in your mind, that it would qualify for this town square designation?
01:00:56.000 Well, for sure, YouTube.
01:00:58.000 Let's talk about that one, because that's the one we're on.
01:01:00.000 I mean, fucking goddammit, it's huge.
01:01:02.000 It's gigantic.
01:01:03.000 So, are there other types of businesses through which communication happens that you think should be regulated in the same way?
01:01:11.000 That's not really clear, so I'll give an example.
01:01:13.000 Okay.
01:01:14.000 If you start regularly sending people via UPS similar things to some of the content that exists on YouTube, and UPS says, we're getting reports that you're sending people harassing stuff,
01:01:30.000 we don't want you as a customer anymore.
01:01:32.000 Here's the question, though.
01:01:33.000 Isn't there a difference between someone sending something to a physical address and someone sending something, let's say, to you when you're Social media apps are on the third page of your phone and you have to swipe all the way over to get them and open it up and you have to read them if you want to find them.
01:01:47.000 You don't necessarily have to read them.
01:01:49.000 There's a difference in a practical sense, but I guess the question is, would you similarly want the government to enforce for telephone companies?
01:01:58.000 If you are getting harassing texts and you report it and report it.
01:02:02.000 That's a different thing, I think.
01:02:04.000 I think when it's coming to your phone and the phone is ringing, I think that's another step.
01:02:09.000 It's another step towards invasive.
01:02:12.000 It's a big gray area.
01:02:13.000 Is the phone ringing?
01:02:15.000 Is it a phone call?
01:02:15.000 Is it a text?
01:02:16.000 Is it a WhatsApp message?
01:02:18.000 You don't have to read that text.
01:02:18.000 No, I feel you.
01:02:20.000 I guess where I hesitate, and again, speaking as someone from the left who believes regulation of businesses is an important thing, I would want to be really sure about how exactly it is that the government would step in and mandate essentially that their view has to be listened to over the terms of service that a private company would wish to have.
01:02:44.000 Yes.
01:02:45.000 I feel like when you give people a gun, they start looking for targets.
01:02:49.000 And that is a very common thing.
01:02:50.000 If you give people the ability to censor, and if you give people the ability to censor based on their political ideology or based on what they feel is offensive where other people don't, it's a slippery slope.
01:03:00.000 And I think that that can lead to all sorts...
01:03:04.000 Look, that woman...
01:03:06.000 What is her name again?
01:03:08.000 The one who tried to get you fired?
01:03:09.000 Amy Siskind.
01:03:10.000 Imagine her being in charge of a social media platform.
01:03:13.000 She tried to get you fired from Boston College for something that was incredibly polite.
01:03:16.000 Right.
01:03:17.000 That is what I'm talking about, is that very action, that very same type of thinking that she tried to impose on you.
01:03:24.000 That's what I'm worried about.
01:03:26.000 And I'm worried about people that are really strictly trying to promote their ideologies and what they think is okay and not okay.
01:03:33.000 And it's very slippery, because there's a lot of weird people out there that believe a lot of weird things and want other people to conform to those weird things.
01:03:41.000 And we sort of have to decide.
01:03:44.000 That's why I'm bringing up this Crowder thing.
01:03:46.000 Do I think that what he said was good?
01:03:48.000 No, it's not nice to call someone a little lispy queer.
01:03:51.000 It's not nice.
01:03:52.000 It's kind of mean.
01:03:53.000 Especially when that guy wasn't even engaging with him.
01:03:56.000 But he's making fun of him.
01:03:57.000 He's a comedy show.
01:03:58.000 He's mocking him.
01:03:58.000 So the question becomes, when is that mocking considered homophobic?
01:04:03.000 And when is it just ribbing?
01:04:05.000 And that's his position.
01:04:06.000 His position is that it's just ribbing.
01:04:07.000 This is the problem with...
01:04:10.000 A discussion that is only about the principles.
01:04:13.000 So a lot of our conversation for the last 15 minutes has been, what is our principle about what types of business regulation is okay for the government to do and is not okay?
01:04:23.000 Or when we talk about free speech, do we have a principle of anything short of illegal content versus something that is more strict?
01:04:32.000 The reality is that there's a more gray area.
01:04:36.000 Yeah, we're trying to sort of regulate the way people communicate with each other.
01:04:41.000 So it's not – if someone said that to someone in a bar, a cop would not arrest them.
01:04:46.000 Like, yeah, you listen to be a little queer.
01:04:49.000 That would be like, oh, that guy's an asshole.
01:04:51.000 But the bar would be perfectly within their legal right to say, we don't want you in here.
01:04:55.000 You're making our customers uncomfortable.
01:04:56.000 And nobody would say that it would be against the law for the bar to say, you got to go.
01:05:00.000 That's a good point.
01:05:01.000 If they were doing it to their face.
01:05:03.000 But what if he was in a corner talking about this guy that wasn't there?
01:05:08.000 And he was saying, yeah, so he's talking about Antifa, this lispy little Mexican queer.
01:05:12.000 If you came along and decided to kick the guy out of the bar then...
01:05:16.000 I mean, listen, at some bars, if you go into the corner and you yell about a lispy Mexican queer, they're going to ask you to leave, and it still would not be illegal, and the bar would still not be doing anything wrong.
01:05:25.000 Right, but that's a bar, right?
01:05:27.000 That's a private business where people are physically there.
01:05:30.000 Isn't there a difference between that and something like YouTube, which again, falls more in line with like a town square?
01:05:35.000 Maybe that's what we need to revisit, because so much...
01:05:38.000 Human communication is now happening across these platforms.
01:05:41.000 I would imagine most of it.
01:05:42.000 Or most of it.
01:05:43.000 We need to maybe stop drawing this arbitrary distinction that in person is a completely different thing than over the internet.
01:05:52.000 I mean maybe it's not increasingly.
01:05:53.000 Maybe it's more the same.
01:05:55.000 Yeah.
01:05:58.000 So I'm torn here, right?
01:06:00.000 On one side, I say, well, it seems like they still allow him to have his freedom of expression because he's still on YouTube.
01:06:10.000 He still is able to upload his show on YouTube.
01:06:13.000 He will have to find other ways to make money.
01:06:14.000 Sure.
01:06:15.000 So one part of me looks at it that way.
01:06:17.000 And no one has a right to monetize on YouTube.
01:06:19.000 Right.
01:06:20.000 Right.
01:06:20.000 So, in a sense, they haven't violated his First Amendment rights because he's still able to express himself.
01:06:26.000 But then you go, as a company, they've made a punitive decision to eliminate his ability or radically reduce his ability to make an income off of their platform.
01:06:38.000 That seems like, and I'm not supporting that they did it, but that seems more reasonable as a decision.
01:06:46.000 Right?
01:06:46.000 To say we're gonna demonetize you.
01:06:48.000 That seems more reasonable.
01:06:51.000 But the problem is there's no alternatives.
01:06:55.000 There's nothing remotely like YouTube.
01:06:58.000 There's no alternative to YouTube for him to regain that same level of monetization.
01:07:02.000 Yes.
01:07:03.000 Or for people that share his viewpoint and share his ideology and share his positions.
01:07:10.000 There's no right-wing YouTube is my point.
01:07:12.000 I would challenge the idea that YouTube is left-wing.
01:07:16.000 I mean in terms of enforcing its policies.
01:07:20.000 How so?
01:07:20.000 I mean, they have...
01:07:21.000 Well, just this.
01:07:21.000 Just this particular issue.
01:07:23.000 But this isn't a left-wing...
01:07:25.000 How is this a left-wing enforcement?
01:07:27.000 I mean, they have a...
01:07:28.000 Well, I think it is because Carlos Mesa is progressive and because the argument that he was making is a very left-wing progressive argument, and this is what Crowder was going after.
01:07:36.000 He was going after the argument.
01:07:38.000 In the process of going after the argument, he mocked his sexuality and his appearance.
01:07:42.000 I can assure you, if it was focused merely on how much of a problem Antifa is...
01:07:47.000 This would not have happened.
01:07:48.000 I mean, I think we both agree to that.
01:07:49.000 Oh yeah, for sure.
01:07:50.000 Yeah, yeah.
01:07:51.000 No, it's all about mocking the guy's sexual orientation and looks.
01:07:54.000 If Carlos Maza were a gay Republican...
01:07:57.000 And the exact same thing happened.
01:07:59.000 Do you think the outcome would have been different?
01:08:01.000 Yes.
01:08:02.000 Why?
01:08:02.000 I just don't think people would be interested.
01:08:05.000 So that gets to the real crux of it, which is my real concern with this is YouTube only getting involved in even publicly saying what they're doing about a channel when it becomes very public and it starts to have the possibility of impacting their bottom line and brands saying,
01:08:23.000 Yes.
01:08:23.000 This is too hot.
01:08:24.000 We're getting out.
01:08:25.000 Well, in that sense, what Carlos did once it was revealed that YouTube was not going to take action was very effective.
01:08:32.000 Absolutely.
01:08:33.000 I mean, he started tweeting like crazy and people jumped on board.
01:08:37.000 He connected it to the LBGT movement and then it became this thing.
01:08:42.000 I mean, the other side of this is...
01:08:45.000 I mean, I don't know if we even want to go into identity politics, so to speak, but there has – I've read some comments on some of the few articles that have been written about this that are saying that this is effectively YouTube enforcing a defense of identity politics,
01:09:03.000 so to speak.
01:09:04.000 And I think that that's just, again, opening up the door to the incredibly broad application of that term identity politics.
01:09:11.000 I don't even really fully understand that, and I don't even know if that's a path we want to go down to talk about the identity politics component of what's going on with a lot of this regulation.
01:09:20.000 Well, define what you mean by the identity politics component of it.
01:09:23.000 I mean, listen, so I guess in order to define it, I— It would be good to point out that I have been critical of quote identity politics on the left in a very limited way that I think it is actually damaging while at the same time recognizing that identity is a really important thing to consider when we think about sort of how the world should be organized.
01:09:47.000 So like for your audience who may not know...
01:09:50.000 When identity politics is used like a knife to enforce that because of someone's identity, their opinion supersedes and is the opinion that is the valid one over everybody else because of membership in some kind of group,
01:10:07.000 I'm against that.
01:10:08.000 I think it's extremely destructive.
01:10:10.000 It would be very incorrect to believe though that identity doesn't play a role and that we shouldn't understand how one's identity might make us think differently about certain issues.
01:10:24.000 I mean, any example would make that pretty clear.
01:10:28.000 You know, I as an immigrant to the United States.
01:10:30.000 Do I get some privileged position to decide what policy should be over all native-born Americans because I immigrated here?
01:10:38.000 No, that would be me using identity politics as like a mallet or a cudgel or whatever.
01:10:43.000 But as someone who did immigrate here, we should recognize that I may have things to say about it which would be valuable and worthy and important to sort of think about.
01:10:51.000 That's my view on identity politics.
01:10:53.000 But you're just not interested in the hierarchy of oppressed people.
01:10:58.000 I'm not interested in the Oppression Olympics and I'm not interested in using identity to silence ideas that could be perfectly good coming from someone who is not a member or checking a certain box.
01:11:08.000 Exactly.
01:11:08.000 Nor am I. I strongly believe in the individual and I think it's one of the most important parts of a collective group of human beings like our country.
01:11:16.000 We recognize that we're all different and there's a lot of weirdness amongst us, but we're individuals.
01:11:22.000 I like to treat people based on who they are, not what classification they fall under.
01:11:27.000 Now, do you think that that bad version of identity politics that I mentioned is a big problem on the left or not a big problem?
01:11:36.000 I'm curious.
01:11:37.000 I think it's certainly a problem, but I think it's a vocal minority problem.
01:11:41.000 That's what I think.
01:11:42.000 I think if you just regular people that are on the left that are working jobs and having families and doing their hobbies and they just have left-wing ideas, I don't think the vast majority of them hold those positions.
01:11:56.000 I think those positions are things that people use as revenue.
01:12:00.000 I mean, not as revenue, but it's like they get points from it.
01:12:03.000 They get points from certain types of behavior that they support, certain types of thinking that they support, and you've got woke social justice points.
01:12:13.000 Then we agree.
01:12:14.000 I asked because I genuinely didn't know.
01:12:18.000 I mean, I've heard you talk about identity politics.
01:12:20.000 It's a dangerous number, though, in terms of college campuses when you look at what happened in Evergreen State with Brett Weinstein.
01:12:26.000 It's very disruptive.
01:12:28.000 Yes and no.
01:12:29.000 I mean, I do think that it's disproportionately – I think it's a small problem, like you're saying.
01:12:34.000 I think a lot of the problem exists in – I mean, even at Boston College, you know, I had sort of maybe been incorrectly indoctrinated into the idea that this was really a problem everywhere on college campuses.
01:12:49.000 And I had an incident, the details of which wouldn't be appropriate to talk about, but with a student when I taught at Boston College.
01:12:56.000 Because of the circumstances and the identities involved, I was ready for it to go into, this is going to be resolved the wrong way on the basis of the toxic identity politics I'm hearing is existing on college campuses, and it was not.
01:13:11.000 It was the exact opposite.
01:13:12.000 So I think the same way that when you look at Yelp reviews, people who had a bad experience are way more likely to go and write about it, these individual stories get way more attention than the percentage of the problem that they represent.
01:13:26.000 I believe you're probably correct about that, but when you see videos like Nick Christakis getting just shouted down at Yale by a group of students and that they supported the students and that kind of shit, you say, well, it is real and it does exist.
01:13:41.000 It's real.
01:13:42.000 It exists.
01:13:42.000 I think that sensible people on the left like me call it out, but I want to be careful.
01:13:50.000 Imagine that you had someone from Cato on the show, which is sort of like a traditional conservative or American Enterprise Institute maybe is like a better example and A lot of the conversation was about getting them to talk about or denounce the alt-right, for example.
01:14:05.000 I'm sure they would do it, but how much should AEI denounce the alt-right when that's like a different thing?
01:14:11.000 That's a very good example.
01:14:12.000 Yeah, it's a very good analogy.
01:14:13.000 Yeah, I think we oftentimes are responding to this very vocal minority.
01:14:19.000 And those are the people that are most invested in getting these ideas pushed through.
01:14:25.000 It's also people that, for lack of a better term, they're probably mentally ill.
01:14:29.000 And I don't mean mentally ill in terms of have legitimate diseases, but in terms of their thought patterns.
01:14:34.000 They're probably obsessive.
01:14:36.000 I mean, I've had friends that were, especially friends that were heavily involved in this kind of stuff before, and it was very damaging to their mental health.
01:14:46.000 This type of stuff being politics?
01:14:48.000 Being woke, left-wing, shout-out-at-people-attack-people politics.
01:14:53.000 Okay, but I mean...
01:14:54.000 And then they realized somewhere on the line...
01:14:55.000 And then one of them, my friend Jamie Kilstein, they turned on him and then devastated his life.
01:15:01.000 And he realized along the way, like, oh, Jesus Christ, what was I doing?
01:15:05.000 I was checking my Twitter every five seconds and insulting people left and right and attacking people just to get everybody to say, yeah, go get them.
01:15:12.000 And showing everybody how woke I am and how progressive I am.
01:15:16.000 And it becomes a weird sort of...
01:15:19.000 A point system.
01:15:21.000 You're trying to score points.
01:15:23.000 You're trying to gain favor with your party.
01:15:25.000 There's a lot of that.
01:15:26.000 I think it's really important, though.
01:15:28.000 So there's people on the left and right who get pulled into political wokeness, whether it's, I'm now Tea Party in 2010, people that got sucked into Tea Party on the right, Antifa, whatever.
01:15:39.000 These are all groups with different Sort of followings, they're not all the same, whatever.
01:15:44.000 I do think that there is a difference between getting extremely passionate about the idea that everybody should have access to just basic healthcare than getting extremely passionate about the idea that we need to go out of our way to shut down every abortion clinic in the country.
01:16:01.000 I think that there's a difference.
01:16:02.000 And so I don't want to participate in a false equivalency between, well, you got very far left and very far right people and they're the same.
01:16:12.000 And you've got center left and center right and they're the same.
01:16:16.000 It's just two sides of the same coin.
01:16:17.000 Like obviously I have a perspective that is based on my politics.
01:16:22.000 I'm glad to debate any of these issues with anybody who wants to on the merits, but I don't want to make the false equivalency.
01:16:29.000 I mean, listen, when you look at Anti-Defamation League numbers, for example, the vast majority of hate incidents in the United States are coming from the right.
01:16:41.000 We could talk about other ways that the left is active.
01:16:44.000 We could talk about what it means or how things should be categorized.
01:16:49.000 But that's the reality and so I want to make sure I don't play a false equivalency game.
01:16:53.000 My audience would crush me if I did that, number one.
01:16:56.000 But I think it's just wrong.
01:16:58.000 I think it's wrong to do that.
01:16:59.000 I don't think the facts bear it out.
01:17:01.000 I think you're right there and I also think that these false equivalency kind of conversations are – they're ridiculous because each individual conversation about each individual issue deserves its own discussion.
01:17:12.000 Yes.
01:17:13.000 And to say, what about this?
01:17:15.000 Or what about that?
01:17:16.000 Those whataboutisms, those are the death of any real rational discussion.
01:17:21.000 Sure.
01:17:21.000 Because they go on forever.
01:17:23.000 They go on forever.
01:17:24.000 I mean, this is why scrolling Twitter endlessly is a problem, because there's really no end.
01:17:30.000 You could always scroll a little more.
01:17:31.000 To the end.
01:17:33.000 Right, yeah.
01:17:34.000 I mean, the new tweets are coming fast.
01:17:36.000 It's the same with a lot of those conversations.
01:17:38.000 What if anyone's ever done that, just scrolled until their phone died?
01:17:41.000 Just charge it, wake up in the morning and just scroll down all day.
01:17:46.000 How long does it take?
01:17:47.000 I think you wouldn't because the new content appears faster.
01:17:50.000 Right, because of the algorithm.
01:17:51.000 Yeah.
01:17:51.000 But you would still never run out.
01:17:53.000 No.
01:17:53.000 You would just keep going.
01:17:54.000 No.
01:17:55.000 There's a few things, I don't know if this would be interesting to go into, but there's a few things that I've found have been somewhat successful in conversations with people who really disagree with me.
01:18:05.000 And at least like lowering the temperature a little bit and getting people to maybe engage in a good faith way.
01:18:12.000 One of them is, how do you think I came to my position?
01:18:17.000 So you might be for total free market for-profit healthcare.
01:18:22.000 I am for a system where the government is more involved and even if you can't pay, you get care.
01:18:28.000 Before we even start, if I say, how do you think I arrived at my position?
01:18:32.000 That has been pretty useful.
01:18:35.000 Another example is, I think this came from Peter Boghossian, who I think you've had on.
01:18:41.000 The defeasibility question, which is, what evidence if I presented it to you would bring you over to my side?
01:18:46.000 I'm not saying I have that evidence or that it exists, but give me a framework as to what is keeping you from seeing this my way.
01:18:53.000 Because sometimes that exists, the person just doesn't know about it.
01:18:56.000 Those are two tools that I have found super useful in trying to make some headway With people who are hyper-partisan and very escalated with a lot of these issues.
01:19:09.000 Yeah, that makes sense.
01:19:10.000 Yeah, it's very difficult to have good faith conversations with people when you disagree with them.
01:19:15.000 You have to have discipline and you have to have some sort of a sense of self.
01:19:19.000 And you have to know how to be calm and kind.
01:19:24.000 You know, the descent into insults and dunking on people is one of the reasons why at the beginning of the conversation I was saying that one of the things I enjoy about your YouTube videos is you're a very reasonable, rational person and you don't get crazy and animated and insulting.
01:19:41.000 And I think we need more of that.
01:19:43.000 Because I think even though you're not going to convert some people, there's just a certain section of the population that disagrees with you that's just going to.
01:19:51.000 But there's a significant number that are going to go, hey, this David Pakman guy, he's reasonable.
01:19:56.000 He's making a lot of sense.
01:19:57.000 He's intelligent and articulate.
01:19:58.000 Well, my goal is, and I think it's sort of working in that we have a lot of Trump supporters who are paid subscribers to my show.
01:20:06.000 A lot.
01:20:07.000 We have some.
01:20:08.000 Are they taking notes?
01:20:09.000 Maybe they are.
01:20:10.000 Right to the boss?
01:20:10.000 But my goal is, yeah, that would be an interesting day if I wake up and Trump has responded to one of my videos about him.
01:20:19.000 What would you do?
01:20:21.000 I think it'd be a good day.
01:20:23.000 Well, that's what Colbert did.
01:20:25.000 How funny was that?
01:20:26.000 When Colbert was on TV, he's like, Donald, how did you not know that you shouldn't respond to me?
01:20:32.000 Right.
01:20:33.000 Yeah, yeah, yeah.
01:20:33.000 That's rule number one!
01:20:35.000 It's an important one unless you want to create a shitstorm of a very certain kind.
01:20:39.000 My goal is, I don't pretend to be neutral.
01:20:44.000 I think neutrality is almost always false because on most issues, people are not indifferent.
01:20:49.000 I mean, neutral is another way of saying indifferent.
01:20:51.000 You could be conflicted and neutral.
01:20:53.000 You could be conflicted and neutral, but I try to at least be objective and transparent in how I arrived at what I believe.
01:20:58.000 So you can disagree with my conclusion.
01:21:01.000 You can even come to me and tell me the facts I've used to reach the conclusion are incomplete or wrong, but I'm completely genuine in how I arrive there.
01:21:08.000 And I think that that is why we have some – I mean, yeah, there's – obviously, if you look at YouTube comments, there are right-wingers that watch my show.
01:21:17.000 But choosing to support it financially is a different thing and I get emails from conservatives who say, I don't agree with your conclusions but I do find that you're at least reasoning through the issues in a way that resonates with me and I want to support the fact that you're doing that.
01:21:29.000 That's outstanding.
01:21:30.000 That's a huge victory.
01:21:32.000 It really is.
01:21:33.000 In a sense.
01:21:34.000 In my eyes, in this day and age, I think this is the most polarized time I can remember as a 51-year-old man looking back at my history of paying attention to social issues and the way we communicate with each other.
01:21:49.000 Just the partisan attitudes that people seem to have.
01:21:53.000 I think it's probably because of Trump.
01:21:55.000 That's a giant part of it.
01:21:56.000 But it's also just a sign of the times of social media.
01:22:00.000 I think it's in part engineered by the algorithms of Facebook and Facebook.
01:22:06.000 Yeah.
01:22:28.000 Yeah.
01:22:52.000 Right.
01:22:53.000 Right.
01:22:56.000 Right.
01:23:04.000 You see that, and that's a factor.
01:23:06.000 That's a giant factor.
01:23:07.000 That kind of shit is a factor, and that has sort of become part of the sport of social media, has been arguing.
01:23:15.000 I don't do it, I don't engage, but I do go on Facebook sometimes, and someone makes an abortion post, and I just watch the chaos, like, oh my god!
01:23:24.000 Or anything having anything to do with Trump, or anything having anything to do with the Second Amendment, or anything that has anything to do with the wall, or immigration.
01:23:33.000 So I don't know that people are actually in larger disagreements than they were previously.
01:23:39.000 I think that, yes, Trump has coarsened the language and the way in which it's now acceptable to talk about a lot of these things.
01:23:45.000 That's number one.
01:23:46.000 I think the social media algorithms, like you're pointing out, reward the most extreme and polarizing comments and reactions in a never-ending feedback loop where the most polarizing initial tweet...
01:23:59.000 Generates more responses than less polarizing tweets and then the sub responses that are most polarizing and aggressive do the exact same thing in this never-ending feedback loop.
01:24:11.000 I think it's all those things, but I don't know that people are having bigger disagreements than in times past.
01:24:17.000 I just think that they're public in a different way.
01:24:20.000 Well, there's more disagreements because people have more opportunity to disagree.
01:24:24.000 So they have more opportunity to engage.
01:24:26.000 Particularly when you're talking about people that are addicted to their phones.
01:24:29.000 And this is coming from a guy who uses his fucking phone four hours a day.
01:24:32.000 I'd like to think that one hour of that is productive.
01:24:34.000 But I know that three hours of it is me staring at butts on Instagram.
01:24:38.000 Right.
01:24:39.000 Looking at muscle cars and watching crazy videos.
01:24:42.000 Four hours a day.
01:24:42.000 And then how much are you on a computer?
01:24:44.000 I don't know.
01:24:45.000 I don't have that data.
01:24:47.000 But it's not as much.
01:24:49.000 And the good thing about it is most of my bullshit I'm doing on the phone.
01:24:54.000 Most of my computer work, unless I'm laying in bed and I just watch...
01:24:58.000 Embarrassingly enough, I watch YouTube videos on pool.
01:25:00.000 That's what I watch before I go to bed.
01:25:02.000 When I play pool.
01:25:03.000 So I watch professional pool matches before I go to bed because it's calming.
01:25:07.000 It's relaxing and I analyze positions.
01:25:10.000 That's interesting.
01:25:10.000 I do the same thing with chess.
01:25:13.000 There's chess streamers that I watch and it's similar.
01:25:16.000 You could kick back and you're engaged but it's nothing crazy and it's also kind of stimulating in an intellectual way.
01:25:24.000 Right.
01:25:25.000 And it's different than politics.
01:25:27.000 Yes, yes, yes, yes.
01:25:28.000 Like on weekends when people, you know, like my mom will, you know, want to talk to me about politics.
01:25:33.000 And I'm just like, I did this all week, mom.
01:25:35.000 Yeah.
01:25:36.000 Did your mom watch your show?
01:25:37.000 She does, and she watches other shows.
01:25:39.000 My family's super political.
01:25:41.000 Oh, really?
01:25:41.000 Yeah, so on the weekends, if it's Saturday, I'm right in the middle of my break period.
01:25:45.000 Are they left wing?
01:25:46.000 Oh, yeah.
01:25:46.000 Thank God.
01:25:47.000 Jesus, if they weren't.
01:25:48.000 Can you imagine?
01:25:48.000 Could you imagine?
01:25:50.000 Some mean dad calling you out.
01:25:52.000 What the fuck is wrong with you, David?
01:25:54.000 I might have had to defoo.
01:25:55.000 Ah, defoo.
01:25:57.000 Yeah.
01:25:59.000 Anyway.
01:26:00.000 Anyway, indeed.
01:26:01.000 I think that there's more opportunity, as we're saying, to disagree with people, more opportunity to argue.
01:26:09.000 And in those more opportunities, you're seeing more conflict and I think more polarization.
01:26:16.000 And I think, again, the social media algorithms and all the other nonsense that gets – I think there's – I really do believe that the feeling that I get – but it also might be because a big part of my job is being on the internet.
01:26:27.000 So maybe I'm more engaged with it than other folks are.
01:26:30.000 Our view skews it a little bit.
01:26:31.000 But I think – so in practice, let's imagine that the disagreements are equal to what they've always been.
01:26:37.000 But there's more opportunities to disagree, and the algorithm favors more escalated disagreement than rational conversation.
01:26:45.000 The effect is that you might meet someone with whom you have 80% in common in terms of your political views, but the circumstances in which you engage with that person are going to be on the 20% that you don't.
01:26:57.000 So it makes it seem as though you just have very little common ground with anybody.
01:27:02.000 Because the 80% agreement becomes background.
01:27:05.000 And the social media platforms, the debates happening on YouTube, elsewhere, are focused only on the most divisive fraction of one's entire political views.
01:27:16.000 And that's, I think, what the problem is.
01:27:18.000 But it makes sense because most people agree that – I don't know – gas stations – I mean just to pick something innocuous.
01:27:27.000 Most people agree that it's good to have a regulatory system that makes sure that when you think you've pumped five gallons of gas, you've gotten five gallons of gas.
01:27:35.000 It's so uncontroversial that nobody is going to talk about it.
01:27:38.000 Like it makes sense that the focus is going to be on the disagreements.
01:27:41.000 Where it's damaging is then when you meet people in real life.
01:27:45.000 And it's hard to relate or even be in the same room because only those differences are sort of like played up or relevant.
01:27:52.000 Yes, yes, yes, yeah.
01:27:53.000 Yeah, that conflict gets highlighted.
01:27:56.000 You have conflict bias.
01:27:57.000 Yeah.
01:27:58.000 Yeah, I... I don't know where I see this going.
01:28:05.000 That's one of the more interesting things about, particularly with social media and things when you come to this Crowder situation.
01:28:12.000 I don't know where this is going because I didn't know this was ever going to be a thing.
01:28:17.000 I had never really considered that there was going to be some digital town square that we were all going to be enjoying, whether it's Twitter or YouTube or whatever it is.
01:28:25.000 That might even need regulation.
01:28:26.000 Yeah, that might even need regulation.
01:28:28.000 But Getting back to the Crowder thing, the issue...
01:28:34.000 So you agree with it in the sense that he was...
01:28:41.000 Yeah, I agree.
01:29:04.000 In terms that the terms of service say is not allowed.
01:29:10.000 Is that different in your opinion than someone singling something out for what you believe is their mental incompetency?
01:29:19.000 Well, mental incompetent, do you mean that they're ignorant or that they're mentally ill or cognitively limited?
01:29:25.000 Cognitively limited.
01:29:26.000 Mocking their ability to think, mocking their intelligence, mocking their decisions, mocking the way they talk, and then encouraging other people to do the same thing.
01:29:35.000 And then that person gets harassed.
01:29:37.000 Based on their intelligence, based on their performance on particular YouTube videos and conversations, and there's active harassers.
01:29:45.000 There's people that do that.
01:29:46.000 Is there a difference between, say, what Sam Seder does to Dave Rubin?
01:29:52.000 What does Sam do to Dave Rubin?
01:29:53.000 I don't know that I've seen that video.
01:29:56.000 Dozens of videos.
01:29:57.000 Don't say that video.
01:29:58.000 He has dozens of videos where he's just dunking on Dave Rubin.
01:30:03.000 So, I mean, I have some as well.
01:30:04.000 I believe that they are substantive.
01:30:06.000 My view is that my videos about Dave Rubin are substantive.
01:30:09.000 I don't really watch any left-wing stuff because I want to try to isolate myself enough to make sure that what I'm saying are my ideas and that I'm not taking them.
01:30:18.000 So Sam's a friend of mine.
01:30:19.000 Sounds like a comic.
01:30:20.000 Oh, that's interesting.
01:30:21.000 Comics do that.
01:30:22.000 I don't know, but if there are some specific examples, we can comment about them.
01:30:27.000 But I think that to your first question, there is a difference between going after someone for sexual orientation.
01:30:35.000 Right.
01:30:35.000 Then going after them for the fact that they say things that are wrong or don't know stuff, until you're making fun of someone who has an actual handicap of some kind, some kind of cognitive limitation that would be a disability of some kind.
01:30:50.000 Then you are mocking someone for a disability.
01:30:52.000 But the resulting effect of the harassment...
01:30:57.000 See, this is what I was getting at before with Crowder.
01:30:59.000 What Crowder said was one thing, but one of the things that Carlos Mesa was discussing was what the people that had watched Crowder, what they were doing, how they were going after him.
01:31:09.000 See, that is a real discussion.
01:31:11.000 What happens when you say something about someone and then your fans agree and then they take action?
01:31:19.000 Which...
01:31:21.000 I didn't see that in the Steven Crowder decision that the reaction was part of YouTube's evaluation.
01:31:29.000 Now, I may just have missed that, but I didn't see YouTube say that part of the calculation had to do with what other people were doing.
01:31:36.000 I don't think they did say that.
01:31:37.000 I don't think they would, but I think Carlos Mesa did say that.
01:31:40.000 It was one of the things that he was talking about, this endless assault that he's experienced.
01:31:45.000 Well, he's right to call it deplorable.
01:31:49.000 I think we would agree with that.
01:31:50.000 I think your question is more about whether YouTube...
01:31:52.000 Who's responsible for it, right?
01:31:54.000 Who's responsible for it, yeah.
01:31:54.000 These anonymous people that can just lash out at someone and insult them out of nowhere.
01:31:59.000 Ultimately, they are responsible.
01:32:00.000 Those people are responsible.
01:32:01.000 Those people are responsible.
01:32:03.000 However, so there's this term stochastic terrorism.
01:32:07.000 I don't know if you're familiar with it.
01:32:08.000 No, I'm not.
01:32:09.000 Stochastic terrorism is the idea that if you have a big enough audience...
01:32:14.000 And you go and every day you're talking about someone should really do something about a particular politician.
01:32:24.000 You're doing it every day.
01:32:25.000 You're doing it every day.
01:32:26.000 At a certain point, given a large enough audience and enough repetition of that and the fact that there's like a distribution of people's emotional states, cognitive capacity, etc., It is statistically probable that someone from that audience is going to go and try to do something about whoever it is that you're targeting.
01:32:45.000 That individual Who has the show and is hammering on this person day after day after day, they're not going to be legally responsible for that person from their audience who went and did something.
01:32:58.000 There's no way that you're going to hold them legally responsible under the current legal system that we have.
01:33:03.000 But you could argue that it is irresponsible in some way not to understand that your consequences have actions.
01:33:10.000 Of course, the person who goes and does the violent act is the primary person who is responsible.
01:33:16.000 Right.
01:33:16.000 But as long as you're not calling out for that act, how do we make this distinction that someone is encouraging that act or someone is at least inspiring that act?
01:33:27.000 They're judgment calls.
01:33:28.000 I mean, listen, I can go on my show and I can speak in vague terminology or specific terminology.
01:33:33.000 You know, imagine that there's a local business that I don't like.
01:33:36.000 I could go on my show and I could say this business did this and I need everybody in my audience to show up there and to make it impossible to get in and patronize that business.
01:33:48.000 That's very clearly on one side of the gray area.
01:33:51.000 I could instead say You know, there's a business, I could say the type of business, but I not name it.
01:33:58.000 If it's a small enough town, people would know exactly what business I'm talking about.
01:34:02.000 And I really don't like the way I was treated there.
01:34:04.000 And if only there was some way that someone could do something about it.
01:34:08.000 The effect could be the exact same one.
01:34:10.000 I don't know how you measure when it's on one side or the other.
01:34:13.000 Yeah, it's like, right, you could somehow or another remove your You could somehow or another make it so that it's, yeah, I'm agreeing with you.
01:34:25.000 You could remove your responsibility for the action in some sort of way.
01:34:28.000 This just got interesting.
01:34:31.000 I was trying to find a tweet from Mazza about him sending or asking people to flag Crowder's videos.
01:34:38.000 Did you get the one where he asked people to go assault people with milkshakes and humiliate them at every turn?
01:34:43.000 YouTube tweeted an hour ago, or yeah, at 1230, that to clarify, this is responding to Carlos Maza, to clarify, in order to restate monetization on his channel, he will need to remove the link to his t-shirts.
01:34:56.000 Oh, so it's the Figs t-shirt.
01:34:57.000 Yeah.
01:34:58.000 Oh.
01:34:58.000 Well, that's all he has to do?
01:34:59.000 He specifically asked about that, and then they responded.
01:35:02.000 That's all he has to do.
01:35:03.000 Wow.
01:35:03.000 That's pretty easy.
01:35:04.000 Well, that's pretty straightforward.
01:35:05.000 That's pretty straightforward.
01:35:06.000 Yeah, this shirt's stupid.
01:35:08.000 But, you know, he's a comedian.
01:35:10.000 I mean, that's what Crowder's doing.
01:35:11.000 And his doing the thing about Mesa, he's mocking him for his appearance.
01:35:16.000 But Carlos Mesa specifically encouraged people to throw milkshakes at people that disagree with him.
01:35:24.000 And to harass them publicly and humiliate them.
01:35:28.000 So one thing doesn't justify the other.
01:35:30.000 No, it doesn't.
01:35:30.000 But that is more egregious.
01:35:33.000 Asking people to assault people and asking people to physically humiliate people in person, in my opinion, is more egregious.
01:35:42.000 I don't agree with mocking his physical – well, his physical appearance is – that's just what it is.
01:35:49.000 You know, but the sexual orientation aspect of it is like, yeah, I get it.
01:35:53.000 I get it.
01:35:53.000 It's not nice.
01:35:55.000 How far do you think the this is just a comedian thing goes?
01:35:59.000 Because I hear that a lot in just excusing things that are said.
01:36:02.000 Well, he's trying to do comedy, right?
01:36:04.000 So he's trying to make fun.
01:36:05.000 Is he?
01:36:06.000 Well, hold on.
01:36:06.000 But comedy and making fun of someone are two different things.
01:36:09.000 Like, I don't do comedy, but I will sometimes make fun of things people say.
01:36:12.000 Right, but he's doing it to be funny.
01:36:15.000 He's making fun of things specifically to be funny.
01:36:17.000 And sometimes, you know, when you do that, you go too far.
01:36:22.000 You cross lines.
01:36:23.000 I genuinely did not realize that Crowder does a comedy show.
01:36:25.000 Oh, yeah.
01:36:26.000 His show is a comedy show.
01:36:28.000 Wow.
01:36:29.000 Yeah, a lot of it is funny.
01:36:31.000 He does some funny shit.
01:36:32.000 He really does.
01:36:33.000 Whether you agree with him or disagree with him.
01:36:34.000 He's done some hilarious bits.
01:36:36.000 I genuinely...
01:36:37.000 I'm reacting in real time because I had no idea.
01:36:40.000 He has this bit he does about this French socialist.
01:36:42.000 He puts on a wig and pretends to be this different person.
01:36:44.000 He's pretended to be a transgender person.
01:36:47.000 He's pretended to...
01:36:48.000 He's done a bunch of these infiltration videos where he'll go into these ridiculous organizations and ask them questions.
01:36:54.000 But it's very much a comedy show.
01:36:57.000 Dressing up like a trans person is funny?
01:36:59.000 If you're funny at it, if you're good.
01:37:01.000 I mean, Mrs. Doubtfire.
01:37:02.000 Isn't that funny?
01:37:03.000 And that's what he was doing.
01:37:04.000 He was dressing up as a woman.
01:37:06.000 He was dressing up as a woman.
01:37:07.000 Right.
01:37:08.000 He was dressing up as a woman.
01:37:09.000 And that's a great movie.
01:37:10.000 I agree with you there.
01:37:11.000 There's some funny shit.
01:37:13.000 Yeah.
01:37:14.000 Look, how many times in In Living Color did they dress up like women?
01:37:18.000 There's some humor to someone who is a man who's dressing up like a woman.
01:37:23.000 Sure.
01:37:24.000 That can be, and I shouldn't comment specifically on Crowder doing it if I haven't seen it.
01:37:28.000 No, he's got some funny shit.
01:37:29.000 He does.
01:37:30.000 And I'll take shit for that, for saying that.
01:37:33.000 He's funny.
01:37:33.000 He makes me laugh.
01:37:34.000 That's why I'm staying quiet.
01:37:35.000 Yeah, I know.
01:37:36.000 I understand.
01:37:37.000 I get it.
01:37:38.000 I don't agree with him constantly going on and on about this guy being queer or calling him a lispy little queer, but he's doing it to try to be funny.
01:37:47.000 So the question is, when can you do that to be funny?
01:37:50.000 And apparently with YouTube, you can do that and be funny.
01:37:52.000 As long as you remove the t-shirt.
01:37:54.000 Yeah.
01:37:55.000 Interesting.
01:37:56.000 That's even weirder now.
01:37:58.000 It's weird.
01:37:59.000 That they just, if he removes a t-shirt.
01:38:01.000 A link, a link to the t-shirt.
01:38:02.000 Oh, he can still sell it, he just can't link to it from YouTube.
01:38:05.000 I think that's what they're saying.
01:38:06.000 That's so minor that it's hard to, I mean, it's mind-blowing.
01:38:09.000 It is, but it isn't, because it's sort of encouraging people to buy it, and then YouTube would say, well, if you have an ad on that, then you're encouraging homophobic behavior, and we can't allow that with our monetization policy.
01:38:22.000 I mean it's minor in the context of everything else that's wrapped up in this.
01:38:25.000 It might be an important revenue-generating t-shirt.
01:38:29.000 I mean I think so much of this Again, these disagreements on issues.
01:38:34.000 It comes down to what you and I were talking about before.
01:38:37.000 That if two people are in a room together, 95% of what they're talking about you're going to agree on.
01:38:41.000 When someone's making a video on someone, if they just say, like, fucking David Pakman, man.
01:38:46.000 Here's my deal with that guy.
01:38:48.000 And then you're just ranting thing.
01:38:49.000 I hate his fucking neck.
01:38:51.000 I don't like his shirt.
01:38:52.000 And his face is stupid.
01:38:54.000 When people do stuff like that, it's a terrible way to communicate.
01:38:59.000 First of all, You'd have to be a real asshole to say most of the things that people say about things when they're dunking on them in person.
01:39:05.000 In person.
01:39:06.000 You'd have to be a bad person.
01:39:08.000 Yeah.
01:39:08.000 So you know the person's going to see it, so you're just deciding, I'm going to be a bad person, but I'm going to pretend I'm not a bad person because I'm going to do it in a way where they're not in the room, so I'm just going to shit all over them and give them my real opinion.
01:39:21.000 Sure.
01:39:22.000 But it's not like you and I are at dinner.
01:39:25.000 And you're like, you know, fuck this guy.
01:39:28.000 That's how people talk.
01:39:30.000 But when you're doing that, but you're doing it, you're broadcasting it.
01:39:33.000 I think we're all learning in this process of doing podcasts and video blogs and all this stuff.
01:39:40.000 We're all learning that you're not alone.
01:39:44.000 You are doing this and you're saying it in a way that that person's going to see.
01:39:49.000 And the same could be applied to Dave Rubin and Sam Seder dunking on them all the time.
01:39:56.000 It's kind of the same thing.
01:39:58.000 And Michael Roberts as well.
01:39:59.000 It's the same sort of thing.
01:40:01.000 That's what they're doing.
01:40:02.000 Yeah.
01:40:03.000 And they're saying things that they wouldn't say if he was there.
01:40:06.000 In person.
01:40:07.000 Right, but they would say if they were sitting around having lunch together, talking shit about some stupid thing that he said the night before.
01:40:14.000 Yeah, that's fine.
01:40:15.000 And I don't think, I mean, whether or not you would say something in person doesn't tell us whether it's a fair or unfair critique, I think it's fair to say.
01:40:33.000 Yes.
01:40:35.000 Yes.
01:40:36.000 Yes.
01:40:47.000 In in-person conversations usually will not lend itself to like screaming or violence or whatever.
01:40:54.000 If that part is the focus.
01:40:55.000 I completely agree with you on that.
01:40:57.000 Yeah.
01:40:57.000 I think we'd be better off if we did try to communicate with...
01:41:01.000 But when you're doing comedy, that goes out the window.
01:41:04.000 It does.
01:41:05.000 But even comedy aside, I agree with the principle.
01:41:07.000 Communicate.
01:41:08.000 Battle of ideas.
01:41:10.000 Marketplace of ideas.
01:41:11.000 Very, very big ideas we all want to hear about and what are the best ideas and let's rank the ideas.
01:41:16.000 There are people whose views are so extreme that you can't really bring them to the table as reasonable negotiating partners for figuring something out.
01:41:26.000 Right.
01:41:26.000 Like Richard Spencer.
01:41:27.000 Sure.
01:41:28.000 Or even, I mean, okay.
01:41:30.000 Louis Farrakhan.
01:41:32.000 Louis Farrakhan, who I've spoken out about many times.
01:41:35.000 Imagine that we want to figure out what the tax rate should be.
01:41:38.000 Something that politicians have to do all the time.
01:41:41.000 If you have a group of people who believe that we need a 25% flat tax and a group of people who want, you know, like an escalating progressive tax that gets as high as 70% on income over 10 million, whatever, right?
01:41:52.000 Like fill it all in.
01:41:53.000 All those people are going to be able to have a conversation.
01:41:56.000 If someone comes in who says any taxes that the government collects are a form of slavery, how do you integrate that into the conversation about how to set tax rates?
01:42:08.000 Hmm.
01:42:10.000 You can't.
01:42:11.000 Right.
01:42:12.000 Yeah.
01:42:13.000 So all of this stuff, you know, there's this new movement now, which I think is great, about long-form conversations, going in-depth, figuring out what our disagreements are.
01:42:23.000 Like, I'm for all of it.
01:42:24.000 I'm absolutely for all of it.
01:42:26.000 Well, you do it.
01:42:27.000 I do it.
01:42:27.000 I do it.
01:42:28.000 Yeah, sure.
01:42:28.000 Okay.
01:42:29.000 But where I do think that there's like a lack of pragmatic reality to it is some people's ideas are so extreme that they can't in any sensible way be incorporated into an actual good faith discussion of how society should be organized.
01:42:43.000 That is the problem with having conversations in scale, right?
01:42:47.000 And that's the problem with Twitter and with YouTube that you're dealing with millions and millions and millions of human beings.
01:42:52.000 And when you have that broad spectrum of humans, you're going to have people on the far ends of both sides.
01:42:57.000 And at a certain point, a decision has to be made about who actually gets to participate in the decision-making conversations.
01:43:05.000 It's great for everybody to have a voice on taxation on Twitter, but imagine if there was a significant portion of our elected officials who straight up think taxes are slavery.
01:43:15.000 I just don't know how that becomes integrated into a decision about tax policy.
01:43:19.000 Right.
01:43:19.000 I think the argument would be that bad ideas should be combated with good ideas, not with silencing someone.
01:43:27.000 And that when you do silence someone, you just sort of create this blockade where the idea builds up behind it, and then the opposition to your perspective builds up, and then people start picking teams and picking sides.
01:43:39.000 And I honestly think that that's something that's going to be going on right now with this whole Crowder-Vox thing.
01:43:46.000 Mm.
01:43:46.000 I think people are going to pick sides and they fucking love it.
01:43:49.000 People love a good conflict to get into.
01:43:51.000 There's a lot of people in their cubicles right now that are weighing in and firing up and there's people that want to dox him again and there's people who want to infiltrate his Facebook and his Twitter.
01:44:01.000 That's what people do.
01:44:02.000 You're dealing with millions.
01:44:04.000 What does Crowder have?
01:44:05.000 3.5?
01:44:06.000 3.8 million?
01:44:07.000 Something like that?
01:44:08.000 I mean, I think what you have to also remember is it's not just the reactions that are sort of like tailored to continue the escalation.
01:44:17.000 I mean, in the end, maybe Crowder personally in his personal life does refer to people he perceives to be gay or who are gay as queers.
01:44:25.000 I don't know.
01:44:26.000 Or he uses the word fags.
01:44:27.000 I have no idea.
01:44:28.000 But he didn't use that word.
01:44:30.000 The t-shirt has the asterisk.
01:44:31.000 I get it.
01:44:32.000 It's a goof.
01:44:32.000 It actually has a fig.
01:44:35.000 It doesn't have an asterisk.
01:44:37.000 Oh, okay.
01:44:58.000 It has a very specific path and set of reactions that it's going to trigger.
01:45:04.000 You mean specifically that shirt?
01:45:06.000 The shirt and referring to Carlos Maza as a queer Mexican or whatever the phrase is.
01:45:12.000 See, that's a weird one.
01:45:14.000 The queer one is a weird one.
01:45:17.000 With LBGTQ, here's a good one, right?
01:45:21.000 National Association for the Advancement of Colored People.
01:45:24.000 Okay.
01:45:25.000 NAACP. You can't call people colored people.
01:45:27.000 Sure, but that organization was named a long time ago.
01:45:30.000 Sure, sure, sure, sure.
01:45:31.000 And it's more acronyms than anything else at this point.
01:45:33.000 I understand, but it's not, right?
01:45:35.000 We both know what the acronym with the individual letters or words in that acronym are.
01:45:43.000 I think...
01:45:44.000 The word queer is not a derogatory word.
01:45:48.000 It can be or it cannot be, depending on how it's used.
01:45:50.000 It is if you go, you fucking queer.
01:45:52.000 Yeah.
01:45:52.000 Right.
01:45:53.000 Sure.
01:45:53.000 I mean, listen, it's the same way with Jew.
01:45:56.000 If I'm in a family thing and it's a bunch of Jews or whatever, that's a word that can be used in a way that if someone shows up, if Richard Spencer shows up or one of his followers and goes to a bar mitzvah and talks about this room full of Jews,
01:46:12.000 the word is the same word, but we're talking about two very different things.
01:46:16.000 That's a good point.
01:46:17.000 But should he be allowed to say this room full of Jews?
01:46:20.000 Allowed?
01:46:21.000 I mean, it's not illegal.
01:46:22.000 No, it's not illegal.
01:46:23.000 He is allowed.
01:46:24.000 Right, he is allowed, but where does it, like, this room full of, where does it get toxic?
01:46:29.000 Well, if Richard Spencer shows up at a bar mitzvah and yells about this room full of Jews, I think it's gotten toxic.
01:46:35.000 That's a good subject to break this stalemate of this subject.
01:46:40.000 Not stalemate, but, you know, sort of end this.
01:46:44.000 Anti-Semitism seems to be ridiculously on the rise, and that's stunning to me.
01:46:50.000 That shocked me.
01:46:52.000 Why?
01:46:53.000 Because the internet.
01:46:54.000 The internet sort of exposed anti-Semitism that I didn't necessarily know existed at the levels it existed at.
01:47:00.000 I knew there was anti-Semites, but I didn't know they were so brazen and overt.
01:47:04.000 Well, they've gotten brazen since January of 2017. Oh, okay.
01:47:09.000 I don't know that Donald Trump has created anti-Semites.
01:47:14.000 In fact, he probably hasn't.
01:47:16.000 Well, his son-in-law is Jewish.
01:47:17.000 His son-in-law is Jewish, his daughter converted to Judaism, etc.
01:47:20.000 Yeah.
01:47:20.000 But I think that Richard Spencer told me.
01:47:24.000 We know that Trump is not literally a white nationalist who is going to talk about, let's take control back from the Jews.
01:47:30.000 But we see him as the closest thing to what we would like.
01:47:35.000 He talks about...
01:47:36.000 People from Mexico, he talks about shithole countries, etc.
01:47:41.000 So it's just emboldened the movement.
01:47:44.000 It doesn't necessarily create...
01:47:46.000 Right, but people from Mexico and shithole countries, that doesn't necessarily really equate with Israel.
01:47:53.000 No, well, anti-Semitism in Israel also are two totally separate things.
01:47:56.000 You could be against the current Israeli administration, as I am, like Benjamin Netanyahu, and still call out anti-Semitism against Jews in the United States, for example, or whatever.
01:48:07.000 I see what you're saying.
01:48:08.000 One is not directly linked to the other, but if you're a group that already has these views, and then you see a guy who opens his campaign talking about They're sending rapists and criminals, but some I'm sure are good people.
01:48:23.000 And I don't want people coming here from shithole countries.
01:48:25.000 What about Norwegians?
01:48:26.000 Whatever.
01:48:27.000 It's a signal.
01:48:28.000 It's a signal.
01:48:29.000 And I've spoken to former KKK people, some of whom are really interesting people to talk to, and they know exactly why it's appealing because they see the signals and the vocabulary and the dog whistling.
01:48:41.000 So I think it's just brought it out into the forefront.
01:48:44.000 I don't know that new anti-Semitism has necessarily been generated, although it being in the forefront probably does start to get some people kind of curious, like, oh, maybe all the problems are because of the Jews.
01:48:54.000 I don't know.
01:48:56.000 It's just, I guess they find groups of like-minded folks and they join along, right?
01:49:02.000 Is that?
01:49:03.000 The anti-Semites?
01:49:04.000 Yeah, they find them online and then you can stumble into it where you ordinarily wouldn't be around people that are having those discussions.
01:49:12.000 That can happen and a lot of the people that I've talked to that got into those beliefs and then out of them said that they got in usually on a community level.
01:49:21.000 There was something about the community that was appealing to them.
01:49:24.000 Like gangs.
01:49:31.000 Yeah.
01:49:33.000 Yeah.
01:49:44.000 It's just – so you think the rise of it in 2017, there's more anti-Semitism or you think it's more overt?
01:49:52.000 I believe it's more overt.
01:49:53.000 Because Trump is the president.
01:49:55.000 Yeah.
01:49:55.000 And groups that track these incidents like the Anti-Defamation League and others, they have the data and there have been increases.
01:50:04.000 Yeah, it's stunning to me.
01:50:07.000 You know, you see it online in so many different places now, and I just don't remember seeing it before.
01:50:12.000 Or not like that.
01:50:14.000 You run into it so often, or people calling people Zionist shills.
01:50:18.000 Yeah, I mean, that's an important thing to talk about.
01:50:21.000 I mean, people call me that.
01:50:23.000 All the time.
01:50:24.000 And, you know, I feel like that is an issue where I try to speak.
01:50:30.000 I mean, Schill to me suggests that you're saying one thing.
01:50:35.000 But with some kind of other agenda that you're trying to push in some way.
01:50:40.000 In other words, you are being in some way deceptive about your actual intentions and what you say.
01:50:47.000 So I think when people call me a Zionist shill, what they mean is I'm talking about one thing with the secret goal or below-the-surface goal of actually promoting some action by the state of Israel.
01:51:00.000 I think that's the idea of a shill.
01:51:03.000 But, you know, I mean, I'm opposed to the current prime minister in Israel.
01:51:08.000 I've made clear that...
01:51:10.000 Isn't he in trouble right now?
01:51:11.000 Yeah, I mean, he's been in tentative trouble for a long time.
01:51:15.000 His wife is in trouble as well, I believe.
01:51:18.000 But that, I mean, the problem is, and I know that there are people on the left and right that when I say this will...
01:51:25.000 I mean, I'm going to get crushed from what I'm about to say.
01:51:28.000 Sometimes when someone says...
01:51:31.000 It's related to your view on the Israeli-Palestinian conflict.
01:51:35.000 Sometimes when someone says Zionist shill, it's cover for just wanting to insult someone for being Jewish or anti-Semitism.
01:51:41.000 You got to look at every instance one by one.
01:51:43.000 Yeah, and sometimes people just like saying things too.
01:51:47.000 Yeah, it's a popular thing to say.
01:51:49.000 Especially if they find out that you're Jewish.
01:51:50.000 Absolutely.
01:51:51.000 It's like a thing.
01:51:52.000 It's a thing to say.
01:51:53.000 It's a little weapon to use.
01:51:55.000 It absolutely is.
01:51:57.000 Do you find, this is sort of an abstract question, but overall, doing this show and having this ever-increased exposure, do you enjoy it?
01:52:11.000 Are you weirded out by the interactions with all the people?
01:52:14.000 Do you feel pressure by all the comments?
01:52:17.000 Do you feel a little bit of anxiety from all the social media aspects of it?
01:52:23.000 I do.
01:52:24.000 So...
01:52:25.000 I enjoy the idea that people are listening to my ideas and either agreeing or disagreeing, but they're considering them and then integrating it into how they figure out what they think about the world around them.
01:52:37.000 That's awesome.
01:52:38.000 I do get weirded out by...
01:52:42.000 Sort of like safety security stuff that sometimes comes up, which I try not to even like put too much attention on because I feel like it just feeds and gives people ideas.
01:52:53.000 And people who, you know, come up to me and – I mean I'm more curious to actually hear your thoughts about this – come up to me and, you know, they may not necessarily see the world the way I see it and I'm unsure.
01:53:06.000 Sort of like what are their intentions type of thing.
01:53:09.000 I mean it gives me anxiety – And it gives people that are close to me anxiety for sure.
01:53:14.000 Yeah, because your profile is just, if you keep doing this, you're very good at it.
01:53:18.000 Thank you.
01:53:18.000 You're going to continue to get more and more popular.
01:53:21.000 Yeah.
01:53:22.000 And, I mean, I guess it's a double-edged sword.
01:53:24.000 I mean, I don't know.
01:53:24.000 Like, when you do a comedy show, afterwards, is it kind of like a free-for-all where people can come up and chat with you?
01:53:31.000 Sometimes.
01:53:32.000 Yeah.
01:53:32.000 Yeah.
01:53:32.000 And do you get skittish?
01:53:34.000 No.
01:53:34.000 You don't?
01:53:35.000 Nah.
01:53:35.000 Most people are nice.
01:53:37.000 I agree.
01:53:37.000 The vast majority of people are nice.
01:53:39.000 I agree.
01:53:39.000 They come to see you.
01:53:40.000 They're usually fans and...
01:53:43.000 I just want to take pictures and say what's up.
01:53:45.000 I guess it's a little different when what you do is like overtly political versus other areas.
01:53:50.000 Like if you're an actor, comedian, doing other things, race car driver.
01:53:54.000 Right.
01:53:54.000 You are in a much more conflict-driven profession in a sense.
01:53:59.000 Yeah.
01:53:59.000 I mean, I have political people on like you, but I'm not entirely engaged in politics like you guys are.
01:54:06.000 Right.
01:54:07.000 Yeah, I don't know.
01:54:08.000 I mean, I do worry that no matter what happens in the next few elections, I don't know how we reverse the radicalization polarization effects of the social media echo chambers that we've been talking about.
01:54:23.000 And I only see that as further.
01:54:26.000 I mean, we could still accomplish good things while that's going on.
01:54:29.000 Like, I think if we elect the right people, maybe we can get good things done.
01:54:32.000 But in parallel, there is this hyper-radicalized, polarized narrative that's going on, and I don't see any way that that's going to turn around.
01:54:40.000 I wonder myself.
01:54:41.000 I do.
01:54:42.000 And I'm very confused by it because I don't see any long-term solution for this other than some radical change in the way human beings communicate with each other.
01:54:55.000 And I've contemplated that and hypothesized and theorized.
01:54:59.000 I really think that if...
01:55:02.000 What has changed the way we communicate is technology and the immersive aspect of social media technology, the fact that we carry these devices with us all the time that allow us to communicate and allow us to read other people's communications or watch other people's communications.
01:55:19.000 I have a concern that this is going to escalate with each expansion and each innovation in terms of, and I don't know what it would be, because no one saw the internet coming.
01:55:33.000 If you go back 30 years ago, no one ever thought anything was going to be anything like it is now.
01:55:38.000 Well, Al Gore did.
01:55:39.000 Haha!
01:55:40.000 I bet he did.
01:55:42.000 But if you go 30 years from now, what are we really looking at?
01:55:47.000 What is this world going to be?
01:55:50.000 I don't think anybody has an idea.
01:55:52.000 I think we have no idea.
01:55:53.000 And I think it's going to be, if you look at the trend, the trend is not towards calming people down and giving people space and allowing people to meditate more.
01:56:02.000 No, the trend is to get more and more immersed.
01:56:04.000 Right.
01:56:05.000 The trend is for us to get closer and closer to each other, to remove boundaries, remove boundaries for information and ideas.
01:56:12.000 Even in long-term contemplations of this, I've often thought that everything, right, all of our communication is basically ones and zeros.
01:56:22.000 It's all information.
01:56:23.000 It's all words and thoughts and videos.
01:56:26.000 And now you're getting into cryptocurrency.
01:56:28.000 Now, cryptocurrency is essentially ones and zeros.
01:56:31.000 It's all digital.
01:56:32.000 Everything's digital.
01:56:33.000 And the bottlenecks, if any bottlenecks are there at all, the bottlenecks Yeah.
01:56:56.000 Yeah.
01:57:02.000 I used to have more of like a techno-utopian view, and it started to sort of change, partially because of some of the sci-fi I read.
01:57:15.000 Everything, but most recently, so like 15 years ago, I read the Richard K. Morgan book, Altered Carbon.
01:57:22.000 And at the time, I was like, this has to be made into something.
01:57:26.000 That's the one that's on Netflix now, right?
01:57:27.000 And then like a year ago, Joel Kinnaman was in the series and it was just awesome.
01:57:32.000 Is the series good?
01:57:33.000 The series is quite good.
01:57:34.000 The series is quite good, yeah.
01:57:35.000 And I really like Joel Kinnaman and Richard K. Morgan I interviewed who wrote the book years ago.
01:57:39.000 But that genre started to move me away from techno-utopianism and technology is just going to solve so many problems.
01:57:48.000 Because it also is going to create new problems that we don't even yet know about.
01:57:52.000 So as an example, I went all the way back to the beginning when humans went from hunter-gatherers and figured out we can domesticate some crops, we can start agriculture and settle and be in one place.
01:58:06.000 That was the acceleration of what we know of as wealth, ownership.
01:58:10.000 It was like the start, right?
01:58:12.000 So much of what we had.
01:58:13.000 I mean, agriculture allowed people to be able to live and do stuff other than find food, which developed specialists who created technology, which created army.
01:58:23.000 It all came from agriculture in that way.
01:58:25.000 But tons of bad stuff came from it as well, right?
01:58:28.000 The beginning of the concept of a sedentary lifestyle came from agriculture.
01:58:32.000 Diseases that we got from animals and then that we brought other places and they killed tons of people.
01:58:38.000 So I've kind of adopted that view to technology now, which is, yeah, all the cool stuff we can imagine and improvements I'm sure will be there, but problems we aren't even aware of yet are also going to be there.
01:58:49.000 Yeah, I agree with you 100%.
01:58:50.000 That's what I meant by looking back at this social media problem.
01:58:54.000 I think we're going to have a far more invasive problem.
01:58:57.000 I think we're going to probably have some sort of a wearable thing that allows us to communicate through thoughts.
01:59:02.000 Sure.
01:59:03.000 Well, thoughts would be a next step, but at minimum, I mean, replacing, you don't need the screen on your phone.
01:59:08.000 You have contacts that are connected to something and everything is just displayed.
01:59:12.000 I mean, there will be steps.
01:59:13.000 Jamie, whatever happened with that Microsoft thing that we were looking at?
01:59:16.000 Remember when they had the little mouse that was dancing in your hand or the elephant that was dancing in your hand?
01:59:20.000 It was an augmented...
01:59:21.000 Magic Leap?
01:59:22.000 Yes.
01:59:22.000 It's available.
01:59:23.000 Microsoft has HoloLens, and they're on HoloLens too, but they've moved more towards commercial applications for it as opposed to consumer availability.
01:59:33.000 There are consumer availability AR things coming out right now.
01:59:38.000 What Apple just showed at their WWDC event this month, or actually on Monday...
01:59:43.000 It's really cool.
01:59:44.000 It's still just like watching through that phone though.
01:59:47.000 I don't think anyone's made the device like a glasses type AR thing yet because the field of view isn't right.
01:59:55.000 They haven't mastered that.
01:59:56.000 Either projecting light into your eye, which is what Magic Leap does, or projecting onto the glass that you're then looking at, which is what I think HoloLens and...
02:00:05.000 The other thing does.
02:00:06.000 They haven't figured it out yet.
02:00:07.000 Betamax versus VHS race to see who figures it out?
02:00:10.000 I think so, yeah.
02:00:11.000 But that Oculus Quest, which is different, also just came out, is really cool.
02:00:15.000 And they're so much closer.
02:00:18.000 They could be within a year or two or something could come out at the end of this year that hasn't been announced.
02:00:22.000 We're very close.
02:00:23.000 The question is, how much is that going to affect daily life with augmented reality?
02:00:28.000 Yeah.
02:00:30.000 The other one that relates to that also is right now you at least can put your phone away.
02:00:37.000 Right.
02:00:39.000 What happens when the line between the technology and the body is...
02:00:43.000 Yeah, I have a bit about it.
02:00:44.000 I'm very concerned.
02:00:45.000 I really am.
02:00:46.000 I think we're giving up agency to something that has no feelings for us at all.
02:00:52.000 I think the problems people have in practice often are different than the ones – I mean there's no transparency with a lot of the companies that are developing these technologies and setting up the algorithms and whatever.
02:01:06.000 There's really no transparency about what it is that's going on, what the end goals are, what the broader effects on society are going to be.
02:01:14.000 I know you've had Jonathan Haidt on who has talked a lot about the disproportionate effect of social media on suicidality, particularly in young girls relative to boys.
02:01:24.000 It's been years now that this stuff has been around and we're now kind of figuring that out.
02:01:29.000 So it's inevitably we're behind always in figuring out what the effects are because you need time to measure it.
02:01:36.000 And that as things advance more and more quickly, whatever damage is potentially going to be done will happen even faster.
02:01:42.000 Yes.
02:01:43.000 Yeah, that's what the concern is, that we are always behind.
02:01:45.000 And that it's sneaking into our lives before we have any idea of how dangerous it is.
02:01:51.000 Sure.
02:01:52.000 I mean, this happened with, you know, the food, the canned and processed food revolution of the 50s and 60s.
02:01:58.000 It was slower, but...
02:02:01.000 It was the same type of thing where all of these advancements and being able to make food last longer via how it was processed and stored, it all sounded awesome in a time when food would just go bad.
02:02:10.000 Then we started learning about all the bad things that came with it.
02:02:14.000 Exactly.
02:02:14.000 Yeah.
02:02:16.000 Anything more before we wrap it up?
02:02:18.000 I think that's it.
02:02:18.000 Oh, so two things I wanted to mention.
02:02:20.000 One, when I announced that I was going to be on the show, companies started contacting me saying, we will give you money if you work our name, our product, into the conversation.
02:02:31.000 What's the product?
02:02:32.000 I'm not going to say.
02:02:33.000 But I do want to talk about car insurance briefly.
02:02:37.000 Have you heard that that's happened to other guests?
02:02:39.000 No.
02:02:40.000 You haven't?
02:02:40.000 Interesting.
02:02:41.000 That's interesting.
02:02:42.000 Yeah.
02:02:43.000 Wow, that's a weird sneaky thing.
02:02:45.000 Yeah.
02:02:46.000 And no one's ever paid me to do that.
02:02:48.000 No one's paid you.
02:02:49.000 No.
02:02:50.000 No, no one's ever paid me to have a conversation on a podcast.
02:02:55.000 Oh.
02:02:56.000 But one company did want to advertise and they wanted their CEO to come on the podcast and discuss their product.
02:03:07.000 And I was like, give me like an infomercial.
02:03:10.000 I was like, no.
02:03:10.000 And they're like, well, you've talked about people before that have had products before.
02:03:13.000 I go, yeah, because I like their product.
02:03:15.000 Right.
02:03:15.000 And I think what they're doing is cool.
02:03:17.000 Right.
02:03:18.000 Zero financial investment in their product.
02:03:21.000 Right.
02:03:21.000 I only did it because I like it.
02:03:23.000 Sure.
02:03:24.000 Well, I mean, my audience knows that we do sponsored stuff.
02:03:28.000 Sure.
02:03:28.000 I disclose it.
02:03:30.000 I'm clear.
02:03:30.000 And my approach is I'm super upfront with my audience, which is, listen.
02:03:34.000 Me too.
02:03:35.000 Only like half a percent of you are paying for a membership.
02:03:38.000 The memberships are six bucks a month.
02:03:39.000 I know like 80% of you can afford it.
02:03:42.000 Only like half a percent are doing it.
02:03:44.000 That's fine.
02:03:44.000 I'm going to keep doing the show, but I'm going to put some sponsored content up.
02:03:48.000 You don't have to watch it.
02:03:49.000 I'm going to mark it as such, period.
02:03:51.000 And I feel like for the most part, we have kind of an understanding of how it all works.
02:03:54.000 There's nothing wrong with it as long as it's products that you actually enjoy and, again, that you maintain that transparency and that honesty.
02:04:01.000 Absolutely.
02:04:01.000 There's nothing wrong with that.
02:04:02.000 If I don't have that with the audience, I don't have anything.
02:04:05.000 Right.
02:04:06.000 And I've been asked to compromise it.
02:04:09.000 You have?
02:04:10.000 Yeah, for sure.
02:04:13.000 It would be worth so much.
02:04:15.000 Yeah, I mean, there's this moral hazard sort of situation that exists with insurance where the people who don't really need the insurance are the ones that the insurance companies want to insure.
02:04:24.000 And the people that are more likely to use the insurance, the insurance companies are like, we're going to have to charge you six times as much type of thing.
02:04:30.000 It's easier to get the sponsorship money from stuff that's less interesting or less aligned with it.
02:04:37.000 Or whatever.
02:04:38.000 And I don't know.
02:04:39.000 I mean, it's an ongoing battle.
02:04:40.000 I don't talk to any of our advertisers.
02:04:42.000 Like, we have a team that handles all of that, and that is great.
02:04:47.000 But there are still calls to make about, like, what is on this side of the line, what's on that side of the line.
02:04:52.000 I try to make the right calls.
02:04:54.000 No, I think you're doing a great job.
02:04:55.000 I appreciate your show.
02:04:57.000 I appreciate your time.
02:04:58.000 Thank you.
02:04:58.000 Thanks for coming down here.
02:04:59.000 My pleasure.
02:05:00.000 Tell everybody where they can find you.
02:05:01.000 D-Pakman on P-A-K-M-A-N. I'm on Twitter at D-Pakman.
02:05:05.000 I'm on...
02:05:06.000 Where am I? I'm on Instagram at David.Pakman.
02:05:09.000 And my website, DavidPakman.com.
02:05:11.000 All right.
02:05:12.000 Thank you, David.
02:05:12.000 Appreciate it, man.
02:05:12.000 My pleasure.
02:05:13.000 Thank you for having me.