The Joe Rogan Experience - April 09, 2020


Joe Rogan Experience #1456 - Michael Shermer


Episode Stats

Length

1 hour and 53 minutes

Words per Minute

168.15729

Word Count

19,027

Sentence Count

1,346

Misogynist Sentences

15


Summary

In this episode of the podcast, I sit down with my good friend and colleague, Jim Shermer, to talk about the current economic situation. We talk about how technology is changing the way we live and work, and what it means for the future of the economy. We also talk about some of the things we can do to prepare ourselves for the coming economic collapse, and how we can prepare for it. I hope you enjoy this episode, and if you do, please leave us a rating and review on Apple Podcasts, and share it with a friend who could benefit from it. If you like the podcast and want to support it, please consider pledging a small monthly or annual subscription. Thanks to our sponsor, Ajinomoto, for sponsoring this episode. It helps keep us on track with our monthly payment plan and keeps us out of debt. Thank you so much to Jim and his amazing support of the show, and we're looking forward to seeing what he's up to in 2020. Timestamps: 3:00 - What's going to happen next year? 4:30 - What will the economy do in 2020? 5:15 - Is the economy going to crash? 6:20 - Will the economy be okay? 7:10 - What are we going to do next? 8:00 9:40 - Are we ready for the next downturn? 11:30 12:15 13:00 | What's the worst case scenario? 15:30 | What s the worst thing that could happen in the next decade? 16:20 17: What s going to be the worst? 18:40 | How will the future? 19:20 | What is the worst that we can we expect? 21:40 22: Should we be prepared for it? 23: How will we learn from this? 25:00 // 22:30 // Is it possible? 26:00 + 27: What will we get out of the next five years? 27:40 // 27:10 28:30 & 29: Is there a better way to learn from the past? ? 32:30 + 33: Is it better than the next one? 35:40 + 35:00 & 35: Is the future in the future coming in 2020 & 35, can we have a better than that?


Transcript

00:00:03.000 Okay, here we go.
00:00:04.000 Three, two, one, boom, and we're live.
00:00:07.000 Mr. Shermer, how are you, sir?
00:00:09.000 I'm fine, thank you.
00:00:10.000 I'm still breathing.
00:00:11.000 It's good to see you again.
00:00:13.000 We were just saying before we got started that the last time we saw each other was we went to dinner about six weeks ago, and you're thinking that that might be the end of that kind of stuff.
00:00:25.000 That was my last time I've been in a restaurant, actually.
00:00:28.000 Well, I think restaurants, of course, will reopen, but I think the kind of social distancing we're seeing now, it's not going to go all the way back to the way it used to be.
00:00:36.000 I think we may quit shaking hands and hugging to the extent that we used to, although I don't think we'll ever go all the way to, say, the Japanese model of social distancing.
00:00:46.000 But I think there'll be modifications like that.
00:00:49.000 The other thing I've been thinking about is the change of remote, say, meetings and education.
00:00:55.000 I mean, I'm in the studio here in Santa Barbara where I've been recording lectures for my Chapman University class, Skepticism 101. And I just upload them and share them with the students, and then they watch them, and then I send them a quiz, they take the quiz, they send them back.
00:01:09.000 Now, that's not a complete replacement of a brick-and-mortar building with a small-class seminar discussion, say, but it does adequately replace a lot of traditional education that you don't really need to be in a classroom for.
00:01:24.000 Do you think that this is preparing us for the ultimate, where we embrace the symbiotic relationship that we have with computers and become one with the machine?
00:01:33.000 I mean, it seems like we're becoming closer and closer to some sort of an electronic community.
00:01:39.000 It's weird.
00:01:39.000 Yeah, I think it was happening slowly already, and this is kind of a jump-starting it.
00:01:45.000 I mean, already tech companies like Zoom are having to, you know, ramp up their game because, you know, the systems are crashing because pretty much everybody's doing Zoom meetings now.
00:01:53.000 Yeah.
00:01:53.000 And then they have to adjust to Zoom bombing because, of course, there's, you know, people like that out there that just want to screw with you.
00:02:00.000 Yeah.
00:02:02.000 And then I was also thinking about things like theaters.
00:02:06.000 Why do we need to go to theaters anymore?
00:02:08.000 I mean, I love watching a movie on a big screen, but the screens we have at home now, big television screens, super high def, why not just watch movies at home?
00:02:18.000 Well, I don't think we're going to have much of a choice.
00:02:20.000 I was reading an article this morning about AMC theaters.
00:02:22.000 They might have to go under.
00:02:24.000 Because of this.
00:02:25.000 Really?
00:02:25.000 Yeah, it's not good.
00:02:27.000 I mean, you got to think, these companies are accustomed to having a certain amount of money coming in every month, and they never, no one anticipated anything like this, where all businesses are just going to shut down.
00:02:38.000 Gyms, I mean, how many gyms are going to go under?
00:02:41.000 How many yoga studios?
00:02:43.000 I mean, it's a strange and trying time for people who have small businesses, for sure.
00:02:49.000 Yeah, one of my cycling buddies owns the La Cunada Theater Complex and, of course, rents out the space to different retailers, including the theater managers.
00:03:01.000 Anyway, he was telling me that they normally pay $93,000 a month in rent, but they bring in like $7.5 million a year or something, so it all balances out.
00:03:10.000 But they just told him, we're not going to make our rent this month.
00:03:14.000 So he has to go to his mortgage company, you know, the bank, where he pays off his mortgage and say, I can't pay you this month because these guys can't pay me.
00:03:21.000 Okay, so multiply that by, you know, $10 million or $100 million or something, and that's kind of what we've been going through.
00:03:27.000 Yeah, and I don't really understand the economics of this stimulus package of how they're going to be able to distribute it and sort of balance people out.
00:03:34.000 It seems like it's just a small band-aid on a very large wound.
00:03:38.000 Yeah, well, of course, the government can't just print money indefinitely.
00:03:43.000 Then we're going to get huge inflation, and that could be catastrophic.
00:03:47.000 You know, so this conversation that people have been wanting to have, but they get hammered every time they bring it up, I think at some point we're going to have to have in the next few weeks, is the economic trade-off and costs to people's lives.
00:04:02.000 Yeah.
00:04:15.000 Well, but at some point, you know, there's an economic calculation.
00:04:19.000 Like, how many people are going to die, say, in the next year if we never open the economy?
00:04:23.000 Of course, we will.
00:04:24.000 But, you know, at what point do you do that?
00:04:26.000 You know, the supply chain dries up.
00:04:29.000 You can't get not just toilet paper, but, you know, food supplies start to dry up, and then you get social unrest.
00:04:34.000 And, you know, there's risks there, too.
00:04:36.000 And the idea of putting a dollar figure on a human life is repulsive to most of us, I think, intuitively.
00:04:43.000 Yeah.
00:04:44.000 In this context, but in fact, we do it all the time.
00:04:47.000 You know, in terms of like an automobile company has to pay off the family of somebody who died in their car.
00:04:53.000 Well, there are people who do those calculations.
00:04:56.000 Like, what's the value of a human life?
00:04:59.000 And the figure is, well, the high-end figure is about $10 million.
00:05:03.000 And after 9-11, the families got paid off, I think it was $250,000 a person times the $3,000 something.
00:05:12.000 And, you know, so it sounds so cold.
00:05:14.000 Like, who does those calculations?
00:05:16.000 Well, statisticians do that sort of thing, and attorneys and accountants work on that, and judges and juries have to...
00:05:34.000 Do you think that's another way to do this?
00:05:38.000 There's been some talk of isolating the people that are high risk, isolating the people with underlying conditions, people that are elderly, things of that nature.
00:05:46.000 Do you think that that's a way that they can move forward?
00:05:50.000 Yeah, for sure.
00:05:51.000 But again, we have this egalitarian mode.
00:05:55.000 That doesn't sound right.
00:05:56.000 It sounds like a death panel.
00:05:57.000 Some group of government agents are going to tell us who's going to live and who's going to die.
00:06:04.000 And that feels like we're sliding into conspiracy mongering.
00:06:07.000 But in fact, that is what you have to do, is kind of a triage.
00:06:12.000 And, you know, South Korea has been pretty good about this, you know, testing everybody.
00:06:16.000 They jumped on it right away.
00:06:17.000 They did that track and trace.
00:06:20.000 You know, they got it right down to, I think it was the 31st patient they found who had gone to two church services, and then she was in a car accident and taken to the hospital, and that's when it spread from there.
00:06:34.000 That one day, I think it was in late February when that happened, and they just jumped all over it.
00:06:40.000 Total deaths in South Korea, I think, is just a couple hundred compared to most other countries.
00:06:46.000 That's pretty impressive.
00:06:47.000 It is, and they've been, you know, super careful about isolating people and targeting the people that most need the tests and so on.
00:06:54.000 And, you know, that's just the kind of thing I think we have to do.
00:06:56.000 What do you think about what's going on in Germany?
00:06:59.000 Because Germany is very fascinating, right?
00:07:00.000 It's, I mean, some of these European countries, particularly Italy, are experiencing this very high death rate.
00:07:05.000 But Germany, I mean, they must have exemplary healthcare.
00:07:09.000 They must be doing something right or be robust and healthy individuals.
00:07:13.000 Is it a genetic thing you think?
00:07:15.000 Is it a healthcare thing?
00:07:16.000 Because they have a very low death rate.
00:07:19.000 I think they have a high, tight culture, a very tight culture.
00:07:22.000 That is when my wife's from Cologne, Germany, so I know this from personal experience, but also their studies.
00:07:29.000 Michelle Gelfin does these studies on loose and tight cultures, and that Germany is a very tight culture.
00:07:33.000 That is to say, very law and order, law abiding, and when the German government says, all right, this is what we're going to do, people do it.
00:07:40.000 And Americans are not that.
00:07:42.000 We're a much looser culture, more freedom-oriented.
00:07:45.000 And if the government says you can't go to the beach, well, the hell with it.
00:07:48.000 I'm going to the beach anyway.
00:07:49.000 Germans don't do that.
00:07:50.000 And they do have a really good health care system, and they jump right on it.
00:07:54.000 And I think that's one explanation.
00:07:57.000 We've seen yesterday this rise in deaths of African Americans versus white Americans and having to do with income.
00:08:04.000 But, of course, money is just a proxy for something else, which has to do with The quality of the health care they get, the food that they eat, how healthy and exercise-prone they are or not, diabetes, obesity, these sorts of things.
00:08:20.000 Down the line, when you're attacked by a virus like that, can have an effect on your immune system and therefore the response to the disease.
00:08:28.000 I think those are the kinds of We're going to have to target to save lives.
00:08:35.000 And I think countries like South Korea and Germany have been doing that pretty well without pushback.
00:08:40.000 Yeah, I'm hoping.
00:08:42.000 My best hope out of this is that it's a wake-up call for people that don't take care of their bodies.
00:08:47.000 The people that get through this, like you dodged the bullet.
00:08:50.000 Now, let's clean that diet up.
00:08:52.000 Let's get you moving.
00:08:53.000 Let's start some exercise on a regular weekly basis.
00:08:57.000 Get some nutrients into your system.
00:08:59.000 Eliminate all this sugar and bullshit that people eat.
00:09:01.000 And, you know, take care of your body.
00:09:04.000 Take care of your immune system.
00:09:05.000 Let's pump everything back up to sustainable levels.
00:09:09.000 I mean, that very well could be the difference between people who contract this virus and survive versus people who contract this virus and don't.
00:09:17.000 Absolutely.
00:09:18.000 I mean, how could it not hurt to be healthy, fit, and have a good immune system?
00:09:23.000 Even if, for some reason, we can't find the exact connection to this particular virus, just as a global thing, even if you do all that, and it turns out there's no connection to this particular virus, it's still a good thing to do.
00:09:34.000 You've been cycling, right?
00:09:36.000 I know you're a cycler, and so you were actually riding your bike today, right?
00:09:39.000 Yeah, this morning.
00:09:41.000 Actually, this is really funny, this kind of world we live.
00:09:43.000 There's no more group rides, of course.
00:09:45.000 And all the big tours, like the Tour de France, have been canceled.
00:09:49.000 So there, I'm riding along this morning, and I see up ahead of me TJ van Garderen, who's the top American pro right now.
00:09:56.000 And apparently, he lives over in the Santa Ynez Valley area.
00:10:00.000 And here he is cycling along in Santa Barbara by himself.
00:10:02.000 So I'm chasing along to see how long I can stay up with this guy.
00:10:07.000 And of course, he's much younger and faster than me.
00:10:09.000 But all of a sudden, he just stops.
00:10:11.000 And he picks something up from the ground.
00:10:12.000 So I pull up and he holds up a $5 bill and he goes, look, I found five bucks!
00:10:17.000 Man, it's the little things in life that just kind of make your day.
00:10:21.000 He was happy with the five bucks and I got to say hi to the great TJ Van Garderen.
00:10:25.000 So that was kind of funny.
00:10:26.000 That's awesome.
00:10:26.000 I don't follow professional cycling, but I can understand your enthusiasm if he's the top guy.
00:10:31.000 That's pretty cool.
00:10:32.000 Yeah, I was.
00:10:33.000 So, yeah, so I think, you know, just working out every day, I mean, you mentioned every week, I think people need to work out every day.
00:10:40.000 Yeah, for sure.
00:10:40.000 And I find the more you work out, you know, you don't have to eat as much because your body becomes more efficient at processing fuel, so I have less desire to eat.
00:10:50.000 Really?
00:10:51.000 Yeah.
00:10:52.000 Not me, baby.
00:10:53.000 Yeah.
00:10:54.000 When I work out, oh my god, I get so hungry.
00:10:58.000 Yeah, I don't know.
00:10:59.000 I've been doing that after you were promoting that, the daily fast, you know, don't eat from dinner time.
00:11:05.000 And I try to make it to like 11 or 12. And if I work out in the morning, if I go for a couple hour bike ride, I can make it.
00:11:11.000 Because you're doing something.
00:11:12.000 You have to be active.
00:11:13.000 Because if you're sitting around, then you're going to get hungry.
00:11:15.000 I do that.
00:11:16.000 Yeah, I do intermittent fasting.
00:11:17.000 I try to do 16 hours.
00:11:19.000 The only time I don't do that is when I have explosive work that I'm doing in the morning, like if I'm doing Muay Thai or something like that that requires a lot of it.
00:11:26.000 Then I'll just have a couple pieces of fruit, but I keep it pretty light.
00:11:30.000 But other than that, I try to do 16 hours.
00:11:32.000 I do a 16-hour non-feeding phase and then just eat in the other time.
00:11:37.000 And your body gets accustomed to it.
00:11:39.000 It's not hard.
00:11:40.000 It's not hard to do because it becomes normal.
00:11:44.000 The other thing I did that I've been recommending to people that are not into cycling or something like that is I got these wrist and ankle weights just at the local sporting goods store, but you can order them on Amazon.
00:11:55.000 Just five pounds each wrist, five pounds each ankle, and just walk briskly.
00:12:00.000 I just take my dog to the local park, and we just go up this hill and down, up and down this hill with these weights.
00:12:05.000 And you don't have to run.
00:12:07.000 I know people don't like running.
00:12:08.000 I don't like running all that much.
00:12:10.000 But with the weights, you get extra...
00:12:13.000 Extra upper body and it works the big muscles of your legs.
00:12:17.000 I know a lot of people hate running, so you don't have to run.
00:12:21.000 Just walk, get the heart rate up, work the big muscles, and circulate the blood and all the bodily fluids and so on.
00:12:29.000 That's just good for general health, and I do think that has to help for response to the coronavirus.
00:12:35.000 Again, these populations that are more targeted, You know, there's a lot of obesity and diabetes and some of these other secondary, you know, these preconditions, as they call them.
00:12:48.000 And, you know, there's this peculiar thing we're all doing now, like what I've done for years, is you look in the obits and you see, okay, well, this guy, he was older than me or he had this or he had that.
00:12:56.000 It doesn't apply to me.
00:12:57.000 You know, we all do that with the coronavirus.
00:12:59.000 Okay.
00:12:59.000 Oh, that person was old or they had this precondition and so on.
00:13:02.000 You're hoping that somehow I'm going to dodge this bullet because of that.
00:13:06.000 It may not be.
00:13:07.000 Right.
00:13:08.000 It's all a game of probabilities of just stacking the odds in our favor.
00:13:13.000 Who knows if that'll make a difference for you or me personally, but on average, it's got to make a difference.
00:13:18.000 It's such a strange virus, isn't it, in terms of the way so many people are asymptomatic?
00:13:25.000 Yeah, so this idea that it came to America, what, January 28th or something in Seattle, I have a feeling it's going to turn out to have been earlier, like in December.
00:13:37.000 Right.
00:13:56.000 But it looked like, I've read this article twice, I don't understand it because this isn't what I do, but they really showed that it very, very likely made the leap from probably bats.
00:14:06.000 You know, bats are mammals.
00:14:08.000 Bats are very respiratory.
00:14:10.000 They're very social.
00:14:12.000 You know, there's this cave in Texas where there's like 20 million bats that live in this cave.
00:14:17.000 I mean, they're just pressed in.
00:14:19.000 It's like the same population of Mexico City.
00:14:21.000 And they come out at night and so on.
00:14:23.000 And, you know, these wet markets in Wuhan, China, it's not that wet markets by themselves are bad, but, you know, say you have dead fish.
00:14:30.000 It's a wet market.
00:14:31.000 But live animals, and particularly mammals like bats, Yeah, I think.
00:14:57.000 I think we're going to have probably a mutated, more modest strain of it forever.
00:15:02.000 And we'll just have to get our flu shots for that one every year and just kind of mitigate it that way.
00:15:09.000 Yeah.
00:15:10.000 Have you been paying attention to these potential remedies like hydroxychloroquine mixed with Z-Pax and zinc?
00:15:17.000 What are your thoughts on that?
00:15:18.000 I took it for two days.
00:15:20.000 Did you?
00:15:20.000 Yeah, I did, yeah.
00:15:21.000 Well, a couple, I don't know, maybe a month ago now, my doc, who's also a good friend and a fellow cyclist, he's the guy that did my neck surgery.
00:15:28.000 You know, I had a fusion on my neck after I had a bad bike crash last year.
00:15:31.000 And so he's a good friend, and so he just texted me out of the blue and goes, Hey, have you heard about this hydroxychloroquine?
00:15:38.000 No, I never heard of it.
00:15:39.000 Oh, yeah, the, you know, the malaria drug.
00:15:41.000 Okay.
00:15:42.000 All right.
00:15:42.000 So, you know, he works at a big hospital in L.A., Huntington Hospital.
00:15:46.000 And, you know, so I could understand why he, you know, was doing it as a precautionary thing, as a prophylactic against it.
00:15:53.000 And there's some evidence anecdotal, you know, but anyway, so he wrote me a script.
00:15:58.000 I tried it for two days.
00:15:59.000 It's pretty toxic.
00:16:00.000 You know, if you follow the The prescription, exactly, you know, the chances of having bad side effects are pretty low.
00:16:08.000 So when Trump says, you know, what have you got to lose?
00:16:11.000 Of course, nothing's risk-free.
00:16:12.000 But if you follow the exact prescription, then the chances of having bad effects are pretty low.
00:16:20.000 Unlike that guy in Arizona that found it in his fish tank cleaner.
00:16:24.000 It was some fish tank.
00:16:27.000 Antifungal chemical, and he drank it and died.
00:16:29.000 Okay.
00:16:29.000 You can't do that kind of stuff, but that would apply to any kind of medication.
00:16:32.000 Yeah.
00:16:33.000 Right?
00:16:33.000 Anyway, so I tried it for two days, and I didn't feel good.
00:16:36.000 I worked out, and then I came back one day, and my wife says, you know, you stink like a toxin, like poison.
00:16:41.000 I'm like, ooh, okay, yeah.
00:16:43.000 I think...
00:16:44.000 That's not good.
00:16:46.000 Well, you're not supposed to take it unless you get in contact, right?
00:16:50.000 Were you thinking that you could have been in contact?
00:16:52.000 Well...
00:16:53.000 Well, you know, I don't know.
00:16:55.000 Again, it could be that lots of us have it.
00:16:57.000 That's the weird part, right?
00:16:59.000 The I don't know.
00:16:59.000 Symptom-free.
00:17:00.000 The I don't know, right?
00:17:01.000 The I don't know.
00:17:02.000 That's what keeps people up at night.
00:17:04.000 You're lying in bed.
00:17:05.000 You're like, what about that guy?
00:17:06.000 He was close to me.
00:17:07.000 What about this?
00:17:08.000 What about that?
00:17:09.000 What if someone at the grocery stores got it and I touched the cart?
00:17:13.000 Two of my cycling buddies had really bad colds in December.
00:17:17.000 Dry cough, fever, you know, all the symptoms.
00:17:20.000 And they're now saying, huh, I wonder if I had it in December.
00:17:24.000 And the reason this is important to know is because that would increase the size of the denominator of the equation, where you have the number of deaths divided by the number of people that got it.
00:17:33.000 And it's that bottom number we just don't know because the testing has just been ramped up.
00:17:38.000 So there might be lots of people that had it in January and February and they got better.
00:17:41.000 We just don't know.
00:17:42.000 Or they were symptom-free and they didn't know they had it.
00:17:44.000 Or maybe even earlier, say December.
00:17:47.000 That nature paper I referenced, they trace it back.
00:17:50.000 I don't know how they do this genetically with mutations or whatever, but to say mid-November in China.
00:17:55.000 So people were coming from China to the United States throughout November and December, so it's entirely possible it's been here longer, and therefore the death rate is not nearly as catastrophic as it seems like it could be.
00:18:09.000 Is there a way to test whether or not you've had it for antibodies?
00:18:14.000 I think there's a test now.
00:18:16.000 I don't think it's in the United States.
00:18:18.000 Where was this?
00:18:19.000 Maybe it was South Korea.
00:18:21.000 Just a pinprick.
00:18:23.000 We're good to go.
00:18:44.000 That would be something like a vaccine.
00:18:48.000 Yeah, that's all very promising.
00:18:50.000 You know, it's really interesting, too, because this has become such a hot political topic.
00:18:55.000 You know, there's so many people that are angry at Trump, but they were angry at Trump back when he was closing the travel from China, which turned out to be a great idea.
00:19:05.000 And, you know, Donald Trump Jr. tweeted today a compilation of CNN and all these other different networks giving out bad information way back in January.
00:19:17.000 Bad information saying, this is going to be fine.
00:19:20.000 Don't worry.
00:19:21.000 It's not as deadly as the flu.
00:19:22.000 You should worry about the flu.
00:19:24.000 Don't change your plans.
00:19:25.000 Don't do anything.
00:19:26.000 So a lot of people got this wrong.
00:19:29.000 But so many people are trying to make this a political point right now.
00:19:33.000 And it's really...
00:19:34.000 It's so...
00:19:35.000 So useless.
00:19:38.000 Pointing fingers and everything at this point in time.
00:19:40.000 What they need to concentrate on now is just getting masks, getting PPE equipment, keeping people healthy if they can, and then educating people on how to keep your immune system strong.
00:19:51.000 Let's try to get people to understand the consequences of not taking care of your body.
00:19:59.000 This has to be the worst job in the world, President.
00:20:01.000 No matter what you do, half the people are going to hate you.
00:20:05.000 Who would want that job?
00:20:07.000 I don't know, because it doesn't even pay that well compared to other professions, at the top end of other professions.
00:20:13.000 There's a lot of articles now about how autocrats around the world have been taking advantage of the Yes.
00:20:42.000 So, let's say, do the counterfactual.
00:20:44.000 Let's say he closed the borders in late January or early February or something like this.
00:20:49.000 I mean, just clamp down on all travel and so on.
00:20:53.000 He would have been totally accused of being an autocrat.
00:20:57.000 He wants to be a dictator, and look what he's doing.
00:20:59.000 So, he doesn't do that, and then he's accused of not doing enough when it looks like we should have done more.
00:21:06.000 And then, you know, the other day when he said, well, I'm not going to tell all the governors what to do.
00:21:11.000 You know, I'm going to honor states' rights for now.
00:21:13.000 And, of course, he gets hammered for that.
00:21:15.000 And it's like, you know, but that's actually...
00:21:18.000 What are you supposed to do?
00:21:19.000 That's not what an autocrat would do.
00:21:21.000 An autocrat would say, yeah, I'm telling everybody what to do.
00:21:23.000 Right, exactly.
00:21:24.000 Exactly.
00:21:26.000 You know, I think maybe we should, you know, drop all the polarization politically.
00:21:31.000 Right.
00:21:35.000 We're good to go.
00:21:54.000 Including Hillary, right?
00:21:56.000 So, you know, maybe we ought to do that.
00:22:00.000 I know people just can't stand Trump, and just the idea like saying something nice or supportive or not being critical seems hard to do, but maybe this is way worse than 9-11.
00:22:10.000 Yeah, and what you're saying is totally correct.
00:22:14.000 It seems like the polarization is even worse, though, than when it was in 2001. It seems like it just keeps ramping up, and Trump is such a naturally polarizing figure that it's gotten the left versus right...
00:22:38.000 I was just showing this graph of the people that self-identify as centrist versus now, which is more polarized.
00:22:49.000 You have this two-hump camel.
00:22:51.000 Here, and that's from 1994, 2004, and then close to today.
00:22:57.000 When did that shift?
00:22:58.000 Is that a Trump shift?
00:22:59.000 That little dip in the middle?
00:23:00.000 No, no, no, no.
00:23:02.000 About 2000, well, really under Obama.
00:23:05.000 About 2008, the polarization got worse and worse.
00:23:08.000 I mean, we can speculate why, but that's pretty much when it happened.
00:23:11.000 Around 2004, 2005, and it gets ramped up.
00:23:14.000 So just pollsters asking people, you know, how do you self-identify?
00:23:17.000 You know, centrist, far left, far right, strong Republican, strong Democrat, whatever.
00:23:22.000 And so that middle ground has been shrinking.
00:23:25.000 The centrist has been shrinking, and the polls have been increasing.
00:23:28.000 So more and more people are polarized.
00:23:30.000 Now, you know, conservative talk radio...
00:23:32.000 And television or MSNBC, whatever, you want to accuse the media.
00:23:38.000 But in general, I think we've just been more polarized in the sense of not just saying, well, I disagree with you, I think you're wrong, but that you're evil, you're immoral, this is the worst thing that's ever happened to us, and so on.
00:23:53.000 This kind of ramping up of the catastrophism is not healthy.
00:23:58.000 Yeah.
00:23:58.000 No, no, it's not.
00:24:00.000 You know, another thing I wanted to talk to you about, Michael, is there's an article today in The Atlantic, which is really interesting.
00:24:04.000 It's about technology.
00:24:07.000 It's contact tracking technology.
00:24:10.000 And there's a real concern about this stuff.
00:24:13.000 First of all, the idea is great that this could free America from quarantine.
00:24:18.000 So this is always the risk, right?
00:24:20.000 The risk is just give up a little bit of your civil liberties, give up a little bit of your freedom, and we're going to keep you safe.
00:24:26.000 And, you know, it brings you to the old Benjamin Franklin quote, you know, he who would give up liberty for freedom deserves neither, or liberty for safety.
00:24:37.000 I'm sure I fucked up that quote.
00:24:39.000 But this technology is very interesting, because they're using it in South Korea, and they're using it in Singapore.
00:24:48.000 The title of the article in The Atlantic is The Technology That Could Free America From Quarantine, and it's out today.
00:24:54.000 And they bring up this conundrum.
00:24:57.000 I mean, nobody wants to give up civil liberties, and civil liberties lost or rarely regained.
00:25:02.000 And this is the real concern here, that if you do allow people to track who you're in contact with, And make sure that, okay, you're testing negative, and you're in contact with people that also test negative, so you're okay.
00:25:15.000 You're okay to travel now.
00:25:17.000 This is a very weird thing, and it gets us into a very gray area.
00:25:22.000 How do you feel about this?
00:25:24.000 I feel about it this way.
00:25:26.000 In general, I'm against that sort of thing.
00:25:29.000 I like the idea of privacy and that I do have a right to not be tracked and you can't have cameras in my home or my yard and so on.
00:25:40.000 In general, I think across the board, that's a good principle, and it follows the Constitution.
00:25:45.000 I think there are times, say, national emergency like this, of course, there's always the risk that, you know, any autocrat can declare a national emergency, grab the power, never give it back.
00:25:55.000 I mentioned examples of this before in Turkey, say.
00:25:59.000 But the difference here, I think, is we do have a constitution.
00:26:01.000 We do have states' rights.
00:26:03.000 We do have courts that litigate these sorts of things.
00:26:05.000 I could see a reasonable measure being taken for, let's say, we're going to do the following for six months.
00:26:12.000 Until we see what happens with this pandemic, and then once that's over, then we're going to revert back.
00:26:18.000 Now, let's say the governor or the president says, well, I'm not going back.
00:26:21.000 Well, then you have courts, and you sue the state or you sue the federal government for violations of civil liberties, and then you can get them back.
00:26:29.000 Right, but that's never happened.
00:26:30.000 We've never gotten back.
00:26:31.000 What happened with the NSA when Edward Snowden revealed how much tracking is actually going on?
00:26:37.000 I mean, that's never been reversed.
00:26:39.000 I know, I know, yeah, I know.
00:26:42.000 I watched that show when you had him on, and oh boy, that was pretty disturbing.
00:26:46.000 Very disturbing.
00:26:47.000 And what was also disturbing was that now it's been proven that the Obama administration lied.
00:26:52.000 They lied about what, you know, it's just metadata.
00:26:55.000 There's no concern.
00:26:56.000 It was not just metadata.
00:26:58.000 They were able to read people's emails.
00:27:01.000 Right.
00:27:02.000 Yes, and this program was started under Bush, and so supposedly when Obama became president, it's like the transparent president, so we're going to stop doing that.
00:27:11.000 Well, that's not the case.
00:27:13.000 So here's a good argument for WikiLeaks and the Pentagon Papers that I recognize as valuable, that we wouldn't have known that without Snowden or the Pentagon Papers.
00:27:27.000 It's good to know what your government is up to.
00:27:31.000 Our mutual favorite subjects of conspiracy theories, we didn't know about a lot of the things Kennedy was doing and Johnson, all the way back to Eisenhower, lying about the Vietnam War, for example, until the Pentagon Papers came out.
00:27:43.000 And then in the 90s, the Church Committee on Conspiracies from the 70s, a lot of those documents were And there was that business about the Operation Northwoods, where Kennedy administration people brought to him this idea of a false flag operation over Cuba, make it look like the Russians were harassing our aircraft or our airports as an excuse to invading Cuba or assassinating Castro and so on.
00:28:07.000 It's like when you had Alex Jones on, he talks about false flag operations, and most of us skeptics go, oh, that's a bunch of nonsense.
00:28:13.000 And then you read these documents that are revealed in the In these released secret documents, like, wow, okay, so we did do that.
00:28:22.000 Not just that, signed by the Joint Chiefs of Staff, vetoed by Kennedy, was like, what the hell are you doing?
00:28:27.000 You know, and then finds himself dead less than a year later.
00:28:32.000 Right.
00:28:32.000 And then all the shenanigans of American intelligence agents manipulating elections in South American democracies.
00:28:45.000 Yeah.
00:28:47.000 Right.
00:29:01.000 Okay, so, you know, this is one reason people believe conspiracy theories is because a lot of them are true.
00:29:06.000 Yeah, that's what's scary.
00:29:07.000 Not all of them.
00:29:08.000 But I point to with you, with the Epstein case, like you were one of the first people, I mean, as a, literally, you're a professional skeptic.
00:29:19.000 And you looked at some of the evidence.
00:29:21.000 You're like, oh, well, you know what?
00:29:23.000 This might be a conspiracy.
00:29:25.000 And I said when Michael Shermer thinks it might be a conspiracy, it's probably a goddamn conspiracy.
00:29:31.000 There's been enough of them.
00:29:32.000 I'm still not sure about that one because after I posted something about the two cameras broke or whatever, somebody wrote me from that prison saying, oh, those cameras are always breaking.
00:29:41.000 It's a little convenient, though.
00:29:42.000 He winds up strangling himself in a way that Michael Batten, the famous autopsy doctor, says is completely inconsistent with hanging, and much more consistent with someone strangling you, including the actual area where he was hanging from,
00:29:59.000 supposedly.
00:29:59.000 It's consistent with someone strangling you from behind, not consistent with you hanging by your own weight.
00:30:06.000 Yeah, after Weinstein got his conviction, I thought, oh boy, they better have a real suicide watch on this guy, because he surely has a black book just as big as Epstein's.
00:30:17.000 Oh, I'm sure.
00:30:18.000 Well, I think what he's got is probably more incriminating to him, though.
00:30:23.000 I think what he's got is probably, hey, I had sex with all these starlets and turned them into big celebrities.
00:30:31.000 And I bet he probably doesn't want that out, especially at this stage of the game.
00:30:36.000 I don't think anything he's got is going to make him look good.
00:30:39.000 And I think the thing with Epstein is he knew way too much about too many powerful people.
00:30:45.000 There's just so many connections that could be made with that guy.
00:30:50.000 And to this day, people are asking questions that people like Bill Gates don't want to answer or Prince Andrew or any of these people.
00:30:57.000 They're like, you know, I don't want to talk about this.
00:31:00.000 Yeah.
00:31:00.000 I didn't hear the one about Gates, but Prince Andrew, of course.
00:31:04.000 Gates apparently flew on the Lolita Express four years after he was convicted.
00:31:10.000 Oh, okay.
00:31:11.000 Wow.
00:31:11.000 All right.
00:31:12.000 According to the Daily Mirror or whatever the fuck it was.
00:31:15.000 I would ask Jamie to look that up, but I've got his computer right now.
00:31:20.000 Cook it up on your iPad.
00:31:21.000 Find out if it's true, if Bill Gates flew in the Lolita Express.
00:31:24.000 Because that's what I was reading today.
00:31:26.000 People were trying to ask Bill Gates.
00:31:28.000 But it's so hard to know what's true and what's not true today.
00:31:32.000 That's the thing is there's so much data.
00:31:34.000 And so, I mean, one of the things that's really sad about the loss of respect for mainstream journalism and mainstream media is, well, if we can't count on them, then who's regulating the independents?
00:31:47.000 Who's regulating these websites?
00:31:48.000 Who's regulating these people that are just, you know, so-called independent journalists that are just tweeting things and finding things and putting things up on their websites?
00:31:56.000 It's so hard to tell who's telling the truth and who's not.
00:31:59.000 And who's right.
00:32:01.000 Yeah, when I was working on Giving the Devil is Due, this was kind of a challenge to me because I feel like...
00:32:07.000 There's so much fake news, real fake news, and just bogus theories, particularly in my areas of quack medicine and cancer cures, and now coronavirus cures.
00:32:19.000 The old televangelist Jim Baker was selling those silver derivative pills that were supposed to fight the coronavirus.
00:32:27.000 So a lot of that stuff is dangerous to have out there.
00:32:30.000 But as a civil libertarian, I feel like, well, but I'm a free speech fundamentalist.
00:32:35.000 I really believe, you know, people, short of just lying about somebody or giving away the nuclear codes or something like that, you know, just let a thousand flowers bloom and just see what, you know, shine sunlight on all of them and see which ones rise to the top because they're supported by evidence.
00:32:49.000 There's a risk.
00:32:50.000 That is to say, people will take bad information and they'll go shoot up a pizzeria or something like that.
00:32:56.000 Now this conspiracy theory about 5G related to the coronavirus, that is to say, the theory is that 5G is causing people to feel ill, to take ill, and that the government made up, the corporations made up this story about the coronavirus as a distraction from 5G. Okay,
00:33:14.000 this is nonsense.
00:33:15.000 That's so criminally stupid.
00:33:17.000 Do you know who Lil Duval is?
00:33:19.000 No.
00:33:20.000 Of course not.
00:33:21.000 He's a hilarious comedian who also has a great Instagram page, and he put something up today.
00:33:27.000 He retweeted something that says, if 184 countries have corona and only five countries have 5G towers, why the fuck would you dummies, why the fuck would you idiots think that 5G towers are causing COVID-19?
00:33:42.000 It's such a great thing.
00:33:44.000 And so many people are like, oh, yeah.
00:33:47.000 Oh, well.
00:33:48.000 And then I literally heard someone say, well, maybe, maybe the 5G causes the coronavirus and then they spread it to other countries.
00:33:56.000 I'm like, oh, God, you don't even understand viruses.
00:33:59.000 Yes.
00:34:00.000 Well, of course, that was accused with 4G and 3G and cell phones back in the late 90s and early 2000s.
00:34:07.000 There was a scare about holding the phone up to your room.
00:34:10.000 We all started doing that and that this is maybe causing brain tumors.
00:34:13.000 And, of course, you can find anecdotes.
00:34:16.000 This guy spent a lot of time on his cell phone and he got a brain tumor over here on his left temple lobe or whatever.
00:34:24.000 Right.
00:34:50.000 Yeah.
00:35:00.000 So you have to look at all the different options, and our focus tends to be on the one cell.
00:35:06.000 So I always use this heuristic for my class of teaching skepticism is a two-by-two matrix where you have four cells.
00:35:14.000 So I did this with this documentary film coming out about skepticism.
00:35:19.000 Horror films that are haunted or cursed.
00:35:23.000 Horror films that are cursed.
00:35:26.000 Which is the one where the actors died in the helicopter accident?
00:35:33.000 Oh, the Twilight Zone, the movie.
00:35:35.000 Yeah.
00:35:35.000 And, you know, The Exorcist and these other films where bad things happen to the actors that are in horror films, okay?
00:35:44.000 The problem with that is you're only focusing on one cell.
00:35:47.000 That is, horror films that are cursed.
00:35:49.000 Then there's horror films that are not cursed.
00:35:52.000 Nothing bad happened to the actors in those.
00:35:54.000 And then there's non-horror films, regular films, in which bad things happened to the actors.
00:36:00.000 And then non-horror films that are not cursed, right?
00:36:04.000 So when we just focus on the one cell, it's easy to find examples that fit it.
00:36:09.000 But something like The Shining, which is a super horror-scary film, nothing bad happened to the actors.
00:36:15.000 Or just take some other film like The Godfather, whatever, that's not a horror film and nothing bad happened to the actors and so on.
00:36:22.000 So when you look at all the different options, this is just a way to think about any particular claim.
00:36:28.000 That then there's really nothing left to explain, because you're just plucking out anecdotes.
00:36:34.000 Well, people love coincidences.
00:36:35.000 They really, really love coincidences.
00:36:37.000 They're fun, because they love to believe in spiritual connections, and they love to believe in clairvoyance, and they love to believe in haunted things.
00:36:45.000 Jamie pulled up the article.
00:36:46.000 It's actually in The Sun.
00:36:47.000 It says, Bill Gates breaks silence on Epstein, admitting he made a mistake in judgment by meeting with the pedo tycoon, it says.
00:36:54.000 So there's a little story about it.
00:36:57.000 Who knows?
00:36:58.000 But he is quoted in there.
00:37:01.000 Yeah.
00:37:02.000 I don't know.
00:37:02.000 I know some scientists.
00:37:06.000 A lot of scientists met with him, right?
00:37:08.000 Yeah, because he had a way of attracting famous scientists because he had a lot of money, and he said, look, I can fund your lab, help fund your lab to the tune of millions of dollars.
00:37:18.000 It's hard to resist that, and then maybe you go down that road a little bit, and then you start hearing these rumors about his personal life, and you're like, ugh.
00:37:25.000 Well, yeah, but the money's good for the lab.
00:37:28.000 Somewhere down the line, it becomes obvious it was a bad thing, but it's too late.
00:37:32.000 You already went down that road.
00:37:33.000 It's hard to judge people after the fact and the hindsight bias.
00:37:36.000 We look back and go, how could anybody have ever had any association with them?
00:37:40.000 It's like, yeah, but we know stuff now that maybe not everybody knew the extent of it back years ago.
00:37:47.000 Now, this book that you wrote, Giving the Devil is Due, the idea is talking to people whose opinions you disagree with and that there's a lot of value in that.
00:37:58.000 Why did you write that and what were you trying to get out of this?
00:38:02.000 Well, in general, I've been kind of a civil libertarian most of my life in that respect.
00:38:05.000 But to be honest, I was kind of inspired after the episode we did of your podcast with Graham Hancock.
00:38:11.000 And I've since gotten to know him and I thought, you know, I was not really fair to that guy.
00:38:15.000 I really didn't give him a fair shake.
00:38:17.000 And there's value in people like him who challenge the mainstream.
00:38:23.000 Now, it's not that outsiders can't make contributions.
00:38:27.000 They can.
00:38:28.000 And we generally tend to be skeptical of outsiders because they're mostly wrong most of the time.
00:38:34.000 But so are scientists.
00:38:35.000 Yeah, I think.
00:38:56.000 I think?
00:39:18.000 And then you've had an opportunity to change your mind.
00:39:22.000 But more importantly still, if you silence people, you refuse to listen to them, then what happens when you take up a contrary position?
00:39:33.000 You come up with some idea that goes against the grain and the norms, or worse, laws are in place to silence you.
00:39:40.000 Now, you've just given up your opportunity to be heard because you've previously endorsed the idea of silencing people.
00:39:47.000 And I don't just mean legally, like passing laws, although that's disturbing enough.
00:39:52.000 Like in many countries like Canada, Austria, Germany, Switzerland, France, Australia, and New Zealand, it's illegal to deny the Holocaust.
00:40:01.000 By which I mean if you say, well, I think one million Jews died, not six million.
00:40:05.000 Therefore, I don't think the gas chambers were used the way we think they were.
00:40:08.000 Therefore, I don't think the Nazis had an intentional plan to exterminate European Jewry.
00:40:14.000 That's now illegal to say that.
00:40:17.000 Now, I've debunked all those claims.
00:40:18.000 I think they're completely wrong.
00:40:20.000 And even if the people who claim it are themselves anti-Semites, you know, I know it's in somebody's heart or minds, but, you know, let's assume the worst just for the sake of our argument.
00:40:28.000 I would still defend their right to say it because, let's say by analogy, I'm in the middle of a debate about how many Native Americans died since Columbus came here.
00:40:38.000 And now the figure is, I don't know, 90 million, 70 million, 50 million, it's debatable.
00:40:42.000 But let's say I'm a historian and I say, I think it was 10 million.
00:40:47.000 And I think it was mostly by germs, not by guns and steel.
00:40:51.000 Am I a Holocaust denier?
00:40:53.000 And therefore, I should be silenced or worse, jailed for my illegal hate speech, you know?
00:40:58.000 And so this is why I wrote a letter to the judge in David Irving's case in Austria.
00:41:05.000 David Irving's a notorious Holocaust denier in England.
00:41:07.000 So he flew to Vienna from London to give a speech at one of these kind of far-right groups in a hotel somewhere.
00:41:14.000 And he got flagged at the airport.
00:41:16.000 You know, they scanned the passport and boom, he's arrested.
00:41:20.000 And he was put on trial and convicted and sent to jail.
00:41:24.000 He didn't even speak.
00:41:25.000 He was just going to speak.
00:41:27.000 So that essentially is a thought crime.
00:41:30.000 So even though I completely disagree with their arguments, and maybe I don't even want to like these guys because of their attitudes about Jews, I don't like that, but still, I would defend them.
00:41:40.000 So I would apply that to...
00:41:59.000 I agree with that, but can I give you the counter-argument?
00:42:01.000 The counter-argument, particularly online, is that people develop these bubbles.
00:42:06.000 They develop these bubbles where everyone agrees with your perspective.
00:42:13.000 You isolate or self-isolate in these bubbles.
00:42:16.000 And there's this theory that you can indoctrinate young, impressionable people into hateful or racist or ideologically disturbing ideas.
00:42:30.000 By finding them isolated in these thought bubbles.
00:42:33.000 If they get onto particular message boards or a particular website where they subscribe to a YouTube channel or some video channel, then they all meet up in the comments and they agree with each other, but they're all wrong.
00:42:47.000 But they can find confirmation bias in these large groups of people that are also wrong, and they feed off of each other.
00:42:54.000 What do you think about that?
00:42:57.000 It does happen, for sure, and my first response is to encourage people to get out of their bubble, so if you read the New York Times, you should read the Wall Street Journal, and vice versa.
00:43:07.000 Now, of course, that doesn't apply to most people online, but there's new research now since the 2016 election by a number of political scientists and cognitive scientists, nicely summarized in Hugo Mercier's book called Not Born Yesterday.
00:43:22.000 And he shows that those Facebook and online bubbles against Hillary, say, or for Trump or vice versa, probably had next to no effect on the actual election.
00:43:34.000 That is to say, if you believe that Hillary Clinton was running a pedophile ring out of a pizzeria, whether I convince you that that's not true, you're very likely not going to vote for Hillary no matter what.
00:43:47.000 Somebody that believes that is already so far down the rabbit hole Say, down the spectrum of where they are politically, they're never going to switch positions.
00:43:56.000 And even the idea of just sort of slightly negative stories about Hillary or slightly negative stories about Trump that might nudge people, it doesn't look like it had much effect at all.
00:44:05.000 In fact, Hugo shows that most political advertising is a complete waste of money.
00:44:10.000 It does nothing.
00:44:10.000 It doesn't change people's votes.
00:44:12.000 All it does is reinforce to your team, say in the primaries, that you're the best candidate.
00:44:17.000 So it might work for that.
00:44:18.000 But in terms of getting Republicans to vote, say, centrist or Republicans to vote Democrat, the advertising probably has no effect at all.
00:44:28.000 And the same thing with corporate advertising and things like that.
00:44:31.000 It probably doesn't really work.
00:44:32.000 And so I've been thinking about this with the Nazis because I've written a lot about that.
00:44:37.000 The problem to explain is how do you convert an entire nation of people From this, you know, highly cultured, educated, intelligent, you know, Western civilization-leading culture into Nazis that are willing to exterminate Jews and other people.
00:44:55.000 And the answer, I think, is now, you don't.
00:44:58.000 You have to.
00:44:59.000 You don't have to.
00:44:59.000 Most of them didn't endorse the Nazi ideology.
00:45:02.000 They like some of them.
00:45:04.000 There were economic policies in the 30s that got Germany out of the Depression.
00:45:08.000 Hitler built the Autobahn and all that stuff, trains ran on time, whatnot.
00:45:12.000 But the exterminationist ideology that the Nazis had, most Germans did not go that far.
00:45:19.000 Now, anti-Semitism was rampant in Europe, including Germany and Poland and Russia especially.
00:45:24.000 I think?
00:45:30.000 I think?
00:45:45.000 We're good to go.
00:46:24.000 But they don't.
00:46:26.000 Silence people who would have dissented that would tell the rest of us who think everybody else thinks this is the way everybody believes, but they don't.
00:46:34.000 We'll never know because we don't hear those voices.
00:46:36.000 They're silenced.
00:46:37.000 So with those two things, pluralistic ignorance and the punishment of dissenters, you can have this Nazi ideology or Stalinist ideology hover in midair even though no one really believes it.
00:46:49.000 And it's just think of like North Korea when Kim Jong-un's father died.
00:46:54.000 You saw those videos of people just weeping in the streets for days on end.
00:47:00.000 Who actually believes that they feel this way?
00:47:02.000 Well, we don't believe it.
00:47:04.000 We're good to go.
00:47:24.000 They didn't care if people believed.
00:47:26.000 They just wanted compliance, right?
00:47:28.000 They wanted to make sure that people – I mean, they had a long period of time where they forced people to mourn.
00:47:33.000 They wanted them to weep in the streets, and they jailed people for as much as six months for not mourning enough.
00:47:39.000 That's crazy.
00:47:41.000 Yeah, it's horrific.
00:47:41.000 But that's how you run a dictatorship, right?
00:47:44.000 Under fear.
00:47:45.000 There's a story where Stalin gave a speech and then got a standing ovation that went on for like three minutes and then six minutes and eight minutes, nine minutes, ten minutes, eleven minutes.
00:47:56.000 Everybody's going, oh crap, please, somebody sit down.
00:47:58.000 Finally, some apparatchik sat down and he was promptly arrested the next day and sent off to the gulag.
00:48:04.000 Really?
00:48:05.000 Yeah.
00:48:06.000 Wow, 11 minutes, not enough.
00:48:08.000 Now, it's not just him they want to silence, of course.
00:48:11.000 It's a signal.
00:48:12.000 Of course.
00:48:12.000 Like, this is what happens if you don't, you know, maintain this charade.
00:48:17.000 We all know it's a charade.
00:48:18.000 Isn't that...
00:48:18.000 That is an issue with social media, right?
00:48:20.000 I mean, there's...
00:48:21.000 There's people that are writing hateful things on social media, but then there's people that are writing things that are just disagreeable.
00:48:30.000 And when they get silenced, this is oftentimes something that sends a signal to other people to not say disagreeable things, not say questionable things, not say things that is contrary to the orthodoxy.
00:48:45.000 Right.
00:48:46.000 That's right.
00:48:47.000 So even though we don't have censorship laws like other countries we've been discussing, there is this self-censorship that happens out of fear of being canceled in the so-called cancel culture or just squelched by the language police, the politically correct police.
00:49:02.000 So when I ask a show of hands of my students every semester, how many of you self-censor?
00:49:06.000 That is, you want to say something but you don't on abortion or immigration or any kind of politically charged issue.
00:49:12.000 They all raise their hand.
00:49:13.000 Oh, yeah.
00:49:13.000 Every one.
00:49:14.000 I would never say something, not just in class, but in the dorm rooms or just wherever students are gathering.
00:49:20.000 That's the chilling effect.
00:49:22.000 So giving the devil is due, it's pushing back against that.
00:49:25.000 I know you don't want to.
00:49:27.000 Give your devil.
00:49:28.000 The devil is whoever you disagree with.
00:49:31.000 I know you don't want to.
00:49:32.000 I don't want to either.
00:49:32.000 But we have to, for our own safety's sake.
00:49:35.000 If I want to be heard, and I want you to take me seriously and listen to what I have to say, I have to respond in kind.
00:49:41.000 I have to practice the principle of reciprocity or interchangeable perspectives.
00:49:46.000 I have to see it from your perspective.
00:49:47.000 He wants to have his voice, so do I. And so as a principle, it doesn't feel intuitive, like, no, I don't want to give everybody a voice, but you know what?
00:49:56.000 I'm going to override that impulse and do it anyway, if nothing else, selfishly, for my own safety's sake.
00:50:02.000 So my other case chapter in the book besides Graham Hancock is Jordan Peterson.
00:50:07.000 Now, you know, after I saw him on your show and then I saw him getting hammered, He's a wonderful guy.
00:50:23.000 He's the most misrepresented person I've ever met in my life.
00:50:29.000 Willfully, willingly misrepresented.
00:50:31.000 They do it on purpose.
00:50:32.000 They know what they're doing.
00:50:34.000 They want to paint him out with just a series of very quick, easy-to-use adjectives that turn him into a monster.
00:50:41.000 And they don't have anything to back that up.
00:50:44.000 Anything.
00:50:45.000 And it's really strange.
00:50:47.000 It's so disturbing.
00:50:48.000 But it's a very strange left-wing characteristic.
00:50:52.000 And again, this is coming from someone who's on the left.
00:50:55.000 But it is a left-wing characteristic.
00:50:57.000 This need to misrepresent someone, paint them in a straw man fashion as some sort of an evil person so that you can dismiss everything they say that is uncomfortable or that...
00:51:13.000 Is contrary to your accepted ideology, the ideology that you subscribe to and that you're defending and that you've identified with.
00:51:21.000 And I think this is a real problem.
00:51:23.000 A real problem that we're having is that people identify with their ideas.
00:51:28.000 If their ideas fall apart, somehow or another they're falling apart.
00:51:32.000 They are a part of the ideas.
00:51:34.000 They're not just a person who has a thought and they can, like if you and I disagreed on something, I would hope that we could just talk about these ideas as if they are separate from us.
00:51:45.000 But oftentimes that's not the case.
00:51:47.000 Oftentimes people, they so identify with those ideas that when those ideas are challenged, they are challenged.
00:51:54.000 They get emotional.
00:51:55.000 They get angry.
00:51:56.000 And they will lie.
00:51:57.000 They will willfully misrepresent you in order to strengthen their position.
00:52:02.000 And this is a terrible, terrible thing that I see.
00:52:05.000 And I see it so much from my side.
00:52:07.000 I see so much of this from the left.
00:52:09.000 And it's so discouraging.
00:52:11.000 And it's so infuriating.
00:52:13.000 And this is one of the things that I love about the concept of your book.
00:52:18.000 I love about this idea that we need open discourse and discussion.
00:52:24.000 And I think We're dealing with a couple of things here, and one of the things I think we're dealing with is the limited kind of communication that's available through social media.
00:52:31.000 It's very limited.
00:52:32.000 You know, writing something in text, if someone responds in text, we're missing on so much nuance.
00:52:38.000 We're missing so much of what it means to interact with someone socially.
00:52:42.000 If you and I are sitting across from each other, person to person, if...
00:52:46.000 I say something insulting to you.
00:52:48.000 I have to see you get upset.
00:52:50.000 I have to feel it.
00:52:51.000 I have to look at you.
00:52:52.000 I have to feel like, what kind of an asshole am I that I said that to you?
00:52:55.000 Why did I hurt your feelings?
00:52:57.000 There's all these things that happen when people are interacting with each other socially, looking at each other in the eye, these cues.
00:53:03.000 This is what made us human.
00:53:04.000 I mean, this is what found community.
00:53:08.000 This is one of the basic tenets of rational discourse, is the ability to communicate with each other in a comprehensive way, in a nuanced way.
00:53:21.000 And so much of that is eliminated entirely when you put things to 140 or 280 characters.
00:53:29.000 Yeah.
00:53:29.000 Yeah, I'll occasionally get really nasty letters.
00:53:32.000 Somebody will email me and, I mean, really nasty, like, you're a piece of shit, you fucking million, just on and on like this.
00:53:38.000 And sometimes I'll write them back and go, hey, you know, are you having a hard day?
00:53:42.000 Because, you know, I didn't mean to be offensive.
00:53:44.000 I was just trying to make this point.
00:53:45.000 And they always write back, oh, my God, I'm so sorry.
00:53:48.000 I didn't know anybody was going to respond.
00:53:50.000 Oh, wow.
00:53:51.000 Yeah, no, I take it back.
00:53:53.000 I didn't mean to be offensive.
00:53:55.000 What you're just saying is so common.
00:53:57.000 I hear that from so many people that are in the public eye that say something back to someone who says something rude to them, and the person's sort of like, well, I didn't mean it.
00:54:06.000 Because it's just a shitty way to communicate.
00:54:08.000 Sending someone an email or writing a blog about someone or tweeting some nasty shit about someone, it's a terrible way to communicate.
00:54:16.000 It's such a one-way, you know, it's a very limited way.
00:54:22.000 To communicate.
00:54:22.000 It's far inferior to actual person-to-person communication.
00:54:28.000 So my chapter on Jordan, you know, I present his views, and then I go, here's where I agree with him, here's where I disagree with him.
00:54:35.000 And to his credit, I sent him a copy of the chapter, and I said, would you blurb my book?
00:54:40.000 And he did.
00:54:41.000 He wrote me that nice long blurb, and he says, this is a rather difficult book for me to blurb, given that an entire chapter is devoted to criticizing my claims about He's a great guy.
00:55:05.000 People just don't know him.
00:55:06.000 They just don't.
00:55:07.000 I love that guy.
00:55:30.000 And he's become this lightning rod for hate from the left.
00:55:35.000 I mean, I don't even think he's really right-wing.
00:55:38.000 He thinks of himself as a classical liberal, which is a very weird definition.
00:55:42.000 I mean, it's more of a centrist than anything, but he's not right-wing.
00:55:45.000 No.
00:55:46.000 No, definitely not.
00:55:48.000 The problem you identified, though, just a moment ago was that if people identify with their beliefs, that is, the specific, say, political platforms like on immigration, abortion, civil rights, whatever,
00:56:04.000 those are sort of secondary to the deeper core moral values that people hold.
00:56:09.000 I define myself as a liberal.
00:56:11.000 I define myself as a conservative, Republican, whatever.
00:56:14.000 And so when you attack one little thing here, well, you know, I agree with you on this and this and this, but, you know, on the abortion thing, I think you're wrong, and here's why.
00:56:21.000 You know, the impulse is, well, but if I give up on that one, then I'm going to lose all these other ones, and then I've given up my identity, right?
00:56:29.000 So, like, when I used to debate creationists, intelligent design theorists, and so on, you know, I could tell that if I give people a choice, like, you have to choose between Jesus and Darwin, right?
00:56:39.000 For your life, you know, they're not picking Darwin, okay?
00:56:41.000 Because, you know, this sort of belief in their, you know, Christian dogmas about Jesus, that is their core being.
00:56:48.000 Who cares about Darwin and, you know, whoever this scientist was?
00:56:51.000 But if I say, keep Jesus, keep your whole religion, I don't care what you believe, but the science is really good on this, and here's why you should follow the facts, and you don't have to give up anything for it, then it's like, oh, okay, I'll listen, right?
00:57:03.000 So, and like with more recently with climate change, I Most of us don't know much about climate science.
00:57:10.000 It's a technical science.
00:57:11.000 The models are super complex.
00:57:13.000 People send me these papers.
00:57:14.000 I don't really understand them.
00:57:15.000 But if you self-identify, say, as a conservative, then climate change is just a proxy for something else.
00:57:24.000 I believe in free markets and free enterprise, and I'm pro-business, and those guys over there, they want to We're good to go.
00:58:03.000 Yeah, the polarization is the thing.
00:58:06.000 If you are on one side, you have to subscribe to the whole menu of ideas.
00:58:12.000 And if you're left-wing, you can't really be pro-life.
00:58:16.000 And if you're right-wing, you're supposed to have a certain amount of skepticism about climate change.
00:58:23.000 Right.
00:58:23.000 So when somebody publicly signals where they stand on, say, climate change, what they're really saying is, look, I am publicly declaring my commitment to my team.
00:58:35.000 Yes.
00:58:36.000 Yes, that's a problem, right?
00:58:38.000 That's a problem.
00:58:39.000 The virtue signaling, the saying, I have loyalty to this position because of this is my tribe.
00:58:49.000 That's right.
00:58:50.000 And so a lot of cognitive science studies of reasoning shows that we generally don't reason toward finding the truth, but defending truth.
00:59:14.000 I'm in that team.
00:59:17.000 And, you know, okay, fine.
00:59:18.000 We're all on teams.
00:59:19.000 That's fine.
00:59:20.000 Defend your team.
00:59:21.000 But, you know, what I try to do in the book is disentangle the specific issues.
00:59:26.000 Let's just take them one by one.
00:59:27.000 Like, why can't I be personally against abortion?
00:59:30.000 I don't want to do that.
00:59:32.000 And I recognize, say, Ben Shapiro's arguments for the rights of the fetus, but I also think we have conflicting moral values there, the rights of a woman.
00:59:41.000 And the history of the way women have been treated and men have always tried to lord it over women's reproductive choices historically and this has always led to bad things like infanticide and back alley abortions and so on.
00:59:54.000 So I got to err on one side or the other.
00:59:56.000 I recognize and acknowledge your arguments are really good.
01:00:00.000 Ben or whoever is a pro-lifer, but I still hold this position.
01:00:03.000 I think there's a lot of progress that can be made socially to kind of reduce the tension when you say, I acknowledge your position, I understand it, you know, steel manning the argument, and then the person on the other side feels like, well, at least this guy's listening to me.
01:00:18.000 Yeah.
01:00:19.000 Well, I think that is the best topic when it comes to that, because when you get to, particularly when you get to late-term abortions, Boy, that's a very hard thing to defend morally and ethically.
01:00:33.000 And it's also one of the things about the abortion topic is that it's so uniquely human in that it's such a messy topic.
01:00:43.000 Here's a clear one.
01:00:45.000 Don't murder people, right?
01:00:47.000 Don't just go up to people and murder.
01:00:49.000 And everyone's like, yeah, that's clean.
01:00:51.000 That's a clean subject.
01:00:54.000 Abortion is not that clean.
01:00:56.000 When is it okay?
01:00:57.000 Is it okay when the fetus is not a fetus, when it's just a bundle of cells?
01:01:02.000 Most people are like, yeah, well, it's not really anything then.
01:01:06.000 Well, it will become a person, though.
01:01:08.000 When do we decide?
01:01:10.000 Well, that's such a messy subject, and it's such a human subject.
01:01:14.000 And I, like you, I am on the side of pro-choice, and I think that it is the woman's choice to decide whether or not she wants to keep the baby.
01:01:24.000 But...
01:01:25.000 I also recognize at a certain point in time, that choice becomes very different.
01:01:29.000 The choice becomes very different when it's a six-month-old fetus.
01:01:32.000 Like, what are we saying there?
01:01:34.000 If you are just, I am pro-choice, period.
01:01:37.000 Okay.
01:01:38.000 Are you pro-choice up until the day of birth?
01:01:41.000 When do you back it off?
01:01:43.000 When do you back it off?
01:01:44.000 And it is a subject that people do not want to breach.
01:01:47.000 They don't want to touch it.
01:01:49.000 And particularly people on the left, when it comes to deciding when it's okay and when it's not okay, because they feel like this is angling towards an elimination of a woman's right to choose.
01:02:00.000 And it angles towards this...
01:02:02.000 It's a very difficult conversation where you recognize that there is a difference between someone who's seven months pregnant and someone who's seven days pregnant.
01:02:12.000 There's a very, very big difference.
01:02:13.000 And if we can't acknowledge that, then we're being tribal.
01:02:17.000 We're being ideologically driven.
01:02:19.000 We're sticking to our position because we feel like if we concede that this is a complex issue, then we open up the door to possibly losing a woman's right to choose and losing these reproductive rights.
01:02:33.000 Yeah, I think part of the problem also is that we tend to dichotomize most moral issues as right or wrong, good or evil.
01:02:40.000 And the problem is that the law has to draw the line somewhere.
01:02:44.000 We have to have a law to get along and so forth.
01:02:47.000 So we have to say the drinking age is this instead of that, or driving age is this.
01:02:53.000 The point at which you can have abortion is right here.
01:02:56.000 But most of life is much more on a spectrum, a continuum.
01:03:00.000 So here I make the distinction in the book between binary thinking and continuous thinking.
01:03:05.000 Most moral issues are on a continuum.
01:03:07.000 You know, like immigration.
01:03:09.000 You know, it's like, close the borders.
01:03:10.000 What, don't let anybody in, ever?
01:03:12.000 Well, no, no, no.
01:03:13.000 We've got to let some in.
01:03:14.000 Okay, then we should open the borders.
01:03:16.000 You mean you want to just open the borders up and let everybody in?
01:03:19.000 No, no.
01:03:20.000 No, I'm not saying everybody.
01:03:21.000 Okay, where do you draw the line?
01:03:23.000 It's another messy human subject.
01:03:26.000 Yeah, yeah.
01:03:27.000 But if you think of it like, well, it's a continuum instead of a binary choice.
01:03:31.000 And, you know, whatever answer, it's not just right or wrong, good or evil.
01:03:34.000 There's, you know, different places to set the...
01:03:39.000 We're good to go.
01:03:43.000 We're good to go.
01:03:48.000 You know, they've slid it way down here.
01:03:51.000 They let almost nobody in.
01:03:52.000 Australia is a little looser but tighter than us and so on.
01:03:56.000 And you can kind of look at the consequences of letting this many people in or that many people and see what it does.
01:04:01.000 Of course, all countries are different.
01:04:02.000 Some are more diverse.
01:04:03.000 Some are more homogeneous.
01:04:05.000 You have to account for that and on and on.
01:04:07.000 So here, I think, you know, instead of thinking of it in these kind of polarized black and white, you know, it's either this or that, and if you're on this side, then you're on the bad side.
01:04:16.000 You know, that's not helpful.
01:04:18.000 So instead of binary thinking, continuous thinking.
01:04:20.000 Abortion, certainly.
01:04:22.000 You just articulated it perfectly.
01:04:24.000 I mean, seven days?
01:04:25.000 Oh, come on.
01:04:26.000 It's just a bundle of cells.
01:04:28.000 But now it looks like by 20 weeks or so, you feel pain.
01:04:32.000 Some consciousness comes online around 24 weeks, 25 weeks.
01:04:37.000 At some point, you've got to draw the line somewhere around there.
01:04:40.000 Now, scientists, of course, they don't want to put lines anywhere.
01:04:43.000 It's a week by week, day by day, even hour by hour, the development of the Yeah.
01:04:59.000 And it creates this, this is like the line in the sand, this polarization line between these two sides.
01:05:06.000 And I think that so much of what people subscribe to when they do choose an ideology, once they choose an ideology, they have this conglomeration of ideas that they adopt.
01:05:20.000 And they adopt in order to be accepted by the tribe.
01:05:23.000 And this is also a very unique aspect of human communication and civilization.
01:05:27.000 We have to adhere to the principles and the ideologies of that tribe.
01:05:33.000 So you just take on all these thoughts.
01:05:35.000 And it's one of the real problems with only having two choices in this country when it comes to politics and when it comes to just styles of life, you know?
01:05:44.000 And there's so many people that take great relish in switching teams, too, which is interesting, right?
01:05:49.000 It's like, I was a liberal my whole life, and then one day I woke up and realized I was being a moron, you know?
01:05:55.000 And now I'm a pro-Second Amendment, pro-Trump, MAGA, make America great, keep America great.
01:06:01.000 It's interesting, because those are sometimes the most...
01:06:05.000 The most passionate supporters of the new side, whether they're newly liberal or newly conservative or, you know, some of the people that are the most Interesting to talk to are people that used to be vegans and are now carnivore.
01:06:20.000 They just eat meat and I was realizing I was being a fool and like, oh my god.
01:06:25.000 It's the same thing.
01:06:27.000 It's with almost every style of living.
01:06:31.000 You can find a contrary style that people find appealing.
01:06:34.000 You know, there's people that used to be atheists that become Muslims and they wear, you know, the hijab and they fully adhere to the Quran.
01:06:42.000 It's really, really interesting because I've spent a lot of time watching religious scholars online talk and watching them preach.
01:06:55.000 And there's something, and as a person who's very agnostic, when I watch that, it's appealing to me.
01:07:02.000 There's a certain aspect of the confidence that they have when they're talking about what God wants or what Allah has in store for you when you die or what you should do because it's written in this particular religious text.
01:07:17.000 The confidence that they have when they describe these things is very alluring, even to me.
01:07:22.000 It's not like I'm going to join, but I'm sitting there in front of my computer and I'm recognizing, oh, I see the appeal here.
01:07:31.000 It's not that it's working on me, but it's attractive to me.
01:07:35.000 I see it.
01:07:36.000 I see how this works on people.
01:07:39.000 And I find it incredibly fascinating, and I think it has to have some sort of an evolutionary reason.
01:07:47.000 There's some sort of an evolutionary benefit that adhering And being accepting of the morals and the ethics and the ideology of the tribe, that's how you stay alive.
01:07:59.000 That's how you find other like-minded people that stick with you.
01:08:03.000 Yeah, I'm glad you do that because that's really the only way to figure out why people believe whatever it is they believe.
01:08:09.000 So monitor your blood pressure when I say this.
01:08:12.000 See it from Hitler's perspective.
01:08:15.000 It's like, what?
01:08:16.000 Well, he had a perspective.
01:08:18.000 His perspective was fueled by meth and testosterone shots and cocaine.
01:08:25.000 His Dr. Morrell probably fucked him up pretty good in his brain.
01:08:29.000 The stories of Hitler and Hitler's use of recreational drugs in order to fuel his escapades.
01:08:39.000 Yeah, there was that book about that a couple years ago.
01:08:41.000 I like that.
01:08:41.000 But just in general, I mean, a good Hitler biography like by Ian Kershaw, the definitive biography, it's two massive volumes each, or like 600 pages long.
01:08:49.000 I mean, it really gives you insight what he was thinking, why he did what he did, why the people responded the way they did, and so on.
01:08:55.000 And, but we should be able to do that without somebody saying, how can you take Hitler's perspective?
01:09:01.000 Because I just want to understand, you know, why evil happens.
01:09:04.000 I mean, my friend and colleague Roy Baumeister wrote that great book on evil, in which he actually went and interviewed serial killers and rapists in prison and said, you know, why'd you do it?
01:09:14.000 And, you know, he discovered that they all had this perspective like, well, this is why I did it.
01:09:20.000 You know, I had a crappy childhood or, you know, I felt that, you know, that it was totally justified.
01:09:24.000 That guy dissed me or she cheated on me or they all had justifications.
01:09:29.000 And it was kind of interesting to see the rationalizations behind their arguments.
01:09:34.000 Now, from the victim's perspective, the perpetrator is just pure evil.
01:09:38.000 He did it because he enjoys the suffering of other people.
01:09:41.000 Now, there are some You know, psychopaths or sadists that do that, but they're very small in number.
01:09:47.000 Very tiny percentage of the population.
01:09:49.000 Most people in prison that are killers, they did it for moralistic reasons.
01:09:53.000 You know, he took my parking spot, so we got in a fight and then I killed him.
01:09:57.000 Or, you know, this guy slept with my girlfriend and so I had to do something and defend my honor and one thing led to another and here I am in prison.
01:10:04.000 They almost all have good moralistic reasoning.
01:10:07.000 The problem is not that we don't have enough morality.
01:10:10.000 Actually, we have too much morality, too much moralizing about other people that are harming us.
01:10:15.000 So, you know, back to the free speech issue.
01:10:18.000 The moment you say, we're going to create a category called hate speech.
01:10:23.000 Okay, what goes in that bin?
01:10:25.000 Right.
01:10:25.000 Well, you know, so I document in the opening page that this really begins the United States in 1919 with the Schenck versus the United States decision by the United States Supreme Court.
01:10:38.000 I think?
01:10:52.000 Because the 14th Amendment protects your right to bodily autonomy, and when the government says, we're drafting you into the military and we're sending you to Europe, in this case for the European Great War, you know, we now own your body for the next four years.
01:11:08.000 Okay, so this is what, and so here's the famous lines from Oliver Wendell Holmes, Supreme Court Justice Oliver Wendell Holmes, Schenck versus the United States, that we're all familiar with.
01:11:30.000 So, clear and present danger...
01:11:46.000 Okay, so you might say, okay, so somebody incites a group to riot and cause violence or something like that, so that's going to be called hate speech.
01:11:55.000 But note what he considered at the time a clear and present danger.
01:12:00.000 Protesters of the draft Mm-hmm.
01:12:24.000 Category creep or category expansion happened where more and more things got put into the bin of clear and present danger.
01:12:30.000 So you're now doing something that I consider to be a clear and present danger, a threat to our nation, our state, our community or whatever.
01:12:37.000 So that category just got bigger and bigger.
01:12:39.000 So back to why liberals used to defend free speech and now it's more conservatives doing it and liberals are in favor of censorship begins with this idea of something like in the 60s where we began to become sensitive to the words we use to describe other people.
01:12:53.000 So, the N-word to describe African Americans is obviously the one we'd all agree with.
01:12:58.000 Yeah, that's bad.
01:12:58.000 We shouldn't do that.
01:12:59.000 Okay, what about the C-word to describe women, or called Jews kikes, or Vietnamese, call them gooks, or whatever?
01:13:06.000 Yeah, yeah, those are all hate speech, so the bin starts getting larger and larger, and then all of a sudden you end up with these lists of microaggressions.
01:13:12.000 I reprint one in the book from UCLA. The entire University of California system in 2014 issued this long list of things you can't say, like Where are you from?
01:13:22.000 Or, wow, you're good at math, to someone who's not Asian.
01:13:25.000 Or, wow, you speak English so well.
01:13:27.000 These are now considered hate speech that could trigger people's feelings of being hurt, and that is a form of clear and present danger to the sort of serenity of our community.
01:13:39.000 And all of a sudden, this category is now huge.
01:13:42.000 Is where are you from really on that list?
01:13:44.000 Yeah, it is.
01:13:45.000 Wow.
01:13:46.000 Yeah.
01:13:47.000 My wife gets this all the time because she's from Germany and she speaks perfect English.
01:13:51.000 She has no accent.
01:13:52.000 And people go, wow, I can't believe you speak English so well.
01:13:55.000 Or, wow, I can't believe you're not from America or something like that.
01:13:58.000 Instead of being offended, she just says, thank you.
01:14:01.000 I paid attention in school.
01:14:03.000 But it's just bizarre that where are you from would be considered a microaggression.
01:14:07.000 We can't make things so sensitive and this is one of the things that I hope comes out of this horrible tragedy that we're experiencing, is that people realize what actually is important and we spend much less time concentrating on stuff that's not really important because there's no real problems.
01:14:26.000 One of the problems with society being so good And this is arguably the best time ever in human history.
01:14:34.000 And Pinker points to that with statistics and gets criticized, like harshly criticized by people on the left that say, no, it's not.
01:14:44.000 It's the best time for white men.
01:14:46.000 And then they'll start going crazy about all the things that are wrong in the world.
01:14:49.000 And he's saying like, yeah, no one's denying the things that are wrong in the world.
01:14:52.000 There's always been things wrong in the world.
01:14:55.000 There are less things wrong in the world today than ever before.
01:14:59.000 And one of the reasons why people can get upset about these things that many people consider to be not that significant, like asking someone, where are you from, is because there's no war.
01:15:13.000 There's no real thing on our beaches.
01:15:15.000 There's no real horrible tragedy that's taking place every day in our communities.
01:15:20.000 You know, I had a friend, my friend Shuki was from Israel.
01:15:24.000 And I went over to his house, and they'd be playing bongos and laughing and dancing and singing, and I'm like, I go, why are Israelis that come to America, like, why are you guys so fun?
01:15:39.000 Like, it's so, they like...
01:15:42.000 They're like laughing and singing.
01:15:44.000 And he goes, you know, in his crazy accent, he was like, hey, where I'm from, he goes, you could die any day.
01:15:51.000 He goes, so when you're alive, it's party, party, party.
01:15:55.000 And that was his take on things.
01:15:56.000 He just wanted a party.
01:15:58.000 You know, he was a really fun-loving guy because he'd experienced some tragedy and because he'd experienced this horrific condition in the Middle East.
01:16:08.000 Whereas here in America, when things get better, we find more shit that's not that important to complain about.
01:16:16.000 I think in a way it's a sign of moral progress in as much as It used to be people would protest really horrific inequalities and prejudices and bigotry against African Americans and so on.
01:16:28.000 Well, we've improved so much on that, and I wish the left would take more credit for that because liberals were drivers of the civil rights movement.
01:16:36.000 So when they now say, you know, things are as bad as they've ever been or worse than they've ever been for African Americans and so on, in a way they're saying, you know, our immediate ancestors who supported these Well,
01:17:01.000 it's also statistically foolish because it's inaccurate.
01:17:08.000 You know, I mean, we should reinforce the positive.
01:17:11.000 Yes, of course.
01:17:12.000 Yes, right, right.
01:17:13.000 But when you see, like, a campus eruption at Yale over, you know, the Halloween costumes business with Nicholas Christakis, you know, I mean, you know, we all looked at that and went, oh, my God, this is ridiculous.
01:17:24.000 But in a way, it's a sign of progress.
01:17:26.000 Like, students in the 60s used to, you know, protest the Vietnam War or, you know, the way blacks were treated in the South.
01:17:31.000 Those are, you know, those are really legitimate things to complain about and protest about.
01:17:36.000 Yeah.
01:17:38.000 There aren't as many of those around anymore for students to get all riled up about, and they still have those moral impulses, like, I want to promote what's right, and I want to be against evil, and I'm all fired up here with my moral module dialed up to 11, and I'm going to go out on the streets.
01:17:52.000 What am I going to protest?
01:17:54.000 Those Halloween costumes!
01:17:55.000 You know, people, that's cultural appropriation, and so on.
01:17:59.000 I remember there was a – I tell the story in a book about this Taco Tuesday at Cal State Fullerton.
01:18:05.000 I was invited to give a speech there years ago about protecting free speech over this issue that caused the campus to just erupt in protest about Taco Tuesday.
01:18:14.000 It's like, Taco Tuesday?
01:18:16.000 Yeah, that's cultural appropriation.
01:18:18.000 The Mexican community is being appropriated by these whites eating tacos.
01:18:23.000 We're in Southern California.
01:18:25.000 Where is there no tacos?
01:18:26.000 I mean, this is – Not only that, you know the guy who tried to patent Taco Tuesday?
01:18:33.000 LeBron James.
01:18:36.000 LeBron James.
01:18:37.000 Didn't he, Jamie?
01:18:38.000 Isn't that what happened?
01:18:39.000 Yeah, he talks about Taco Tuesday on his Instagram.
01:18:42.000 I was on his Instagram, and he's like, you know what today is?
01:18:46.000 And then he shows his talk, Taco Tuesday!
01:18:48.000 And I'm like, okay, is he allowed to do it?
01:18:52.000 I mean, the whole thing is preposterous.
01:18:54.000 It's delicious food.
01:18:55.000 It's not like we're saying that white people created it or anything like that.
01:19:00.000 And cultural appropriation is so ridiculous in so many ways, but one of the most ridiculous ways it is is that It prevents people from enjoying some amazing aspects of the diverse cultures that we all coexist with, especially here in America.
01:19:15.000 I mean, this is such a legitimate melting pot.
01:19:18.000 I think it's amazing that you can go to all these different places.
01:19:22.000 There's a guy, I'm trying to remember his name, Rick Bayless, I believe his name is.
01:19:27.000 And he's a famous Mexican chef, but he's not Mexican.
01:19:32.000 But he cooks Mexican food and he loves Mexican cuisine.
01:19:35.000 He takes all these trips to Mexico to learn with the Mexican masters.
01:19:39.000 And he has a famous restaurant in Chicago where he has a full, authentic Mexican menu.
01:19:48.000 And people are furious at him.
01:19:51.000 Because here he is, this white guy, selling Mexican food.
01:19:54.000 Like, what do you expect him to do?
01:19:57.000 Like, do you have to be born in a certain patch of dirt to enjoy a style of food?
01:20:01.000 And don't you think that he is actually boosting the signal and letting people know that there's some amazing things that come out of Mexico?
01:20:08.000 This is an homage to Mexican cuisine.
01:20:11.000 He's not trying to claim it.
01:20:12.000 You know, hey, this was invented in Chicago.
01:20:14.000 You know, this is not really Mexican.
01:20:16.000 No, he's saying this is from Mexico.
01:20:18.000 He talks openly about the various parts of the country of Mexico, where this style of cooking came from, and how it emanates from the traditional ingredients, and he cooks them in traditional ways.
01:20:31.000 And it's fantastic, like, really widely praised restaurant with amazing food.
01:20:37.000 And this guy gets shit on for it.
01:20:39.000 It's crazy.
01:20:40.000 There's not only these perverse reversals, because it used to be, like in the 19th century, there was this The idea of kind of a pure European culture and other cultures were somehow not as good and they define culture in a very specific way.
01:20:56.000 And then, you know, kind of liberals then were pushing back against that and saying, no, no, no, you know, all cultures are equal and culture is a whole blend of different, you know, migrations and people mixing and that's what culture, that's what makes culture rich.
01:21:10.000 It's fluid and changing and so on.
01:21:12.000 And now all of a sudden, Liberals are saying, no, no, there's a pure, correct culture that only the people born there can use, you know, adopt those cultural features.
01:21:21.000 That's the complete opposite of what liberals used to argue.
01:21:24.000 Starting with anthropologists saying, no, no, no, this crazy idea that whites have, white supremacists have, and that you are European cultures, a bunch of nonsense.
01:21:33.000 Europeans are just as amalgamated with lots of different cultures as anybody else.
01:21:37.000 There is no real European culture.
01:21:40.000 Yeah, it's a weird time, and again, I connect this to the fact that we didn't have as many real problems as we used to.
01:21:48.000 And when you talk about Yale with Nick Christakis, I think there's this thing that kids do when they're coming of age.
01:21:54.000 They're separated from their parents, and they want to establish that so many of these older people were wrong about the way life is, and they're wrong, and we're going to show them what's right, and we have a new way of living, we have a new way of thinking, and We want this campus to be safe.
01:22:10.000 We want safe spaces and a lot of it is about taking control of their environment and enforcing their ideology and creating something that's in a lot of ways is very ego driven because they're trying to show that they're making a change in the environment around them and they have good intentions while they're doing it.
01:22:31.000 It's just their brains haven't fully formed yet and they don't have a lot of life experience and this pattern This pattern shows itself over and over and over again.
01:22:41.000 It's a constantly repeating pattern where these kids go away to college and become self-righteous and then try to impose their viewpoints on the older people.
01:22:50.000 It's very, very common.
01:22:51.000 And it has these psychological building blocks to it that you can kind of see why they're doing this.
01:22:59.000 Yeah, I mean, if you want a safe space, go to a college campus.
01:23:02.000 These are about the safest environment you could be in in all of America, which itself is safer than it's ever been.
01:23:08.000 At Chapman University, we have a safe space group, and so I went to one of their meetings once just to see what it was all about.
01:23:14.000 This is mostly LGBTQ people that were kind of concerned about being insulted or assaulted.
01:23:24.000 Anyway, so I said, well...
01:23:25.000 How many incidences have there been on this very white, very pleasant campus here in Orange County?
01:23:32.000 Well, we don't have any numbers because we're not allowed to ask and keep track of how many incidences there are.
01:23:38.000 Now, the police can do this, but this Safe Space Group or the administration can't do that.
01:23:43.000 So, well, then how do you know it's worse than it was, say, five years ago?
01:23:46.000 Or it's the same or it's better?
01:23:48.000 Well, we don't know.
01:23:49.000 And then I said, okay, give me some examples.
01:23:51.000 What are we talking about here?
01:23:52.000 What's the issues?
01:23:53.000 And like one of them was, for example, a gay couple were, I think it was two guys walking along a sidewalk.
01:23:59.000 And now Chapman's is in the middle of the city of Orange.
01:24:02.000 So it's ringed with houses and just the regular city.
01:24:05.000 So some guy in a pickup truck drove by and said something like, fucking faggots.
01:24:10.000 And I said, okay, yeah, that guy's a dick.
01:24:13.000 He's an asshole.
01:24:13.000 So what are you going to do now?
01:24:16.000 I mean, are you going to give the power to that guy in the pickup truck in which now you see yourself as a victim?
01:24:24.000 Now, technically, yeah, you're a victim of sort of a hate crime or whatever.
01:24:28.000 You know, he said something nasty.
01:24:29.000 But then what?
01:24:31.000 Why not just say, fuck off, asshole, or don't say anything?
01:24:35.000 Just ignore it and just move on with your life, because there will always be assholes.
01:24:39.000 There are fewer assholes than there used to be.
01:24:41.000 As I like to say, conservatives are more liberal now than liberals were in the 1950s.
01:24:45.000 We've all had our consciousness raised, the moral sphere has expanded, and so on.
01:24:50.000 That's a very good point, the way you just said that.
01:24:52.000 Conservatives are more liberal today than liberals were in the 1950s.
01:24:56.000 Socially, just think about that.
01:24:57.000 I mean, it used to be where...
01:24:59.000 Even most liberals were against gay marriage, say, for example, until 2011, really, the switch began, and then 2015, it changed completely with the Supreme Court decision.
01:25:07.000 But if you look at interracial marriage, that was illegal until 1967, and pretty much most Americans, including liberals, were against it.
01:25:16.000 Now, conservatives are all—no one objects to interracial marriage by conservatives.
01:25:21.000 They've all shifted in that liberal direction.
01:25:25.000 And I think the gay marriage thing, I think that's pretty much fallen off the radar of anybody's discussions after the Kate Baker incident in Colorado.
01:25:33.000 I think nobody's really talking about that anymore.
01:25:37.000 Gay marriage, it's like the Seinfeld episode.
01:25:39.000 Yeah, whatever, dude.
01:25:41.000 Who cares?
01:25:42.000 I mean, not that there's anything wrong with that.
01:25:44.000 It's become kind of just a Seinfeld-level joke now.
01:25:48.000 And I think pollsters won't even ask that question anymore.
01:25:51.000 Are you in favor or against gay marriage?
01:25:54.000 In a couple more years, it'll just fall off our social radar.
01:25:57.000 Well, not totally, though.
01:25:59.000 Remember when Pete Buttigieg was running for president before he had dropped out?
01:26:03.000 There was this one woman that found out that he was married to a man, and she tried to take back her vote.
01:26:10.000 Oh, no, I didn't hear about that.
01:26:12.000 Oh, it's a crazy video.
01:26:13.000 It's a crazy video to watch.
01:26:17.000 You know, I pity her.
01:26:20.000 First of all, I pity her that she cares.
01:26:23.000 And that she's developed this ideology or she's been subjected to this ideology that she thinks there's something wrong with two gay people that are married.
01:26:33.000 But she was like, no way.
01:26:34.000 I am not.
01:26:35.000 There's no way.
01:26:36.000 And she was trying to take her vote back.
01:26:38.000 She didn't know.
01:26:39.000 She's like, I didn't know he was gay.
01:26:41.000 And she was upset by it.
01:26:43.000 Yeah.
01:26:44.000 They're out there.
01:26:46.000 There's a few of them out there.
01:26:47.000 I would say this would be in my category of there's always going to be a few assholes driving around in their pickup truck scene in fucking Vegas.
01:26:53.000 And, you know, what do you do about it?
01:26:55.000 Again, just, you know, you can't give those kind of people that power over you.
01:27:00.000 Well, I feel pity for them.
01:27:02.000 It's sad.
01:27:03.000 It's sad that someone would care.
01:27:05.000 It really is sad that you would care about someone's sexual preference or any of those things.
01:27:10.000 And as a comedian, it's a real pain in the ass because you can't even make fun of gay people.
01:27:15.000 You can't make fun of anything that gay people do that's legitimately funny because it would be considered hateful.
01:27:20.000 But all people are funny.
01:27:23.000 People are funny, and you should be able to make fun of all of us.
01:27:26.000 We're so silly.
01:27:28.000 We're the weirdest thing on this planet, as far as I can see, in terms of how complex we are.
01:27:34.000 And some gay people are funny.
01:27:37.000 They do funny things, but if you make fun of them, in our world today, it's considered homophobic.
01:27:45.000 I just think that's it's so crazy coming from a person who's not even remotely homophobic.
01:27:51.000 I've been called homophobic because I've made fun of certain things that gay people do that I think are silly.
01:27:58.000 Again, a perverse reversal, the way it used to be.
01:28:01.000 Liberals were always in favor of comedians poking at the power structure and the prejudices of our time.
01:28:08.000 The thing is, it's punching down now.
01:28:11.000 The idea is that you're punching down on people that are maligned and people that find themselves in a position in society where many people on the left consider them a protected class.
01:28:23.000 But my position is we're all okay.
01:28:27.000 Everyone's fine.
01:28:28.000 But we all have our own idiosyncrasies and our own behavior patterns and our old things that are pretty fucking funny.
01:28:35.000 There's a lot of humor to it.
01:28:37.000 But we...
01:28:38.000 With love.
01:28:39.000 You know, that all of this is with love.
01:28:41.000 That even making fun of Boys Town and how raucous it is on a Saturday night, it doesn't mean you hate gay people.
01:28:48.000 It's an observation.
01:28:49.000 I mean, it's a fucking fantastic place to be if you're a young gay guy looking to get laid.
01:28:53.000 But it's a hilarious place to be if you're a straight person driving through.
01:28:57.000 It doesn't mean you hate anyone.
01:28:59.000 And the problem is that those people, like the guy in the pickup truck, yelling the slurs to make people feel bad, those people still exist.
01:29:07.000 That's the problem.
01:29:08.000 Yeah.
01:29:09.000 But my point is, not as many.
01:29:11.000 Not as many.
01:29:12.000 Much better.
01:29:12.000 Much better.
01:29:13.000 Yeah.
01:29:13.000 And the denial of that, it seems silly.
01:29:15.000 Yeah.
01:29:16.000 Exactly.
01:29:16.000 Because it denies that we've made moral progress, that people like Dr. King and the other civil rights leaders didn't have any effect.
01:29:23.000 And that's not—first of all, it's wrong.
01:29:25.000 They did have an effect.
01:29:26.000 We've made a lot of progress.
01:29:27.000 But also, the effect I worry about is that it's going to tell younger people today, don't bother trying because nothing ever changes.
01:29:37.000 Right.
01:29:37.000 Right.
01:29:37.000 Yeah, there's definitely some of that.
01:29:39.000 You know, it would be great if all of the hate went away.
01:29:45.000 It would be great if there was no prejudice, no racism, no sexism, no homophobia, none of those things.
01:29:53.000 When you see someone acting foolish in a way that is discriminatory, one of the things about it is you recognize that this is a pattern of behavior that human beings can fall into, and it's some of the worst aspects of tribal behavior.
01:30:10.000 And recognizing the folly of others is very beneficial to your own personal growth.
01:30:17.000 If you're there when you see someone yell out a hateful slur to a gay couple, It's terrible that it happens, but you can experience how stupid that person has to be and how sad it is that those people exist and recognize that,
01:30:37.000 okay, this is what it's like to be these people.
01:30:40.000 This is what it's like.
01:30:43.000 It's like when you see people making mistakes and doing terrible things and doing dumb things, the good thing about it is you can learn from other people's foolish thoughts.
01:30:55.000 Yeah, well, research shows that when you know somebody, say, who's gay, you're less likely to be homophobic, just the exposure to them.
01:31:02.000 And so part of the effect of cause of moral progress is this bottom-up.
01:31:07.000 Now, sometimes you have to pass laws to get people to change, like to abolish slavery in the United States.
01:31:12.000 We needed a war.
01:31:13.000 And 750,000 people died about that.
01:31:16.000 And, you know, sometimes you have to send in the federal troops, like I think it was Eisenhower did that, to desegregate Alabama schools that were segregated.
01:31:25.000 And you remember the governor said, you know, segregation now, segregation tomorrow, segregation forever.
01:31:31.000 And I forget who was president at the time, he said, no, you're segregating, and we're sending the You're integrating your schools and we're sending in federal troops, men with guns, to make sure you do it.
01:31:42.000 Sometimes you have to do that.
01:31:43.000 But most of the change happens from the bottom up of just oppressed peoples saying, you know what?
01:31:49.000 Stop that.
01:31:50.000 Don't do that.
01:31:51.000 I don't like it when you say that.
01:31:52.000 And it began with the N-word and just kept expanding.
01:31:55.000 So there's a logic to where we ended up today, where you have this big bin of microaggressions and so on.
01:32:01.000 But there was a logic to it, like just saying it's hurtful to do that.
01:32:05.000 And most of the effects have been good.
01:32:07.000 They've been positive.
01:32:08.000 Just, you know, when Ellen comes out on her TV show, that's just a little thing there.
01:32:14.000 Or when South Park makes fun of all different religions, you know, that gets perspective on things.
01:32:19.000 Humor is good.
01:32:21.000 Television scripts, movie scripts, the way characters talk.
01:32:24.000 Richard Dawkins makes this point about you could pinpoint to the decade when a novel was written based on the words that are used to describe Jews, blacks, and women.
01:32:34.000 But no one said, okay, we're going to pass laws to say you can't use these words to describe Jews, blacks, and women.
01:32:40.000 We all just change the way we talk about other people in a way that's more liberal, that's more all-encompassing, that's more egalitarian in that sense.
01:32:50.000 And it's not clear how exactly that happened, just incrementally, little bit by bit.
01:32:55.000 It's like trying to figure out when a word started to be used.
01:33:00.000 It's really hard.
01:33:01.000 9-11 or gays.
01:33:03.000 I remember in the late 90s, there was a couple of atheists that wanted to quit using the word atheist and call us the brights.
01:33:11.000 We are the brights.
01:33:13.000 Thank God that didn't stick.
01:33:15.000 Yeah, it was pretty obvious what the antonym to the Brights were, the people that believe in God, they're the Dems.
01:33:22.000 Right.
01:33:22.000 Okay, so the Dems are not going to be fond of that.
01:33:25.000 Yeah, oh my God.
01:33:27.000 So that never took off.
01:33:28.000 By trying to change language by fiat from the top down, okay, here's the new rule, we're all going to use this word.
01:33:35.000 That doesn't work.
01:33:36.000 It's just expanding our consciousness, expanding the moral sphere, just including more people in your honorary...
01:33:43.000 Circle of friends and family members or honorary family members or people that you will treat with respect.
01:33:50.000 That has been happening just tiny bits every day, a little bit here and there, and over the decades you see it when you look back at the numbers like Steve Pinker does.
01:34:00.000 But it's hard to pinpoint the day that that happened.
01:34:04.000 Do you know who Daryl Davis is?
01:34:06.000 Yes, the L.A. Sheriff, right?
01:34:09.000 No, no.
01:34:09.000 No, he is, this man right here, he's a blues musician, and he's converted over 200 KKK members and Nazis.
01:34:23.000 That guy, yes.
01:34:24.000 He's amazing.
01:34:25.000 I talked to him on the podcast.
01:34:27.000 He's an amazing guy, and it all came from him doing a show and talking to a guy who said, I've never had a drink with a black man before.
01:34:38.000 And he's like, how is that possible?
01:34:39.000 And the guy said, I'm in the KKK. He thought the guy was joking.
01:34:43.000 And then the guy pulls out his card.
01:34:45.000 And then he's like, what?
01:34:46.000 And so he tells this guy, hey, here's my phone number.
01:34:52.000 When I'm in town again, I'll be in town again.
01:34:54.000 Let's sit down and have a conversation.
01:34:56.000 So the guy comes to see him again when he's in town.
01:34:59.000 He strikes up a friendship with this guy.
01:35:01.000 And four months later, this guy hands him his robe.
01:35:05.000 And he says, I'm leaving.
01:35:06.000 I'm leaving.
01:35:07.000 And this guy was like a grand wizard of the KKK. He said, I'm stepping down.
01:35:11.000 And he's had that effect on 200 people.
01:35:15.000 And done it on a person-to-person basis.
01:35:18.000 First of all, he's an incredibly articulate guy.
01:35:20.000 He's very intelligent.
01:35:22.000 And probably more articulate and intelligent than the people that he's talking to, who consider themselves superior.
01:35:27.000 So they talked to him over a long period of time and they realized, like, this guy is so well-read and he's so fucking smart.
01:35:33.000 Like, I'm dumber than him and I'm a white guy.
01:35:36.000 What is wrong?
01:35:37.000 So they eventually wind up giving up on their racism and they're friends with him now.
01:35:43.000 And it was more important to them to be friends with him and to continue their friendship with him Than it was for them to stay in the KKK. And he's had, you know, these guys who are like henchmen for the KKK quit,
01:35:58.000 give it up and become his friend.
01:36:01.000 And he brought in all these robes that these guys have given him, including like Nazi flags that they gave him and the bands they wear around their sleeves and Nazi uniforms.
01:36:14.000 And it's amazing.
01:36:15.000 But it's that one-on-one thing that you were talking about.
01:36:18.000 You're less likely to be homophobic if you know a gay person and you like them.
01:36:22.000 You're like, well, that's crazy.
01:36:23.000 They're just people.
01:36:24.000 You're less likely to be racist if you're around black people and you meet them and you get to know them.
01:36:29.000 And you're like, well, we're just people.
01:36:31.000 We're just people who look different.
01:36:33.000 That's it.
01:36:34.000 That's it.
01:36:35.000 That's right.
01:36:37.000 Isn't that that movie, K.K. Klan?
01:36:39.000 The Spike Lee movie.
01:36:40.000 Isn't it about this guy, Davis?
01:36:43.000 Oh, I don't know.
01:36:44.000 I was trying to remember.
01:36:45.000 I don't think so.
01:36:46.000 Or maybe it was somebody else.
01:36:47.000 Maybe it was a different example.
01:36:49.000 Yeah, I don't think there's a movie made about him.
01:36:51.000 I mean, there should be.
01:36:53.000 He's just doing it like old school.
01:36:57.000 Door to door.
01:36:58.000 I mean, he's really doing it.
01:37:00.000 You know, what is that term when someone does it from the bottom up?
01:37:05.000 I mean, that's really what he's doing.
01:37:07.000 Grassroots?
01:37:08.000 Yeah, grassroots.
01:37:09.000 Thank you.
01:37:09.000 That's really what he's doing.
01:37:10.000 Like, grassroots converting people that were racist to realize the error of their ways.
01:37:17.000 I think?
01:37:41.000 And so the way this is tested is they measure the kinds of things that people read or they actually have them read passages, like from a Jane Austen novel.
01:37:50.000 And then they take this eye test, this test where you look at just a block of eyes, like I would show you a picture of just this, where you can kind of see the way the corner of my eyes is squinting or not or whatever, what the emotions are.
01:38:05.000 And they have like six different emotions.
01:38:07.000 And then you have to guess what the emotion is of this picture you're looking at.
01:38:11.000 They have hundreds of them you go through.
01:38:13.000 And anyway, the correlation was that people that read a lot of Of novels or that kind of fiction that has that interchangeable perspective are better at mind reading.
01:38:25.000 They're better at reading emotions in the eyes.
01:38:28.000 Anyway, a lot of this hasn't been replicated yet, but it's kind of new, but still the idea is that The rise of the novel since the Enlightenment, in which just common people become more literate, and literacy rates were going up over the centuries.
01:38:43.000 It used to be like 10% of the population was literate.
01:38:45.000 Now it's 99%, whatever.
01:38:47.000 So there's a curve going up there.
01:38:49.000 So as people start reading more, and then they start reading novels, they start taking the perspective of others.
01:38:55.000 So, you know, Harriet Beecher Stowe's Uncle Tom's Cabin, It was the first time most whites had ever read anything about what it was like to be a slave.
01:39:05.000 And they were horrified, like, oh my god, I had no idea that this is what it's like.
01:39:09.000 And Abraham Lincoln famously said when he met her, so you're the little woman that started this great war.
01:39:15.000 You know, in a way, that's right, because a lot of people say, I've never met a black, I have no idea what a slave even is if you're in the North.
01:39:23.000 Now I see why this is so abhorrent, right?
01:39:26.000 Right.
01:39:27.000 And, you know, back to this idea of hate speech, again, once you go down that road, the argument I make in the book is that in the 1850s, there were Southern congressmen who fought against Northern abolitionists coming down to give speeches in the South,
01:39:45.000 or publish articles in newspapers in the South, or get books published and distributed in the South that were pro-abolitionists.
01:39:54.000 This could lead to slave revolts and riots and violence.
01:39:58.000 Therefore, we have to silence that.
01:39:59.000 They didn't use the word hate speech, but that's what today it would be called.
01:40:04.000 Abolitionist speech to abolish slavery is hate speech?
01:40:07.000 That's insane!
01:40:09.000 Right?
01:40:10.000 And the same thing in the Civil Rights Movement in the 1960s, there were people in the South that said, you know, when Malcolm X comes down here or Martin Luther King comes down here and they give a speech, this is not good.
01:40:20.000 This is To use the Supreme Court Justice's words, a clear and present danger to the peaceful nature of our society.
01:40:27.000 We have to silence them.
01:40:29.000 Yeah, that's a similar conversation I had with a friend who's very progressive.
01:40:33.000 There was a thing that was going on for a while where people were saying, punch Nazis.
01:40:40.000 And what I think they were really saying is they were sort of – they were proclaiming their connection to this progressive ideology, proclaiming it so much so that, hey, man, I'm willing to fucking draw blood.
01:40:54.000 I'm willing to punch Nazis.
01:40:56.000 And so I was saying that this is a dangerous thing, and they said, well, why do you think it's dangerous?
01:41:02.000 I said, well, first of all, what if they punch you back?
01:41:04.000 Then you've got a real fucking problem.
01:41:07.000 You're espousing violence.
01:41:10.000 That's always a bad idea.
01:41:11.000 And second of all...
01:41:13.000 I've heard a lot of people called Nazis that I don't think are really Nazis.
01:41:17.000 Like Ben Shapiro.
01:41:19.000 He's Jewish.
01:41:20.000 I've seen people call him Nazis.
01:41:23.000 I've seen it written.
01:41:25.000 He's wearing a yarmulke.
01:41:27.000 Yes!
01:41:28.000 And people still call him a Nazi.
01:41:30.000 So the point is, who is to decide who is a Nazi?
01:41:34.000 If you're saying, like, there's a guy and he's running the gas chamber at a concentration camp and he's relishing the fact that he's going to put these Jews to death, that's a Nazi.
01:41:44.000 Well, yeah, punch that guy.
01:41:46.000 Okay?
01:41:46.000 Yeah.
01:41:47.000 But that's not what you're talking about.
01:41:49.000 You're using this word in this very sort of flippant way and it becomes very dangerous to just say punch Nazis because you're just deciding people are Nazis who are definitely not Nazis and a lot of them are actually Jewish which is patently insane.
01:42:08.000 I think that meme started after somebody punched Richard Spencer Yeah.
01:42:34.000 You know, and emotionally, I feel like I would just like to punch that motherfucker because that is wrong.
01:42:39.000 But this is why we can't go down that road, because the whole point of a civilized society is we can't just have people punching each other.
01:42:45.000 What we need to do is put that guy on a vacation with Daryl Davis.
01:42:49.000 That's right.
01:42:50.000 Really?
01:42:50.000 Make him so he can't go anywhere.
01:42:52.000 He's got to hang out with Daryl for many days at a time.
01:42:55.000 And by the end of it, hopefully he would get it.
01:42:59.000 You know, then hopefully he's not so, as we were talking about earlier, it's not so connected with those ideas that those ideas are him.
01:43:08.000 You know, we've got to be, as a society, more flexible in the way we hang on to ideas.
01:43:16.000 And I think that's something that needs to be taught to people because there's a sort of a built-in reaction that we have to defend our ideas and when you're young and you're learning how to debate things you were learning how to argue things the sting of losing is very personal and so you sort of built in these people build in these defense mechanisms into their personality into their vernacular into the way to communicate where you you do think of your
01:43:47.000 ideas as a part of you But if you really care about you, you should care about objective truth, and you should care about recognizing when an idea that you're holding onto sucks.
01:43:58.000 You know, if you are valuable, if you care about yourself, you should recognize when an idea that you're clinging to is not a good one.
01:44:07.000 Yeah.
01:44:09.000 Yeah, there was that documentary film about white supremacists by that...
01:44:15.000 I forget the name of the film and her name.
01:44:17.000 I'm sorry.
01:44:18.000 But she's basically hanging out with, I think it was Richard Spencer, Jared Taylor, and a few of the others, particularly after Charlottesville.
01:44:27.000 And they're talking to her.
01:44:30.000 You can tell they really like her.
01:44:31.000 First of all, she's an attractive woman.
01:44:33.000 So you see these guys are like, oh boy.
01:44:36.000 This attractive woman's paying attention to me, and so she's very disarming in this way, and she says, well, just tell me what you believe.
01:44:43.000 So they articulate all their beliefs about why brown people are inferior to white people and so on, and she's like, you know, I'm brown.
01:44:49.000 They're like, oh, this doesn't apply to you.
01:44:53.000 Oh, I see.
01:44:54.000 Okay, why not?
01:44:56.000 Well, because, you know, I know you.
01:44:58.000 Like, right.
01:44:59.000 I think there was a line in there where he said, well, I know you, so these things don't apply to you.
01:45:03.000 It's like, right, okay.
01:45:05.000 Therein lies the problem.
01:45:06.000 We just don't know these other people, okay?
01:45:08.000 So we stereotype them, and, you know, that's part of our cognition, stereotyping.
01:45:13.000 It's a perfectly normal thing.
01:45:14.000 We stereotype all kinds of things.
01:45:16.000 We put them into bins, cognitive bins, so we can keep track of them and distinguish them from others.
01:45:22.000 Unfortunately, we do that with people.
01:45:24.000 Yeah, and it's a normal thing that people have done since the beginning of time to sort of recognize who's in your tribe and who's not.
01:45:32.000 Back when we were these little groups of 50 people, and we'd get invaded by another group of 50 people, and you had to be loyal to your group.
01:45:40.000 Yeah.
01:45:41.000 My friend Jared Diamond tells a story of, you know, he goes to Papua New Guinea every year to go birding, and then now...
01:45:48.000 I've been doing anthropology work as well.
01:45:50.000 But back in the day, he said they would go out birding and he'd have one of his Papua New Guinean hunter-gatherers with him on some hiking trail somewhere and they've got their binoculars and so on.
01:45:58.000 And they encounter some stranger from another tribe and Jared's like, hey, let's go talk to that guy.
01:46:03.000 And his buddy is like, are you out of your mind?
01:46:07.000 We could be killed.
01:46:08.000 We don't know who that guy is.
01:46:09.000 And in Papua New Guinea, you might be eaten as well.
01:46:12.000 That's right.
01:46:13.000 Not just killed.
01:46:15.000 But the point of Jared's story is that that's the environment in which we evolved.
01:46:21.000 There's a kind of a logic to xenophobia.
01:46:23.000 Like, other people are dangerous.
01:46:26.000 You know, like my favorite line from A Few Good Men, where Jack Nicholson is schooling the Tom Cruise character, you know, about, you know, you can't handle the truth.
01:46:34.000 What's the truth?
01:46:34.000 The truth is, we live in a world with walls, and on those walls are men with guns.
01:46:38.000 And you want me on that wall.
01:46:39.000 You need me on that wall.
01:46:41.000 When you're at parties, enjoying your freedom, I'm on the wall, right?
01:46:44.000 And I remember seeing that thing, and yeah, that's actually true.
01:46:47.000 That's a very good description of the way we evolved.
01:46:49.000 It's a very walled-off tribal tradition.
01:47:16.000 I think?
01:47:20.000 Well, these ideologies that people subscribe to, I saw a lot of them evaporate when this lockdown took place.
01:47:30.000 Because a lot of my friends that are anti-gun were asking me how to get a gun.
01:47:34.000 Like, what do you do?
01:47:35.000 How do you get a gun?
01:47:36.000 I'm like, you want a gun?
01:47:37.000 My wife says I should get a gun.
01:47:38.000 A friend of mine who says his wife is always like, you're never getting a gun.
01:47:42.000 We're not having a gun in this house.
01:47:43.000 The moment the shit hit the fan, she goes, we gotta get a gun.
01:47:47.000 He was laughing.
01:47:48.000 He's like...
01:47:48.000 He told me to get a gun.
01:47:50.000 You believe this?
01:47:52.000 And he grew up in the South, so he's used to being around guns.
01:47:55.000 But this is when people realize there's a reason why people are preppers.
01:48:02.000 Some of them are insane.
01:48:03.000 Yeah, but it's also, hey, if the fucking grid goes down and there's no power, you should have at least a certain amount of food to sustain you for a little while.
01:48:12.000 It's a good idea.
01:48:14.000 It's a good idea to have a method of filtrating water.
01:48:16.000 It's a good idea.
01:48:18.000 And then I've also gotten a ton of questions from people on how do you get into hunting?
01:48:23.000 How do you get into hunting?
01:48:25.000 It's a complex question.
01:48:27.000 It's a long road.
01:48:28.000 It's not an easy thing to learn how to hunt.
01:48:31.000 But from people that never had any interest in it before.
01:48:34.000 But now they realize, like, hey, I went to the grocery store today and there's no fucking meat.
01:48:38.000 Like, I want meat.
01:48:40.000 Like, what do I do for food?
01:48:41.000 Like, oh, there's animals roaming around.
01:48:43.000 That's what, okay, how do you get these animals?
01:48:45.000 Like, what are you doing?
01:48:47.000 You realize why people hold on to certain beliefs that some people find distasteful.
01:48:54.000 Yeah, there's a certain logic to hoarding.
01:48:56.000 Even if no one wants to do it, again, it's like this pluralistic ignorance problem.
01:49:00.000 Everybody thinks that everybody else wants to do this, but of course they want to do that.
01:49:04.000 The solution is you just put limits.
01:49:06.000 It's like hunting licenses and you have a limit to how many you can shoot and so on.
01:49:11.000 And lots of industries have adopted that to solve the tragedy of the commons problem.
01:49:17.000 My local Vons solved it by saying you can only buy one packet of toilet paper per shopper per day.
01:49:24.000 That's it.
01:49:25.000 And I think everybody was kind of glad about that.
01:49:27.000 Like, okay, good.
01:49:28.000 Now, there's toilet paper on the shelves.
01:49:30.000 I don't have to worry about hoarding.
01:49:31.000 I don't have to worry about that asshole taking two and I get none.
01:49:35.000 And really, even if you say, well, we're just selfish creatures.
01:49:41.000 Yeah, we are.
01:49:42.000 That's right.
01:49:42.000 But we also want to do the right thing.
01:49:45.000 But we need some kind of norms and laws and customs or whatever in place to kind of attenuate the inner demons and accentuate the better angels and that everybody can see that.
01:49:55.000 And then they feel better about, like, okay, I'm just going to buy one and I'm not going to hoard or whatever.
01:50:00.000 Now, you mentioned the South.
01:50:02.000 There's interesting research, Richard Nisbet and his colleagues, on the culture of honor that's more common in the South than in the North.
01:50:09.000 In that, in a culture of honor, you solve your own problems, you don't turn to authorities or the state, kind of in general on average.
01:50:17.000 Also in the South, particularly in African American communities, law enforcement and the judicial system has not been very fair.
01:50:26.000 So you can't really trust them.
01:50:28.000 So you do kind of have to take the law into your own hands.
01:50:31.000 Therefore, there's more guns, more gun violence in the South.
01:50:34.000 There's kind of a logic to it.
01:50:36.000 So Nisbet did these famous experiments that are kind of amusing now, where he'd have subjects come in and fill out a form for some fake experiment they were doing.
01:50:45.000 And then you have to walk down the hall and give the form to the person in that room at the end of the hall.
01:50:50.000 In the hallway, there's like a bank of lockers or filing cabinets or something, and there's somebody working there.
01:50:57.000 We're good to go.
01:51:16.000 Students from the North were like, you know, whatever.
01:51:18.000 I don't care.
01:51:19.000 They would apologize to the guy that said asshole.
01:51:22.000 People from the South, they're like, that motherfucker called me an asshole?
01:51:25.000 And they were mad.
01:51:26.000 And then when they drew the blood, their stress hormones were higher, testosterone was higher.
01:51:31.000 And so, anyway, his theory is that democracy came late to the southern United States.
01:51:38.000 The rule of law...
01:51:40.000 And the idea that the state has a monopoly on the legitimate use of force, which is how government is defined.
01:51:47.000 And therefore, Southerners had to kind of take the law into their own hands.
01:51:52.000 That is to say, you know, we're going to develop a culture of honor and we're going to take care of matters ourselves.
01:51:57.000 Because everybody wants justice.
01:51:59.000 Everybody wants right to be done and wrong to be punished.
01:52:01.000 That's normal.
01:52:02.000 And if the state's not doing it, if the state says, we're going to do that, so you don't have to do it, so we're going to disarm the citizenry, and we're going to take care of that through a court system and a police system and so on.
01:52:12.000 But if they're not doing it, or they're doing it unjustly, and some communities like African-American communities are treated differently, then of course they're going to push back.
01:52:22.000 So that's why there's more gun sales and more homicides in the South and in the North.
01:52:27.000 Anyway, I just thought of that when you mentioned that.
01:52:29.000 It's a very interesting subject.
01:52:31.000 So your book comes out when?
01:52:33.000 Is it out now?
01:52:34.000 Tomorrow?
01:52:34.000 Yeah, it's out tomorrow.
01:52:36.000 Tomorrow's officially, but you can actually order it.
01:52:37.000 I just checked on Amazon this morning.
01:52:39.000 It's like, oh, it's already for sale.
01:52:40.000 Okay, so tomorrow will actually be today, because this is going to be released tomorrow.
01:52:44.000 Okay, perfect.
01:52:45.000 And it's called Give the Devil His Due.
01:52:47.000 Giving the Devil His Due.
01:52:48.000 Giving the Devil His Due, yes.
01:52:50.000 So the little chess piece there is our art director, the little devil there.
01:52:54.000 Excellent.
01:52:55.000 All right.
01:52:55.000 Well, I always enjoy talking to you, Michael.
01:52:57.000 I really appreciate you.
01:52:58.000 Thanks, Joe.
01:52:58.000 And hopefully next time I see you, we can actually have dinner together again.
01:53:02.000 We will do.
01:53:02.000 And I'll give you a big hug.
01:53:03.000 We'll do it in the studio next time.
01:53:04.000 That's right.
01:53:04.000 All right.
01:53:05.000 We'll be back to that.
01:53:05.000 All right, man.
01:53:06.000 Take care.
01:53:06.000 Thank you.
01:53:07.000 All right.
01:53:07.000 Bye-bye.
01:53:08.000 Bye, everybody.
01:53:09.000 Bye-bye.