Making Sense - Sam Harris - January 04, 2016


#24 — Ask Me Anything 2


Episode Stats

Length

45 minutes

Words per Minute

162.43459

Word Count

7,397

Sentence Count

302

Misogynist Sentences

5

Hate Speech Sentences

49


Summary

On this episode of The Making Sense Podcast, host Sam Harris answers the question, "Where do you draw the line when it comes to the use of social media as a tool for public discourse?" and provides a brief preview of what's to come on the podcast in the coming weeks. Sam also discusses his new podcast, The Waking Up Podcast, where he'll be speaking with Jocko Willink and Scotty Reitz, two of his former co-hosts on his new show, "The War Room" hosted by Joe Rogan and Tim Ferris, as well as his upcoming interview with Mariam Namizadeh, an ex-Muslim reformer based in the UK who is fighting for greater gun control in her home country of the UK. Sam also provides an update on his plans for the future of the podcast, including who his next guest will be, and why he thinks it's a good idea for him to have a guest on the show, and what he's looking forward to in terms of guests he'd like to have on the next week's episode. Sam finishes out the podcast with a brief recap of the past year and talks about his thoughts on the recent events that have occurred over the past decade, including the Aurora shooting of a woman in Aurora, Colorado, by the Aurora police officer who was shot to death by her own daughter. And, of course, there's a New Year's Day special for Sam's old high school friend, Alex. . Thanks for listening, and Happy New Year, everyone! - Sam and Happy Holidays! Music: "A Good Morning America" by SONG, "Good Morning" by The Good Morning Thing by Skynyrd, "In the Dark" by Fountains of America (feat. John Singleton & "The Good Morning Morning Thing" by Jeff Perla -- "Outro" by Haley Shaw, "New York Times" by Ian Dorsch, "Thank You" by John Rocha, "Blame It On Me" by Pizzi, "I Don't Know What's Wrong, I'll See Ya'll, I'm Sorry About It" by Joseph Rogan, "Noah" by Kevin McLeod, "We'll See You, My Name is Sam Harris, "You Don't Need to Know That" , and "It's My First Day of 2020" by Josh, "Let's Get Into It"


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.240 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.860 Okay, well I'm going to do a Ask Me Anything podcast this time around.
00:00:52.140 Now, as some of you know, I often set out to do these and then answer one question for
00:00:58.940 20 minutes or more and don't get very far.
00:01:02.800 And I'm afraid that's what's going to happen this time around.
00:01:07.040 I'll answer a few questions in a rather long-winded way and then hopefully do some rapid-fire answers
00:01:14.740 and get through many of your questions.
00:01:16.200 I must say, I really appreciate the response I get whenever I go out for questions.
00:01:21.720 I get hundreds.
00:01:24.040 Honestly, I can't even read all of your questions.
00:01:27.380 But I've read many of them and I just, you know, it's very hard to choose.
00:01:32.560 I'll just work through them.
00:01:34.360 But I also have to do a little housekeeping in this podcast, so I will start with that.
00:01:39.320 First, I'm recording this podcast on New Year's Day.
00:01:42.400 So, Happy New Year, everyone.
00:01:44.260 And I ended the year in contentious fashion on Twitter.
00:01:50.080 I don't know why I use Twitter.
00:01:52.160 If I treated Twitter the way I treat Facebook, I would never have any of these entanglements
00:01:58.200 that I so often discuss with you.
00:02:00.340 I would never notice what was being said about me or what sort of skirmishes I was being dragged
00:02:07.100 into.
00:02:08.060 And therefore, I'd never be tempted to respond.
00:02:10.120 So, it's a, perhaps deserves some rethinking whether it's worth my paying attention.
00:02:16.860 But as I'll tell you in a moment, at least two of my upcoming guests are coming on the
00:02:22.360 podcast simply because of some Twitter incident.
00:02:25.760 And so, that's, it's certainly interesting.
00:02:28.180 It keeps me forever uncertain whether I should be using this technology.
00:02:32.480 But in any case, as for my upcoming guests, I have, actually three of the guests coming
00:02:37.980 up are coming up entirely as a result of Twitter.
00:02:43.060 So, you could figure out whether it's a blessing or a curse at this point.
00:02:47.740 I'll soon be speaking with Jocko Willink, the Navy SEAL who was recently interviewed on Tim
00:02:53.200 Ferris' podcast first and then Joe Rogan's.
00:02:56.040 And I encourage you to listen to both those interviews.
00:02:58.180 That's literally five hours of interview with Jocko.
00:03:02.260 He is just a fascinating guy.
00:03:04.580 And I certainly would have always loved to have him on my podcast.
00:03:08.800 And it occurred to me to reach out to him except, you know, I just heard literally five
00:03:14.040 hours of him on two of my friends' podcasts.
00:03:17.480 And it just seemed, I wasn't quite sure what there would be to add to that conversation.
00:03:23.960 And Jocko now has his own podcast.
00:03:26.280 At Joe Rogan's insistence, he started a podcast immediately and is banging those out.
00:03:31.260 And the guy is great.
00:03:32.700 But, you know, on Twitter, we got thrown together and people were encouraging him to come on
00:03:36.940 my podcast.
00:03:37.760 And he said he'd be happy to do it.
00:03:39.440 And so, here we are.
00:03:41.020 So, I'll be speaking to him in the coming weeks.
00:03:43.420 And I look forward to that.
00:03:44.480 And I'll try to find a fundamentally new line through the conversation so that we can get
00:03:49.060 to some of his insight and experience that he didn't have a chance to share with Tim and
00:03:54.300 Joe.
00:03:54.940 And that should be fun.
00:03:56.280 I'll also be speaking with Scotty Reitz, who's a former SWAT operator.
00:04:03.480 He was the lead weapons and tactics instructor for LAPD SWAT and now trains people in the use
00:04:10.560 of firearms.
00:04:11.120 And I'm going to talk to Scotty about violence and self-defense and firearms and gun control
00:04:17.380 and also what it's like to be a cop and the misuses of force that we've all seen of late
00:04:24.660 from cops and just to get into all of those politically sensitive but interesting areas.
00:04:32.580 And Scotty, I think, will be a great person to do it with.
00:04:36.060 Jocko and Scotty will come close together and that will be, we can call that violence week.
00:04:41.120 On the Waking Up podcast.
00:04:43.260 I'll also be speaking with Mariam Namazi, the ex-Muslim reformer based in the UK who many
00:04:51.340 of you probably know.
00:04:52.540 You've probably seen video of her.
00:04:54.560 I've circulated video of her before.
00:04:56.820 And this was also Twitter born.
00:04:59.060 There was an incident of what I considered friendly fire where she went after me for what I think
00:05:06.020 is a misunderstanding of my views on profiling.
00:05:08.920 And I've always thought Mariam was great, but she more or less slammed me as a bigot as
00:05:15.280 far as I could tell on Twitter.
00:05:16.360 And so I reached out to her and she's agreed to come on the podcast and we'll try to rectify
00:05:22.160 that situation.
00:05:23.160 Whether or not we succeed, Mariam's is a voice you all should hear and I will bring her to
00:05:28.920 you.
00:05:29.180 Uh, there was also another Twitter born collision, not so friendly fire, uh, with someone who
00:05:35.980 I'd never heard of, a young Muslim, probably soon to be lawyer.
00:05:39.900 He's a, um, getting his JD at Yale.
00:05:42.620 He's a writer who wrote a truly withering book review, uh, in Salon about my book with
00:05:49.360 Majid and, uh, he hated the book, um, seems to hate Majid, but especially hates me.
00:05:58.400 And, um, hatred really isn't too strong a word.
00:06:01.880 People were hurling this review at me on Twitter.
00:06:04.700 As you know, I don't tend to read Salon.
00:06:07.440 So, um, anyway, I read it.
00:06:10.260 Needless to say, I don't agree with it.
00:06:12.200 Uh, but I, um, reached out to, I don't know if his name is pronounced, Omer Omar.
00:06:17.340 I will find out from him.
00:06:18.800 Let's call him Omer.
00:06:20.240 That's how it's spelled.
00:06:21.500 I reached out to him on Twitter and he agreed to come on the podcast.
00:06:24.620 And so I anticipate that being a difficult conversation.
00:06:29.480 And my interest is, as I've said before, in trying to figure out how to have hard conversations,
00:06:36.020 how to start very far apart in a conversation and figure out how to converge, or at the very
00:06:42.200 least, agree to disagree on specific points in a way that is not entangled with personal
00:06:50.400 hostility and misunderstanding.
00:06:53.920 And that, admittedly, is a challenge that not everyone is up to.
00:06:57.900 And so it's, you know, I tried it with Noam Chomsky and I'll be trying it with other people.
00:07:04.060 My conversation with Majid was also an example of that.
00:07:06.640 And I didn't know how it was going to turn out and it became hugely productive.
00:07:10.660 So, um, I'm going to be running similar psychological and conversational experiments on my podcast.
00:07:18.000 And, um, Omer will be one.
00:07:20.400 Also, I'll be speaking with Jonathan Haidt, who many of you know.
00:07:22.900 He's a very influential psychologist with whom I've disagreed in the past and in a none
00:07:28.500 too friendly way, I might add.
00:07:30.040 And so that is a, another instance of my reaching out to somebody who has taken some very hard
00:07:36.020 shots at me in the past and I've returned fire and we're going to see if we can have
00:07:41.560 a civil and useful conversation on important topics.
00:07:45.680 That'll be coming sometime in February, I think.
00:07:48.720 And also I'll, I will have, uh, Steve Pinker on at some point and I have a few other guests
00:07:53.640 lined up so that there will be interesting conversations coming your way.
00:07:56.360 I feel the need to apologize once again for the level of congestion I'm bringing to the
00:08:01.540 mic now.
00:08:02.420 I have, uh, two young girls, each of whom seems to be, um, striving to win the patient zero award
00:08:09.580 for bringing new colds into the world.
00:08:12.820 I don't know if they're out there, uh, playing with ducks in a pond or where they're getting
00:08:17.860 these viruses, but they're bringing them to daddy.
00:08:21.360 Uh, so I bring them to you in a substandard audio performance.
00:08:26.360 So bear with me there.
00:08:28.200 So many of you noticed various Twitter controversies and wanted me to address them.
00:08:34.080 Here's the first with Fareed Zakaria, the CNN and Washington Post journalist.
00:08:39.120 He sent out a tweet about a week ago endorsing a truly terrible piece of Islamist propaganda.
00:08:47.480 And so the tweet read, my book of the week, who speaks for Islam?
00:08:52.580 What a billion Muslims really think.
00:08:54.400 That's, that's the title of the book.
00:08:56.340 And then he calls it an essential voice of reason.
00:08:59.180 Now this book was written by John Esposito and Dahlia Mogahed.
00:09:04.200 And, uh, apologies again.
00:09:06.080 I don't know exactly how to pronounce Dahlia's last name.
00:09:08.940 I will say Mogahed.
00:09:10.120 Uh, it's probably not right.
00:09:12.580 Esposito runs a Middle East studies center at Georgetown.
00:09:15.540 And Mogahed works for Gallup, the famous polling organization that published this book that Zakaria was recommending.
00:09:23.720 So I tweeted in response, witness the capture of academia, Esposito, polling, Gallup, and journalism, Zakaria, by rank Islamism.
00:09:34.680 Now, many people took this tweet to be a sign that I had gone off the deep end by alleging some kind of stealth Islamist takeover of our institutions.
00:09:43.280 Uh, well, there is an attempted Islamist takeover of our institutions.
00:09:47.600 And it's not especially stealthy.
00:09:50.340 But to be clear, I wasn't claiming that Zakaria is an Islamist.
00:09:54.080 Rather, I think he's probably been deceived by Islamist misinformation, of which there's an endless supply.
00:10:00.020 And there is no question he's spreading such misinformation by pushing this book.
00:10:06.160 And nor do I think Esposito is an Islamist, because to my knowledge, he's not even a Muslim.
00:10:10.840 But everything I've seen him publish about Islam has been, if not a lie, a half-truth.
00:10:17.260 He's someone who I've called a Muslim apologist in the past.
00:10:19.720 And his center at Georgetown is funded with tens of millions of dollars by the Saudi government.
00:10:25.260 It's the Prince Al-Walid bin Talal Center for Muslim Christian Understanding.
00:10:29.660 And his function appears to be to whitewash Islam in general, and the obscenity of Saudi Wahhabi Islam in particular.
00:10:36.500 Now, I was frankly unfamiliar with Mogahed.
00:10:38.520 She's the executive director of the Gallup Center for Muslim Studies.
00:10:42.600 I just happened to have seen her on Meet the Press a few days before, sitting beside my friend Azrin Omani,
00:10:48.040 the eminently rational journalist and Muslim reformer.
00:10:51.020 And more or less every word out of Mogahed's mouth was, again, a lie or a half-truth
00:10:56.720 that seemed calculated to deceive a secular audience.
00:11:00.840 She was saying things like the members of ISIS aren't religious,
00:11:03.620 and that they have no theological or popular support,
00:11:06.620 and that there's no correlation between being a religious Muslim and being a jihadist.
00:11:11.180 In fact, the correlation is negative, according to Mogahed.
00:11:14.080 You're more likely to be a jihadist if you're not a devout Muslim.
00:11:17.900 These statements are completely dishonest.
00:11:21.060 I did a little digging on Mogahed, and from what I can tell,
00:11:24.000 it seems that she has some affinity for, if not direct connection, to the Muslim Brotherhood,
00:11:28.940 as do many people who claim to be Muslim moderates in our society.
00:11:32.120 And it's very annoying that only people on the political right,
00:11:36.220 and many of whom are dogmatic Christians or Jews,
00:11:39.860 seem to have the time or the temperament to point this out.
00:11:43.180 A group like the Council on American-Islamic Relations, CARE,
00:11:46.740 which seems to be the most influential Muslim civil rights organization,
00:11:50.500 was a direct offshoot of the Muslim Brotherhood,
00:11:53.380 and has been supportive of terrorist organizations like Hamas.
00:11:56.860 Is it connected with these organizations now? I don't know.
00:11:59.740 Do its members even know? It could be like Scientology,
00:12:03.160 where you don't know how crazy the organization is until you're deep in it.
00:12:06.600 But I can tell you one thing for certain.
00:12:08.660 This is an organization that systematically lies about Islam and demonizes its critics,
00:12:14.660 and tries to make life as difficult as possible for people like Ayaan Hirsi Ali.
00:12:19.360 And again, this group is treated like the Muslim ACLU by the press.
00:12:23.180 It's insane.
00:12:24.760 Both Esposito and Mogahed are darlings of this organization,
00:12:28.320 as is Glenn Greenwald, as you know.
00:12:31.480 So whatever her connections,
00:12:33.980 Mogahed practices some of the worst forms of Islamist obscurantism and identity politics.
00:12:40.600 She describes jihadism as a purely political phenomenon
00:12:43.740 that has no connection to religious doctrine or belief.
00:12:47.000 And needless to say, it's always arising out of that vast reservoir
00:12:50.040 of, quote, legitimate grievances that Muslims have against the West.
00:12:54.040 And she's also an Obama appointee.
00:12:56.640 She sits on the President's Advisory Council on Faith-Based and Neighborhood Partnerships.
00:13:01.380 And she's one of the people who had a hand in writing President Obama's famous Cairo speech.
00:13:06.260 Again, I'm painfully aware that despairing over facts like these
00:13:11.020 makes one sound like a right-wing crackpot.
00:13:13.940 So to be clear, I am an Obama supporter.
00:13:17.100 I've voted for him both times.
00:13:19.540 I will almost certainly vote for Hillary Clinton in the fall,
00:13:22.240 because I don't see any other conceivable choice,
00:13:25.620 though I have to hold my nose over her obscurantism on this issue in particular.
00:13:29.540 But it isn't crazy to worry that Islamists are gaming our system,
00:13:33.980 because they quite obviously are.
00:13:36.060 Now, again, I don't know for sure whether Mogahed is an Islamist.
00:13:41.520 She wears the hijab and says some very dishonest things about Islam in general
00:13:45.920 and Sharia law in particular.
00:13:47.780 Still, she may just be a useful idiot like her colleague John Esposito.
00:13:51.560 The line here can be difficult to find.
00:13:54.460 And it may not even be important to find it.
00:13:56.980 It's the ideas and their influence,
00:13:59.060 rather than the people conveying the ideas that I'm worried about.
00:14:02.120 But this is also not to say that ideologically motivated people,
00:14:08.420 even Islamists, couldn't produce an honest poll,
00:14:11.920 or a book based on such a poll.
00:14:14.380 And I certainly wasn't discounting the contents of the book,
00:14:18.280 because its authors strike me as nefarious and dishonest people.
00:14:22.220 For instance, let's just flip this around.
00:14:24.400 If I were to produce a poll of religious public opinion,
00:14:27.440 perhaps hiring an organization like Gallup or Pew to run it,
00:14:30.660 here are two things about which I am certain.
00:14:34.080 I'm certain I could do this honestly and make every effort to produce a poll
00:14:37.440 that was well-designed and scientifically valid.
00:14:40.400 I'm also certain that religious people and their apologists
00:14:43.840 would reject its findings, whatever they happen to be,
00:14:47.440 because of my history as a critic of religion.
00:14:50.000 I have made no secret of my views on religious faith.
00:14:53.460 I think religion, I think faith-based religion,
00:14:56.180 is dangerous and divisive bullshit.
00:14:58.620 And I think Islam is the worst of the lot.
00:15:01.400 So it would be totally understandable and also wrong
00:15:04.720 for a religious person to reject a Gallup poll of religious opinion
00:15:08.740 that I was associated with.
00:15:10.800 So to be clear, I am not doing that in reverse.
00:15:14.380 When Esposito and Mogahed's book came out in 2007,
00:15:17.780 I bought it and I read it.
00:15:20.160 And I found it so obviously misleading
00:15:22.800 as to not even be worth discussing.
00:15:25.080 But now Fareed Zakaria is pushing it eight years later
00:15:28.220 at his pick of the week and as a, quote,
00:15:30.440 essential voice of reason.
00:15:32.540 And it should be disturbing that Zakaria can't see the flaws in this book
00:15:37.200 because, again, they are so obvious.
00:15:39.540 So what's wrong with the book?
00:15:41.440 Well, first it purports to be an unprecedented
00:15:43.700 and thoroughly scientific poll of Muslim public opinion.
00:15:47.460 But the authors don't show any data.
00:15:49.500 And the ways they discuss their data,
00:15:52.560 along with the kinds of questions they thought to ask in their poll,
00:15:56.120 and the questions they declined to ask,
00:15:58.560 prove that they were after a certain result,
00:16:01.660 which was to make Muslim opinion look totally benign.
00:16:05.080 They want you to believe that Islam is just like any other religion
00:16:08.660 and that Muslims worldwide are just like any other group of religious people.
00:16:13.540 Now, the book isn't entirely filled with lies.
00:16:16.080 The authors admit, for instance,
00:16:18.560 that the imagined link between poverty and lack of education
00:16:21.720 and terrorism, or support for terrorism, is a myth.
00:16:26.240 They admit that the most radicalized people in the Muslim world
00:16:29.420 tend to be middle class and educated.
00:16:32.500 In fact, according to Esposito and Mogahed,
00:16:34.240 the politically radicalized tend to be more satisfied
00:16:36.820 with their financial situations
00:16:38.340 and believe their standard of living is improving
00:16:41.140 and are more optimistic about their futures in general
00:16:44.580 than the so-called moderates are.
00:16:46.940 Which proves that the remedies
00:16:48.300 that many secular liberals imagine exist
00:16:50.740 for extremism in the Muslim world,
00:16:52.780 that is, more education and economic opportunity,
00:16:55.600 are not remedies at all.
00:16:57.280 As I've been saying for years,
00:16:58.600 I don't know how many more engineers
00:16:59.920 have to fly planes into buildings
00:17:01.740 or devote their lives to waging jihad in other ways
00:17:05.040 for us to get it through our heads
00:17:06.540 that the lack of education and economic opportunity
00:17:09.180 isn't the cause of Muslim extremism.
00:17:11.680 But even in making this concession,
00:17:14.920 Esposito and Mogahed reveal
00:17:16.220 that getting Islam off the hook is their goal.
00:17:19.240 Their point is to say that the backgrounds of terrorists
00:17:21.920 are so diverse as to fully exonerate religion.
00:17:26.300 There's the usual tendentious nonsense
00:17:27.840 about how the 9-11 hijackers went to strip clubs,
00:17:30.540 for instance,
00:17:31.380 which, according to Esposito and Mogahed,
00:17:33.320 proves that they weren't really religious.
00:17:35.700 Majid and I dealt with this lie in our book.
00:17:37.520 They also point out that most jihadists
00:17:39.780 aren't graduates from madrasas.
00:17:41.860 This is a point that Scott Atran makes all the time.
00:17:44.600 As though this suggests a lack of connection
00:17:46.760 between sincere religious belief in Islamic doctrine
00:17:49.700 and jihadism.
00:17:51.360 They even go so far as to intimate
00:17:52.500 that the academic backgrounds of prominent jihadists
00:17:55.520 suggest that almost anything could make one a jihadist.
00:17:59.840 So bin Laden, for instance,
00:18:01.300 was, quote,
00:18:01.720 trained in management, economics, and engineering.
00:18:04.180 It's like, who knows which of these streams of information
00:18:07.300 could have radicalized him.
00:18:09.620 This is pure obscurantism.
00:18:12.240 But this isn't the worst part of the book.
00:18:14.680 The worst part comes down to the questions that were asked,
00:18:18.080 as well as those that weren't asked,
00:18:20.060 and the way the results are discussed.
00:18:22.800 So one of the most egregious examples
00:18:24.580 can be found in the question that Esposito and Mogahed
00:18:27.680 used to differentiate what they call radicals
00:18:30.700 from, quote, moderates.
00:18:32.700 They report that only 7% of Muslims worldwide
00:18:36.300 consider the 9-11 attacks to be, quote,
00:18:38.880 completely justified.
00:18:40.700 And then they go on to say, therefore,
00:18:42.600 that 9 in 10, 93%,
00:18:45.660 quote, believe the attacks were not justified.
00:18:48.560 And they call these people moderates.
00:18:50.620 Incidentally, the press ran with this,
00:18:52.180 reporting that 93% of Muslims, or 9 in 10,
00:18:55.120 the world over, are, quote, moderate.
00:18:57.880 Well, if you know anything about anything,
00:19:00.200 you should be feeling a little queasy at this point.
00:19:03.480 I took one look at this line,
00:19:05.960 that only 7% of Muslims consider the 9-11 attacks
00:19:08.860 to be, quote, completely justified.
00:19:11.080 So 9 in 10 are moderates.
00:19:13.160 And I knew I was being lied to by sinister people,
00:19:16.520 or being misled by useful idiots.
00:19:18.900 Again, I can't claim to know which of these categories
00:19:20.900 Esposito and Mogahed fall into.
00:19:23.100 Okay, so the first thing to point out
00:19:24.420 is that even if true,
00:19:25.940 even if this most sanguine of interpretations
00:19:28.700 of this pseudo-data is true,
00:19:31.280 7% of Muslims believing that the atrocities of 9-11
00:19:34.740 were completely justified
00:19:36.080 is a problem that should not be minimized.
00:19:39.780 The authors equate this with 91 million people.
00:19:42.980 Okay, their book came out in 2007.
00:19:45.000 Today it's more like 112 million people.
00:19:47.700 Recall what we're talking about here.
00:19:49.540 We're talking about the intentional murder
00:19:51.960 of 3,000 innocent non-combatants
00:19:55.080 at a time that preceded our involvement
00:19:57.940 in Afghanistan and Iraq.
00:20:00.420 We're talking about 112 million people
00:20:02.800 who think that burning thousands of people alive
00:20:06.120 in the Twin Towers was completely justified.
00:20:09.420 That's already a huge reservoir of murderous lunacy.
00:20:14.160 Okay, but of course the real problem
00:20:15.400 is that the 7% figure is totally misleading,
00:20:17.880 and intentionally so.
00:20:20.100 We can tell from the wording,
00:20:21.300 quote, completely justified,
00:20:22.780 that Gallup used a scale in its poll
00:20:25.100 from 1 to 5 or 1 to 7,
00:20:27.780 where completely justified
00:20:28.960 and completely unjustified were at the tail ends.
00:20:32.060 There was almost certainly a choice
00:20:33.240 of somewhat justified or mostly justified
00:20:35.680 or both that many people picked.
00:20:38.160 And there was certainly a choice
00:20:39.420 of don't know or no opinion.
00:20:41.760 Think of all those people who couldn't say
00:20:43.920 that the attacks of 9-11
00:20:45.280 were completely unjustified.
00:20:48.600 But said, rather, they were mostly justified
00:20:50.740 or somewhat justified
00:20:52.060 or said they didn't know.
00:20:54.560 Okay, these people are being described
00:20:56.640 as moderates.
00:20:58.220 And there's another problem
00:20:59.340 with using this particular question
00:21:00.860 as the dividing line
00:21:02.320 between moderates and extremists.
00:21:04.560 As many of you know,
00:21:05.900 and as Esposito and Mogahed surely know,
00:21:08.900 vast numbers of Muslims
00:21:10.380 think that Muslims had nothing to do
00:21:13.140 with the attacks of 9-11
00:21:14.480 because there were 4,000 Jews
00:21:16.840 who didn't show up to work that day.
00:21:19.240 Millions of Muslims believe
00:21:20.640 that the Mossad and the CIA
00:21:22.280 conspired to bring down
00:21:23.720 the World Trade Center
00:21:24.600 as a pretext to invade Muslim lands.
00:21:28.440 In fact, one poll indicated
00:21:29.580 that 16% of Americans
00:21:31.280 believe something like this.
00:21:33.280 This is the whole 9-11 truth movement.
00:21:35.560 So, using this question
00:21:38.160 in this way
00:21:39.880 rigged the game.
00:21:41.980 And again, Esposito and Mogahed
00:21:43.640 almost certainly know this.
00:21:45.880 And they know that
00:21:46.820 if they had asked questions
00:21:47.860 about apostasy
00:21:48.860 or blasphemy
00:21:49.860 or the rights of women
00:21:51.000 and homosexuals
00:21:52.000 and polytheists
00:21:52.980 or whether infidels
00:21:54.580 deserve to spend
00:21:55.340 eternity in hellfire,
00:21:56.980 the results of their polling
00:21:58.480 would have been appalling.
00:22:00.460 And every poll
00:22:01.860 of Muslim public opinion
00:22:03.120 that has been run
00:22:03.820 on these questions
00:22:04.720 produces appalling results.
00:22:08.660 And again,
00:22:09.120 it literally took me
00:22:10.280 five seconds
00:22:11.200 to see the problem here.
00:22:12.800 Why didn't Fareed Zakaria
00:22:14.300 see it?
00:22:15.460 He is a journalist
00:22:16.520 who covers these issues.
00:22:18.500 He doesn't want to see it.
00:22:20.080 Even this fictional 7%
00:22:22.100 is talked about
00:22:23.260 in the book
00:22:23.660 in a way that is
00:22:24.380 obviously tendentious
00:22:25.780 and misleading.
00:22:27.320 They say, for instance,
00:22:28.160 that, quote,
00:22:28.920 if the 7%,
00:22:29.920 91 million
00:22:30.820 of the politically radicalized
00:22:32.180 continue to feel
00:22:32.980 politically dominated,
00:22:34.360 occupied,
00:22:34.960 and disrespected,
00:22:36.240 the West will have
00:22:36.860 little if any chance
00:22:37.840 of changing their minds.
00:22:39.200 End quote.
00:22:40.040 Again, the burden
00:22:40.860 is upon the West
00:22:41.780 to behave better
00:22:42.760 and show respect
00:22:44.080 to people
00:22:44.620 who think the attacks
00:22:45.800 of September 11th
00:22:46.760 were, quote,
00:22:47.240 completely justified.
00:22:49.500 And again,
00:22:50.340 remember,
00:22:51.000 these attacks
00:22:51.620 came before
00:22:52.620 we invaded
00:22:53.680 Afghanistan
00:22:54.320 and Iraq.
00:22:56.100 Justified for what?
00:22:57.820 Ask yourself that.
00:22:59.480 I should say,
00:23:00.180 I invited
00:23:00.840 Dahlia Mogahed
00:23:01.760 on the podcast
00:23:02.660 after a little sniping
00:23:04.180 on Twitter
00:23:04.980 and she declined.
00:23:07.480 But I would have been
00:23:08.180 happy to speak with her.
00:23:10.620 I've seen her promulgate
00:23:11.900 what I now call
00:23:12.600 the narrative narrative.
00:23:14.340 And basically,
00:23:14.860 everyone is doing this now
00:23:16.120 from President Obama
00:23:17.020 on down.
00:23:18.180 And it's understandable
00:23:19.200 in some ways,
00:23:20.660 but it's also scary.
00:23:22.660 So pay attention here.
00:23:24.220 The idea
00:23:25.300 I'm about to describe
00:23:26.700 is almost
00:23:28.520 unrivaled
00:23:29.820 in its strangeness.
00:23:31.800 And yet,
00:23:32.300 those hearing it
00:23:33.260 for the first time,
00:23:34.580 to say nothing
00:23:35.140 of those who espouse it,
00:23:36.940 never seem to notice
00:23:38.380 that something
00:23:39.080 out of the ordinary
00:23:40.140 is being said.
00:23:41.660 Now,
00:23:42.060 you've heard this idea
00:23:42.880 before,
00:23:43.300 and I will venture
00:23:44.200 to guess that you
00:23:44.920 did not notice
00:23:46.020 how strange
00:23:47.800 and indeed terrifying
00:23:50.220 a claim
00:23:51.580 was being made.
00:23:53.240 The idea is this.
00:23:55.040 In fighting ISIS
00:23:56.080 or in resisting
00:23:57.720 the spread
00:23:58.220 of Islamic theocracy
00:23:59.200 generally,
00:24:00.480 we must at all costs
00:24:02.120 avoid, quote,
00:24:03.300 confirming the narrative
00:24:04.640 of Islamic extremists.
00:24:07.020 The fear is that
00:24:08.000 any focus
00:24:08.800 on the religion
00:24:09.620 of Islam
00:24:10.180 or its adherents,
00:24:12.000 profiling at TSA,
00:24:13.400 intelligence gathering
00:24:14.560 at mosques,
00:24:15.660 or merely acknowledging
00:24:16.840 that we are at war
00:24:17.780 not with generic terrorism
00:24:19.520 but with Islamic terrorism
00:24:21.200 will drive many more Muslims
00:24:23.440 to support the jihadists.
00:24:25.820 Now,
00:24:26.380 think about what is actually
00:24:27.680 being alleged here.
00:24:29.620 Think about the underlying
00:24:31.020 pessimism,
00:24:32.680 if not paranoia,
00:24:34.260 of this claim.
00:24:35.620 Let's use an analogy.
00:24:36.740 Let's say you're a
00:24:37.740 bald white man
00:24:38.880 and unluckily for you,
00:24:41.800 there happens to be
00:24:42.560 a global insurgency
00:24:43.920 of neo-Nazi skinheads
00:24:45.620 terrorizing a hundred countries.
00:24:48.980 Most white men
00:24:49.860 are perfectly peaceful,
00:24:51.220 of course,
00:24:51.900 but this insurgency
00:24:53.620 has grown so captivating
00:24:55.360 to a minority of them
00:24:56.580 that no city on earth
00:24:59.100 is truly safe.
00:25:00.760 Bald white men
00:25:01.940 have blown up
00:25:02.780 planes and buses
00:25:03.880 and burned embassies
00:25:05.300 and even murdered
00:25:06.220 innocent children
00:25:07.100 by the hundreds.
00:25:08.780 And we have spent
00:25:09.540 trillions of dollars
00:25:10.880 trying to contain
00:25:11.780 the damage.
00:25:13.000 Many of these
00:25:13.660 bald white men
00:25:14.660 are seeking to acquire
00:25:16.180 nuclear materials
00:25:17.240 so that they can detonate
00:25:18.820 dirty bombs
00:25:19.600 or even atomic ones
00:25:20.900 in the capitals of Europe
00:25:22.320 and the United States.
00:25:23.640 And to make matters worse,
00:25:25.540 many of these men
00:25:26.340 are avowedly suicidal
00:25:27.860 and therefore
00:25:28.740 cannot be deterred.
00:25:31.060 Now,
00:25:31.320 imagine hearing
00:25:32.360 presidents
00:25:33.500 and prime ministers
00:25:34.640 and newspaper columnists
00:25:36.200 and even your fellow
00:25:37.300 bald white men
00:25:38.480 express the fear
00:25:40.100 that merely acknowledging
00:25:41.980 the whiteness
00:25:42.820 and baldness
00:25:43.640 of neo-Nazi skinheads
00:25:45.020 would so oppress
00:25:46.720 and alienate
00:25:47.920 other bald white men
00:25:49.220 that they too
00:25:50.860 would begin
00:25:51.840 murdering innocent people.
00:25:53.980 Imagine being told
00:25:54.920 that at all costs
00:25:55.980 we must not confirm
00:25:57.320 the narrative
00:25:57.960 of the neo-Nazis
00:25:59.540 by acknowledging
00:26:00.860 that white bald men
00:26:02.460 emblazoned with swastikas
00:26:04.180 are of greater interest
00:26:05.780 from a security point of view
00:26:07.180 than elderly Hawaiian women.
00:26:09.000 This is the situation
00:26:11.040 we're in.
00:26:12.000 You might be somewhat
00:26:12.980 confused by the
00:26:14.300 racial characteristics
00:26:15.920 of this analogy.
00:26:17.320 Obviously, Islam
00:26:18.100 is not a race.
00:26:19.480 But most people
00:26:20.480 appear to believe
00:26:21.420 that by honestly
00:26:22.920 describing the link
00:26:24.120 between the doctrine
00:26:24.980 of Islam and jihadism
00:26:26.300 and therefore admitting
00:26:27.320 that Islam
00:26:27.900 is of special concern
00:26:29.240 in a way that
00:26:29.820 Anglicanism and Mormonism
00:26:31.360 aren't,
00:26:32.160 that we will provoke
00:26:33.140 otherwise peaceful Muslims
00:26:34.960 to such a degree
00:26:36.080 that they will become
00:26:37.280 jihadists or support them.
00:26:40.280 Now, this is either
00:26:41.360 one of the most pessimistic
00:26:42.820 and uncharitable things
00:26:44.180 ever said about a community
00:26:45.540 or it's true.
00:26:48.500 And if it's the former,
00:26:49.700 we should stop saying it.
00:26:51.480 And if it's the latter,
00:26:52.920 we should be talking
00:26:53.680 about nothing else
00:26:54.900 and obliging Muslims
00:26:56.520 to talk about nothing else.
00:26:59.460 Where are these Muslims
00:27:00.920 who are just like you and me
00:27:02.560 in valuing freedom of speech
00:27:04.220 and secular tolerance
00:27:05.400 and scientific rationality,
00:27:07.220 who want their daughters
00:27:08.100 to grow up to be fully
00:27:09.260 self-actualized members
00:27:10.480 of society,
00:27:11.740 who aren't afraid of cartoons,
00:27:13.840 who think gays
00:27:14.620 should be free to marry,
00:27:16.100 but who,
00:27:17.320 if subjected to an extra glance
00:27:19.280 at the airport
00:27:19.980 or a visit from the FBI
00:27:22.380 at their mosque,
00:27:23.940 will be, quote,
00:27:24.700 radicalized
00:27:25.680 and helplessly driven
00:27:27.200 to support ISIS.
00:27:28.940 They're just like you
00:27:29.640 and me now,
00:27:30.400 but say the wrong thing
00:27:31.920 about Islam on television.
00:27:34.280 And they'll start supporting
00:27:35.240 a group that decapitates
00:27:36.500 journalists and aid workers,
00:27:38.440 rapes women
00:27:39.080 by the tens of thousands,
00:27:40.860 and throws gays
00:27:41.880 from rooftops.
00:27:43.480 That is what is being claimed.
00:27:45.840 And it is absolutely shocking.
00:27:49.420 Doubly so,
00:27:50.200 because no one is admitting
00:27:51.580 or seeming to even notice
00:27:53.680 what a shocking claim it is.
00:27:56.080 And again,
00:27:56.740 I don't know what is true here.
00:27:57.920 It could be a totally
00:27:59.160 reasonable fear.
00:28:00.160 Or it could be pure paranoia.
00:28:03.320 But I'm pretty sure
00:28:04.600 the difference matters.
00:28:06.860 So that's all I have to say
00:28:08.280 about that particular book
00:28:09.960 and Contratompe on Twitter.
00:28:12.560 Again,
00:28:13.340 it is highly inconvenient
00:28:14.780 that worrying
00:28:16.380 about the spread
00:28:18.820 of Islamist ideology
00:28:20.140 and the deception
00:28:21.480 that covers its spread
00:28:23.220 immediately puts people
00:28:25.100 in mind of the Red Scare
00:28:26.740 and Joseph McCarthy
00:28:28.240 and right-wing conspiracy theories.
00:28:31.660 You have to follow
00:28:32.960 the plot here.
00:28:34.400 I am always talking
00:28:35.620 about the necessity
00:28:37.000 that freedom of speech
00:28:38.860 and freedom of thought
00:28:40.120 be safeguarded.
00:28:41.800 You should be free
00:28:42.700 to think and say
00:28:44.060 whatever you want to say.
00:28:46.040 And the people
00:28:46.780 who are trying
00:28:47.980 to write blasphemy laws,
00:28:50.080 whether actually
00:28:50.900 or effectively,
00:28:52.520 in the way they're
00:28:53.200 stigmatizing
00:28:54.100 the criticism of Islam
00:28:55.740 as tantamount to bigotry
00:28:57.580 and xenophobia
00:28:58.240 and even racism,
00:28:59.160 these people
00:29:00.640 are undermining
00:29:02.200 the freedom of speech
00:29:03.820 and the freedom of thought.
00:29:05.120 There is no analogy
00:29:06.600 to the Red Scare here.
00:29:08.620 But it is,
00:29:10.260 I just confess,
00:29:11.940 highly inconvenient
00:29:13.200 that the New York Times
00:29:15.980 doesn't do a proper analysis
00:29:18.080 of where the sympathies
00:29:19.520 of people like
00:29:20.180 Mogahed and Esposito are.
00:29:22.200 And they don't talk
00:29:23.620 about the corrupting influence
00:29:25.040 of Saudi money,
00:29:26.860 money from a regime
00:29:27.980 that is theologically
00:29:29.580 indistinguishable
00:29:30.780 from ISIS,
00:29:32.080 flooding our academic institutions
00:29:34.060 and funding mosques
00:29:36.420 worldwide
00:29:37.640 and supplying them
00:29:38.840 with literature
00:29:39.460 that demonizes infidels
00:29:41.580 and polytheists
00:29:42.880 and needless to say,
00:29:44.040 Jews.
00:29:45.260 So,
00:29:45.980 this is not
00:29:47.240 conspiracy theory time.
00:29:48.660 This is just
00:29:49.220 the nefarious work
00:29:51.080 of Islamists
00:29:52.880 that is in plain view
00:29:54.100 for anyone
00:29:54.820 who wants to see it.
00:29:57.140 But,
00:29:57.980 unfortunately,
00:29:59.540 most of the people
00:30:00.400 who want to see it
00:30:01.280 are on the right wing.
00:30:03.300 So,
00:30:03.640 you do a Google search
00:30:04.620 on someone like
00:30:05.800 Dahlia Mogahed
00:30:06.580 and you're immediately
00:30:07.640 dumped onto
00:30:08.480 Front Page magazine
00:30:10.060 and the Weekly Standard
00:30:11.520 and other conservative
00:30:12.880 publications.
00:30:14.620 That's because
00:30:15.260 the liberal publications
00:30:16.180 are not doing their job
00:30:17.320 and many of them
00:30:18.380 are doing the other job
00:30:19.440 of obscurantism.
00:30:21.080 And you have a place
00:30:21.900 like Salon
00:30:22.680 that gets this wrong
00:30:25.300 almost as a matter
00:30:26.120 of principle.
00:30:27.420 Okay,
00:30:27.820 so on to
00:30:28.600 more of your
00:30:29.760 actual questions.
00:30:32.480 What progress
00:30:33.100 have you made
00:30:33.660 toward becoming
00:30:34.260 vegetarian or vegan?
00:30:35.940 So,
00:30:36.500 this question
00:30:36.960 arises
00:30:37.620 from my podcast,
00:30:40.120 I think it was
00:30:40.540 the second podcast
00:30:41.420 I did with Paul Bloom
00:30:42.620 where we sort of
00:30:44.000 just stumbled
00:30:44.680 into an intervention
00:30:46.620 I performed
00:30:47.460 on both of us
00:30:48.820 around the topic
00:30:49.520 of the ethics
00:30:50.680 of eating meat.
00:30:52.880 One of us
00:30:53.540 asked the other
00:30:54.180 what would be
00:30:55.020 on your short list
00:30:55.860 for things
00:30:56.420 that will just
00:30:57.040 mortify our descendants
00:30:58.720 on our behalf.
00:30:59.900 You know,
00:31:00.140 the way we look back
00:31:01.260 on Thomas Jefferson
00:31:03.100 and we're just aghast
00:31:04.640 that he couldn't see
00:31:05.640 the wrongness
00:31:06.380 of slavery.
00:31:07.780 We have this
00:31:08.400 supremely ethical
00:31:09.360 and intelligent
00:31:10.100 person
00:31:10.560 who still
00:31:11.600 couldn't see
00:31:12.620 what an abomination
00:31:13.860 slavery was.
00:31:16.380 So,
00:31:16.860 what analogous
00:31:17.600 blind spots
00:31:18.260 do we have
00:31:18.900 and what will
00:31:19.980 our descendants
00:31:20.560 be scandalized
00:31:22.040 by when they
00:31:22.960 look back on us?
00:31:24.180 And,
00:31:24.720 on both of our
00:31:26.220 short lists
00:31:27.380 was the horror show
00:31:29.140 of factory farming
00:31:30.360 and neither of us
00:31:31.900 could defend it.
00:31:32.920 Both of us
00:31:33.480 participated in this
00:31:35.100 machinery of death
00:31:36.400 and we both
00:31:37.980 admitted that it's
00:31:38.560 only because it's
00:31:39.140 out of sight
00:31:39.640 and out of mind
00:31:40.360 that we were able
00:31:41.240 to do so
00:31:41.880 and neither of us
00:31:43.020 could defend
00:31:43.600 eating meat
00:31:44.420 under these
00:31:45.400 circumstances
00:31:45.980 and nor could
00:31:47.840 we defend
00:31:48.280 delegating
00:31:48.940 the acquisition
00:31:50.040 of meat
00:31:50.600 to others
00:31:51.560 in this way.
00:31:53.140 So,
00:31:53.560 we did kind of
00:31:54.580 stumble into
00:31:55.280 an intervention
00:31:57.120 of sorts.
00:31:57.740 Then I threw
00:31:58.340 Paul into the bus
00:31:59.200 by saying,
00:31:59.780 well,
00:31:59.860 I'm willing to
00:32:00.500 make a change
00:32:01.140 in my diet
00:32:01.780 and I don't know
00:32:02.800 what kind of
00:32:03.240 moral monster you are
00:32:04.440 that you aren't
00:32:05.940 but in any case
00:32:06.940 that was a fun
00:32:08.140 conversation
00:32:08.720 and at that
00:32:10.540 point I asked
00:32:11.440 vegetarian and
00:32:13.140 vegan listeners
00:32:14.180 to send me
00:32:15.460 resources
00:32:16.000 and help me
00:32:18.460 idiot-proof the
00:32:19.420 process of
00:32:20.080 getting off
00:32:20.720 of meat
00:32:21.100 and I made
00:32:22.080 that appeal
00:32:22.480 because I had
00:32:23.080 been a vegetarian
00:32:23.760 for six years
00:32:24.980 at one point
00:32:25.640 and became
00:32:26.380 anemic
00:32:26.740 and just
00:32:27.620 decided it was
00:32:28.980 not a healthy
00:32:30.380 diet for me.
00:32:31.700 So yes,
00:32:32.100 I have a little
00:32:33.060 to report
00:32:33.560 but not enough
00:32:35.300 that I want to
00:32:36.180 go into it
00:32:36.760 in any depth
00:32:37.320 but I can say
00:32:37.840 that since
00:32:38.240 that conversation
00:32:38.840 I have been
00:32:39.440 a vegetarian
00:32:40.140 and now
00:32:40.620 that's
00:32:41.040 I think
00:32:41.780 that's about
00:32:42.080 four months
00:32:42.760 ago
00:32:43.360 and I did
00:32:45.680 some blood work
00:32:46.640 recently
00:32:47.360 and strangely
00:32:48.900 or perhaps
00:32:49.760 not so strangely
00:32:50.500 my lipid profile
00:32:52.320 my cholesterol
00:32:52.980 and triglycerides
00:32:54.020 have gotten worse
00:32:55.240 on a vegetarian
00:32:56.900 diet
00:32:57.320 and I think
00:32:57.800 that's largely
00:32:59.120 because not
00:32:59.600 not because
00:33:00.000 I'm eating
00:33:00.480 more dairy
00:33:01.720 and eggs
00:33:02.380 and yes
00:33:03.440 I'm aware
00:33:04.300 of the ethical
00:33:04.980 concerns
00:33:05.460 around dairy
00:33:06.020 and eggs
00:33:06.400 but because
00:33:07.520 I'm probably
00:33:08.760 eating more
00:33:09.360 carbohydrates
00:33:10.420 and so
00:33:11.380 you know
00:33:11.780 there's the bread
00:33:12.500 and the pasta
00:33:13.100 and the rice
00:33:14.420 and all the rest
00:33:15.040 that's tweaking
00:33:16.040 my blood sugar
00:33:16.720 and that has
00:33:18.240 a unhappy
00:33:19.340 effect on
00:33:20.040 on lipids
00:33:21.260 I'm still working
00:33:22.280 with this
00:33:22.640 I'm still a vegetarian
00:33:23.660 I am an
00:33:25.460 aspiring vegan
00:33:26.820 I'll keep trying
00:33:27.860 to to find
00:33:28.880 my way
00:33:29.440 through this
00:33:30.260 science experiment
00:33:31.120 without making
00:33:32.560 food preparation
00:33:33.980 and eating
00:33:35.680 a new religion
00:33:37.080 for myself
00:33:38.160 or the center
00:33:39.100 of my life
00:33:39.700 I have to be
00:33:40.740 realistic about
00:33:41.380 what I can do here
00:33:42.460 and I don't want
00:33:43.320 the perfect to be
00:33:44.060 the enemy of the good
00:33:44.980 so at the very least
00:33:47.020 I'm convinced
00:33:48.440 about the ethical
00:33:49.080 problem of eating meat
00:33:50.160 the eggs I buy
00:33:51.500 claim to be
00:33:52.760 impeccable
00:33:53.720 ethically
00:33:54.340 every chicken
00:33:55.360 has something
00:33:56.640 like 108 square feet
00:33:58.260 of pasture
00:33:59.360 to run on
00:34:01.480 and I am aware
00:34:02.420 that that doesn't
00:34:03.620 answer the concern
00:34:04.280 of what happens
00:34:04.840 to the male chicks
00:34:05.780 born in their
00:34:07.000 hatcheries
00:34:07.460 but I'm still
00:34:08.840 not convinced
00:34:09.540 that I can be
00:34:10.660 a healthy vegan
00:34:11.680 at this point
00:34:12.260 but I'm going
00:34:13.380 to try
00:34:13.940 to be convinced
00:34:14.860 so again
00:34:15.860 this is a slow
00:34:17.060 unwinding of my
00:34:18.380 carnivore lifestyle
00:34:19.620 but in any case
00:34:20.640 I've made the
00:34:21.440 big change
00:34:22.400 which is
00:34:23.240 I no longer eat
00:34:24.380 meat, chicken, or fish
00:34:25.500 I suppose you could
00:34:26.620 even make the case
00:34:28.180 that eating fish
00:34:29.200 given our current
00:34:30.340 system
00:34:30.840 is more ethical
00:34:31.980 than continuing
00:34:33.600 to eat dairy
00:34:34.440 and eggs
00:34:35.560 I would be interested
00:34:36.640 to know
00:34:37.180 how you vegans
00:34:39.160 and vegetarians
00:34:40.300 view that
00:34:40.960 I did have one idea
00:34:42.280 for a short book
00:34:43.460 or a long blog article
00:34:45.100 where I could go
00:34:46.340 through the
00:34:47.300 comparative
00:34:48.020 neuroanatomy
00:34:48.900 of various species
00:34:50.120 as well as what
00:34:51.280 we know about
00:34:52.080 the likely basis
00:34:53.540 of consciousness
00:34:54.220 and pain
00:34:55.200 and suffering
00:34:55.760 in various
00:34:57.600 animal brains
00:34:58.500 and try to make
00:35:00.280 some intelligent
00:35:01.200 ranking
00:35:02.300 of the likely harm
00:35:03.500 done at each stage
00:35:04.920 so is it
00:35:05.480 you know
00:35:05.880 is it worse
00:35:06.640 to kill a cow
00:35:08.080 than a fish
00:35:08.980 is it worse
00:35:09.560 to kill a pig
00:35:10.200 than a cow
00:35:10.840 can you really
00:35:11.820 eat oysters
00:35:12.600 and other bivalves
00:35:14.580 without any concern
00:35:15.820 that they may be
00:35:16.600 suffering
00:35:16.960 this might be
00:35:17.920 interesting to look
00:35:18.640 at at some point
00:35:19.460 take a fair amount
00:35:20.500 of time to do it
00:35:21.340 right
00:35:21.680 but in any case
00:35:23.380 that's at the back
00:35:23.980 of my mind
00:35:25.100 to look at
00:35:25.900 next question
00:35:27.040 this one's from
00:35:28.400 Faisal Saeed Al-Muttar
00:35:29.780 who many of you know
00:35:30.600 is the ex-Muslim
00:35:32.180 Iraqi reformer
00:35:34.280 and voice of reason
00:35:36.160 he was recently
00:35:37.420 on Dave Rubin's show
00:35:38.740 and gave a great
00:35:40.000 interview there
00:35:40.560 and he was
00:35:41.880 one of the people
00:35:42.680 I consulted
00:35:43.260 on my book
00:35:43.980 with Majid
00:35:44.600 and he gave some
00:35:45.180 very helpful notes
00:35:45.960 he's great
00:35:46.620 anyway he asks
00:35:47.500 if the Islamic
00:35:48.680 reformation
00:35:49.520 slash modernization
00:35:50.760 movement
00:35:51.180 doesn't succeed
00:35:52.080 what do you think
00:35:53.140 should be the
00:35:53.560 alternative
00:35:54.020 and that's a
00:35:55.240 extraordinarily
00:35:56.180 difficult question
00:35:57.000 I think
00:35:57.560 I can't answer it
00:36:00.020 I don't think
00:36:00.560 there is an
00:36:00.880 alternative
00:36:01.300 if Islamic
00:36:02.440 reform
00:36:03.480 slash modernization
00:36:05.200 doesn't succeed
00:36:06.580 we will have
00:36:07.780 a continuous
00:36:08.440 source of conflict
00:36:09.820 with
00:36:10.920 free speech
00:36:12.480 and
00:36:13.020 tolerance of
00:36:14.180 diversity
00:36:14.680 and gender
00:36:15.960 equality
00:36:16.560 and a respect
00:36:17.780 for science
00:36:18.600 I think
00:36:19.420 it will work
00:36:20.360 at some point
00:36:21.420 because it will
00:36:21.840 become so painful
00:36:23.580 and untenable
00:36:24.420 that it will just
00:36:25.740 have to work
00:36:26.580 I don't know
00:36:27.880 how much blood
00:36:28.420 will be spilled
00:36:29.100 or how many
00:36:30.220 pendulum swings
00:36:31.340 toward reactionary
00:36:33.000 governments
00:36:33.500 we'll see
00:36:34.540 in Europe
00:36:35.460 and even in the
00:36:36.100 United States
00:36:36.760 I don't know
00:36:37.620 how many
00:36:37.860 Donald Trump
00:36:38.560 campaigns
00:36:39.140 will have to
00:36:39.920 endure
00:36:40.220 again I'm not
00:36:41.360 especially worried
00:36:42.080 that there's going
00:36:42.760 to be a Trump
00:36:43.320 presidency
00:36:43.800 but it's conceivable
00:36:45.420 and it's conceivable
00:36:46.180 because of this
00:36:47.120 but it has to succeed
00:36:49.000 or at the very least
00:36:50.620 Islam has to become
00:36:52.220 like Christianity
00:36:53.700 in the United States
00:36:55.140 that's a problem
00:36:56.320 that's big enough
00:36:57.020 for me to have
00:36:57.540 written a short
00:36:58.460 book about it
00:36:59.240 Letter to a Christian
00:37:00.380 Nation
00:37:00.700 but it's comparatively
00:37:02.320 a tolerable
00:37:03.660 problem
00:37:04.360 once we get there
00:37:05.440 once the Middle East
00:37:06.860 is like the Bible Belt
00:37:08.520 then we'll have the luxury
00:37:09.500 of trying to
00:37:10.680 fine-tune things
00:37:11.500 and wondering
00:37:12.500 what the far future
00:37:13.380 might look like
00:37:14.000 okay another question
00:37:15.560 this is a longish
00:37:17.880 question
00:37:18.580 that I got by email
00:37:19.740 and it contains
00:37:22.060 a criticism
00:37:23.340 but I thought
00:37:24.220 it was good
00:37:25.080 so I'll read
00:37:25.820 the whole thing
00:37:26.280 here's my question
00:37:27.520 with some context
00:37:28.340 and setup
00:37:29.080 do you think
00:37:30.260 your reliance
00:37:30.940 on hypotheticals
00:37:32.080 and thought experiments
00:37:33.000 has become a hindrance
00:37:34.460 to making headway
00:37:35.340 in discourse
00:37:35.860 on important issues
00:37:36.860 in particular
00:37:37.540 the threat
00:37:38.080 of Islamic terrorism
00:37:39.060 generally speaking
00:37:39.980 how big a role
00:37:40.780 should thought experiments
00:37:41.760 and hypotheticals
00:37:42.660 play in discussing
00:37:43.400 key issues
00:37:44.140 it seems that lately
00:37:45.500 you've given several
00:37:46.340 gifts to your detractors
00:37:47.660 namely the statement
00:37:48.540 about Ben Carson
00:37:49.420 while I understand
00:37:50.420 your position
00:37:50.980 in stating that
00:37:51.620 you'd support him
00:37:52.260 over Chomsky
00:37:52.900 on the point
00:37:53.360 of terrorism only
00:37:54.320 I still think
00:37:55.160 this was a disastrous
00:37:56.180 tactical error
00:37:57.180 that didn't need
00:37:57.860 to be made
00:37:58.440 your point could
00:37:59.500 have been made
00:37:59.880 in any number
00:38:00.500 of ways
00:38:00.900 that didn't involve
00:38:01.780 taking an absurd
00:38:02.740 position
00:38:03.320 voting for Carson
00:38:04.660 under any metric
00:38:05.600 on a situation
00:38:06.860 that will never
00:38:07.400 actually happen
00:38:08.340 Chomsky running
00:38:09.480 for president
00:38:10.020 imagine a person
00:38:11.080 that had not heard
00:38:11.980 of your work
00:38:12.520 until seeing
00:38:13.060 that statement
00:38:13.740 do you think
00:38:14.400 they'd be more
00:38:14.960 or less likely
00:38:15.740 to dig deeper
00:38:16.520 and fully explore
00:38:17.420 the nuance
00:38:17.980 of your views
00:38:18.600 or write you off
00:38:19.360 as a crackpot
00:38:19.980 if the goal
00:38:20.800 is to win
00:38:21.340 the war of ideas
00:38:22.220 it seems like
00:38:22.840 tactics like this
00:38:23.760 might be doing
00:38:24.340 you more harm
00:38:24.980 than good
00:38:25.440 the non-starter
00:38:26.540 with Chomsky
00:38:27.280 and the defense
00:38:27.880 of torture
00:38:28.380 in certain circumstances
00:38:29.440 are other areas
00:38:30.500 where relying
00:38:31.060 on thought experiments
00:38:32.020 and hypotheticals
00:38:33.280 did not seem
00:38:33.980 to win many supporters
00:38:35.200 the name is
00:38:36.500 Jason Teufel
00:38:37.980 thank you Jason
00:38:39.660 well I agree
00:38:41.400 I think it's probably
00:38:42.880 in the specific instances
00:38:44.880 you cite
00:38:45.600 counterproductive
00:38:46.680 and perhaps I should be
00:38:49.760 more disciplined
00:38:50.880 in how I screen
00:38:52.540 for those statements
00:38:53.340 which wind up being
00:38:55.440 counterproductive
00:38:56.600 or easily used
00:38:57.920 to mislead people
00:38:59.680 about my views
00:39:00.400 the Ben Carson thing
00:39:02.000 is very obvious
00:39:03.480 though I couched it
00:39:05.280 with so many caveats
00:39:07.260 and so much context
00:39:08.580 that one really
00:39:10.340 had to be
00:39:11.120 totally malicious
00:39:12.460 to spread the meme
00:39:14.320 that I support
00:39:15.520 Ben Carson
00:39:16.200 for president
00:39:16.820 but of course
00:39:17.900 I have critics
00:39:19.100 who are just
00:39:20.160 that malicious
00:39:20.940 you know someone
00:39:22.040 like Max Blumenthal
00:39:23.180 did just that
00:39:24.580 the issue is
00:39:25.660 when people are
00:39:26.140 that malicious
00:39:26.940 in their use
00:39:27.880 of ellipses
00:39:28.620 they can defame
00:39:30.760 you with any statement
00:39:32.000 but I take your point
00:39:33.520 they may not notice
00:39:34.640 that you were
00:39:35.380 talking about anything
00:39:36.260 until you make
00:39:37.440 a statement
00:39:37.860 of the sort
00:39:38.480 I made about Carson
00:39:39.280 and the net result
00:39:41.320 of that one
00:39:41.940 certainly was not helpful
00:39:43.840 on the question
00:39:45.600 of thought experiments
00:39:46.500 I notice now
00:39:47.820 I offered one
00:39:48.920 at the top
00:39:49.980 and talking about
00:39:50.580 the narrative narrative
00:39:51.440 they do serve
00:39:53.920 the purpose
00:39:54.600 if the analogy
00:39:55.740 one is drawing
00:39:56.500 is correct
00:39:57.240 of clarifying
00:39:58.840 people's thinking
00:39:59.680 and getting down
00:40:00.780 to first principles
00:40:01.620 using a thought experiment
00:40:03.120 or an analogy
00:40:04.280 however surprising
00:40:05.680 can break the spell
00:40:07.240 for people
00:40:08.040 in a way
00:40:08.540 that just talking
00:40:09.800 more and more
00:40:10.700 about the complex details
00:40:12.740 of events
00:40:14.560 in the world
00:40:15.120 can't
00:40:15.980 so yeah
00:40:17.040 I would be reluctant
00:40:17.740 to say that
00:40:18.320 thought experiments
00:40:19.100 and hypotheticals
00:40:20.840 shouldn't be used
00:40:22.420 but I think
00:40:23.560 more care is needed
00:40:24.920 in resorting
00:40:26.740 to them
00:40:27.080 perhaps
00:40:27.860 or at least
00:40:28.460 justifying
00:40:29.200 their use
00:40:29.900 and you know
00:40:30.400 I didn't think
00:40:31.420 that kind of care
00:40:32.160 was needed
00:40:32.740 in the case
00:40:34.360 of talking
00:40:35.140 with someone
00:40:35.460 like Chomsky
00:40:36.120 because obviously
00:40:36.980 he's a celebrated
00:40:37.860 academic
00:40:38.300 who understands
00:40:39.640 what is going on
00:40:41.140 when someone resorts
00:40:41.900 to a hypothetical
00:40:43.020 in order to get
00:40:44.100 at first principles
00:40:44.840 he's also someone
00:40:45.940 who seems inclined
00:40:47.060 to deliberately
00:40:47.640 miss the point
00:40:48.420 when he thinks
00:40:48.980 it will serve
00:40:49.480 his side
00:40:50.080 of the argument
00:40:50.560 I have to factor
00:40:52.020 in the price
00:40:53.580 I pay
00:40:54.180 for being
00:40:55.920 so on my guard
00:40:57.320 in conversations
00:40:58.440 like the one
00:40:58.980 I was having
00:40:59.460 with Douglas Murray
00:41:00.560 that I wind up
00:41:02.240 just not having
00:41:03.280 useful conversations
00:41:04.800 and not branching
00:41:06.220 out into areas
00:41:07.060 that are ethically
00:41:07.800 interesting
00:41:08.320 and consequential
00:41:09.200 where my views
00:41:11.000 may prompt
00:41:12.380 someone to think
00:41:13.220 differently
00:41:13.920 in important ways
00:41:15.200 than they thought
00:41:15.780 before
00:41:16.220 you know
00:41:16.940 I find it thrilling
00:41:18.220 when someone
00:41:19.320 raises a point
00:41:20.640 that I find
00:41:21.480 I'm uncomfortable
00:41:22.380 with
00:41:22.920 and I'm being led
00:41:24.280 helplessly
00:41:25.480 in the direction
00:41:26.240 of something
00:41:26.860 that I find
00:41:28.240 destabilizing
00:41:29.960 to my cherished
00:41:31.140 opinions
00:41:31.540 and I can't
00:41:33.080 see any errors
00:41:34.140 that being made
00:41:35.040 and yet I don't
00:41:36.160 like where I'm
00:41:36.960 being taken
00:41:37.880 I find that
00:41:39.120 absolutely thrilling
00:41:40.200 I find those moments
00:41:41.280 some of the best
00:41:42.020 moments in intellectual
00:41:43.200 life
00:41:43.660 and I've been told
00:41:44.960 by many of you
00:41:46.000 that I managed
00:41:47.220 to do that
00:41:47.800 for you
00:41:48.300 so I would be
00:41:49.100 reluctant to stop
00:41:50.060 doing that
00:41:50.780 I'd be reluctant
00:41:51.980 to speak
00:41:53.060 more like a politician
00:41:54.680 than I do
00:41:55.740 but I would be
00:41:56.920 the first to admit
00:41:57.560 that I may have
00:41:58.380 caused myself
00:41:59.220 more headaches
00:42:00.300 than I should have
00:42:01.740 by not being
00:42:02.540 more careful
00:42:03.100 than I've been
00:42:04.080 so
00:42:05.680 I don't do a lot
00:42:07.060 of censoring
00:42:07.640 of what I think
00:42:08.900 or how I say things
00:42:10.400 but
00:42:10.760 increasingly I find
00:42:12.660 that I do some
00:42:13.360 because again
00:42:13.980 it's yeah
00:42:14.480 if I didn't care
00:42:15.920 whether I was
00:42:16.860 misrepresented
00:42:17.500 it would be a lot easier
00:42:18.940 and
00:42:19.680 getting off of Twitter
00:42:21.200 would be one way
00:42:22.200 not to care
00:42:23.340 because
00:42:23.840 I seem to only
00:42:24.880 see these things
00:42:25.720 on Twitter
00:42:26.560 but
00:42:27.620 as I said
00:42:28.780 as well
00:42:29.500 I also see
00:42:30.480 some useful
00:42:30.920 things on Twitter
00:42:31.700 and some of
00:42:32.480 the upcoming
00:42:32.980 conversations
00:42:33.540 I will have
00:42:34.000 on this podcast
00:42:34.560 are a result
00:42:35.860 of what I've
00:42:36.880 seen there
00:42:37.240 so
00:42:37.480 there may
00:42:38.420 be no perfect
00:42:39.380 solution
00:42:39.780 I'll just keep
00:42:40.940 trying to find
00:42:41.460 my way
00:42:41.880 as I eat
00:42:43.540 nothing but
00:42:44.000 vegetables
00:42:44.480 next question
00:42:45.940 this is a
00:42:46.840 similar one
00:42:47.520 actually
00:42:47.820 obviously
00:42:48.620 it's your
00:42:49.060 quote
00:42:49.320 controversial
00:42:50.080 views
00:42:50.640 that gain
00:42:51.040 the most
00:42:51.380 notoriety
00:42:51.980 however
00:42:52.260 the importance
00:42:52.840 of an idea
00:42:53.420 and how much
00:42:53.880 attention it
00:42:54.380 receives
00:42:54.800 are only loosely
00:42:56.000 related
00:42:56.460 some ideas
00:42:57.160 arguably
00:42:57.620 such as
00:42:58.020 your stance
00:42:58.420 on profiling
00:42:59.120 may turn out
00:42:59.660 to be
00:42:59.860 relatively
00:43:00.240 unimportant
00:43:00.900 regardless
00:43:01.620 of how
00:43:01.980 reasonable
00:43:02.400 or well
00:43:02.840 reasoned
00:43:03.240 it is
00:43:03.560 worse
00:43:04.100 it could
00:43:04.400 be
00:43:04.540 actively
00:43:04.960 unhelpful
00:43:05.640 such
00:43:06.200 topics
00:43:06.580 can be
00:43:06.880 so
00:43:07.140 explosive
00:43:07.660 and
00:43:08.160 your
00:43:08.300 position
00:43:08.620 may
00:43:08.820 not
00:43:09.060 condense
00:43:09.500 to
00:43:09.680 tweet
00:43:09.960 length
00:43:10.320 that
00:43:10.780 it
00:43:10.860 can
00:43:10.980 easily
00:43:11.300 be
00:43:11.500 used
00:43:11.780 by
00:43:11.940 opponents
00:43:12.340 to
00:43:12.540 denigrate
00:43:12.960 you
00:43:13.220 in this
00:43:14.580 regard
00:43:14.900 sharing
00:43:15.360 such
00:43:15.660 ideas
00:43:15.980 could
00:43:16.280 detract
00:43:16.740 from
00:43:16.960 more
00:43:17.160 critical
00:43:17.460 points
00:43:17.940 in
00:43:18.600 what
00:43:18.800 way
00:43:19.060 should
00:43:19.260 the
00:43:19.400 imagined
00:43:19.860 repercussions
00:43:20.600 affect
00:43:21.120 what
00:43:21.320 you
00:43:21.440 decide
00:43:21.820 to
00:43:22.020 publicly
00:43:22.360 share
00:43:22.780 if
00:43:23.360 you
00:43:23.460 believe
00:43:23.740 something
00:43:23.960 to be
00:43:24.200 true
00:43:24.580 would
00:43:25.340 it
00:43:25.420 be
00:43:25.540 moral
00:43:25.860 to
00:43:26.060 withhold
00:43:26.380 it
00:43:26.580 in
00:43:27.220 this
00:43:27.360 respect
00:43:27.660 is there
00:43:27.960 such a
00:43:28.260 thing
00:43:28.440 as a
00:43:28.700 noble
00:43:29.000 lie
00:43:29.520 of
00:43:30.000 omission
00:43:30.380 are there
00:43:31.220 ideas
00:43:31.660 you've
00:43:31.900 decided
00:43:32.320 against
00:43:32.780 sharing
00:43:33.220 what
00:43:34.040 are
00:43:34.200 they
00:43:34.420 that's
00:43:34.960 interesting
00:43:35.280 share
00:43:36.720 them
00:43:36.920 now
00:43:37.260 and
00:43:37.820 are
00:43:37.940 there
00:43:38.040 views
00:43:38.360 of
00:43:38.500 yours
00:43:38.880 that
00:43:39.480 you
00:43:39.600 believe
00:43:39.960 don't
00:43:40.320 get
00:43:40.480 enough
00:43:40.700 attention
00:43:41.120 and
00:43:42.520 this
00:43:42.960 is
00:43:43.080 from
00:43:43.220 Jordan
00:43:43.600 thank
00:43:43.900 you
00:43:44.020 Jordan
00:43:44.280 again
00:43:45.820 yeah
00:43:46.060 I
00:43:46.280 think
00:43:46.480 there
00:43:50.600 why
00:43:50.800 I'd
00:43:50.940 rather
00:43:51.120 not
00:43:51.320 speak
00:43:51.620 about
00:43:51.880 torture
00:43:52.260 on
00:43:52.820 my
00:43:52.980 blog
00:43:53.380 and
00:43:53.860 I
00:43:54.280 said
00:43:54.520 there
00:43:54.880 or
00:43:55.860 somewhere
00:43:56.160 that
00:43:56.380 I
00:43:56.480 once
00:43:56.780 had
00:43:56.920 an
00:43:57.060 epiphany
00:43:57.360 that
00:43:57.560 not
00:43:58.100 everything
00:43:58.640 worth
00:43:59.140 saying
00:43:59.880 is worth
00:44:00.600 saying
00:44:01.060 oneself
00:44:01.680 and
00:44:02.460 that
00:44:02.700 is
00:44:02.960 still
00:44:03.160 true
00:44:03.500 I
00:44:03.980 still
00:44:04.140 think
00:44:04.320 that's
00:44:04.540 true
00:44:04.720 and
00:44:04.900 I
00:44:05.100 notice
00:44:05.560 certain
00:44:06.460 people
00:44:07.180 abide
00:44:07.860 by this
00:44:08.440 precept
00:44:09.200 much
00:44:09.560 better
00:44:09.800 than
00:44:09.960 I
00:44:10.120 do
00:44:10.320 and
00:44:11.040 have
00:44:11.340 commensurately
00:44:12.140 easy
00:44:12.940 lives
00:44:13.480 as a
00:44:13.840 result
00:44:14.160 I
00:44:14.600 would
00:44:14.740 put
00:44:14.860 someone
00:44:15.040 like
00:44:15.220 Steve
00:44:15.500 Pinker
00:44:15.880 in
00:44:16.320 this
00:44:16.480 category
00:44:20.600 gets
00:44:20.700 down
00:44:20.940 in
00:44:21.040 the
00:44:21.160 trenches
00:44:21.440 in
00:44:21.860 the
00:44:21.960 same
00:44:22.120 way
00:44:22.260 that
00:44:22.380 I
00:44:22.520 do
00:44:22.700 part
00:44:23.500 of
00:44:23.580 this
00:44:23.740 is
00:44:23.900 the
00:44:24.340 ideas
00:44:24.920 themselves
00:44:25.540 right
00:44:25.900 so
00:44:26.060 that
00:44:26.260 there
00:44:26.440 are
00:44:26.740 topics
00:44:27.760 like
00:44:28.140 profiling
00:44:28.900 for instance
00:44:29.520 where I
00:44:29.780 just
00:44:29.940 think
00:44:30.140 it's
00:44:30.420 ethically
00:44:31.260 both
00:44:31.960 interesting
00:44:32.560 and
00:44:33.240 important
00:44:33.780 to
00:44:34.600 figure
00:44:34.920 out
00:44:35.220 what
00:44:35.580 we
00:44:36.180 think
00:44:36.580 on
00:44:37.000 this
00:44:37.180 topic
00:44:37.520 I
00:44:38.420 mean
00:44:38.640 get
00:44:38.880 this
00:44:39.120 wrong
00:44:39.580 in
00:44:40.580 any
00:44:40.800 significant
00:44:41.240 way
00:44:41.640 and
00:44:41.940 people
00:44:42.340 will
00:44:42.600 die
00:44:43.000 people
00:44:43.500 by
00:44:43.740 what
00:44:44.020 the
00:44:44.220 hundreds
00:44:44.700 the
00:44:44.940 thousands
00:44:45.420 it's
00:44:46.580 truly
00:44:46.900 important
00:44:47.460 that
00:44:48.120 we
00:44:48.220 figure
00:44:50.600 reluctant
00:44:50.900 to
00:44:51.420 say
00:44:52.340 that
00:44:52.580 I
00:44:52.700 shouldn't
00:44:52.980 touch
00:44:53.220 those
00:44:53.480 topics
00:44:53.900 but
00:44:54.640 in
00:44:55.180 hindsight
00:44:55.520 I
00:44:55.820 can
00:44:55.940 say
00:44:56.140 that
00:44:56.360 yeah
00:44:56.640 some
00:44:56.920 of
00:44:57.060 them
00:44:57.180 have
00:44:57.340 been
00:44:57.640 just
00:44:58.340 more
00:44:58.660 trouble
00:44:59.020 than
00:44:59.200 they're
00:44:59.380 worth
00:44:59.680 and
00:45:00.720 as I
00:45:01.380 think
00:45:01.520 I've
00:45:01.640 said
00:45:01.760 on this
00:45:01.980 podcast
00:45:02.540 at least
00:45:03.300 once
00:45:03.640 that
00:45:03.820 I
00:45:04.260 tore up
00:45:05.060 the best
00:45:06.000 book
00:45:06.240 contract
00:45:06.720 if you'd
00:45:08.140 like to
00:45:08.380 continue
00:45:08.680 listening
00:45:08.980 to this
00:45:09.280 conversation
00:45:09.820 you'll
00:45:10.540 need
00:45:10.680 to
00:45:10.800 subscribe
00:45:11.280 at
00:45:11.500 samharris.org
00:45:12.440 once
00:45:13.340 you do
00:45:13.640 you'll
00:45:13.840 get access
00:45:14.220 to all
00:45:14.640 full-length
00:45:15.120 episodes
00:45:15.520 of the
00:45:15.760 making
00:45:15.920 sense
00:45:16.180 podcast
00:45:16.720 along
00:45:17.400 with
00:45:17.560 other
00:45:17.720 subscriber
00:45:18.240 only
00:45:18.500 content
00:45:18.960 including
00:45:19.820 bonus
00:45:20.200 episodes
00:45:20.800 and
00:45:21.280 AMAs
00:45:21.880 and the
00:45:22.440 conversations
00:45:22.900 I've been
00:45:23.240 having on
00:45:23.620 the waking
00:45:23.920 up app
00:45:24.380 the
00:45:25.140 making
00:45:25.340 sense
00:45:25.560 podcast
00:45:26.080 is
00:45:26.280 ad
00:45:26.520 free
00:45:26.840 and
00:45:27.420 relies
00:45:27.740 entirely
00:45:28.220 on
00:45:28.440 listener
00:45:28.680 support
00:45:29.100 and
00:45:29.860 you
00:45:29.960 can
00:45:30.080 subscribe
00:45:30.420 now
00:45:30.880 at
00:45:31.360 samharris.org