#187 — A Conversation with Paul Bloom
Episode Stats
Words per Minute
177.87321
Summary
In this episode, I discuss the growing problem of child pornography, and the lack of progress in technology in dealing with it, and how technology is keeping us silent about it. I also discuss the role of pornography as a tool for pedophiles, and why we should all be more aware of it. This episode was produced and edited by and . It was produced by , , and , with additional editing and mixing by . It was edited by Ian McKellen, and produced by David Rothkopf and Alex Blumberg. It was mixed by Matthew Boll, and additional editing by Peter Mennear, and provided by Patrick Muldowney. Please don't forget to rate, review, and subscribe to the podcast on Apple Podcasts! If you like the show, please consider leaving a rating and a review. I am always open to suggestions for new episodes and topics, and am always looking for new guests to come on the podcast. Please also consider supporting the project by becoming a patron of the podcast by becoming one of the sponsors of the show by becoming an patron of The Vagrant, by becoming patron of Vagrant. You can also become a patron by purchasing a monthly subscription through the Anchor.fm link below, and I'll send you an ad-free version of the Vagrant's newest podcast, "The Vagrant" in the future, "Vault" in which I'll be paying you a small monthly and you'll get access to all sorts of special perks, including access to the latest releases and perks, like the latest and exciting perks, such as travel and perks such as "the latest updates from the podcast, and "the most exciting places in the world, including "the best of my inbox." I'll also be getting an ad, and other perks, I promise you'll be getting access to access to new episodes of the newest and greatest, I'll learn more about the latest, the most exciting things I can do next week, and so on, I'm sure you'll even get a chance to join in on the future of the future? You'll get the chance to hear more of what's going on in future episodes, and learn more of the next, I hope you'll hear about what's to come in the coming weeks, and hear about it! I'm looking forward to it all, I can't wait to hear back from you, I am back with Paul Bloom.
Transcript
00:00:00.000
i am back with paul bloom paul thanks for joining me hey thanks for having me back this conversation
00:00:22.600
is it's always a break from what i'm normally doing on the podcast but this week it is a
00:00:27.380
a very stark break because i've been having some very gloomy conversations i just released one on
00:00:34.740
nuclear war and i just recorded one on this phenomenon that we we euphemistically call
00:00:41.740
child pornography which if if there's anything more gloomy than nuclear annihilation it is the
00:00:49.020
details of what is going on in tech around child pornography i mean it's just i haven't released
00:00:54.220
this one yet this is probably going to drop after we release this podcast but i mean the scope of the
00:01:00.480
problem and our apparent unwillingness to actually confront it is just it's impossible to understand
00:01:07.240
so anyway that's where my head has been no matter how dark you get you'll be bringing levity to my
00:01:12.720
world very few people say that to me i'm normally kind of a downer conversation wise also i got views
00:01:19.200
on child pornography but maybe i'll save that until uh your thing uh lets out we can talk about
00:01:24.280
it a bit more yeah we can talk about it next time actually the the guy interviewed gabriel dance he's
00:01:28.960
the new york times writer who's been covering this in a series of long and harrowing articles and they
00:01:35.000
just interviewed him on the daily the new york times podcast today so people want a preview of that
00:01:40.980
that's going on there i mean i think you know the daily conversation is like 25 minutes but i think
00:01:45.940
gabriel and i spent two and a half hours wading into this morass and it's astonishing that it exists
00:01:51.840
but it's just what you really can't get your mind around is our lack of motivation to deal with it
00:01:58.060
because we actually can deal with it i mean there's there technological solutions to this there there's
00:02:02.920
just there's obviously a law enforcement solution but we just i mean we're like we're paralyzed largely
00:02:08.900
around i think the fact that the details are just so dark that nobody wants to focus on it for long
00:02:15.540
enough to to actually deal with it i mean it's it's taboo to even think about it and i don't know i
00:02:22.460
mean maybe there are other examples of this kind of thing but there's just such an ick factor with the
00:02:26.580
topic that that has more or less protected these truly sinister people and and networks from much
00:02:35.900
scrutiny much less prosecution so i that sounds fascinating i realized i began this by saying i have
00:02:43.880
views on child pornography and just kind of left that hanging i think i think rather than wait a few
00:02:48.500
weeks and and let twitter you know have itself at me i decided i should really clarify good save which
00:02:54.300
is yes which is you know i i have the same views everybody else has about it's you know it's morally
00:02:59.920
monstrous to prey on children but what i would add to this is that there are people who are sexually
00:03:06.480
attracted to children and i see that as nothing but a curse yeah i would i wouldn't wish that on my
00:03:12.540
worst enemy it is a terrible thing to have and it is unchosen nobody nobody wakes up and says you know
00:03:18.860
oh i want to really work it so that i could only be attracted sexually to kids it is it's hard to
00:03:24.060
imagine a worse thing to happen to you now that doesn't excuse you morally if you act upon it
00:03:30.060
it still i think should reframe a little bit how we think about such cases
00:03:33.780
yeah that's actually a point i make at some point in that podcast because you know if you view
00:03:39.400
pedophilia as a sexual orientation albeit an illegal and and unfortunate one yes nobody decides to be a
00:03:48.380
pedophile but given that the production of child pornography is in every case the the commission of
00:03:54.740
a crime so you're essentially that's why the word pornography is a euphemism and these are just records of
00:04:00.880
you know child rapes and tortures the difference is this preserves the the point you were making
00:04:07.340
this is though being a heterosexual man is one thing one doesn't choose it and it's perfectly legal and
00:04:15.100
you know happy to be one but if you're a heterosexual man who likes to watch real women get really raped
00:04:23.060
and are participating in a network that engineers the the rape of non-consenting adults that's a
00:04:30.640
depraved addition to your sexual orientation for which you you can't be held responsible and just
00:04:37.420
by its very nature anyone who's consuming child pornography much less distributing it is part of that
00:04:44.400
second sadistic phenomenon and so it's yeah but i completely agree with you my position on free will
00:04:51.160
commits me to uh that view obviously that's right and and there's stuff to be again this is exciting
00:04:56.640
you know i i i can't resist it's just that that what you describe is plainly evil and monstrous and
00:05:02.460
should be punished the question of fantasies that hurt nobody but themselves are violent fantasies and
00:05:09.420
perhaps involving depictions of acts which would be terrible if they took place those i think sit in a
00:05:15.540
more complicated place for me and so we could talk about that at a later time i guess yeah yeah this
00:05:21.980
this is okay i promise people we will not spend too much time on this because there's a lot to cover
00:05:27.720
but i don't think i got into this with gabriel dance and any completeness what do you think of
00:05:34.560
this connects to your point about fantasy what do you think about purely fictional products of this taboo
00:05:43.120
material right so you know fictional child pornography the production of which entailed the rape or
00:05:49.700
mistreatment of no one that's obviously nearly as taboo as the real stuff and and also illegal this
00:05:56.420
is just i don't know whether this is true or not but i i believe some people suspect that if it were
00:06:04.160
legal it would to some degree satisfy the desires of pedophiles who are otherwise seeking child
00:06:11.720
pornography i don't know if that's psychologically realistic but what do you think about the ethics
00:06:16.280
there i think you're asking the right question it's plainly icky and and again i wouldn't want to be
00:06:21.720
condemned to have that taste but i think the answer to the question of what i think about that
00:06:27.060
rests on the empirical issue of what its effects are so if it turns out that these robot sex dolls or
00:06:36.280
just people depicting themselves as children but they aren't really children if it turns out that
00:06:40.520
that men who satisfy themselves over that become less likely to harm real children and it makes the
00:06:46.680
world a better place then on balance it seems like a good idea if it turns out to sort of feed their
00:06:52.720
desire and make them want more it's definitely a bad idea yeah i'd be very mindful of the consequences
00:06:57.560
of this and i don't know what the consequences are right right yeah you've uttered the phrase that
00:07:03.740
was uttered only once on this podcast before the notion of child sex robots kate darling who is a
00:07:11.500
as a robot ethicist at mit first introduced me to the the concept more or less as a as a faille
00:07:19.500
complete the moment we get robots that are truly humanoid some genius will give us sex robots and the
00:07:26.220
moment that arrives some perverse person will give us child size sex robots i hope we avoid the path in
00:07:34.780
the in the multiverse that is leading toward child sex robots but i suppose if yeah if it has the
00:07:40.700
consequentialist effects you hope then it would be a good thing on balance and and it's a good
00:07:46.300
illustration of a contrast which we always get into and talk about morality which is you're considered
00:07:51.340
moral views which might lead you to an unintuitive claim that child sex robots are a good thing and
00:07:57.020
make the world a better place and our gut feelings which say you know oh that's disgusting that's
00:08:01.820
terrible someone who creates child sex robots should be strung up but i think you and i agree and talk
00:08:07.100
about this that moral progress involves turning away our ick reactions and focusing in a more considered
00:08:13.820
and deliberative way on consequences right okay so i see i dragged you uh kicking and screaming into the land
00:08:20.300
of ick but what are you thinking about these days yeah let me let me actually this this is actually
00:08:25.500
not incredibly far from it it's another moral dilemma by the way i'm i'm paul bloom i'm from uh
00:08:31.100
the i'm a psychology professor at yale university and so i was at cornell university giving a series of talks
00:08:36.620
and i was at a seminar talking to some students some some you know terrific graduate students and
00:08:41.580
undergraduates and we ended up talking about research ethics and somebody brought up the case of
00:08:47.100
this person works in a lab and he talked about his lab mate hypothetically what if she was engaged in
00:08:52.540
scientific misconduct of some sort maybe and his example is fairly mild but it was scientific
00:08:57.020
misconduct and so you know we kind of agreed that he should encourage her to stop doing it and turn
00:09:03.740
herself in particularly if some data got compromised but then the question came up what would happen if
00:09:09.420
if she wouldn't she refused and he said very matter-of-factly well then i would turn her in
00:09:16.540
and everyone's nodding this makes sense and and something about it sat funny with me they said well
00:09:21.260
what if she was your friend what if she was a good friend and the student thought about it
00:09:26.060
and said no i'd still turn her in they said what what if this was uh you know your girlfriend your
00:09:30.700
partner what is your wife and and there was some hesitation and the conversation got a little bit
00:09:35.980
awkward and and i thought of a couple of different things here but we were talking here about
00:09:41.580
loyalty and and i had two observations from this and i kind of want to throw them at you and get
00:09:48.620
your own sense of this but one is i worried that my own intuitions were a little bit out of whack and
00:09:53.100
maybe this is a generational thing i give loyalty of that sort fairly high value you know if my lab if
00:09:59.740
my best friend was a serial killer yeah i i'd call the police but if my best friend is doing stuff
00:10:05.180
which i thought was wrong but fairly minor i don't think i would i think my loyalty would would
00:10:11.580
override my moral obligation and then this got me to think about how subversive loyalty is loyalty
00:10:18.620
pulls you together with your allies your friends and your family and sits uneasy and easily with
00:10:25.180
broader moral goals including sort of broader utilitarian picture you tend to defend so i was
00:10:30.380
wondering what you thought about that and i was also wondering to make it a bit more personal
00:10:34.380
you would get involved a lot of controversies and debates and you're often defending your friends
00:10:39.500
on twitter and social media and elsewhere and it's really easy to defend your friends when you think
00:10:44.300
that they're right but do you ever defend your friends when you think that they're wrong
00:10:49.980
yeah this is this is a really interesting topic and i've been thinking about it lately because it's one of
00:10:55.820
the variables i see in politics that that leads such dysfunction and it's something that that trump prizes
00:11:04.220
above everything else every one of his abominations seems to be a kind of loyalty test for those around
00:11:11.740
him the people who will pretend he's not lying or pretend he's normal are essentially passing a loyalty test
00:11:17.900
at all times and i've waxed on forever about how degrading i find that but i think loyalty is a virtue
00:11:25.900
obviously until it isn't right so it's one of these right one of these virtues that can be kind of
00:11:32.220
bivalent and i'm not sure what other examples there are but what's interesting is that so it is kind of
00:11:38.460
parasitic on the notion and experience of friendship so to say that someone is loyal to a friend or is a loyal
00:11:46.940
friend it's almost redundant because you know being a real friend entails some degree of loyalty
00:11:54.140
that's right but also family as a second right right you know we're loyal to our children we're loyal to
00:11:59.740
our parents to our siblings yeah and then derivative of that people become loyal to organizations or to
00:12:04.860
you know loyalty to your nation is you know is patriotism but i think the edge cases are interesting
00:12:11.100
and and we reach the edge when you know a friend or a family member or a member of the organization to
00:12:18.700
which we're pledged or our country does something terrible right and at that edge i think being anchored
00:12:26.940
to loyalty as though it were the the moral virtue that trumped all others i think that clearly is
00:12:34.220
pathological my country right or wrong just becomes blind nationalism if your country is doing something
00:12:41.980
obviously illegal and wrong and counterproductive you can turn up those dials as high as you want at
00:12:47.020
some point you look crazy for supporting your country at any apparent cost so to speak of groups
00:12:52.700
for a second is everything i tend to complain about with respect to tribalism and identity politics
00:12:59.660
really just looks like a perversion of loyalty to me it's just that you know if a member of your
00:13:03.820
group is behaving like a psychopath you should be able to acknowledge that and if you can't acknowledge
00:13:08.860
it because you have a different set of ethical books you're keeping for people in your group than
00:13:14.780
from people outside your group well then that is tribalism or identity politics and it's obvious
00:13:22.140
that can't be a foundation for universal ethics right right to be universal you have to be able to
00:13:28.300
triangulate on on something that's happening within your group and judge it by a standard you know
00:13:34.460
certainly the standard you would apply outside your group and that erodes loyalty this same argument
00:13:39.820
applies though for friends and for friends it's more complicated for friends i think there's more
00:13:44.300
of a pull for loyalty the bar just gets higher for the bar gets higher yeah and certainly for your
00:13:50.220
child you know i would do all sorts of things for my child would i i don't know if my child murdered
00:13:56.300
somebody would i lie to get him off so he doesn't go to prison that's a toughie you know would i and
00:14:04.300
there was a movie having this theme would i murder another child to take away that child's organs to
00:14:09.740
save my own child probably not my preference ends somewhere again it's it comes down to mitigating harm
00:14:18.060
for me so let's take it back from the the far extreme if you have someone you have a friend
00:14:23.180
who's doing something objectively wrong i you know we can use the scientific misconduct case or
00:14:28.460
it just depends on what you mean by misconduct but your loyalty to the friend should translate
00:14:34.380
into a commitment to their well-being right and so if they're if they're doing something wrong
00:14:39.020
that you think they should stop doing on some level you view it as bad for them too i mean it's
00:14:44.220
making them at minimum it's making them a worse person right or revealing them to be
00:14:49.580
worse than you you wish they were if you want to improve them in some way if you want to improve
00:14:54.380
their ethics if you want to bring them into compliance with intellectual standards you think
00:14:59.580
they should share in the in the scientific case well then you're you're urging them to stop and correct
00:15:05.260
their misdeeds based on a concern for them at least in part it seems to me right there are cases where
00:15:12.380
it could conveniently line up that way where the most loyal act is also the act that is the best
00:15:17.660
for the community and the best as a whole but i think we got to agree that there's some cases where
00:15:21.980
they really diverge yeah so then the question is what are the real motives and the real consequences of
00:15:28.620
the transgression so i mean i could imagine a murder which while illegal because it's a murder
00:15:34.780
could still be viewed as ethical or close enough to ethical or ethically gray enough such that
00:15:39.900
it's not clear that you even think they did the wrong thing right so then the question is you
00:15:46.460
know you're helping them to conceal it or you're not turning them in that becomes much easier to
00:15:52.140
think about than if you think this person who was who's a friend of yours did something completely
00:15:57.500
insane and sadistic that's right poses a further danger to society right that's right well i you know
00:16:03.660
we might get on to talk about richard dawkins a recent adventure on twitter and and and but put
00:16:10.460
aside exactly what happened i imagine i i i admire dawkins a lot but i don't know him personally i i think
00:16:15.900
you do know him personally let's say hypothetically you'll view him as a friend but suppose you thought
00:16:21.900
he was really on the wrong side of it you might i might imagine you might you know at minimum not be
00:16:27.660
vocal about that if it was somebody you didn't like you might sort of announce it and say this is
00:16:32.620
really irrational and immoral but if somebody you like you'd say ah i'm sure i'm sure he was well
00:16:38.540
intentioned everybody makes a mistake or you might just be silent and and i think that's actually the
00:16:43.420
right way to go i think that that as his friend you have some burden of you should treat him in a
00:16:50.220
different way you would treat anybody else yeah i understand that and i think by default i fall into
00:16:58.060
that pattern i do think that being more and more ethical and compassionate would certainly wouldn't
00:17:07.900
require that you wouldn't require that you treat your friends worse but it does require that you treat
00:17:14.220
strangers more and more like friends i think so you know i am increasingly suspicious of the impulse to
00:17:21.580
dunk on somebody who i yeah who i consider an enemy or at least somebody who's worked very hard to
00:17:27.980
make themselves my enemy and i do look for opportunities to do the opposite i mean so for
00:17:34.540
instance ezra klein i forget what his perch at vox is now he's one of the founders of vox and he's no
00:17:40.060
longer the editor-in-chief but i mean he's somebody who i do think has treated me terribly and never
00:17:47.500
apologized to the contrary he's actually someone who just simply can't see that he's treated me
00:17:52.860
unethically and dishonestly and actually done considerable harm to my reputation these just
00:18:01.260
strike me as objective facts i mean when i get outside of my reaction to them but recently i saw
00:18:07.900
he just released a book and there was a um an excerpt from it in i think it was the new york times it was
00:18:13.900
an op-ed there might have been the washington post and i read it and thought it was it was very useful
00:18:19.100
i mean i thought there was just some great political analysis in there and so on twitter i with the
00:18:24.220
caveat that we disagree about many things i circulated that as you know a great piece of political
00:18:29.420
insider i forget how i phrased it but basically just pure praise while just uh-huh telegraphing that
00:18:35.580
i hadn't completely lost my mind and forgotten you know how much blood there was under the bridge
00:18:39.580
for us so first of all that feels much better to me that's leading me in a much better direction
00:18:48.380
as a person psychologically than my endlessly rehearsing all the reasons why i have every
00:18:55.020
right to despise ezra klein and so that's one example where it's like i acknowledge the difference you're
00:19:01.900
describing and so if it's a friend who does something embarrassing i will i'm certainly
00:19:09.420
inclined not to add any top spin to the the bad press they're getting and if it's somebody who who
00:19:16.140
is a neutral person or somebody who i have reason already not to like you know it's certainly more
00:19:20.780
tempting to give their reputation a push toward the brink but i don't know i just feel like there's a
00:19:27.580
course correction that i'm looking for more and more in my life which is leading everything to
00:19:32.700
converge on the standard you seem to be articulating for friends right and i understand that you and i
00:19:40.780
have had this discussion many times before and it's a good discussion to have where you're always pushing
00:19:44.780
for impartiality and and being an optimist about how much of sort of pure impartial morality we should
00:19:51.820
have and you know i see some of it but i see so many cases which are kind of zero sum where where
00:19:59.100
you have to you have a and b and you have to choose between them and the the option of treating
00:20:05.020
everybody the same just isn't available to you but i i i gotta say we i agree with the general point which
00:20:11.500
is i am trying very very hard to be nicer on twitter and i am trying to recognize you know i think maybe
00:20:18.860
the exception of donald trump but that that everybody you know these are real people here
00:20:24.300
and nobody's a villain in their own heads and people have had unfortunate lives and the sort of
00:20:29.340
public shaming the the impulse which i think people everybody has it they just have different targets
00:20:36.780
is an unhealthy and corrosive impulse so i'm in favor of treating everybody nicer on twitter and elsewhere
00:20:43.500
yeah i think it it's a hard balance to strike because i think becoming completely anodyne and
00:20:51.420
just not participating in any public criticism of bad actors i don't think that is the the sweet spot
00:20:59.980
at a certain point you you have to say something about a phenomenon especially if your particular take
00:21:06.620
on it is is underrepresented and and when you're talking about somebody like trump you know the the
00:21:11.980
only real danger is boring yourself and everyone around you but i do think the ethics are pretty clear
00:21:18.780
we have to figure out how to get this guy out of office so you want to be critical and and you want
00:21:25.020
you don't want to take that away that's right but a friend of mine owen flanagan once got to ask a
00:21:29.180
question dalai lama you know a little translator so he asked and the question was a good question he said
00:21:34.060
if you had had a chance would you kill hitler and the dalai lama was translated and he thought about
00:21:39.260
it he smiled and he said and his answer was yeah i would kill hitler but i wouldn't be angry at him
00:21:45.260
and i would do it with ritual and grace and kindness and to some extent i don't know sure that's good
00:21:51.100
advice for killing hitler but it's pretty good advice for twitter which is if you have to correct
00:21:56.060
somebody if i said this this person's wrong this is an immoral view you shouldn't take this adolescent
00:22:02.060
glee in it you shouldn't do it out of anger you should just you know trying to help people
00:22:07.260
i totally agree with the anger part but this also it connects up to something we spoke about i think
00:22:12.380
we spoke about killing hitler last time or or the time before that and and it does raise the ethical
00:22:18.140
question of at what age is it appropriate to kill hitler because i mean if you go back and kill him as a
00:22:23.260
seven-year-old you do look like a moral monster because he's not quite hitler yet right so it's
00:22:29.260
interesting to consider when that would happen and i think someone should produce a a youtube
00:22:34.700
animation of the dalai lama going back and killing hitler with ritual and without any hatred that's
00:22:40.140
a cartoon i was thinking that you would imagine like a science paper which has a graph and the
00:22:44.540
graph is the best time to kill hitler yeah that's right hey we could we could float that as a poll
00:22:49.900
on twitter or somewhere i'm sure there would be a bell curve around the yeah the appropriate age
00:22:55.020
yeah i'll do that okay so back to uh dawkins yeah who yes i do consider a friend and i did not
00:23:02.860
react one way or the other to his tweet maybe i should remind people what the tweet was though i
00:23:07.900
went out on twitter before this recording and asked for questions and this came up as you might expect a
00:23:13.340
few times so it was a series of tweets i believe two forgive me if this is somebody else's summary but
00:23:20.540
it's it's one thing to deplore eugenics on ideological political or moral grounds it's
00:23:26.940
quite another to conclude that it wouldn't work in practice now if i mean this is kind of hilarious
00:23:32.300
because this really is i can immediately understand the spirit in which he tweeted it i'm not sure what
00:23:37.260
the proximate cause of him deciding to screw up his day and week this way was but can we agree he's
00:23:43.900
very bad at twitter but this is what's hilarious about this is just it really is you take one look
00:23:50.300
at it having been around and around the block with this kind of thing i mean this is just poised to
00:23:56.700
explode in the minds of every person on earth who's just waiting for another reason to vilify richard
00:24:03.740
yeah i don't know what got into his head around this do you know what his point was is this point that
00:24:09.100
as biological creatures our intelligence and creativity and kindness can be shaped through
00:24:16.620
breathing or what what was his point well i think his point might have been a topical and political
00:24:21.980
one i think there's somebody in in the press in the uk right now who just got nominated as an advisor
00:24:27.980
to boris johnson or something and then someone did a a little scandal archaeology in his twitter feed and
00:24:34.220
found some celebration of eugenics or something and so that could have been what richard was reacting
00:24:40.540
to but i got it yeah but anyway he's making the obvious point that eugenics is a thing i mean forget
00:24:47.740
about that it's history as a movement among scientists and pseudoscientists you know a hundred years ago as the
00:24:56.300
facts of darwinism and genetics were only starting to be absorbed it's just obvious that whatever is under
00:25:03.740
genetic control whether that's the way our physical bodies perform and look or you know the way our
00:25:10.060
minds emerge from our brains basically everything about you is genetically influenced to some degree
00:25:16.300
you should be able to breed for that or engineer towards some goal in the same way that i think in
00:25:24.140
further tweets he uses the example of cows giving more milk and all of that so it's the biology of of it
00:25:31.340
is not debatable and that's just his point as a biologist like of course this kind of thing is
00:25:37.180
possible and acknowledging its possibility is not at all a suggestion that it's desirable that we
00:25:45.740
institute any kind of program to do this so he was just separating the you know people's political and
00:25:51.420
moral reaction to the idea based on presumably some notion of what its social consequences
00:25:57.740
were originally and would be in the future and separating that from from this claim that it
00:26:04.060
wouldn't work in practice i'm not sure what which claim he was responding to there but yeah out of
00:26:09.660
context it was weird i mean i i like i i don't know him i'm a i'm a huge follower of his work and i
00:26:16.220
think he's he is you know an extraordinary scholar and and has a lot of interesting say i i think nobody
00:26:21.980
in their right mind would think that he's really defending eugenics it's a it's a you know a comically
00:26:27.020
unfair take on this but as somebody pointed out the very structure of what he said is it's same
00:26:32.780
structure as you know it would be wrong to burn down paul bloom's house on moral grounds on political
00:26:38.300
grounds and other ideological grounds but you know if you had enough gasoline and enough tinder yeah you
00:26:43.660
can burn it down and it's and and it it has this sort of taunting trollish claim and i i'm totally
00:26:52.060
accepting that that that it wasn't intentional and i think it probably speaks the idea twitter is the
00:26:56.780
wrong arena for these sorts of comments let me uh take the opportunity to get us into more trouble
00:27:03.180
than dawkins got into don't we practice if you'd like to continue listening to this podcast you'll
00:27:11.260
need to subscribe at sam harris.org you'll get access to all full-length episodes of the making sense
00:27:16.460
podcast and to other subscriber only content including bonus episodes and amas and the
00:27:22.300
conversations i've been having on the waking up app the making sense podcast is ad free and
00:27:27.260
relies entirely on listener support and you can subscribe now at sam harris.org