ManoWhisper
Home
Shows
About
Search
Making Sense - Sam Harris
- April 07, 2026
#468 — More From Sam: Gratitude, Bad Conversations, Conspiracy Addiction, Waffle House Teleportation, and More
Episode Stats
Length
32 minutes
Words per Minute
194.02054
Word Count
6,323
Sentence Count
212
Misogynist Sentences
2
Hate Speech Sentences
5
Summary
Summaries generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript generated with
Whisper
(
turbo
).
Misogyny classifications generated with
MilaNLProc/bert-base-uncased-ear-misogyny
.
Hate speech classifications generated with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.000
You're listening to Making Sense with Sam Harris.
00:00:04.280
This is the free version of the podcast, so you'll only hear the first part of today's
00:00:07.980
conversation.
00:00:09.160
If you want the full episode and every episode, you can subscribe at samharris.org.
00:00:14.240
There are no ads on this show.
00:00:16.140
It runs entirely on subscriber support.
00:00:18.640
If you enjoy what we're doing here and find it valuable, please consider subscribing today.
00:00:24.300
Welcome back to another episode of More from Sam.
00:00:26.480
Once again, we are taping this live in front of subscribers where anything goes.
00:00:31.040
We've had them submit questions in advance of the show, and I will try to get to as many
00:00:34.220
of those as possible.
00:00:35.400
And then we've asked them to provide any follow-ups using the chat feature so that Sam can address
00:00:39.780
their feedback in real time.
00:00:41.540
This worked really well last week or last episode, I should say.
00:00:44.280
And it's very helpful to have a bunch of smart people feeding me lines.
00:00:47.400
So please keep those comments coming.
00:00:49.980
All right.
00:00:50.400
Before we get to our first topic, I just want to give a quick rundown of the guests you'll
00:00:54.180
be recording with on the podcast over the next few weeks.
00:00:56.260
we have Tristan Harris, Lloyd Blankfein, Rahm Emanuel, Francis Fukuyama, Ben Shapiro,
00:01:01.980
Michael Pollan, and Siddhartha Mukherjee. And that's just April.
00:01:05.580
Yeah, a lot coming up. Yeah.
00:01:07.180
Yeah. So if anyone wanted more content from you, got a lot coming up. And I'm hopeful that
00:01:12.020
you'll have another essay for us soon. Nobody does them quite like you do. And I'm certain
00:01:16.620
the audience agrees with me. All right, let's get to our first topic. We're going to get your
00:01:21.500
updated thoughts on Iran, AI, and other concerns. But first, a lot of people feel overwhelmed
00:01:26.120
by many things these days, including the pace of change and the fear of being left behind in an
00:01:30.660
increasingly AI-driven world. Yet, even with some legitimate fears, there's still so much to be
00:01:35.420
grateful for, but it feels like no matter how much better things get, things feel worse. Maybe
00:01:39.840
you could remind us what we still have to be grateful for and how to best navigate this moment.
00:01:46.580
Well, I think it's just useful to ask yourself the question. Even if your job in some sense is
00:01:52.540
to pay attention to risk or the downside of things or to criticize bad. I'm just thinking
00:01:58.360
personally how I navigate this. I spend a lot of time thinking about what's wrong and the needless
00:02:04.760
own goals we score on ourselves as a society. All of that can be depressing, but the filter I use
00:02:11.660
to do that is to ask myself, how unhappy do I have to be in the meantime? Is my being unhappy
00:02:17.660
contributing anything useful you know on the side of my own motivation to do any of this work or
00:02:23.100
my ability to communicate well about it or i mean just is it useful and you know almost always the
00:02:30.540
answer is no right so like there really is a potentially a radical disjunction between even
00:02:36.400
paying attention to scary and depressing things and being scared and depressed in one's life
00:02:42.960
moment to moment i mean i just i just think that second piece isn't actually necessary i'm not i'm
00:02:48.840
not saying there's there's never bleed through but it's um there can be surprisingly little when
00:02:53.540
you reflect on just how lucky you are moment to moment even with all the things you might be
00:02:59.220
concerned about so and obviously there are many people whose jobs are nothing like mine and they
00:03:04.120
they can withdraw their attention from current events and from you know things like existential
00:03:08.840
risk and they can do it knowing that for the most part they can't do anything about those
00:03:13.840
things right they're not they don't have a job that requires them to be up to the minute on
00:03:18.900
whatever it is you know pandemic risk or you know or nuclear proliferation or any other sort of
00:03:24.400
damocles that's hanging over our heads there is no good reason to um simply become morbid in the
00:03:31.240
way you pay attention to the world right i mean it's just not it's not useful right and so i really
00:03:37.000
think only mindfulness gives you the capacity to make these choices moment to moment i mean if you
00:03:42.140
really and if you don't know what i mean by mindfulness then there's really nothing there's
00:03:45.560
no um there's no foothold to grab there i mean you really just have to learn something about it
00:03:50.260
but if you can notice the moment to moment consequences of paying attention to things and
00:03:56.320
and and how you use your attention being consequential it allows you to decide you know
00:04:02.440
to kind of wisely curate the contents of your own consciousness and withdraw your attention
00:04:08.300
from things when that when your attention on them serves no good purpose and i don't know
00:04:12.940
just kind of break the addiction of being unhappy in the usual ways right i mean many of us get sort
00:04:17.760
of stuck in the rut of conforming to various patterns of attention and you you can just
00:04:23.840
decide to break break those habits so yeah obviously you can think in the stoical vein of
00:04:28.740
all the people in the world whose prayers would be answered if they could simply be in your exact
00:04:32.620
situation right now. I mean, think of all the bad things that haven't happened to you that if they
00:04:36.120
had, you know, what you'd pay just to get back to where you are right now. I mean, those are useful
00:04:40.240
reflections. But, you know, it is just in fact true that life is very, very good for so many of
00:04:45.460
us. And it's very easy not to be aware of that moment to moment. So how do you help people
00:04:50.200
navigate the anxiety around their jobs if they think that their job's going to be going away?
00:04:54.940
I mean, so many people are talking about right now, is it going to be months?
00:04:57.640
Is it going to be weeks?
00:04:59.200
Maybe a little bit longer before I lose my job.
00:05:01.240
And if you're looking out there looking for a job right now, if you're a young kid out
00:05:05.340
of college, it's great if you have expertise in good taste because AI sort of plays like
00:05:10.800
this infinite boardroom for you of experts.
00:05:13.560
But if you don't have the experience and the higher skilled people aren't going to hire
00:05:18.300
you, the anxiety is real.
00:05:21.100
I mean, how does mindfulness help those people who are looking to...
00:05:23.600
Well, so specifically on that point, I mean, I think, I don't think you can, you can boycott
00:05:27.980
AI at this point.
00:05:28.920
I mean, I just, I think the right thing to do is figure out how to use it in beneficial
00:05:33.420
ways, you know, for your career and for your personal life.
00:05:36.140
I just think it's, I mean, some people can ignore it, but for the most part, certainly
00:05:40.340
if you're in any kind of job or hoping to be in a job that focuses on information, I
00:05:45.760
mean, if it's a job you can do sitting behind a desk, I think AI needs to become your friend
00:05:50.660
leaving aside all of the other issues we might have about it and the other concerns about
00:05:54.880
alignment and all of that. I think in the limit, when we start to see the real evaporation of
00:06:02.380
jobs because of how good AI is getting, we as a society are going to have to figure out how
00:06:07.940
to navigate that. And that I think is probably coming sooner than many people expect. I think
00:06:13.760
it is definitely coming. Many people expect that it's not coming. AI is just going to create a
00:06:18.200
bunch of new jobs that we don't have names for yet. And really, there will be no radical
00:06:22.040
displacement. I think that's just happy talk. But there are people who, there are smart people who
00:06:26.300
believe that by analogy to previous breakthroughs in technology. But I do think, I think we as a
00:06:31.600
society are going to have to figure out how to absorb productivity gains that don't ultimately
00:06:37.280
entail people becoming more productive, right? I mean, so that the AI starts doing work that
00:06:43.380
people are doing now and um there's job cancellation i think that's coming people can't solve that by
00:06:49.560
themselves though really i mean once it comes at any kind of scale society has to solve it
00:06:53.940
well i get that but i'm talking about the delta between where the shit gets bad and before it
00:06:58.480
gets better and so for people like you say make ai your friend that's great because you have good
00:07:03.660
taste and expertise so you can tell ai what you want but if you're somebody on the other's other
00:07:08.320
into that and you're somebody whose job it is to do admin or coordination or some of the other
00:07:13.440
tasks, if you're a paralegal or even a junior lawyer or any of the other examples, again,
00:07:20.300
the anxiety is real. How does mindfulness help here? How do people navigate it? Because I mean,
00:07:25.100
again, I understand what you're saying is so much of it, it's in our heads, but there is a reality
00:07:29.620
that this is different. Mindfulness helps with everything because in each moment, there's either
00:07:34.080
There's something for you to do or there isn't, right?
00:07:36.420
So if there's something for you to do, well, then you just do that thing, right?
00:07:40.540
I mean, again, this applies even in emergencies, right?
00:07:42.840
The house is on fire, and you now need to get your kids out to safety, and so you have
00:07:48.460
to escape, right?
00:07:49.260
So action's required, and you don't need to suffer over performing that action.
00:07:55.400
You just have to do the thing, right?
00:07:57.080
Now, if there's nothing you can do to change your situation or to change the risk you're
00:08:02.220
confronting, well, then your misery is adding nothing to that occasion either. Now, in either
00:08:07.620
case, your misery is extra, right? Now, this can be a high bar to clear for people who don't have
00:08:13.840
a mindfulness practice, but once you do, you can actually differentiate these components to your
00:08:19.580
engagement, you know, with your life, moment to moment. I mean, you can have a highly energized,
00:08:26.560
you know, motivated, even adrenalized experience that isn't a miserable one, right? You can be
00:08:31.820
responding to an emergency and not be miserable. You can be making decisions over a longer time
00:08:36.860
horizon that entail a lot, that are, you know, kind of scary decisions, right? It's like you
00:08:40.820
could say, okay, now I need a surgery and it's, you know, in two weeks I'm going to have a surgery
00:08:45.220
which I'm, you know, anxious about, right? But, you know, all things considered, it is just the
00:08:49.720
right decision and now I've got, now the surgery's on the calendar and now I've got this thing looming
00:08:54.020
and so, okay, so now, but now the question is, over the next 14 days, how much time are you going
00:08:59.080
to spend being miserable because you're anxious about the surgery. All of those moments are
00:09:04.640
discretionary once you know how to be mindful. Once you've decided what you have to do, there
00:09:09.700
really is no more to think about, really, right? Now, you will helplessly be knocked around by
00:09:14.900
your thoughts, but that's where mindfulness comes in. And again, if you don't have a mindfulness
00:09:19.740
practice, you are going to be the mere hostage of those thoughts, right? So you'll be as anxious
00:09:24.580
and as fearful and as worried and as sleepless as you'll be because your mind is completely out of
00:09:30.940
control for the next two weeks. But there simply is no alternative to mental training once you
00:09:38.720
get into a situation like that. I mean, the time to develop a mindfulness practice is before you
00:09:43.120
really need it, not when you're in the middle of that maelstrom. But it really, I mean, it is there
00:09:48.740
this is a capacity you can develop and it really does provide relief. I mean, every time you find
00:09:56.820
yourself suffering, you can recognize that you're thinking without noticing that you're thinking
00:10:01.920
and then wake up from that dream. And it doesn't change the fact that you still might have to have
00:10:05.880
surgery in two weeks, right? But again, all of the suffering that precedes it is unnecessary
00:10:10.880
and the same will be true afterwards, right? I mean, again, you're just going to be in the
00:10:16.220
company of your thoughts 99% of the time. Well, speaking of relief, the New York Times reports
00:10:21.080
that after decades of religious decline, people have stopped leaving churches. Now, it doesn't
00:10:25.160
point to a revival necessarily, but maybe people like the feeling of something familiar in uncertain
00:10:30.760
times. How does that sit with you? That might be, yeah. I just don't know how durable that
00:10:36.440
change is. I mean, the larger trend is of kind of a massively secularizing change in our culture
00:10:44.140
over the last you know quarter century but um no i could well imagine people want community and they
00:10:50.560
want you know real world experiences and it's comforting to be inside a church and i love
00:10:55.720
churches right you know i love sitting in churches so i get it but yeah i don't know i don't know what
00:11:01.800
to draw from that headline well i want to play a video for you and get your thoughts on this can
00:11:06.400
we play the um rubio clip speaking of religion here we were all created every single one of us
00:11:14.560
before the beginning of time by the hands of the god of the universe an all-powerful god did you
00:11:19.960
see this no and created us for the purpose of living with him in eternity god this is rubio
00:11:25.580
the atheists don't make content anywhere near this inspiring
1.00
00:11:29.320
took on the form of a man and came down and lived among us i don't find this inspiring this is oh
00:11:35.840
that music died like a man all right i think we'll see enough yeah okay yeah i i mean i haven't
00:11:41.240
really tracked uh rubio's level of religious fanaticism i mean you know i want to when i
00:11:46.780
think of someone like pete hegseth i get quite worried because he clearly is someone who's
00:11:51.560
making decisions based on his bible thumping but well i was going to ask you that because i thought
00:11:56.340
like if you had to pick one Republican to take over in 2028, if you had to do that, I would have
00:12:01.100
assumed it would have been someone like a Rubio. And then seeing this video, I thought, wait a
00:12:05.360
second, where did this come from? But even with this video, I'd still think. I mean, who knows
00:12:09.960
what this guy believes, right? This is very few people have shown themselves to be this malleable
00:12:16.800
in the face of, you know, political imperatives. You know, he is someone who rightly identified
00:12:24.140
had Trump as a con man who was destroying conservatism, and now he's secretary of state
00:12:29.640
and just a odious lickspittle. I mean, so I think the only reason why I view him differently
00:12:36.620
from someone like Pete Hegseth is that he was a sort of normal politician with a normal degree
00:12:43.660
of qualifications for his role, right? So he's not egregiously unqualified to be secretary of
00:12:50.980
state or president or any other. I mean, he's just, he's a normal, normal candidate for these
00:12:56.060
kinds of roles. Whereas, you know, Pete Hegseth is mostly a Fox News personality, though he can
00:13:02.660
bench press 300 plus pounds, which is genuinely impressive and just a proper religious maniac
0.99
00:13:08.240
by all appearances. So he seems quite a bit scarier to me, but yeah, maybe this is sincere
00:13:14.200
from Rubio. I don't know. I mean, he's just, he's just a shape-shifting opportunist as far as I can
00:13:19.520
help speaking of people sound like religious maniacs did you see that um video i want to play
00:13:23.940
this for you of greg phillips from fema who claimed that he had time traveled to a waffle house
00:13:29.740
i forgot who it was yeah i know the story yeah can we play that clip real fast we had a teleport
00:13:36.200
incident two of them um which uh which transported me about uh 40 miles from
00:13:45.220
from where I was in near Albany, Georgia, to the ditch of a church. I ended up at a
00:13:55.240
Waffle House like 50 miles away from where I was. It was an incredibly frightening moment
00:14:01.860
to experience yourself in your car flying through the air. It was possible. It was real.
00:14:09.920
i want to be hanging out with her after that interview ended jesus he's at fema yeah we can
00:14:21.220
all sleep peacefully at night knowing that that our emergencies will be responded to by
00:14:25.400
a man who's convinced of the physics of teleportation who among us hasn't teleported
00:14:31.400
at some point to a waffle house jesus i think 50 miles is reasonable yeah he would have claimed
00:14:36.740
500 i thought he was shit yes to be unconscious while driving 50 miles that seems normal i think
0.63
00:14:43.440
you've had ambient ambient experiences like that i don't think it's been that far unbelievable
00:14:48.440
uh the last time we went to the moon you were a kid how do you feel about this latest mission to
00:14:54.100
the moon i'm amazed at how little bandwidth it's taken up for anybody i mean i i haven't seen
00:14:59.680
me in a different time we would see a lot of press coverage of this i haven't seen
00:15:05.200
nearly as much as you know it's sort of at the level of this guy teleporting to a waffle house
00:15:10.160
in my algorithm i mean i think it's great i you know i i think it's amazing that we do this
00:15:15.740
this sort of thing but it's amazing how little bandwidth we have for nice news stories like that
00:15:22.620
yeah we had a very uniting if short-lived moment when the u.s hockey team won the gold medal earlier
00:15:27.780
this year do you do you think this mission will do anything for us on that level uh maybe when
00:15:32.760
they have a parade when they get home or no one's going to care? No, I think we're pretty jaded at
00:15:38.920
this moment and distracted for obvious reasons. I'd be surprised if the ticker tape parade got
00:15:44.480
much coverage. There'll be 100,000 old people on CNN watching it. Right, like the Rose Parade?
00:15:49.340
Yeah. What do you think we could do as a society to bring us closer together? Is there anything
00:15:53.140
we could do right now? Get off social media. I think that would be to uncouple us from all the
00:15:59.100
maniacs from the you know the 10 percent of us that are that are trolls and lunatics and bots
00:16:03.400
and grifters and uh just dial down some of that noise so that we can get a little more signal i
00:16:09.760
think that would that would be good if all the social media people pulled the plug on social
00:16:14.880
media i think they would all deserve i told jack dorsey this you know when he was running twitter
00:16:19.680
if he just pulled the plug on twitter saying sorry this just this didn't work out guys he
00:16:24.220
would deserve the noble peace prize and and uh i think that's still true it's probably even more
00:16:28.460
true now. I'm going to shift topics here. It's a great question from a Substack subscriber.
00:16:34.080
You said conversations with people you strongly disagree with can be unproductive, hard to fact
00:16:38.360
check in real time, prone to confusion. But you've also argued that conversations are only
00:16:42.760
alternative to violence. And your early debates on religion showed you engaging calmly and
00:16:47.560
rigorously across deep disagreement. In recent years, many of your guests seem to largely share
00:16:52.360
your views with differences mostly in emphasis. There's a real value in surfacing expertise,
00:16:57.060
reinforcing important ideas, but do you worry it comes at the cost of one of your genuine strengths,
00:17:02.000
modeling how to engage thoughtfully with opposition? Most real-world disagreements
00:17:06.900
are messy and emotionally charged. Isn't there still value in demonstrating how to navigate
00:17:12.280
those well, even if the conversation isn't perfectly clean? Yeah, up to a point. I mean,
00:17:17.640
I think some of those conversations are useful, and I keep looking for the ones that I think will
00:17:21.740
be useful. I mean, I think there's generally greater utility in when there's something to
00:17:27.980
learn about an issue or something to figure out, bringing on someone who really knows their stuff
00:17:32.820
and just, you know, helping me learn more in real time in front of my audience. So I'm thinking
00:17:38.620
about, you know, the consequences of Trumpism and having Trump for a second term, you know,
00:17:42.800
and just the way the rest of the world perceives us and, you know, the loss of American leadership
00:17:47.400
and et cetera, all of that.
00:17:49.240
If I bring on someone who I know I'm going to agree with, but who just knows much more
00:17:54.480
about certain details than I do, I bring on someone like Anne Applebaum, right, who can
00:17:58.160
just give me the view from the other side of the world with much more historical context
00:18:03.500
and who's just deep in the weeds on the way propaganda works and the way democracies unravel
00:18:10.420
and all the relevant historical analogies, right?
00:18:13.240
I'm not going to disagree with Ann about much, if anything, but that's not the point.
00:18:17.600
The point is to hear what she has to say and to learn a little bit more each time I do
00:18:22.420
that.
00:18:22.800
I think that's usually more useful than me getting some person on who stridently disagrees
00:18:30.040
with a position I already have, and I just know going into it that it's just going to
00:18:34.760
be an exercise in my attempting to showcase their errors, which I already understand to
00:18:41.120
be errors.
00:18:41.620
right? So someone who doesn't really understand jihadism at all and doesn't believe in it and
00:18:45.960
thinks it's all economics and politics and bluffing, okay, I can get that person on for
00:18:50.200
another two-hour round of, you know, brain damage. But it really is just brain damage, right? Now,
00:18:56.040
it's not to say that it's not going to be useful for some people in the audience to see me hit
00:19:01.220
those pitches again and again. But I've done it so many times, I'm not sure about the value of
00:19:07.800
continuing to do it and and it also runs the risk of being confusing to some people anyway because
00:19:14.880
certain moves are reliably confusing right well i think people like the conversations you're having
00:19:20.280
i think it's just it feels as though you've been just avoiding the chance at having a bad conversation
00:19:25.300
and they feel that many of them can learn from a bad conversation i just don't see like i like
00:19:31.060
who are we talking about i mean like i'm going to talk to someone like hassan piker would that be
00:19:35.440
fun no not really i mean it just i i don't think i don't think anyone should be listening to this
00:19:40.520
guy right so it's like it just seems like well that's a different point i think what they're
00:19:44.740
saying is there are bad conversations that they're constantly having and they can still learn how to
00:19:50.180
have a good conversation in a bad conversation or how to navigate a bad conversation and with you
00:19:55.580
lately just saying fuck it i'm not going to have bad conversations anymore because this is not
0.95
00:19:59.520
the best use of my time or i'm not going to spend two hours on this i think that a lot of people
00:20:04.220
are saying, well, even from those bad conversations, I don't think they mean the
00:20:07.600
Omar Z's conversation examples. Well, I mean, so I'm going to talk to Ben Shapiro. I anticipate
00:20:12.740
that being a potentially bad conversation, at least for half of it. I mean, half of it will
00:20:18.300
agree about certain things, but part of it will be exposing a fair amount of daylight between us
00:20:24.500
around Trump and current American politics. So I'm going to go into that thinking we both might
00:20:30.680
get somewhat uptight in this conversation, at least for part of it. And I'm not avoiding it.
00:20:35.080
I think it's useful because Ben is a serious enough person with a big enough audience that
00:20:40.560
it's useful to try to kind of make some sense in his direction. Would I talk to Candace Owens? I
00:20:46.480
don't think so. I mean, she's got an even larger audience, but she's a total lunatic. And I'm not
1.00
00:20:53.140
sure, I mean, apart from, I mean, I could approach it the way I approached the conversation with
00:20:58.900
Doug Wilson, the pastor who I knew just how far out he was as a fundamentalist, but I
00:21:05.880
approached that differently than I might have.
00:21:09.020
I mean, I was not looking for conflict.
00:21:10.460
I was looking, I trust, in that case, I trusted my audience to understand what's wrong with
00:21:16.260
a pastor who will sign off on maybe one day bringing chattel slavery back because it's
0.97
00:21:21.660
in the Bible, right?
00:21:22.720
So that's obvious enough that I can, I don't have to dunk on that point.
00:21:27.520
i don't have to say oh well like i just hope it's clear that i'm against chattel slavery i mean so
00:21:31.440
it's just you can just be more of an anthropologist there rather than a you know somebody who's going
00:21:36.660
to dig in and really have a debate you know my conversation with ross doubt that right like we
00:21:41.360
did not agree that was had the quality of a debate about religion i think he's more of a
00:21:47.000
religious extremist than people appreciate and i i feel some of that came out in our conversation
00:21:52.800
i mean he's surprisingly extreme in his claims to to you know his faith claims given that he's
00:22:01.100
also trying to function as a normal journalist at the new york times so those are not totally
00:22:06.660
those are somewhat adversarial conversations but they're not to talk to someone like i don't know
00:22:11.900
again i just brought up hassan piker at random someone like him i mean he's just kind of a
00:22:15.940
performance artist. I mean, he's a deeply confused, I think, fairly amoral, fairly juvenile person in
00:22:24.200
his ethics, right? I mean, he's just a, and he's just a bit of a nutcase and dishonest. And so
00:22:29.260
it's just, it's such a mess. I mean, there's so much bad faith that you have to anticipate going
00:22:34.100
into a conversation like that. My first question is why do it? Because it can stand the chance of
00:22:40.380
being genuinely confusing, certainly to anyone who's sort of in his audience, right? Because
00:22:46.720
there's just so much, it's asymmetric warfare. It's so much easier to make a mess than to clean
00:22:51.140
it up, right? That's why I wouldn't talk to someone like RFK Jr. I mean, RFK Jr. is basically
00:22:55.560
a nutcase, right? I mean, there's something wrong with the guy. He's a liar, absolutely a liar,
00:23:01.460
but he's also a kind of a confabulator. He's a bullshit artist. And also I just think a little
00:23:06.640
crazy, right? So he's, you're dealing with all of that. It's very hard to do in real time. Again,
00:23:12.480
for the kinds of things he's crazy about, what you want is someone who's deep in the weeds on
00:23:19.040
the science of vaccines and immunology and just, you know, virology and all of it, right? So I'm
00:23:25.220
the wrong guy to have that conversation. I've always said that. It's not to say I couldn't
00:23:28.840
take a month of my life and get up to speed on some of that, but it's just not worth a month
00:23:32.500
of my life to do that. So all of it is just, at this point, you have to have a life is too short
00:23:39.280
module in your brain and consult it occasionally. And for many of these conversations that people
00:23:44.980
seem to want me to have, I have to say life is too short for many of them. It's not to say, I mean,
00:23:50.360
given the right candidate, you know, I might jump at the chance. Again, I mean, you know,
00:23:55.200
the monk debates, I think, they asked me to debate Tucker Carlson, and I was surprised
00:24:00.640
to find myself saying yes without any reservation, really, because Tucker has become such a fixture
00:24:07.500
in our politics. But that seems to have evaporated. So I don't know what happened there. I'm sure he
00:24:12.780
said no. But so I would do that. I would do something like that. But again, it really has
00:24:18.140
to be worth it. Well, Rogan has said he doesn't want to talk to you publicly until you've debated
00:24:22.460
Brett Weinstein. Is that something you consider doing?
00:24:25.500
Oh, well, no. I mean, for the same reasons, I wouldn't debate RFK Jr. I mean, it's just
00:24:30.640
It's very disconcerting not to know whether someone has lost their mind, right?
00:24:37.160
Because when you look at someone like Brett Speak, he just seems to be the picture of
00:24:42.840
reasonableness, right?
00:24:43.980
He's not getting blown around by his emotional life, certainly not in any obvious way when
00:24:49.280
he's certainly not on a podcast with Rogan.
00:24:52.480
So he's not an Alex Jones-like character where you look at the way he's delivering the lines
00:24:58.620
and you think, okay, this is kind of a case study in, in, you know, chemical imbalance or something,
00:25:04.720
right? Like what we need is a psychopharmacologist before anything else happens here so that we can,
00:25:09.020
we can try to get this guy back to some physiological baseline. That's not happening
00:25:12.760
with Brett at all. And yet the things he says are equivalently crazy and the certitude with which he
00:25:18.940
says them, it's totally indefensible, right? So you take his recent appearance on Rogan and you
00:25:25.040
stick his claims into any LLM and what you get is just a litany of obvious errors. And I did that
00:25:32.640
for Joe. I sent him a link to my chat GPT session. Like, Joe, just here's a sanity check. Listen to
00:25:39.340
what the robots say about what Brett was giving you on this most recent podcast. And he didn't
00:25:43.900
seem to want to do that. So he's still convinced that Brett was right about everything, though
00:25:47.500
Brett thinks that 17 million people were killed outright by the vaccines and that ivermectin is
00:25:52.280
still worth taking. I don't know how to interact with that, but what it requires is if it were
00:25:58.040
going to be done at all. I mean, Brett is just the right, you know, Brett is not someone to take
00:26:01.420
seriously on this topic, but you know, if you were forced to take him seriously because he's
00:26:05.140
made so much noise about it, what you want is someone who's deeper in the weeds on the relevant
00:26:10.540
science than either of us are and let that guy or gal have the debate. And that's what I urged
00:26:17.740
joe to do i mean joe was wrong there i wasn't urging that i do a post-mortem on covid about
00:26:23.320
you know rfk jr or or brett or anyone else any other lunatic he's had on his podcast i was urging
00:26:29.160
that he have the relevant experts do it right i mean is that it's not my wheelhouse right it
00:26:34.600
shouldn't you know i'm not going to unlike many of these guys i'm not going to pretend it's my
00:26:39.060
wheelhouse uh knowing that i can be a quick study and and uh sound like i know what i'm talking
00:26:45.060
about. I'm not an immunologist. I'm not a virologist. I'm not an epidemiologist. You want
00:26:51.340
to be all three of those things. Like everyone else on the internet? Yeah, yeah, to have this
00:26:55.220
conversation responsibly. But I know enough of what mainstream science thinks about what happened
00:27:01.340
during COVID to know that Brett doesn't make any sense on this topic. Yeah, another question while
00:27:06.000
we're on Rogan. You've been critical of Rogan's irresponsibility in the spread of misinformation,
00:27:09.980
but he's taken basically the same approach since he was bullshitting stoned in his living room with
00:27:14.360
an audience of a few thousand. Not his fault. His audience has become huge. Isn't the real problem
00:27:18.980
that the epistemic institutions have trashed their credibility and the audience is lacking
00:27:23.200
discernment? Well, it's both, but I think I sent you this clip of, um, that the algorithm served
00:27:28.440
me of, of Joe talking to Theo Vaughn and Theo was, you know, just kind of melting down around his
00:27:34.040
anxiety about everything. It seemed, um, super worried about the war in Iran and, and, um,
00:27:39.460
worried about Israel, I guess. I mean, I forget about some of the specifics of the clip, but
00:27:45.560
they went on for like 10 minutes, kind of casting doubt on everything. And then they had seemed to
00:27:49.820
have a lot of time for some, I think it was CIA conspiracy theory, which was getting delivered
00:27:55.060
to them on what looked like a short YouTube clip or a TikTok video by someone they liked. And it
00:28:00.680
was probably a rehash of MKUltra or some old story about the CIA putting LSD in the water,
00:28:06.900
or something they did in the 1950s.
00:28:08.900
But, you know, it was a-
00:28:10.220
Which they need to do again now.
00:28:11.620
Yeah, but it was a completely paranoid story
00:28:13.760
about the CIA trying to make us all dumb
00:28:16.040
so we'll be more bovine and compliant.
00:28:19.940
And, I mean, the image I got
00:28:21.880
is just two kind of pyromaniacs
0.58
00:28:24.220
just lighting matches
00:28:25.680
on a landscape that they had spent years
00:28:28.640
soaking in gasoline, right?
00:28:30.820
It's like there's, I mean,
00:28:32.300
and this really is,
00:28:33.300
it's relevant how large the audience is for this. It's relevant just how much cultural damage is
00:28:39.960
being done every time these guys basically play the just asking questions routine on socially
00:28:48.220
combustible topics with tens of millions of people listening, right? It's just, it's completely
00:28:54.440
irresponsible. It is genuinely dangerous. It's genuinely corrosive of our culture. It's genuinely
00:28:59.960
misleading of their audience. And because they're not journalists, they feel no responsibility to
00:29:05.860
get their facts straight. They certainly don't correct their errors. And I mean, they don't
00:29:10.000
have the mechanism by which to correct their errors. It's just not...
00:29:12.860
They'll even admit that this might be some tinfoil...
00:29:16.860
I think Theo said that in the middle of this clip. Yeah, like this might be tinfoil hat time.
00:29:20.640
But still, they're still just flicking matches at everything, right? And the worst thing about
00:29:26.180
all of this, is there addiction to a conspiratorial framing of everything? That is, if you can
00:29:33.360
extract any lesson from what's happened to our politics in the last decade and the role that
00:29:40.060
people like Rogan have played in the unraveling of everything and the way in which social media
00:29:46.400
has weaponized all this and the rise of people like Tucker and Candace and Nick Fuentes and
00:29:51.860
And the fact that we've got Trump a second time around and central to all of it is this
00:29:57.280
addiction to conspiracy thinking and contrarianism and, you know, what I've called the pornography
00:30:04.560
of doubt, right?
00:30:05.660
And Joe has been as addicted as anyone and has brought it to scale perhaps more than
00:30:12.740
anyone.
00:30:13.480
And it's totally unprincipled.
00:30:15.980
It is genuinely confusing to millions of people.
00:30:18.380
I mean, you've got young people getting raised on a diet of this bullshit.
00:30:22.560
It's divisive.
00:30:24.000
It amplifies the worst in us.
00:30:26.560
And it's undermining of, you know, yes, our institutions have done something to lose credibility,
00:30:32.940
right?
00:30:33.140
They have become politicized in ways that they shouldn't have become politicized.
00:30:36.400
Yes, all of that's true.
00:30:37.620
The remedy for that is not a torrent of bullshit from podcasts and the platforming of proper
00:30:44.600
lunatics and people who think they were denied the Nobel Prize when they have almost no scientific
00:30:49.760
reputations to protect. No, the remedy is more good science and good journalism and real
00:30:56.260
intellectual integrity and holding institutions to account in serious ways, not spreading lies
00:31:02.780
and half-truths and cockamamie conspiracy theories. And I mean, really, you'd look at that
00:31:08.820
clip of uh joe and theo who are both good guys i mean i mean that what the paradox here for me
00:31:15.220
ethically is that this what i'm talking about is a species of evil right given its consequences
00:31:20.480
it's a species of evil right it's it is like at the top of the list of what ails us in our society
00:31:27.080
it is the thing that is preventing us from solving real problems in this world there's no question
00:31:32.720
it is getting people killed and will continue to get people killed it is absolutely toxic and yet
00:31:38.020
many of the people participating in this are just good guys who are just having fun, who are just
00:31:42.460
entertaining. They think there's no stakes, right? They're just like athletes, right? They're just
00:31:48.220
having fun. They're just playing a game. Joe's just playing a game. But it's a game with real
00:31:52.580
consequences, right? It's like, how would you play tennis if you knew that every time you lost a
00:31:58.880
point, people would die, right? I mean, that's the kind of game that's being played with information
00:32:04.520
now. And so people like Joe and Elon and people who have audiences in the tens and even hundreds
00:32:11.140
of millions have a real responsibility to get their heads out of their asses. And they're not
00:32:17.600
showing any aptitude for doing that. I'm going to move to another question. I don't like that
00:32:21.940
you seem to use the term woke in the same pejorative way that those on the right do.
00:32:34.520
podcast player.
Link copied!