Sam Harris on Political Tribalism, Cultural Divisions and Finding Inner Peace | Ep. 37
Episode Stats
Length
1 hour and 38 minutes
Words per Minute
170.76505
Summary
Sam Harris is a philosopher, an author, and host of the Making Sense podcast. He's a liberal, but he's not woke. And he's one of those guys who just makes you feel like, Why couldn't I put it that way? Super smart, very big, big-brained guy. But he's also pensive. I think you'll find him illuminating on how you could quiet your own mind and enrich it after you quiet it.
Transcript
00:00:00.520
Welcome to The Megyn Kelly Show, your home for open, honest, and provocative conversations.
00:00:11.980
Hey, everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show. Today on the program,
00:00:16.000
we've got Sam Harris. This guy is a philosopher. He's an author. He is host of the Making Sense
00:00:23.580
podcast. And I would say more than anybody else, he is the name that has been suggested to me
00:00:29.140
as a potential guest on the program. And I've listened to Sam in the past, but I was fascinated
00:00:34.140
to see what is it about him that has made him so very popular with the people who listen to this
00:00:39.900
podcast. And now I get it. Now I get it. He's a liberal, but he's not woke. And the way he talks
00:00:47.260
about the woke is incredibly eloquent and thoughtful and smart. And he's one of those guys who just
00:00:52.180
makes you feel like, oh, why couldn't I put it that way? Super smart, very big, big, big-brained guy.
00:00:58.140
But he's very thoughtful. He's pensive. I think you'll find him illuminating on how,
00:01:04.980
A, you could quiet your own mind and, B, enrich it after you quiet it. It may require you being
00:01:11.800
quiet for two months in a row, but we'll talk about that later. But I think we had a good spirited
00:01:16.660
exchange on things like Trump and his fans and sort of the media and their coverage of
00:01:24.340
politicians and their, quote, lies. Anyway, we'll get to Sam in one second. But first,
00:01:30.260
I want to talk to you about Jan Marini's skincare. You guys, this is a beautiful line of products.
00:01:37.560
I myself tried them. And number one, they don't smell, which I like. I don't like a lot of odors
00:01:43.900
in my products. Number two, they idiot-proof it for you. They tell you in the regimen,
00:01:48.560
this one goes on in the morning, this one goes on in the evening, this one you can put on both.
00:01:52.080
And so, you know how, like, you're a busy person, you don't have time, so, like,
00:01:55.020
they walk you through it. So Jan Marini Skin Research is a recognized leader and an innovator
00:01:59.700
in skincare. The easy-to-use products keep your skin feeling nice and refreshed, nice and hydrated.
00:02:05.260
You don't get, like, shiny. And the Jan Marini buzz is that it's one of the fastest-growing
00:02:10.120
professional skincare brands in recent years. That's what everybody's saying about it. And it's true,
00:02:14.060
because it's got a bunch of awards. It's used by multiple movie and TV production sets. It's
00:02:20.180
being used on the set of Spider-Man, I guess. Yay. And it's a five-step daily care system.
00:02:25.040
You cleanse, you rejuvenate, you resurface, you hydrate, and you protect. Their skincare
00:02:29.060
management system has been awarded 10 consecutive years by New Beauty Magazine as Best Skin Care
00:02:33.880
System for Aging Skin. Jan Marini Skin Research has earned more beauty awards from New Beauty than
00:02:38.840
any other skincare company. The products are hydrating, they're calming, they have numerous
00:02:44.240
clinical studies conducted by leading dermatologists, and you can get them anywhere, really.
00:02:47.700
They're at med spas, aesthetic offices, spas throughout the country. So go to janmarini.com
00:02:52.340
to find locations near you or purchase directly from the website. Plus, they've got some great
00:02:56.940
holiday offerings available and always with just two-day free shipping. Transform your skin
00:03:07.500
Really excited to talk to you and also a little nervous because I've been watching a lot of your
00:03:13.180
interviews and listening to your podcasts and you seem like an intellectual giant and I'm
00:03:24.820
Well, listen, I want to put you at your ease because, first of all, I think you and I agree
00:03:31.020
about many things and I'm a fan of yours. I'm happy to talk to you about anything and where
00:03:37.160
we disagree. I think it'll be fun. So let's just go.
00:03:41.440
Good. Okay. I feel a little better. I like to start where I can with news of the day.
00:03:45.040
And you've been so smart and easy to listen to on the topic of wokeness and the religion of
00:03:51.520
wokeness. And just today, we saw a school in Virginia, an elementary school, announcing that
00:03:57.600
it is dropping part of its name, Thomas Jefferson, quote, due to the pain his legacy can cause,
00:04:04.500
not even actually is causing, but can cause black students, despite an overwhelming majority
00:04:10.360
of the parents saying, we don't want this. We are not in favor of this at all, but it's
00:04:16.040
happening. And then here in New York, it was announced not long ago that at one of these
00:04:21.960
sort of expensive private schools, it's actually happening at more than one, parents must now
00:04:27.320
outline their commitment to anti-racism when applying. And they have to attend anti-racist
00:04:34.100
training before they can even get into the school. The anti-racist label is a rhetorical
00:04:39.840
trick. So does all of this concern you? I know you've got daughters and you're concerned about
00:04:45.780
wokeness in general, but I think in the schools, it's especially pernicious.
00:04:49.340
Yeah, well, it does. At first, I think we should just bracket all the heresy we're about to download
00:04:56.360
with an acknowledgement that racism is real and it's a problem and it's been an excruciating
00:05:04.180
problem throughout our history, right? So it's not mysterious, this kind of moral panic we're seeing
00:05:11.580
around the issue of racism now. I mean, we know what the past was like. The problem is that people seem
00:05:20.040
to, one, not want to acknowledge the progress we've made, right? So there's something deranging
00:05:26.280
about acting like this is 1964, right? Given all that's happened in the last 50 years.
00:05:35.580
You know, we had a two-term black president and that counts for absolutely nothing. We have a generation
00:05:41.520
that's acting like, you know, being on the three-yard line with respect to racism is, you know, a moment
00:05:51.300
of moral emergency, right? So I'm not saying that racism is gone. I'm not saying that there are no
00:05:56.340
ways in which there may be policies that disadvantage people in various groups and we want to untangle all
00:06:07.360
that and respond to all that. But we have made enormous progress and there are just not that many racists
00:06:17.340
out there, right, who want racist outcomes, you know, starkly unequal and unfair outcomes for people.
00:06:24.880
And so to not acknowledge that is deranging. And so now we have a kind of an activist cohort in our
00:06:34.480
society. It's, it's not clear just what percentage of, of American society it is, but it doesn't have to be
00:06:40.720
all that big to completely derail our conversation about these issues. And yeah, so now, you know,
00:06:47.800
defenestrating Thomas Jefferson, um, uh, or anyone else from our, from our history who, who's, you know,
00:06:59.060
obviously, uh, record on race was, was imperfect, uh, to say the least, but still you can't, you can't
00:07:07.820
discount the fact that this is one of the most important people in American history and, uh, still
00:07:14.740
the person who made, you know, an outsized contribution to creating our country. Um, so it's, uh,
00:07:24.060
yeah, this, this particular activism makes no sense. And yeah, it's, it's deeply concerning that
00:07:32.520
we, you know, these are educators. These are people who will be teaching children, uh, almost without
00:07:41.020
fail at this point to, to view the world through the lens of race in a way that, that, you know,
00:07:48.640
arguably was appropriate 50 years ago, but really isn't appropriate now. It's in fact,
00:07:54.840
totally dysfunctional now. I mean, some of the critics have said, this is, this is David Duke's
00:08:02.360
dream realized where everything really is about skin color. It's immutable. It's not overcomable.
00:08:09.980
It makes all the difference in character and how well you can do in society. You know, these,
00:08:16.400
the differences are innate and, um, should dominate anyone's perception of another or just upon
00:08:23.140
first meeting them. And that's where we're going. And I know you've said, I thought you made an
00:08:28.320
interesting statement because you're not big on false claims of victimhood. And you were saying
00:08:33.720
they, they can diminish the social stature of any group, including one that really has been victimized.
00:08:41.280
And so constantly, as you say, you know, we're on, we're on the three yard line with race,
00:08:47.560
constantly pretending that we're not, that we're already, we're still all the way down the playing
00:08:51.480
field can actually set a group back. I mean, it can actually, it's setting back race relations and
00:08:59.520
black people as a group. Yeah. Because it's, it's dishonest, right? It's just, it's not,
00:09:05.340
um, and it also just, it just violates the basic principle that will get us into the end zone.
00:09:12.300
I mean, to, to, to, let's just acknowledge what the goal is here. The goal is to wake up in a world
00:09:19.420
where these superficial differences between people, the skin color, uh, has absolutely no moral or
00:09:27.940
political significance, right? That's, I mean, that, that was certainly MLK's dream and it should
00:09:34.280
be ours, right? And, and activists on this issue and a host of other issues as well are acting like
00:09:42.160
these differences between people are indelible, morally important, uh, politically essential to
00:09:50.580
recognize at every turn and that, that any disparity we see in our society, you know, if we, if we inventory
00:09:58.480
our fortune 500 companies and, and our various professions, we find that there's not a perfect,
00:10:04.320
uh, representation of the general population in all of those places, right? If, if exactly 50%
00:10:12.660
of cardiologists are not women, if exactly 13 to 14% of, of, uh, uh, people in the C-suite at Apple
00:10:22.580
aren't black, right? The only explanation for this is bigotry, right? Now that is just,
00:10:29.740
there are many things wrong with that, but first is that it is almost certainly untrue, right? I mean,
00:10:35.300
at minimum you would need real evidence to make that allegation. And there's so many other
00:10:41.000
explanations that, that, uh, promote themselves here. And it's so the, the dishonesty of it is toxic,
00:10:49.660
uh, not to mention the fact that it's seeing yourself as a victim perpetually and, and locating
00:10:55.100
your, your social power, your, your status in victimhood, which is really the, the algorithm
00:11:01.560
that is running here. Uh, it's just, it's, uh, it's intrinsically divisive, right? I mean,
00:11:07.460
if you're, if you're, if you're politics is based on the politics of identity rather than,
00:11:11.720
than, uh, looking for solutions that benefit everybody, looking for systems that are intrinsically
00:11:18.580
fair, right? You're, it's just, you can't, you can't possibly converge with other people because
00:11:26.440
all you're doing is ramifying your differences. Uh, so you're, you're, you're viewing everything
00:11:31.100
as zero sum in principle. Uh, and it's just, it's, um, yeah, it's, it's regressive. And again,
00:11:38.420
it, it, it doesn't acknowledge any of the progress, progress we've made or the, the principles by which
00:11:44.900
we've made that progress. One of the problems we're having is the complete stifling of conversation
00:11:52.320
on this and the messaging to white people and often white men that you just can't speak.
00:12:01.100
If you're white, you shouldn't speak on the issue of race. If you're a man, you shouldn't speak on
00:12:04.860
the issue of sexism. If you're cis, meaning you identify with the, your biological sex that you
00:12:10.820
were born with, um, you shouldn't speak on the issue of transgender and so on. And it's a really
00:12:16.140
effective way of silencing people in those groups because you're told if you even try to speak out
00:12:22.700
about it, nevermind, speak out critically. If you dare say anything critical of it, um, you've,
00:12:28.900
you've offended just by opening your mouth, you've offended. And I know this summer when we were in
00:12:35.100
the midst of the BLM protests and defund the police cries and all that, you, you were very outspoken in
00:12:42.320
a very, very powerful podcast that I I've listened to a few times. And you made a point up front about
00:12:48.700
saying you, it was a conscious choice not to put on a black scholar, uh, an intellectual to talk to
00:12:57.040
you and to make these points for you with you across from you, because you want to disabuse
00:13:04.420
people of this perception that the color of one's skin matters for a discussion on race or that,
00:13:11.920
you know, your gender has to determine whether you can speak up on trans or et cetera. I thought
00:13:19.240
that was really brave. And I think more of us need to say that. Yeah. I mean, you know, my,
00:13:23.280
obviously my bravery is, is to some degree founded on the fact that I've taken prudent steps to not be
00:13:29.560
cancelable. Right. So in the end, it's, it's not all that much bravery to, to be honest, uh, because I
00:13:35.420
just have, uh, I have, I knew what I wanted to be able to do. And I've, I've taken steps to ensure
00:13:43.600
that I, I, uh, run a fairly low risk of, of, uh, suffering some, you know, fatal error, you know,
00:13:51.180
bit, you know, career wise, uh, for doing it. So, and I mean, as you know, you've had adventures in,
00:13:56.360
in cancellation and it's, you know, I mean, you're an example I've actually spoken about. I mean,
00:14:00.720
someone at the absolute top of media, uh, saying one thing and being hurled from the ramparts for it.
00:14:09.780
Um, and, you know, and I would, I would say, you know, it was a, a malicious framing of,
00:14:15.560
of what you said. Um, and I mean, perhaps we can, we can, I don't know if you want to talk about that,
00:14:21.940
but I think there's a, the underlying problem here is that people want to hold people to the least
00:14:31.060
charitable interpretation of what they've said or done at every turn. And they're really not concerned
00:14:38.320
to know what was actually intended, what was actually going on in their minds or what, or what,
00:14:42.880
what their, their real aims are. Right. So if you can discover that someone has said something
00:14:49.420
that can possibly be construed as racist or sexist or transphobic or whatever it is, um, well then
00:14:57.140
that possible construal is the thing that you, you, you will amplify as an, as an activist or even as
00:15:04.580
a journalist now. And the goal is just complete obliteration of a person's reputation and, and,
00:15:12.720
you know, obvious aim to make them, them unhireable. Uh, and so, so yeah, I mean, I, I've, I saw the
00:15:22.740
writing on the wall there and I have, um, you know, kind of created a platform for myself where I can say
00:15:28.520
more or less anything I want, but yeah, the reality is that if you are a white guy who is talking about,
00:15:34.740
uh, in this case, police violence, uh, against, uh, you know, everyone, but in particular young black
00:15:42.960
men, uh, you are, you're on, you, you, you know, you have to be on the back foot, uh, to, to even think
00:15:51.420
you should be saying something, especially at a moment as fraught as immediate aftermath of the,
00:15:56.900
the George Floyd killing. Um, but the truth is the color of a person's skin has absolutely no
00:16:04.160
relevance to a conversation about the, the, the actual statistics of police violence and, and
00:16:11.240
crime and violence in our society. Uh, and it's, it's, it's bizarre and dysfunctional to think it,
00:16:18.600
it does have relevance there. And, um, you know, we, we have, uh, on that particular point,
00:16:25.340
we're, we are suffering a kind of public hysteria around this topic. I'm not saying that there isn't
00:16:32.000
too much police violence. I'm not saying that many of our cops are poorly trained and we should be
00:16:37.140
hiring, you know, better recruits and training them better. And, and, you know, far from defunding
00:16:41.640
the police, we should, we should give them more funds for, you know, to, to recruit better people
00:16:46.840
and to, and to train them better. Uh, but the flip side of that is the cops have the hardest job in
00:16:52.680
the world practically, right? I mean, they're thrust into situations that are, that are, um,
00:16:57.960
that the public, you know, don't understand how to interpret, right? I mean, we're in a country
00:17:02.620
where there are 400 million guns on the streets. This is not like policing in Japan where you can
00:17:07.580
assume when, when the guy turns around and races to the front seat of his car, he's going to, he's,
00:17:13.840
he's not going to pull out a handgun. In fact, you, you have to assume that he will in this case,
00:17:18.560
right? So cops are in all these situations where they have to make split second decisions about
00:17:24.220
what someone's doing with their hands. And it's, um, you know, yes, they kill a thousand people every
00:17:30.120
year, but they don't kill people, uh, in a way that suggests that there's an epidemic of racist
00:17:37.760
violence perpetrated by racist cops in our society. They're just, that's not what the data show.
00:17:43.740
And yet to say that in the aftermath of George Floyd, um, yeah, that's, you know, I think it
00:17:50.340
was rightly perceived as risky and, you know, many people would have been fired for recording a
00:17:55.160
podcast like that, but, you know, happily, I, I have taken pains to be unfireable.
00:18:00.060
I mean, the, the whole stifling of conversation around COVID first and the, the riots over the
00:18:08.000
summer, black lives matter, the gender stuff. This is really what made me want to get back out there,
00:18:12.780
just get in front of a microphone again, because the more you tell me, I can't say something,
00:18:17.260
the more I want to say it. And I've always been that way. I don't, I, you know, my executive
00:18:20.820
producer, Tom Lowell used to say to me at, at, at Fox, MK, you like to go to the place that hurts.
00:18:27.100
And it was true. And it wasn't, you know, gratuitous. It was because when no one wants to
00:18:33.280
talk about a thing, to me, it becomes ever more important to talk about the thing. There's nothing
00:18:37.740
wrong with talking about the thing. And we've gone completely crazy on this where just talking
00:18:45.780
is a fireable fence, just a fireable offense, just talking about something can get you fired.
00:18:52.720
It's, it isn't right. And it, it scares me that so many people have, forgive the term,
00:18:59.580
bent the knee on that. They, they don't talk anymore. They're afraid.
00:19:04.500
Yeah. Yeah. Well, this is a, um, I mean, it's a, uh, a great inoculation against this
00:19:10.540
comes in, in certain areas of academia. Although I got to imagine those are closing down a little
00:19:17.400
bit now, but, you know, having a background in philosophy, you know, in a philosophy seminar,
00:19:21.360
you know, just as thought experiments, you can talk about anything, right? Cause you're,
00:19:27.180
you're looking, you're probing for the foundations of ethics, right? You're trying to figure out what
00:19:31.300
makes something wrong. You know, if in fact things can truly be wrong in this world, you know, as,
00:19:37.740
as I think they can, uh, you know, in a, in a philosophy seminar, you might say, well, you know,
00:19:42.160
why can't, what is the difference between abortion and infanticide? And, you know, why can't people kill
00:19:48.780
their children if they don't want them anymore? Right. You know, if they do it, you know, within 15 minutes
00:19:53.760
of birth, right. If abortion is, is legal. I mean, so these are the kinds of things, sentences you
00:19:59.660
would speak in a room filled with, with people searching to understand ethics. And yet now one
00:20:09.100
would fear that somebody would just take that quote out of context and say, well, look at this maniac.
00:20:16.360
He doesn't understand why you can't kill babies. Right. Um, and it's, we need to be able to speak
00:20:23.060
without being paranoid. Uh, and we need our, we need our actual intentions to matter. I mean, it's,
00:20:31.480
it's, it's not that hard to figure out if somebody is actually racist and racist in a way that that
00:20:38.960
should matter. Right. Or actually sexist, you know, sexist in a way that should matter. And people don't
00:20:45.940
tend to conceal this stuff. Right. And, and it's, and people should be held accountable for what
00:20:51.600
they really are trying to do to the world. Right. So it's the fact that a, a bad joke or,
00:20:59.460
or something that can be misunderstood or a, or an honest question of confusion, um, get, can get
00:21:06.440
spun into, uh, a, a career ending reputation canceling offense. That's, uh, I mean, maybe that
00:21:16.780
was always possible in some way, but it does seem genuinely new in a way that it achieved scale based
00:21:23.200
on, you know, our, our, our new, uh, technology. I mean, just social media has, has leveraged this
00:21:30.320
into something that's, that's deeply, uh, unhealthy for us as a society. And, and, uh, we have to find
00:21:37.300
some way to pull back. Well, I know you've said all we have between us and the total breakdown of
00:21:43.820
civilization is successful conversations. That's, and we're on the brink, but it made me think because,
00:21:51.200
you know, having been in media for a long time, I do wonder whether these conversations are working.
00:21:58.140
The ones that we were having before the complete shutdown of all of them and even the ones that
00:22:04.280
we're starting to have now, you know, it's, there've been studies that say people don't want to
00:22:09.060
hear opposing ideas that they really, they try to avoid listening to them. That's why we have Fox
00:22:14.300
news and CNN and MSNBC on the other side. People go there for confirmation bias and just to feel good
00:22:20.480
about themselves. It's like hearing the sweet nothings about how right you are. And I really wonder
00:22:26.440
whether we've leaned so far into that, that conversation's over, it's canceled conversations
00:22:33.380
canceled. Yeah. Well, so I do think conversation is the only tool we have to ensure that 8 billion
00:22:42.500
of us can collaborate in an open-ended way. I mean, is, is the human project truly open-ended? You know,
00:22:50.620
is it possible that we're going to get through this century and into the next million years of
00:22:57.720
conscious life, you know, that, that is directly descended from who we are now, you know, whether
00:23:06.760
we will be recognizably human at that point is certainly an open question, but you know, is,
00:23:13.360
is, is this project doomed and is it doomed in the near term or are we going to, to, to thrive in
00:23:21.860
some truly open-ended way? I think conversation is really the, the whole story that will decide
00:23:30.840
whether, you know, which future we, we land in here, because it is the only thing that allows us to
00:23:38.220
modify the behavior of perfect strangers without violence, right? We have a choice between conversation
00:23:47.400
and violence and in all its forms coercion, right? So, you know, laws are also a form of violence in
00:23:53.780
the, in the end, you know, if we pass a law against you doing or saying certain things, well, what
00:24:00.260
happens when you break that law? I mean, people show up to your house with guns and you know, if you do
00:24:05.860
the wrong thing with your hands, you might get shot, right? So it's, it's, um, we have, we have our
00:24:12.660
ability to persuade one another based on common principles of reasoning and, and appeal to facts
00:24:20.720
that can be, you know, mutually appreciated, uh, or we have force and, uh, we should be very slow to
00:24:29.600
make an appeal to force for obvious reasons. Uh, so we have to get better at,
00:24:35.860
speaking to one another about difficult issues and we appear to be getting worse at it. And that's,
00:24:42.020
and we, and we appear to have crafted a, an information ecosystem in the media and social
00:24:48.040
media now that is, uh, has all of the signs of being a, a psychological experiment to which no one
00:24:55.780
consented, uh, whose purpose is to see how crazy it can make us, right? I mean, we, we, we, we're now in,
00:25:02.560
you know, mutually canceling and irreconcilable, uh, echo chambers. And there are many of them.
00:25:11.020
And, you know, I mean, just to look at, you know, what's happening now currently in our politics
00:25:14.620
around the election and, you know, what has been happening four years, uh, under Trump,
00:25:19.900
we're seeing a, a fragmentation of media and social media. Uh, and, you know, you, you know,
00:25:28.640
in the case of social media, we've built a built platforms that, that, you know, have been maliciously
00:25:33.720
gamed, you know, where the, where the echo chamber effect can be accentuated by the very business
00:25:39.060
model. Um, so it really is, uh, something that we have to get straight. And I think the prospect of
00:25:47.280
us just maintaining this particular course where we're this fragmented, where it's, where it's this
00:25:53.520
difficult to talk about the most important things that face us, you know, you know, a pandemic,
00:25:59.540
right? We can't even figure out how to talk to one another about what, what we should do in the face
00:26:05.540
of a pandemic, right? And that's, um, and, and trust in our institutions has eroded, right? And for,
00:26:13.160
for obvious reasons, but for reasons that we have to figure out how to nullify, right? So you just want
00:26:18.580
to take something that will be, uh, I would imagine dear to the hearts of many of your listeners,
00:26:24.320
I mean, the hypocrisy of public health officials, right? Where they castigate people protesting lockdown
00:26:32.500
as being, you know, murderously irresponsible for having gone out in public without masks, uh, or,
00:26:39.800
or gone out in large groups at all, uh, to express their, their political opinions. You know,
00:26:46.540
the way that flipped when the protests were for black lives matter, and you have the same public
00:26:52.420
health officials in many cases by the, I think by the thousand signing open letters in support of,
00:26:59.340
uh, these protests, which, you know, from an epidemiological perspective were just as crazy.
00:27:05.420
In fact, probably far crazier than any of the, the protests that were happening, uh, in protest over the,
00:27:11.700
over lockdown. Um, yeah, it harms the, the, the stature of our most important, uh, organs of
00:27:23.620
information in a pandemic, right? If science is going to be that politicized, uh, it's easy to see
00:27:31.820
how trust breaks down. More with Sam in just one second, but first let's talk bloomsy box. Are you
00:27:37.600
still struggling with what to buy your mom or your family member, your long lost friend for the
00:27:42.560
holidays? Here is a great idea. You can send flowers from bloomsy box. Your loved one is going to light
00:27:50.600
up when they get your flowers and they're going to show them to all their friends because they're
00:27:54.200
spectacular and they're special. These are not your average flowers. They are better blooms. You're not
00:27:59.780
going to believe the look on somebody's face when your bloomsy box flowers arrive. It's kind of
00:28:03.400
magical. And the reason is they're sustainably grown on family farms around the world. So you
00:28:09.720
place your order, your flowers are handpicked. They're arranged at the farm just for you. It's
00:28:15.400
like sending a personal one of a kind flower gift and they are delivered farm fresh straight to your
00:28:20.660
loved one's door. I'm going to do this for my mom. So she's going to get them and they're not going to
00:28:24.240
die in two days. Like you get in New York city. They're, they're going to last because they came from
00:28:28.020
the local farm, get an incredible price, a huge selection of artisan designed arrangements. There
00:28:33.720
are no hidden fees, no endless upsells and free shipping with your subscription. So check it out.
00:28:39.160
Whether you're going to send a single holiday arrangement or a subscription, uh, for someone
00:28:43.380
special to receive flowers every month, you got to go with bloomsy box. And I got you a special
00:28:47.940
discount. Go to bloomsy box.com and enter MK to get 15% off and free shipping. That's promo code MK
00:28:55.180
for 15% off at bloomsy box.com. There is a difference between the way the New York times
00:29:08.420
tries to get its facts straight, even when it fails, then something like Breitbart or Fox news or,
00:29:15.900
or, um, you know, any, any organ of, of, uh, uh, you know, I would say pseudo news on the right.
00:29:24.320
And that, that asymmetry is something that, you know, we have to figure out how to correct,
00:29:29.640
uh, because it's, it's, um, you know, it's, it's shattering our society.
00:29:35.460
Would you, I'm just curious, would you, do you think CNN and MSNBC are in the New York times camp?
00:29:42.900
No, no, I think they're worse. I think they're obviously worse. Most of the time. MSNBC is,
00:29:48.560
is obviously worse. CNN is, is often worse. Um, and it's, yeah, I mean, I find them both more or
00:29:57.700
less unwatchable now. I mean, I never watched MSNBC, frankly, but, um, but I think we have to be
00:30:06.680
honest about why this is because I, so I'm sure, you know, my opinion of Trump. Um, but I think,
00:30:15.880
I think Trump really is as, um, I mean, I think it is true to say that he is the most dishonest
00:30:25.340
and corrupt person who has, has, who has appeared in, in public life, in our country,
00:30:32.720
in our lifetime. I think that is a fact about him. This is not, I don't think that's just my
00:30:37.920
opinion. I don't think it's, I don't think this is really debatable. I mean, I think it's like saying
00:30:42.960
he's got a, you know, a vaguely orange hue to his skin or, you know, fairly colorful hair. Uh, I mean,
00:30:49.820
these are, these are, these are, these are facts about him that you can observe based on being able
00:30:54.920
to observe him for now decades, right? I'm not talking just about the last four years. I think
00:31:00.720
he is, he is a, um, he's not a normal politician. He's not a normal person in this specific regard and
00:31:09.380
his, his level of dishonesty and his level of, of selfishness, I mean, his, the malignancy of his
00:31:17.020
self concern, right? And everything gets sucked into that, that kind of moral black hole around
00:31:27.200
him, right? And he, and he attracts people into his orbit who will put up with that, right? You know,
00:31:33.500
people who are for their own reasons are willing to, who don't view that as the, the, the moral and
00:31:43.040
political abomination that, that it, that it actually is. And the, the media has had to figure
00:31:48.420
out how to respond to this. And, uh, you know, I would be the first to say that they've done a
00:31:54.540
terrible job. Um, and it has been deranging for them, right? And he said, because when they see that
00:32:00.980
he's, you know, he, he functions by a very different physics, you know, your reputation, like he's managed
00:32:06.200
to dissect out a, a kind of, you know, a personality cult within our society.
00:32:13.940
Now I got to stop you. I got a couple of thoughts. Um, I would suggest to you that one of the reasons
00:32:18.340
you don't see Hillary Clinton as in the running for most dishonest, corrupt politician ever is
00:32:23.640
because in part, the way the media didn't cover that they run cover for their favorite Democrats.
00:32:30.220
Like we saw with the Hunter Biden story, which has now come out as confirmed. The FBI has been
00:32:35.120
looking into him reportedly looking into foreign dealings, possible money laundering, a story that
00:32:40.660
Twitter and Facebook and the other media outlets wouldn't even touch. We were told it was untrue.
00:32:47.200
We were told it was a smear. It was an October surprise, not based in fact. Well, guess what?
00:32:51.840
That's not true. And Hillary Clinton at the time, she wasn't the most beloved democratic candidate,
00:32:56.640
but once it was her versus Trump, they were all in on protecting her stories about the Clinton
00:33:01.700
foundation. She lied every two minutes, but we didn't have a lie ticker going on her because the
00:33:09.060
media wasn't interested in doing that. Cheryl Atkinson was just on talking about that, about
00:33:12.840
this reporter who was telling her, we calculated out how many times Trump lies. And this is the number
00:33:17.460
of lies per minute. And she said, how many for Hillary? This is when they were running against each
00:33:21.260
other. And they said, Oh, we didn't have the time or the staff to do that. Well, it might be
00:33:25.820
interesting. She's, she's not, she doesn't have an adult relationship with the truth either. I
00:33:29.740
don't, I don't argue that, that Trump is truthful, not for one second. And I understand he may be in
00:33:36.460
a special category, but he may not be as far ahead of people like Hillary as you think.
00:33:42.780
No, I mean, I would definitely dispute that. Well, first I will say you're not going to find
00:33:46.380
much of a defense of Hillary Clinton or the Clintons as a couple in me. I mean, I just, I, you know,
00:33:53.800
I completely get why people were allergic to her and her candidacy. Um, and it was a lesser of two
00:34:02.700
evils, you know, choice from my point of view, but it was, it was much lesser for, for a host of
00:34:09.160
reasons, not just to this, this difference in, in dishonesty and, and personal corruption. Uh, but it's
00:34:17.880
just, I mean, the, the, the, or the ordinary range of lying and self-dealing and, and, um, you know,
00:34:29.500
perverse incentives and all the stuff we, we recognize in, in many, if not most politicians,
00:34:36.520
we have a sense of the general shape of that. Right. And I, you know, I, I, I get that if you
00:34:43.980
scratched the surface on, on most of the people in power and then certainly most of the people who
00:34:48.280
seek the office, right. I mean, it is kind of, it's a self-selecting group and there, there are
00:34:54.020
people, you know, who are in this game for the wrong reasons. And there, and there's just, you know,
00:34:59.820
many of the, the incentives are perverse and yet it's not glamorous, right. It's, it's, it's hard to
00:35:05.140
be idealistic about many of these people. And I would say the Clintons were, um, especially cynical
00:35:13.360
and, and yes, there's a lot of dishonesty that you can find in their, in their backstories. Um,
00:35:21.020
but it's just, it's, you know, Trump is orders of magnitude worse. I mean, he, he's done literally
00:35:30.180
hundreds of things, any one of which would have destroyed the political prospects of any other
00:35:37.840
normal politician. Well, that's true. But what I'm saying to you is that when you say we have a
00:35:43.240
sense of what these politicians were and how far, you know, down the dishonesty lane they,
00:35:48.620
they were, I'm, I'm just positing to you that you should put an asterisk there because I think the
00:35:55.320
media has exposed itself during the Trump era. And I think they have gone nuts. They've, they've taken
00:36:00.600
their dishonesty to a new level because they are deranged by him, but their bias has always been
00:36:05.800
there. And I do think it's worthwhile for people to stop and ask themselves how they've been
00:36:12.700
manipulated, how their perception of a politician like Hillary or Barack Obama was manipulated by a,
00:36:19.180
by the glowing coverage they received in general from outlets like the New York times. And to your
00:36:25.260
second point that you were making earlier about when I asked you about CNN and MSNBC,
00:36:28.720
they don't get a pass for, you know, well, they, they're nuts, but he drove them to it. Well,
00:36:35.560
too bad. That doesn't, you know, some of us had a very difficult time with Trump and some of us were
00:36:41.920
not huge fans of his for personal reasons. Right. But you can get past that if you're a strong person
00:36:48.600
and you're an ethical person, you understand what the job of a journalist is, which is not to make it
00:36:54.040
about yourself. And what we saw here with, with CNN and MSNBC was not a couple of journalists fell,
00:37:01.000
you know, like Jim Acosta, who used to be a straight news guy. He fell. He, he just went
00:37:05.040
totally partisan, got the derangement syndrome, couldn't be relied upon for truth anymore.
00:37:10.000
They all went down. All of them. I would like, it was shocking to me to see somebody like Anderson
00:37:16.780
Cooper, um, sacrifice his credibility because of what appeared to be his inability to see Trump in
00:37:23.660
any way that approached fair. Um, so I don't, you know, you, I, I, I felt like you may, may have been
00:37:30.060
sort of trying to explain their surrender to their nonstop attacks on him by, well, he drove them to
00:37:36.660
it. They drove themselves to it. I would just say that the, the asymmetry here really is difficult to
00:37:43.260
navigate. So, I mean, for instance, just to take the Hillary Clinton coverage, I think the media,
00:37:51.060
uh, is rightly, uh, concerned that they got Trump elected, right? I mean, they, first of all,
00:38:00.160
they gave Trump something, some insane ratio of coverage. And there's something like 20 times
00:38:07.340
the amount of coverage that they gave. I can't remember if that was, that was for Clinton or for
00:38:12.080
Bernie Sanders, but there's some, there's some comparison between the amount of airtime Trump got,
00:38:15.860
you know, in 2016. It was the other GOP candidates in the primary. Okay. So, so it's just,
00:38:21.740
they did that, right. And they, they have good reason to worry whether that was counterproductive
00:38:28.860
given what they wanted to happen, uh, in the election. Uh, and also it's, I think it's fairly
00:38:35.960
well established at this point that the, the Comey, uh, you know, reactivation of, of the email,
00:38:43.420
uh, investigation, whatever it was 11 days before the election in 2016, the coverage of that and
00:38:51.520
the sort of the, the, the, the scrupulosity of that moment, both from the FBI side and from the
00:38:57.320
press's side, like, Oh yeah, we really got to talk about this now from, you know, you know,
00:39:01.620
yet another 24 hour news cycle when there's only 10 days left left. Let's talk about Hillary Clinton's
00:39:07.460
emails. They rightly think that that probably cost her the election right now. Obviously there are
00:39:13.980
many other things that, you know, cost her the election. She was a terrible candidate and
00:39:18.100
she didn't go to Wisconsin. Yeah. You, I mean, you might, yeah, you could say that her
00:39:22.160
election was, was overdetermined, but when the people who were tracking the polls in those last
00:39:27.980
days of the campaign just saw the, the, the direct effect, or at least now claim to have seen the
00:39:33.560
direct effect of, of that coverage. So, so it's given that history. Yeah. What do you do with a
00:39:41.080
Hunter Biden story? Right. It's a hard problem. You know, I, and because it's, it's, it's not clear.
00:39:47.840
It isn't hard. You're so much more forgiving of these people than I am. First of all, it was very
00:39:52.640
obvious that they were giving Trump too much free airtime. Even to me, I was a person responsible
00:39:58.160
for putting points on the board when it came to ratings during the Trump rise, the GOP primary and
00:40:02.560
thereafter. And this is in my book, but we, we had many, a meeting, my executive producer, my team
00:40:08.060
and I did about the need to stop, to pull ourselves away from the crack cocaine. You know, like Trump
00:40:15.680
was amazing television. Every time you put him on television, the ratings soared, but it wasn't fair
00:40:20.800
because Scott Walker was terrible television and incredibly boring to watch. I like him. I like him
00:40:27.040
just for the record, but he is not a dynamic television personality and it was totally unfair.
00:40:32.240
What we were doing. So even though we wanted the crack cocaine, we didn't snort it, right? We,
00:40:37.260
we made a conscious decision to try to get ratings the old fashioned way, which was interesting debate
00:40:41.380
and reporting the news in an entertaining way, but not, not putting our thumb on the scale.
00:40:45.640
They did it intentionally. They knew what they were doing. Jeff Zucker knew what he was doing.
00:40:49.980
Noah Oppenheim knew what he was doing on the today show, allowing Trump to phone in,
00:40:53.460
to do phoners as a presidential candidate when nobody else would have been afforded that. Right.
00:40:57.180
Um, they did it for ratings. They sold their souls out for numbers, Joe and Mika too. They kissed
00:41:03.420
Trump's ass when he was running. Why? Because their numbers would soar on a show that was not doing
00:41:08.000
well. So I, that I don't, I don't forgive them one bit of that. In other words, I don't allow them
00:41:13.060
to use that to justify their overcompensation and trying to ruin him every moment of his presidency.
00:41:18.620
They, they, they made, made their bed and they needed to lie in it. And the Hunter Biden story
00:41:23.140
was very clearly legitimate. He, he had never denied it. There was an FBI subpoena that was
00:41:31.360
verifiable. There were third party witnesses who had come forward to say, I participated in this.
00:41:36.980
And let me tell you about it. Whose credibility was not assailable. They just didn't want to do it.
00:41:43.160
They didn't want to do anything that could hurt Biden.
00:41:46.360
Yeah. Yeah. Well, well, I understand it. I mean, you know, it's, you know, I've always been,
00:41:51.980
uh, I've always made an effort to be fair in my criticism of Trump because I do, I do think it's,
00:42:00.940
I mean, I think intellectual honesty is the, the master value here. And I think, I think you,
00:42:06.000
there's, if you think he's racist, well then, you know, argue on, based on real evidence that he's
00:42:11.640
racist and not pseudo evidence. Right. And I, and I think it's, you know, and this is true for
00:42:17.040
anything else you might want to allege about him. Um, but I, I think that, you know, the worst
00:42:23.640
problem with him beyond anything else that has, that he's done or tried to do is, uh, the way in
00:42:32.820
which our, our trust in our institutions and, and a whole style of, of speaking and thinking about
00:42:40.620
institutions, uh, has, has just fully eroded. I mean, we're just, you know, it seems that it
00:42:48.220
seems that half the country thinks we don't need institutions anymore. Right. Now that we can just,
00:42:52.820
let me ask you about that, but let me ask you about this. So I, I've been following this too,
00:42:56.140
and I've been experiencing it myself as a citizen, to some extent, not, not fully, but to some extent,
00:43:02.440
right. Because take, take Comey and the FBI. I defended Jim Comey. I thought he was a man of honor.
00:43:08.620
Um, the whole story of Robert Mueller and Comey and, you know, having each other's backs and
00:43:12.780
protecting each other and whatever the ethical, uh, quandaries they faced and rose above.
00:43:18.220
I defended him. And now I see him as a partisan hack. I really do. I've completely changed on him.
00:43:23.780
I defended the FBI. I know a lot of these FBI agents. I, I tend to defer to law enforcement.
00:43:28.840
I have law enforcement in my family. And I think a lot of women are deferential to law enforcement for
00:43:33.340
reasons having to do with nonstop crimes being played on the evening news as we were growing up,
00:43:37.820
but that's just my armchair theory. Anyway. Um, then I saw the, the Peter Strzok emails and the
00:43:45.020
other emails coming out from the FBI and how partisan they were and how determined they were to
00:43:48.280
bring down Trump. And I thought, maybe I have been too deferential. Maybe Trump kicking these tires
00:43:56.600
is not such a bad thing. Maybe we have been too trusting in these institutions. And of course,
00:44:02.600
we've discussed media and how people were trusting media in a way they shouldn't have been. Um,
00:44:08.340
I don't, I don't, I agree. It's gone too far. I don't think you throw out the baby with the
00:44:13.360
bathwater, right? I think we're sort of now at the point where we trust no one and we've become
00:44:17.980
conspiratorial and things are getting weird. There's becoming like a cult, like love for Trump
00:44:26.080
that we're in, which he looks deified, that's concerning. And therefore his offhanded need to
00:44:33.280
attack every institution that says something bad about him or does something he doesn't like
00:44:37.060
or person, you know, governor Kemp of Georgia, who's his fan is, it isn't healthy. I see all of
00:44:43.900
that, but how, how do you ascribe, you know, how do you deal with those revelations of dishonesty,
00:44:51.900
um, versus maintaining a country in which we do have to trust institutions and move forward?
00:44:58.800
Well, I just think you have to put them in their proper perspective, right? So again, what you,
00:45:04.620
you drill down on anything and you start reading people's private emails or emails they thought
00:45:09.300
were going to be private, you start reading their texts. Yes, there's no question you're going to
00:45:14.280
find things that, that, you know, are embarrassing and, and are all too human. Uh, and that's totally
00:45:24.800
understandable. I mean, these, these things weren't intended for public consumption, right? So, you know,
00:45:29.680
a hack of the DNC or, or, uh, you know, subpoenaing people's text messages when you, when you're
00:45:36.580
investigating, you know, anyone that, you know, attorneys or members of the FBI, yet you'll, you'll
00:45:41.960
find evidence of bias and all of that. Right. So, but that, no one's surprised by that. What people
00:45:46.760
should be surprised by is, uh, you know, to take one institution institution in particular,
00:45:52.700
that you are under an election, um, and all the systems that support it. And we, we should be
00:45:58.200
surprised that we have a sitting president who in the run-up to the election would not commit to a
00:46:06.180
peaceful transfer of power in the event that he lost. And then on election night with millions of
00:46:11.980
votes still coming in, claimed to have won, right. And demanded that the voting be stopped,
00:46:17.620
right. Whether the ballots, you know, the ballot counting stop, um, that just, just that, if you
00:46:24.040
just, just make a little documentary about that moment and what it says about where we've come and
00:46:32.300
the, the divergent reactions to that in our society, right. The fact that we had something like half of
00:46:40.300
the society that simply didn't care this was happening, or they had some construal of it,
00:46:45.920
where this is all, this is, this is not only benign, this is good, right. He's, he's disrupting
00:46:52.680
everything now. Now he's disrupting our expectations about the peaceful transfer of power and, and, uh,
00:46:59.300
the integrity of our elections, right. He's calling a fraud on an election that, that, that is in
00:47:04.580
process, you know, where we haven't seen any evidence of fraud yet. Right. He, he called the
00:47:09.820
election he won in 2016 fraudulent. Um, and that's, you know, and this should be the lens
00:47:16.380
through which we look at everything that has happened subsequently in the last month and a
00:47:20.840
half or so, right. It's, it's, it's not that if you don't go looking for fraud, you'll never find
00:47:27.340
it. Of course, that there's, there's going to be some ambient level of election fraud that there
00:47:32.120
will be, and there'll be this, there'll be crazy behavior. Um, and if you put yourself in a
00:47:37.600
position to just look for crazy behavior, you will find it. But the question is, can you find it at a
00:47:43.320
scale, uh, that reveals this election to be, uh, completely fraudulent? Uh, you know, there's no
00:47:51.360
evidence for that at the moment. No, I get it. I get it. But I have to say, I think that I look at
00:47:55.060
the media who's that's, that's how we're learning about the substance of these election fraud claims,
00:47:59.580
right? We trust, we need to trust them to tell us what has he filed? What is the court saying?
00:48:04.780
Does he have the proof? It's, it's the same as those doctors who justified the BLM protests
00:48:11.960
coming out now telling us to stay inside and not go shoulder to shoulder. We're looking at them
00:48:17.720
saying, I don't believe you anymore. You already sacrificed your credibility. I don't know who to
00:48:23.060
believe, but when, when confused with who to believe people do revert to tribalism. They go to
00:48:29.940
the person who's wearing their team Jersey. And I think that's why a lot of these Republicans
00:48:33.980
who have lost all trust in media are believing Trump, even though I've said on this show many
00:48:40.080
times, I don't think any of his claims are robust. I've, there's been a couple his, the last one he
00:48:44.560
filed, um, in the state of Georgia that challenged like the failure to match up the signatures and how
00:48:50.080
many votes were thrown out versus in prior elections. And it went down this year, even though the vote was
00:48:53.920
five times higher by mail, blah, blah, blah. You could, you could make an argument on, on a couple
00:48:58.520
of them, right? So we're trying to give him his due without losing touch with reality. But I
00:49:03.500
completely understand why people that it's gone. Their, their trust in the, in the information
00:49:09.920
deliverers is gone. And just for the record, the FBI thing was not just the Peter struck emails,
00:49:14.240
but as you know, an FBI agent pleaded guilty for doctoring a subpoena. They got subpoenas in the,
00:49:19.900
in the case against Trump from the FISA court by using the, the, um, the dossier, which had been
00:49:26.300
discredited and they knew it wasn't true. Like there were a lot of things that were exposed.
00:49:30.860
Why? Because Trump fought back and he made outlandish claims at the time that weren't true
00:49:36.100
in defending himself, but he wasn't guilty. That was the truth at the end of the day. And,
00:49:41.160
and same thing with impeachment. There's just been so many overreaches in order to destroy him
00:49:45.760
that in essence, they've endowed him with the credibility to come out and challenge anything.
00:49:52.300
I'm not, I don't excuse Trump for throwing wild claims around. I'm just in the way you're trying
00:49:57.060
to explain the media's distorted minefield when it comes to Trump. I'm trying to explain why he now
00:50:02.340
has this ultimate credibility with all these folks, because the other side is just, it's collapsed.
00:50:07.860
The, the information deliverers have collapsed and it matters.
00:50:11.720
Yeah. Well, it does matter. It hasn't completely collapsed. And I think people need to be sensitive
00:50:16.560
to, to, uh, the difference between, you know, plausible interpretations of events and, you know,
00:50:26.580
completely unprincipled conspiratorial, you know, tinfoil hat, crazy interpretation of events. And,
00:50:32.900
and there, there, there are reasons why we have this, this category of conspiracy theories that,
00:50:38.620
that doesn't subsume all of our thinking about everything all the time. Right. And there,
00:50:44.040
there's a reason why there's a stigma associated with conspiracy thinking because it, it, it reliably
00:50:51.680
manufactures errors, right. It rests on, on not acknowledging, uh, the power of incentives,
00:50:59.860
right. And so to take the, you know, the case of the, the election fraud, uh, conspiracy right now,
00:51:05.220
again, I'm not saying there isn't some level of fraud, but there are many reasons to think that
00:51:08.900
whatever level of fraud there is, it almost certainly happens on both sides in the election,
00:51:14.640
right. And it's, there's not a lot of incentive for individuals to commit fraud and, and, and doesn't,
00:51:21.080
there doesn't seem to be the apparatus to allow us to really commit it at scale, uh, across multiple
00:51:27.160
states. Um, and you know, that's a good thing. I mean, obviously we want to, we want an election
00:51:32.640
system that we can be confident in and that, and that is, uh, designed in a way to truly minimize
00:51:37.960
fraud and error. And we have to, this is a project we, we need to, to engage, uh, if for no other
00:51:44.920
reason than to, to restore confidence in, in our election system. But the, the reality is, is that
00:51:51.000
all of this is happening in a context where many of the people in power, right. The governors,
00:51:56.820
the secretaries of state, the legislators, the election officials, the judges who have to hear
00:52:02.880
these cases, many of them are lifelong Republicans, right? Many of them, many of them surely voted for
00:52:08.960
Trump in this election. And so you have to, to, to, you're arguing to be really conspiratorial about
00:52:15.900
this. You're arguing that these people are somehow incentivized to risk going to prison in order to
00:52:23.360
help Joe Biden. Back to Sam in one second. But first, have you ever Googled yourself,
00:52:28.180
your neighbors? The majority of Americans admit to keeping an eye on their online reputation and
00:52:32.960
why shouldn't you? But Google and Facebook are just the tip of the iceberg when it comes to finding
00:52:37.160
public records. There is an innovative new website called truth finder, and it is now revealing the
00:52:43.160
full scoop on millions of Americans. Truth finder can search through hundreds of millions of public
00:52:48.720
records in a matter of minutes. Truth finder members can begin searching in seconds for
00:52:53.780
sensitive data like criminal traffic and arrest records. Before you bring someone new into your
00:52:59.620
life and around the people you care so deeply for, consider trying truth finder. What you may find
00:53:05.220
may astound you. This might be the most important web search that you ever do. So do it. Go to
00:53:10.620
truthfinder.com slash Kelly right away to start searching and be prepared for what you find.
00:53:16.700
Again, that's truthfinder.com slash Kelly. Okay, we're going to get back to Sam in a second. But
00:53:22.480
first, we want to bring you this feature we call Sound Up, where we talk about some of the sound
00:53:26.500
bites making the news that we think are interesting and want to share with you. And today, we are going
00:53:31.720
to talk about Hunter Biden. Remember old Hunter Biden? If not, you could be excused since it was a
00:53:38.460
story that was totally buried by the mainstream media. CNN, Politico, others reported last week
00:53:44.700
that Joe Biden's son is being investigated by the FBI. His taxes, his dealing with China,
00:53:51.720
money laundering, and more. We don't know how deep this goes, but it doesn't sound good.
00:53:57.080
The story has a lot of the same hallmarks as the New York Post's reporting on Hunter Biden
00:54:02.140
in October. Remember that? It was a big scoop. It was about what was on the Hunter Biden laptop.
00:54:08.200
That story, however, which was published before the election, was suppressed, totally censored by
00:54:15.540
Twitter, Facebook, and other social outlets. Basically, no one wanted to go with this. It
00:54:19.260
wasn't just social. It was print magazines and print newspapers and television. Nobody wanted to
00:54:24.120
touch this thing. They decided it was not to be discussed. And they said it was because it was
00:54:29.200
unreliable. But the truth is more likely that it was bad for Joe Biden. Anything bad for Joe Biden has
00:54:34.520
to be suppressed, according to the media. So how did the media treat the story at that time prior to
00:54:40.200
the election? Listen to some MSNBC highlights. Watch for President Trump to go after former Vice
00:54:46.120
President Joe Biden's son, Hunter Biden, and unverified emails about his business dealings,
00:54:51.300
a story that many intelligence experts say has all the hallmarks of a foreign interference campaign.
00:54:56.760
When there is a New York Post article that is false,
00:54:59.760
it's much better for Twitter to let people read the New York Post article and sit there and laugh
00:55:07.840
at the hokey story of a computer repairman looking at a computer going, this sure does look suspicious
00:55:14.600
to me. I'm going to call Rudy Giuliani. Let that out. OK, because people read this story and then
00:55:23.380
they'll go, this is really one of the stupidest October surprises I've ever seen before. What did
00:55:29.740
he have? X-ray vision? Oh, my God. Why? Why when they're trying to portray a dumb person, do they
00:55:35.700
always put on a Southern accent? It's so irritating, honestly. And then they're like, me, an elite? Oh,
00:55:42.240
wait, I love being an elite. Yes, I am. I am. Screw everybody else. Anyway, so that was Morning Joe.
00:55:48.260
Here's Christiane Amanpour at CNN, who seems to forget the role of a journalist.
00:55:52.860
As you know perfectly well, I'm a journalist and a reporter and I follow the facts and there has
00:55:58.180
never been any issues in terms of corruption. Now, let me ask you this. Yesterday, the FBI-
00:56:04.100
Wait, wait, wait, wait. The FBI- How do you know that?
00:56:07.140
I'm talking about reporting and any evidence. I'm talking to you now-
00:56:12.380
OK, I would love if you guys would start doing that digging and start doing that verification.
00:56:16.420
No, we're not going to do your work for you. I want to ask you a question.
00:56:22.860
So we can't report on it. Why were there no reports again? Oh, because it was buried by
00:56:27.140
people like you who just decided without investigating it that it was untrue. Well,
00:56:31.120
guess what? The FBI has been investigating it. And frankly, it looks like they've been
00:56:35.480
investigating it for a long, long time because in order to get the subpoena to get that Hunter
00:56:40.020
Biden laptop from the legally blind guy who was repairing it. Remember, Hunter left his laptop there
00:56:46.080
and he never went back. And finally, the FBI came in and got it. That was back in December of 19,
00:56:52.920
December of 19. So for the FBI to have gotten that, it suggested that they had an open investigation on
00:56:57.540
Hunter Biden. That's how the FBI can't just like randomly throw out subpoenas back in December of
00:57:02.580
19, which they would have known had they bothered to look at it at all, at all. But no, it was the New
00:57:07.940
York Post. That's a Rupert Murdoch publication. So this has got to be a lie. And, you know, Jack at
00:57:14.260
Twitter, who's got some far left guy deciding what gets censored, spoke out and said, no, we're not
00:57:19.020
going to do it. And so the media lemmings followed. And now it turns out it's a very big story. And if
00:57:24.740
it were Republican, they'd be treating it like it were white hot. So now fear not. Now they're on it.
00:57:31.380
They got Politico. They got CNN. They are on it now that Joe Biden appears to be headed to the White
00:57:43.280
I've been thinking about it when it comes to race and we've been talking about it on the show when it
00:57:46.480
comes to gender. And the reason the gender thing is important is because the scientific world has
00:57:53.220
collapsed, too. Right. Like science is done and you're not allowed to say that there are only two
00:57:59.420
genders and there's only two biological sexes and you're not allowed to question whether it might
00:58:04.680
not be healthy to put a girl who suddenly at age 14 for the first time says she might actually be a
00:58:11.260
boy on puberty blockers or cross sex hormones or allow her to get, quote, top surgery before she's
00:58:18.000
even reached her 18th birthday because that's not allowed. So you go to a scientist, you go to a
00:58:22.620
psychiatrist to check out whether that's true for your daughter. And the standard of care is for him to
00:58:27.860
just affirm, yes, you're trans, you're trans, you're trans. These are the standards of care.
00:58:32.200
It drives me insane. And I know you're a neuroscientist. We had Deborah Sell on the
00:58:36.100
program talking about how she left the field. She's now a journalist covering that field because
00:58:40.340
she didn't think she could say it was scientifically true. I feel like people feel they need to fight
00:58:45.820
back. You know, if Biden gets in there, we're going to see the return and the emboldened
00:58:49.320
nature of all of this. No one's going to shut them down. You've got people like you, Sam,
00:58:54.400
who speak honestly about the dicey issues. You're obviously a, you know, a Democrat, a liberal.
00:59:01.240
But you speak honestly about those issues. But so few people do that people are getting hurt now
00:59:06.540
in the scientific field. They're getting hurt. What do you make of that?
00:59:09.840
Yeah, well, a few things here. One is scientists are just people, right? And not everything they do
00:59:15.440
is science. So, you know, you have scientists who express their opinions on many topics,
00:59:21.140
you know, as I have here on many topics. And it's not, you know, there's no guarantee that
00:59:28.280
what they're saying is, you know, scientifically defensible or convergent with what they would say
00:59:35.240
when they have to put their scientist hat on. And yes, I would agree with you that it is
00:59:41.920
very costly to the reputation of science and any specific institution, you know, scientific journals
00:59:50.860
when they express opinions that are at minimum, you know, highly debatable in terms of their,
01:00:02.060
you know, ethical integrity and connection to science. And it's wrapped up in the mantle of,
01:00:10.080
you know, this is now a scientific opinion, right? So that's, that's a problem. And there,
01:00:15.920
you know, there are many issues, you know, some of which you just raised where
01:00:20.120
they're just, they're, they're fraught issues for a reason. It's hard to know what to do about
01:00:26.420
certain things. I mean, you take the transgender issue, right? I have no doubt that transgenderism
01:00:32.540
is a real phenomenon, right? It's not just made up. It's not just a product of culture.
01:00:39.500
It's not just a, a social contagion, but is there a degree of social contagion riding on top
01:00:47.440
of a real phenomenon that we have to worry about? I mean, specifically with the issue you,
01:00:52.740
you mentioned, I'm sure, I think you're probably referencing Abigail Schreier's book
01:00:57.540
there. And just, you know, just among girls transitioning to, to being boys or wanting to,
01:01:04.940
um, yeah, that's, that has to be discussable, right? And, you know, as her efforts to get,
01:01:13.020
you know, her side of this discussion out have shown, it's, it's very hard to discuss,
01:01:17.960
right? There are people who want to, to cancel her over this. And, you know, somebody like JK Rowling,
01:01:23.880
you know, gets, uh, you know, uh, at least attempted, there's an attempted cancellation of
01:01:29.880
her, um, based on something absolutely benign. She said about, you know, the trade-offs between
01:01:36.680
women's rights and trans rights. And there were, there are trade-offs there. And it's just, there
01:01:40.980
are, there are moments that are hard to navigate based on, uh, appeals to, you know, the primacy of
01:01:48.560
identity, uh, around those issues. And, um, you know, when she was objecting to the corruption of
01:01:54.040
language where we can't talk about women anymore, we have to talk about, about people who men's
01:01:58.780
straight, um, you know, it, it is in fact true to say that if she were not this, you know, billion
01:02:05.940
dollar colossus of a writer, she probably would have had her career ruined over the, the absolutely
01:02:12.020
anodyne thing she said about trans issues there. I mean, this is, there's, there's no evidence at
01:02:18.820
all that she's remotely bigoted against trans people. Um, so yeah, it, it's a, um, the, the stuff
01:02:28.340
is hard to talk about. And the only thing we can appeal to really, if this is going to work is a good
01:02:35.320
faith, uh, engagement with facts and arguments and let the, let the best arguments and the,
01:02:43.300
and the most searching, honest engagement with facts win. I heard John McWhorter on your show
01:02:49.160
saying, uh, he believes these sort of woke, wokesters are in good faith, that they think
01:02:55.300
they're doing good. They're going to help you understand, you know, how racist you are, how
01:03:00.000
transphobic you are, but he also said you, they, they're not persuadable. The only answer is to
01:03:06.720
fight them. Do you agree with that? Yeah. Well, I think, and fight by, by which he meant in most
01:03:13.700
cases, ignore them, you know, go around them, no longer give them any power. Um, but you know, I,
01:03:20.820
I, I have to think that more people are persuadable in the fullness of time. We're not dealing with
01:03:27.400
a different species of person, you know, in all of these camps, we have recognizably human people
01:03:34.780
who have been persuaded of certain dogmas or certain bad ideas, and they're not disposed to,
01:03:44.340
to run the, the, the, the newest operating system, uh, that would debug these ideas. Right. And so
01:03:53.460
it's, it's, it's, it's, we, we have to just keep advertising the importance of, of getting a
01:04:00.520
firmware upgrade here. And it's, it's true. It's true on both sides. And it's, it's true on the far
01:04:05.760
left. It's true on the, on the right or whatever you want to call Trumpistan. It's not clear how,
01:04:11.400
how that relates to conservatism now, but, um, I mean, to come back to a point you made earlier,
01:04:16.620
I hold out hope for it being easier to deal with the wokeness and the hysteria on the left
01:04:25.720
under Biden than, than under Trump, because Trump was such a, a super stimulus, you know,
01:04:33.020
he was such a, such a confirmation of the worst fears or an apparent confirmation of the worst fears
01:04:39.560
of the left. I mean, cause again, much of what is, has been alleged against him on those particular
01:04:45.880
points, you know, with respect to, to race in particular, I think is, is not true. Right.
01:04:51.780
I mean, I, I happen to think, you know, I happen to believe Trump is a racist, but I don't think
01:04:56.040
he's a white supremacist. And I don't think he has, you know, I think he has been unfairly tarred with,
01:05:00.660
you know, the, the good people on both sides or the fine people on both sides, hoax, you know,
01:05:05.700
that, that is, you know, you go back to that press conference and yes, he did condemn white
01:05:09.600
supremacists and neo-Nazis. Clearly my hope certainly, and my, if I had to bet money on it,
01:05:16.500
I would, I would say it's going to get easier under Biden to recognize how dysfunctional wokeness is
01:05:26.360
politically and ethically, uh, because we, they won't have Trump to point to, or, you know,
01:05:31.340
they, they probably won't have Trump to point to if he doesn't emerge in some other.
01:05:35.700
So far it doesn't look like it's going that way. I mean, so far it does not look like it's going
01:05:39.480
that way. You've got, um, critical race theories coming back thanks to what Biden says will be his
01:05:45.160
first executive order. Uh, the mandated sessions at the federal government amongst, uh, its workers
01:05:50.140
and its contractors, they're already saying that they're going to try to undo the restoration of
01:05:54.560
due process rights for men who get accused on college campuses of sexual assault. Um, those are
01:06:00.720
not good signs, not at all. No, no. And yeah, I mean, it, again, I just, I just know what it's
01:06:08.200
like. I know what it'll be like for someone like me to criticize all that without having to bracket
01:06:15.720
everything I say with an acknowledgement of how crazy things are on the right. You know, it's just,
01:06:21.360
it's, it's not, um, it'll, it'll just be very easy to do now. Uh, you know, maybe this wave is just
01:06:30.920
now cresting, you know, maybe there, there is, there is a bizarre effect here where when the problems
01:06:37.780
get smaller and smaller, the people who are, who are most focused on these problems get more and more
01:06:45.960
agitated and more and act more and more like things have never been worse. Right. It's just,
01:06:51.700
it's on some level, this is the, the narcissism of small differences. Right. I mean, just, if you're
01:06:56.960
not singing from precisely their hymn book, you know, on, on any of these woke issues, well, then
01:07:04.780
you're a Nazi. Right. And, and we're talking about things like, you know, whether you can compliment
01:07:09.560
someone on their hair, right. You know, is, is that a racist microaggression, right? We're not
01:07:14.420
talking about people getting lynched. We're not talking about, you know, people, um, having to
01:07:21.320
function under the rule of, of, of, you know, race-based laws, right. We're talking about, um,
01:07:27.420
you know, off color jokes, you know, someone gets into a, into a elevator at an academic conference.
01:07:33.920
And when they ask me, you know, what floor he says, women's lingerie, right? Like that is a,
01:07:38.120
that is a life deranging cancelable offense because he did a, he, he, you know, offered up a,
01:07:44.420
a dad joke from the 1950s. Uh, that's, that's where we are. And it's, it's eminently
01:07:51.920
criticizable. Right. And there's so many people, I mean, you know, to, to mention some of the black
01:07:56.820
intellectuals who I, I didn't invite on that podcast. There's so many great people like John
01:08:02.200
McWhorter and Glenn Lowry and Thomas Chatterton Williams and Coleman Hughes. And, you know, many of
01:08:07.360
you, I know you, uh, many of whom I know you've spoken with, um, and Camille Foster and, and, uh,
01:08:15.240
Chloe Valdary. And, and I mean, there's just, there, there are people who, you know, it shouldn't
01:08:19.740
matter that the, the, the people I just mentioned are black, uh, but it does matter. Right. And they,
01:08:26.820
they really have a, an enormous responsibility and, and they are shouldering it, uh, to perform an exit,
01:08:34.920
perform an exorcism here. Right. I mean, this is just, because the, the people, uh, on the far left
01:08:42.900
simply cannot hear it from people like us, right. I mean, if you, if you're white and, and, you know,
01:08:52.180
obviously privileged, you know, you have all the privilege marks, uh, that could be, you know,
01:08:57.460
ascribed against anyone in, in our society at this point. Um, you by definition don't get it
01:09:06.040
and can't talk about it. Uh, but there are many people who, um, really can talk about it and,
01:09:12.440
and for whom they, I mean, they really are a kind of kryptonite and it's not an accident that
01:09:17.280
no one really wants to debate them. I mean, the people are not lining up to debate Glenn Lowry and
01:09:22.380
John McWhorter about race issues or Shelby. Or Coleman Hughes, Coleman Hughes, who's been out
01:09:27.040
there saying, Hey, Ibram X. Kendi, just let's talk about your, your book. I don't believe in it.
01:09:32.160
I think you've made mistakes. You've been sloppy. Let's talk about it. And he won't debate Coleman
01:09:36.400
Hughes, a 24 year old guy who's done his homework because he's afraid. But, but to your point earlier,
01:09:43.640
they, right. They can't assail Glenn Lowry and, and Coleman and certainly Thomas Sowell and the way
01:09:49.680
that they could come after you or me, but they won't hear from them. Number one, they get called
01:09:54.480
uncle Tom's number two, they do not get invitations to appear on shows. No one's interested in putting
01:10:01.740
them out there to say how they feel. And I'll just give you one other small example. Um, it's a stupid
01:10:09.020
story, but my, I am in the Bethlehem central hall of fame. Perhaps you didn't know that Sam, but I
01:10:16.040
no, I did not know that. Now I'm, now I'm intimidated. I was inducted years ago. It's
01:10:20.600
not Stanford, but it happened. And, um, they, there's a push by some kids there to get me booted
01:10:27.020
out. Why did I say, cause I said something controversial about race. No, uh, because I
01:10:32.740
retweeted two black men, two prominent black men who criticized the constant focus on race in this
01:10:42.120
country. One was Jason Whitlock of outkick, right? Formerly ESPN, a journalist who's super smart.
01:10:48.600
He's coming on the show. He, I love Jason Whitlock. He's been brilliant on these issues and really
01:10:52.540
brave. And he's been called an uncle Tom by everybody. And one, um, Leonidas Johnson, who's
01:10:57.240
got his own podcast. So now it's to the point where even retweeting these, you know, black men with
01:11:05.140
heterodox views of this race dogma is problematic, right? That's potentially cancelable, whatever
01:11:12.660
they'll, they'll do what they're going to do. But I think it's, no one wants to hear from them.
01:11:17.720
Why isn't Coleman Hughes a household name? If he were saying the stuff the left wants to hear,
01:11:24.420
he would be. Yeah. Uh, well, it's a problem. Uh, you know, I'll, I'll grant you that it's been,
01:11:31.020
it's been a problem for a long time on some adjacent issues. I mean, this is what happened
01:11:36.680
to, to Ayaan Hirsi Ali, who I, I think you probably know. Um, she's my friend and she was on the show
01:11:42.440
recently. Yeah. Who, I mean, she's a dear friend of mine and, you know, so, you know, when she was,
01:11:49.220
you know, speaking critically about the treatment of women under Islam and you'd think she would have
01:11:55.820
standing to do this, having, you know, come out of Somalia and, and suffered, you know, all of the,
01:12:01.740
the, um, uh, uh, the collateral damage of that experience that you might expect. And then,
01:12:09.520
uh, literally recapitulating the entire enlightenment project in her own life and becoming a, a, uh,
01:12:19.020
member of parliament in Holland. And then, you know, being, uh, you know, hunted by jihadists and
01:12:24.360
theocrats and, and, you know, essentially becoming the next Salman Rushdie, uh, the, uh, the left
01:12:33.280
didn't want to hear from her, right? I mean, she's, she was much, a much better candidate for, for
01:12:39.360
taking a position in a, a left wing think tank than a right wing one, but only the AEI, the American
01:12:46.740
Enterprise Institute would give her a perch, uh, when she really needed one. Right. And I, you know,
01:12:53.120
while I don't agree with, uh, everything that comes out of that organization, I, you know,
01:12:57.860
never, uh, cease to be grateful to, to them for doing that. Um, and, you know, so she had this
01:13:05.960
experience that many people have had where, when you begin making sense on one of these issues,
01:13:12.820
you know, in her case, the, the, the problem of, of Islamist theocracy, um, and, uh, it, it becomes,
01:13:22.540
um, radioactive enough on the left that you, you have a, it's a, it's a very, it's a disorienting
01:13:31.500
social experience. You, you, you, what you, what shows up in your inbox is, uh, you know, utterly
01:13:39.320
disparaging and crazy and bad faith attacks from the left. And on the right, you kind of get love
01:13:49.020
bombed by a cult, right? I mean, you just, you meet, you know, really friendly people who, who
01:13:54.320
don't agree with much or, uh, of the rest of what you may believe. Right. And so, you know, people on
01:14:02.180
the right, when they, when they hear me, uh, criticize Islam and his connection to jihadism
01:14:08.720
and terrorism, you know, very following very much the line that, that, that Ayan would, would
01:14:15.800
follow here. Um, people on the right are, are so happy to have, to have someone, you know, left of
01:14:24.360
center making sense on this issue that it's, you know, it really is just a completely congenial
01:14:31.200
meeting of the minds, despite all of the other things I believe and will argue for that they
01:14:37.120
find just odious. Right. And then you're like, surprise, surprise. Yeah. Well, yeah. So there's
01:14:42.900
that, I mean, that, I don't know if you ever saw my initial, my, my interviews with Bill O'Reilly,
01:14:48.160
but they, they all went this way where I would say something about Islam and it was a, you know,
01:14:52.180
a perfect meeting of the minds. And then I would switch to Christianity and, and, you know,
01:14:56.280
it was basically at the end of the segment. Um, so that's all the time we have. Yeah, exactly.
01:15:02.960
But, but the, but the truth is the, the, the worst, um, the most dishonest, the most hostile,
01:15:12.280
the most gaslighting, there's the, the most insufferable attacks tend to come from the left.
01:15:19.780
Right. And it's now there's an asymmetry here or kind of an optical illusion perhaps, because I'm
01:15:26.880
not, you know, I'm not dealing with the far right. I'm not talking to neo-Nazis and antisemites. Right.
01:15:32.640
So obviously what, what they would have to say to me would be, you know, just as despicable and,
01:15:38.160
and, and dishonest, I'm sure in the end, but, um, but the truth is I'm not even sure given how crazy
01:15:45.160
the far left has been. And, you know, I on experienced this and, and so many people have
01:15:50.960
experienced this and what many people have, have suffered, I would say, uh, you know, I've, I've
01:15:58.060
certainly resisted this, I think successfully, but not everyone has, there's kind of a tractor beam
01:16:03.640
effect where when you're getting nothing but disingenuous, uh, you know, nausea inducing craziness on
01:16:12.460
the left and the right is, is showing itself willing to just bury the hatchet again and again
01:16:19.060
and again, and we can agree to disagree about all these other things, but you know, we're nice guys
01:16:23.640
over here. Um, you see people get pulled to, into the getting, getting captured by a new audience.
01:16:31.620
And, you know, I'm not going to name names here, but there, you know, we have mutual friends who,
01:16:37.060
who I think just now can't really make a lot of sense when talking about Trump and, and the election
01:16:44.800
say, because they have been captured by a right-wing audience that really treated them well,
01:16:51.680
you know, when the left treated them just despicable. Um, and, and it's always, it's a kind
01:16:57.540
of social and psychological experiment. And it's, um, you know, I, you know, I think it's something to be
01:17:03.060
on guard for, um, not for political reasons, but just for reasons of, of, um, you know, making sense
01:17:10.480
on these issues. We, I know, I like what you're saying. I feel like this is illuminating. This is
01:17:15.380
illuminating because I, I know on the subject of religion, you're, you're not a subscriber. I know
01:17:21.340
you don't, you don't like the term atheist because you don't need to be called that just because you
01:17:25.460
don't happen to subscribe to religiosity, um, in any form, but it's really a rejection of dogma.
01:17:32.360
That's what you're, you're, you're a rejecter of dogma. And I, I actually do like that. I feel
01:17:37.100
like I too am a rejecter of dogma though. I am. I don't think I can say a practicing Catholic,
01:17:43.700
but I'm Catholic. I do believe in God. I don't believe in every story in the Bible. I'm trying
01:17:49.200
to raise my kids Catholic, but I don't, I don't really subscribe to the dogmatic religious thinking.
01:17:54.320
I kind of take what I want from it and use it to reinforce moral and ethical principle
01:18:00.180
principles. I believe in, I use God to threaten them, which works brilliantly. And, um, that's
01:18:07.300
sort of where I am, but I politically am very reticent to sign on for anybody's dogmatic thinking.
01:18:13.960
And I think that's an advantage to me as a journalist, but I, that's why I'm a registered
01:18:19.100
independent. I, I, why, why should I sign on to some party and their platform that I'm invariably
01:18:26.620
going to have many disagreements with nobody out there has got exactly my ideological outlook.
01:18:31.600
And why, why would I just put on their team Jersey? I'm talking about as a citizen. Now I just say
01:18:36.500
like, I I'm going to support you. I think it's, I'm always surprised when someone says they agree
01:18:41.220
with everything on the, on the Republican platform or everything on the democratic platform. It's like
01:18:45.860
all of it. Did you look at it? Did you think about it for yourself? How did you get to that place?
01:18:50.800
And so like, I will confess that I, when I see, I mean, I can't even deal with the far left. I'm so
01:18:56.980
over them. I don't want, I really don't want anything to do with them that I have shut down
01:19:00.860
my willingness to converse with them. I don't think they're honest brokers. I don't think they're,
01:19:05.780
I don't think they're coming at it in good faith. That's what, that's what I want to say about it.
01:19:09.100
The left, I feel differently about liberals. I feel differently about, and I think Republicans,
01:19:13.760
you know, my experiences with them have been largely positive. And so I understand what you're
01:19:18.660
saying, the temptation to put on the Jersey, but I haven't, I, and I, I won't, it's just not the way
01:19:24.800
I'm built. And I'm, I'm more skeptical of these groups and parties, uh, than I am loyal to them.
01:19:31.680
And it's one of the reasons why I'm like a little concerned about, as we were, as I was saying before,
01:19:36.340
the deification of Trump, like I understand defending him and giving him a fair shake. And,
01:19:41.340
and I understand just thinking he's awesome, right? Like I get those people,
01:19:45.000
but what I see happening right now at some of these rallies where people are like, I will do
01:19:49.760
whatever my president tells me to do. I will do what Donald Trump, he's, he sacrificed everything
01:19:54.720
for me. I would die for him. I would die for him. People are saying that. And I don't totally
01:20:01.040
understand how they got there or where that means we're going. Yeah. Well, this is where it crosses over
01:20:09.080
into something like a political religion or a, a kind of pseudo spiritual awakening, right? It's,
01:20:17.540
it's just kind of a mass movement and it's happened on the left. And I would argue that what's happened
01:20:22.680
around BLM has that character as well, right? Like it's, it's just, it's not even trying to get in
01:20:30.340
touch with facts, right? It's, it just, it feels too good to be right about this particular thing
01:20:34.840
that you just don't want to, you know, you've achieved escape velocity somehow from, from the
01:20:41.380
normal constraints of, of public discourse. And you're just soaring above the earth. And that,
01:20:48.660
and that's happened in Trumpistan. Um, it's, uh, it's very strange. I mean, it, you know, it's worth
01:20:57.880
looking at the literature on cults to, to understand it. I mean, it's functioning by the same dynamic
01:21:04.760
the difference between, um, a cult and a, a religion is, is really just in numbers of subscribers
01:21:13.840
from, from my view. I mean, once you get a billion subscribers, well then it's simply indecent
01:21:19.040
to call it a cult. I mean, that's a pejorative term here, here you're talking about most of the
01:21:24.580
people in, in any given society. Um, but if there's, there's only 15 people in a house with,
01:21:30.340
you know, a lot of, you know, burning candles and they've got pictures on the mantle that no one
01:21:34.640
can recognize, um, well then that's a cult and what the hell are you people up to? Uh,
01:21:40.120
and what are you teaching your kids and, and all the rest. And so that, but if you really
01:21:44.960
want to have an honest conversation about the way the world is and how we should all live
01:21:51.520
together within it so as to stand a chance of maximizing human wellbeing or escaping the,
01:21:58.600
the worst possible outcomes that are, that are in fact possible, well then you, you need to appeal
01:22:04.280
to something deeper and something universalizable, right? Something that isn't born of the mere
01:22:09.340
accidents of, of birth or geography or, you know, who you're, you know, what religion your parents
01:22:15.000
happened to, to have, or, you know, what, what politics they happened to have. And we know many
01:22:20.080
people inherit their politics very much like a religion, right? You tend to just be following the line
01:22:25.620
of your parents. Um, and it is, yeah, it's weird. I mean, to come back to the point you made about
01:22:32.860
platforms, it's weird that if you know someone's position on gun control, you know, you stand a good
01:22:40.420
chance of knowing their position on climate change, right? Or on, on a dozen other things that should be
01:22:47.280
unrelated to, and so it's, um, it's a sign that people aren't thinking these, these problems through
01:22:56.120
based on first principles. They're, they're joining a team. They're, they're joining a religion. They're,
01:23:01.280
they've, they're, they're part of a social experiment on some level. It feels good to, to be part of a team.
01:23:08.140
Yeah. It's tribalism. I mean, we, we have, we're deeply tribal and we, um, you know, we're, we're
01:23:16.480
apes in that regard. And we're trying to, we're trying to leave the monkey behind here. And again,
01:23:23.660
all we have is conversation by which to do it. Now, I want to pick up on something you said about
01:23:29.800
going forward in life and being focused on, you know, what matters and who we are as, as human
01:23:34.520
beings. You are, you're deep into meditation. You've studied it for years. You've practiced it
01:23:41.880
for years. You've read all the books you've spoken with all the gurus. And, but the thing that stood
01:23:47.380
out to me and just reading up about this piece of your life was, um, you, I read, this is a quote
01:23:53.500
from you. I've gone into silence for a week and meditated 18 hours a day just to see what can be
01:24:00.200
revealed through disciplined use of attention through introspection and to see how it can inform
01:24:07.240
the study of the mind. And then you didn't say anything more after that. And as somebody who
01:24:15.520
doesn't meditate, I was wondering, could you like short form it for those of us who didn't do that?
01:24:21.800
That seems like an important thing to know. Um, well, yeah, so I, I've spoken a lot about
01:24:26.440
meditation in, um, I wrote a book, uh, it appears in several of my books, but I wrote a book on the
01:24:33.540
topic called waking up. And I have a, an app by that title where I, where I, I and other people
01:24:39.080
teach various techniques of meditation and talk about its, its connection to understanding the mind
01:24:45.260
scientifically and, and, and just living an examined life altogether. I mean, just, just kind of
01:24:50.760
rebooting the, the ancient philosophical project of, of developing a philosophy of life that actually
01:24:57.980
matters, right. That actually changes one's moment to moment engagement with the world and aligning
01:25:03.900
one's ethics and one's emotional life and, and really trying to, to live a life that you don't
01:25:12.080
regret in the end. I mean, you don't regret at the end of any given day or a given hour, but you don't
01:25:17.700
regret at the end of your life. And what would it mean to, to, to do that? And, and how can we do
01:25:22.760
that? Um, so that's, that's really the kind of the center of gravity of, of my, my interest at this
01:25:28.400
point. But yeah, with respect to sitting silent retreats, yeah, I did a lot of that, you know,
01:25:36.500
mostly in my twenties, I spent about two years on silent meditation retreats and, um, the longest
01:25:42.840
one was three months. Um, and I did a couple of those and then, you know, several, you know,
01:25:48.800
one month and two month retreats. And, and then yes, you know, you can't talk at all.
01:25:54.080
No, it was on some retreats. You, you have an interview with a teacher every other day for
01:26:00.320
about 10 minutes. So you, you, you, they just need to check in on you and guide your practice and
01:26:05.920
make sure you're not losing your mind. Um, as some small percentage of people do under those
01:26:11.120
conditions as you, as you might imagine. And, and, uh, but basically it's, it is silence
01:26:16.320
and it's, you're not even making eye contact with people. I mean, you're really, you really
01:26:20.220
are kind of locked down. If you have no perspective on the nature of, of mind, uh, prior to concepts,
01:26:31.160
prior to your, your thinking incessantly about everything, uh, prior to the conversation you're
01:26:37.280
having with yourself, uh, you are a mere hostage of that conversation. And it's an, it's an amazingly
01:26:45.640
distorted conversation. I mean, you'll, you'll tell yourself the same thing 10 times in a row
01:26:51.180
and never get bored, right? If someone walked into the room and spoke the same sentence to you over
01:26:56.540
and over again, you know, you, one, you'd think they were crazy and two, you would get out of the
01:27:02.340
room, right? You'd say this, this is, this is not worth my time. But when you look at the kinds of
01:27:08.000
things you will tell yourself and, you know, you know, every hour on the hour, every minute on the
01:27:14.880
minute, right? When you're perseverating on something, when you're trying, when you're, when
01:27:18.060
you're really caught by something, um, it is, it is a psychotic dream really. I mean, the difference
01:27:26.320
between you and a psychotic in that case is, is, is that you have the good sense to keep your mouth
01:27:30.600
shut? And the psychotic is verbalizing everything, you know, out on the sidewalk, but that's basically
01:27:36.340
the difference, right? You're, you know, we, if you could just imagine your thoughts broadcast on a
01:27:40.840
loudspeaker every moment of the day for all to hear, you know, we'd, we'd all sound crazy under
01:27:46.020
those conditions. And meditation is a technique for recognizing the mechanics of all that and
01:27:53.180
relinquishing it, you know, if, if only for moments at a time. And as you get better at it,
01:27:58.200
you can, you can, you know, get off the train for longer. And what you discover when you do that is
01:28:04.120
that the mind is a, is, is the basis for, for all the wellbeing you have ever experienced in your
01:28:11.260
life. You know, it really is, there's an intrinsic quality to consciousness, you know, before anything
01:28:19.060
changes, you know, in the very midst of any ordinary experience, you know, before you, you know,
01:28:24.520
before, before the pain in your knee goes away, I mean, even in the midst of an unpleasant experience,
01:28:30.380
there is, there is a real freedom, right? A real, a real sense of, of, you know, compassion for
01:28:39.720
yourself and for others and, and just a radical openness, right? And, and I mean, this is something
01:28:47.400
that many people experience first, you know, taking, you know, one or another psychedelic, right? And this
01:28:53.460
is what happened to me, you know, when I was 18, I took MDMA. Um, and I had to look that up. That's
01:29:00.040
ecstasy. Yeah. Yeah. And, you know, and ecstasy, you know, since it became very popularized as a club
01:29:08.040
drug and people took it in, in comparatively, you know, frivolous ways. And it's not to say they
01:29:13.660
didn't have a lot of fun in the process, but, you know, originally it was, it was, you know, designed
01:29:19.820
and, and taken by people, you know, very much with the intention of discovering something about the
01:29:25.460
nature of their minds. And, and that was really the, the, the framing I had when I, when I took it. And
01:29:31.080
yeah, I discovered that it was possible to be much less of an asshole than I was tending to be,
01:29:38.460
right. I mean, it was possible to be deeply at peace with myself and the world and, uh, and to be
01:29:48.200
happy, right. Like really, really happy down to my toes. And then, and then I think the H word
01:29:53.920
and, and, and, and then, and then you, you know, then you lose that and then you become interested in
01:29:59.700
why, right. Like what is, what is it about how I'm using my attention that reliably produces
01:30:07.700
something less than the, the deepest peace and satisfaction and love and connection that I've
01:30:14.960
ever experienced, right. How am I, how am I failing to actualize that day after day? And that's when
01:30:21.680
a practice like meditation becomes relevant to people. You, can you get yourself there? Can you get
01:30:27.980
yourself to the ecstasy version of yourself through meditation? Yeah, yeah, definitely. Although
01:30:35.820
that isn't actually the, the primary goal is the primary goal of, of certain types of meditation.
01:30:43.320
And so in a Buddhist context, there's a meditation called Metta, which is the Pali word for loving
01:30:49.800
kindness. And, and in that practice, you, you are trying to create a specific state of mind,
01:30:55.600
very much like that, you know, what many people have experienced on, on ecstasy, which is, you know,
01:31:02.640
unconditional love for lack of a better word. But I mean, it really is that it's, it's, you recognize
01:31:08.800
that love is a, a state of being that you can fall into more and more deeply. And it really just,
01:31:18.960
it's not, it's not transactional. It's not like you, you love someone because, you know, you,
01:31:24.060
because of your history with them, because of all the good things they did for you because of,
01:31:28.540
you know, because of how much fun you have in their presence. No, it's, you can actually recognize
01:31:33.640
that you really want other people to be happy. You, you really want them to be free of suffering
01:31:40.840
and the depth of that wanting, the depth of that commitment to the wellbeing of other people,
01:31:47.720
even people you've never met, right? Even people who are your enemies, who, who are working hard to
01:31:53.160
make themselves your enemies, right? I mean, you, you, you can stand back from your kind of the,
01:31:59.160
you, your, your reactivity and your kind of the, the, the personal aspect of those collisions
01:32:06.700
and recognize that on some basic level, everyone is suffering. Everyone is going to lose everything
01:32:14.800
they love in this world. You know, everyone is, is we're all in this astonishing circumstance
01:32:21.520
together. And what you want, even for the bad people is an end to suffering. I mean, you want
01:32:31.540
people to be happy. And, and that is the, that wanting is a state of mind that you can focus on.
01:32:40.860
And it's, and so metta practice is the, is the, is the practice of, of, of amplifying that intention
01:32:46.720
and emotion to the point where, yeah, it, it just obliterates everything else in your mind for the
01:32:51.960
time that you're doing that practice. So you just, you just feel a depth of love for everyone,
01:32:58.560
you know, for, for no reason, other than the fact that that's, that's what you feel for them,
01:33:04.300
right? You actually just, because no matter how bad they are, and it's, I mean, it's, this may
01:33:09.040
sound, you know, bizarre or, or, or its own, in its own way, pathological to people, but I mean,
01:33:15.940
just take, you know, take one of the worst people who's ever lived. Right. And so there are many
01:33:21.480
people on this list. Um, you know, I wouldn't put Trump on this list has, it might surprise some
01:33:26.640
people, but, um, you take, I was going to ask you whether you've meditated on him. Yeah. I mean,
01:33:31.420
so he's, he's, he's harder than most people to feel compassion for, for reasons that are
01:33:36.300
interesting, but, um, let me take someone like Hitler. Yeah. Yeah. No, I, my, my favorite for
01:33:42.180
this, this use case is, um, Uday Hussein. Right. So one, one of Saddam Hussein's sons, you know,
01:33:48.080
his worst son. Right. So this, I mean, he is just the prototypical evil person, right? He's no
01:33:54.840
question. He was a psychopath. And this was a guy who, when he was driving through Baghdad with his
01:34:00.320
bodyguards and he would see a wedding in progress, they would descend on the wedding and he would rape
01:34:05.900
the bride. And in certain cases he killed the bride. I mean, he did this more than one case.
01:34:11.080
Right. And it's just, just the most despicable human being you can imagine. Right. So how could
01:34:16.740
you feel love or compassion for Uday Hussein? Uh, well, just look at his, his lifeline as a whole,
01:34:26.940
right. Just to just roll back the clock on him and think of him as the four-year-old Uday Hussein.
01:34:32.840
Right. So how do you feel about the four-year-old? Right. Well, he's, he, I mean, he may have been a
01:34:38.360
psychotic kid too. I don't know. He may have been a scary little boy. I wouldn't, I wouldn't doubt it,
01:34:43.580
but he was above all a really unlucky one. Right. He, first of all, he had Saddam Hussein as a father,
01:34:51.380
right. I mean, how unlucky can you get? Um, and he was the four-year-old boy who was on track to become
01:34:59.520
the very scary man who, you know, we wound up killing, you know, happily. That's, you know,
01:35:06.400
that's exactly what we should have done with him, uh, given who he was and given the fact that we
01:35:11.400
couldn't capture him. But, um, it is appropriate to feel compassion for the four-year-old boy who
01:35:20.540
became Uday Hussein. I mean, that is an unlucky life for, through no fault of his own. He didn't pick
01:35:26.040
his genes. He didn't pick his parents. He didn't pick the, he didn't, he didn't decide to be born
01:35:32.220
into a war-torn honor-based society that would, would amplify all of his flaws. Right. Uh, and so
01:35:39.140
you can feel compassion for that boy. And then the question is, at what point is it illegitimate to
01:35:46.000
feel compassion for him? Right. When he's five, when he's six, when he's seven, when he's eight,
01:35:50.020
like, when does he cross over into no longer being an appropriate target for your, your, your
01:35:58.800
well-intentioned wish that he'd just be happy, that he just overcome suffering. And you can get
01:36:05.720
there. I mean, you can really get there with the worst people. Um, but that actually is not the,
01:36:11.000
the center of the bullseye as far as I'm concerned for meditation in general. I mean,
01:36:16.040
in general meditation is not about producing specific states of mind, like loving kindness is
01:36:23.740
about recognizing that, that ordinary consciousness. I'm just, just the, the consciousness that is,
01:36:29.960
that is hearing my words right now. I'm just the consciousness that's allowing the two of us to
01:36:34.420
have a conversation, um, is already free of, of self. Sam Harris, living and
01:36:46.020
examined life. I love that. That's, that's inspirational to me. It does. And, and see where
01:36:51.820
that goes and see how it makes you feel and get really honest about the answers to both of those
01:36:58.940
things that I think I can do. Today's episode was brought to you in part by Jan Marini skin research,
01:37:05.640
dramatic results. Dermatologist recommended. Get your award-winning skincare system now
01:37:10.880
at Jan Marini.com. Want to tell everybody that, uh, next show, which is on Wednesday is going to be
01:37:18.100
with two people who are spectacular. Andy McCarthy of national review, who is the one lawyer who's been
01:37:24.940
super smart on all the Trump legal challenges, really fair to the president too. Unlike most
01:37:29.780
and Selena Zito, who I've corresponded with a bunch online, but I've never actually met her. And, uh,
01:37:35.120
she's somebody who I love her voice and I love the angles she pursues on stories. She's one of those
01:37:39.960
folks who gets fly over country and isn't disdainful of it. So I think those are two great
01:37:44.240
people to talk to. Well, it'll be, you know, our first show after the electoral college meets and
01:37:48.980
votes today and we'll get their take on where we are and what's going to happen, you know, between now
01:37:54.060
and January 20th. Sound counsel and thoughts from two smart, likable people. That's the kind of show I
01:38:01.120
love. If you don't want to miss it, go over there and subscribe. Make sure you're a subscriber. That helps
01:38:05.900
that, that lets me come right into your inbox in the morning, the top of your phone saying, Hey,
01:38:10.640
don't forget me today. Come listen to me. Talk to Selena. Um, and then of course you've got to
01:38:15.720
download and you've got a rate five stars and more than anything, send me a review. Will you?
01:38:19.900
It's been fun to read them. People make me laugh. Some people swear. Some people ramble on
01:38:24.820
some people write weird sexual things. Don't do that. But I do like hearing from you and, uh,
01:38:33.420
any guest ideas are good too. A lot of new names that I hadn't even heard of. And then I send my
01:38:38.140
team to go Google and find and, and call in some circumstances. So anyway, it's been a pleasure
01:38:43.660
and to be continued after we have results from the electoral college. Talk to you Wednesday.
01:38:49.640
Thanks for listening to the Megan Kelly show. No BS, no agenda, and no fear.
01:38:54.820
The Megan Kelly show is a devil may care media production in collaboration with Red Seat Ventures.