Ep 41 | James A. Lindsay & Peter Boghossian | The Glenn Beck Podcast
Episode Stats
Length
1 hour and 32 minutes
Words per Minute
180.33429
Summary
In this episode, we speak with Peter Carr, a professor of philosophy at the University of Wisconsin-Madison, about how he got started in his career, why he thinks critical thinking is a virtue, and why we need to teach critical thinking in our schools.
Transcript
00:00:00.000
We constantly hear about activism on campuses. We have gotten familiar with the stories of the
00:00:05.680
far-left ideologies and ideologues and activists that claim to be professors. These days, a
00:00:11.780
professor with a conservative belief or anything right of Marxist is about as rare as a polka-dot
00:00:17.400
rhino. Academics have gotten so extreme that they're increasingly turning on fellow liberals.
00:00:23.180
So it's a big deal when you hear about some academics fighting against the system. It's even
00:00:28.320
a bigger deal when they use the left's own insane words and ideas as a postmodern Trojan horse.
00:00:35.840
Today's guests are unique. They are among the handful of academics brave enough to challenge
00:00:41.480
the corrupt system and the pernicious ideas that it forces and force feeds on college students.
00:00:48.340
Best of all, they're doing it in a very funny and clever way. Along with their associate,
00:00:53.680
Helen Pluckrose, who couldn't make it today, they're responsible for the grievance affair.
00:00:59.140
They wrote articles that translated passages from Hitler's Mein Kampf into postmodern feminist
00:01:04.680
terminology and submitted them to peer-reviewed academic journals in fields including gender studies,
00:01:11.060
queer studies, and fat studies. That's a quote. If you're worried about the state of culture in this
00:01:17.500
country, you're curious about the inner workings of academia, you're fed up with increasingly
00:01:22.740
authoritarian ideology from the left or from the right, this podcast is for you. This is a very
00:01:31.320
rare conversation and an important one for the future.
00:01:47.500
I grew up, uh, I didn't go to college. I skated through school. Um, and I, I realized when I was 30
00:01:58.200
alcoholic, I don't know anything. I don't know anything. And it's not that I wasn't smart. It was
00:02:08.520
that I didn't, everything I believed had been taught to me. You know what I mean? I believed in God because
00:02:15.620
everybody around me taught me about God. I believed in whatever my worldview was, it was shaped by other
00:02:21.900
people. And so I wasn't me. Um, uh, something that changed my life was I read a letter from
00:02:31.560
Thomas Jefferson to Peter Carr. It was his nephew. And in this letter, it said, um, he was, he was trying
00:02:40.960
to educate him and to line up all the basic things you had to do to be an educated man at the time.
00:02:46.980
The last one was religion. And this changed my life. Jefferson said above all things, when it comes
00:02:55.860
to religion, fix reason firmly in her seat and question with boldness, even the very existence of
00:03:03.540
God. For if there be a God, he must surely rather honest questioning over blindfolded fear that gave
00:03:11.880
me permission that I had never thought of before on every front question with boldness, but there is
00:03:20.340
no bad question and there is no dangerous answer. Does that make sense? Oh yeah. Yeah, totally.
00:03:28.580
So I, Peter, you want to go ahead? Well, I was thinking that there are certain, well, first,
00:03:34.300
thank you for having us on. Sure. I think that there are certain questions that are better than
00:03:39.120
other questions. Daniel Dennett writes an article in which he talks about if it's not worth doing,
00:03:45.620
it's not worth doing well. And I think that the first thing you need to figure out is it's certainly
00:03:51.060
true that we're taught things in our society, in our culture, in our university systems,
00:03:55.660
and we have to be able to question those. We have to have, we have to construct institutions
00:04:02.620
and systems that allow us at the most fundamental level, not only to question, but to teach us that
00:04:08.220
questioning is a virtue. But we also need to kind of find our own way in, in terms of thinking about
00:04:14.520
what should I start to think about? If it's not worth doing, it's not worth doing well.
00:04:19.780
So I would agree with you. And that's why at 30, I went, I went back to school because I didn't want
00:04:26.260
to learn what to think. I needed to learn how to think. And our universities are not teaching that
00:04:33.800
now. Nobody's teaching critical thinking. And in fact, the exact opposite is happening. We're being
00:04:40.740
told, don't question this. So I want to, I want to start with just this, this framework first. There's
00:04:49.560
kind of a two part question. What created the world we're living in was the dark ages going into the
00:04:57.080
age of enlightenment and reason science, weigh it, measure it, show it, prove it out. It's that's all
00:05:05.520
being dismantled now. Right. So where would you put us if you looked in the history of man?
00:05:12.340
Where would you, what would you compare this time to? Where are we? Where are we headed?
00:05:18.300
I mean, it's probably clear as to say that it's sliding back toward a kind of feudalism,
00:05:23.200
but not an economic feudalism. It's a feudalism of ideas of thought. So tell me, explain feudalism.
00:05:30.480
Yeah. So feudalism, you know, you have kind of each little area has its Lord who's in charge and
00:05:36.920
everybody does what the Lord says and they work on the Lord's fields and they produce crops for the
00:05:42.480
Lord and the Lord gets his share. And in return, he provides some kind of governance protection for
00:05:47.480
that group of people. And so that's kind of an economic system where you kind of have these
00:05:52.060
little fiefdoms that are, that are led by, you know, some kind of a royal figure or sub-royal figure
00:05:58.640
or whatever. And it's got this, this idea basically that everything's kind of like, okay, here's this
00:06:05.460
little one and here's this little one and this little one. And they all kind of, you know, trade
00:06:09.160
with each other and fight with each other and whatever. And it's all done by the elites at the
00:06:12.740
top of that. And so I don't think we're sliding to an economic type of system like that, but an
00:06:17.780
intellectual one where you have, you know, so-and-so's thought leader. And then there's
00:06:23.040
this other thought leader that other people follow and people are in these little groups.
00:06:26.760
They say, oh, well, you know, I kind of follow Glenn Beck or I kind of follow Judith Butler or I
00:06:31.740
kind of follow, you know, whoever it happens to be, they have their, their way of thinking, their preferred
00:06:36.060
way of, of interacting with the world. And everybody can kind of find their own little, if you want to
00:06:41.800
call it a tribe, their own little fiefdom or whatever. And we're losing the common ground,
00:06:47.780
that you mentioned that came from the Enlightenment, which would ultimately be the correspondence
00:06:50.880
theory of truth. But you can, the truth somehow corresponds to, to an objective reality that we
00:06:57.260
can, can look at, observe, measure way out, as you were saying, and have some kind of external to any
00:07:03.840
particular person standard by which we can, we can say, okay, this is true because if you do the
00:07:09.960
experiment, you get the result. If I do the experiment, I get the same result. If a robot with no mind does the
00:07:15.720
experiment, it gets the same result. If a dog could do the experiment, it would get the same result. So something
00:07:20.040
that makes the same result keep happening is fundamental here. And we're kind of erasing that. It's, you know,
00:07:24.960
it's kind of an age of, of prejudice and opinion and people get to follow whose prejudices and opinions they like
00:07:31.080
best. And so you see this kind of fracturing and, and even, you know, balkanizing where these, these little groups are
00:07:37.960
kind of at war with one another and they don't agree and they can't see eye to eye and they can't
00:07:42.360
get along and tribalism. It's, it's very tribal in a sense. Yeah. So I see that's where we're, we're
00:07:48.300
headed. If this kind of very, as they call it postmodern, you know, modern would be the
00:07:53.620
enlightenment idea that there's some science and reason, science and reason, uh, democracy and
00:07:59.740
capitalism and so on. And then we're, we're kind of heading to this place where we're fracturing it out
00:08:04.820
and making it be about your truth versus my truth. And that was, that was the, that was the
00:08:10.080
question that I wanted to ask next was define truth. What is truth? We're now living in a place
00:08:17.980
where it's, well, that's your truth, right? Yeah. Speak your truth. Right. Yeah. So, so I think people
00:08:24.740
confuse the external world and the internal world, like subjective states. So if it's a mat, so here's
00:08:30.900
the question for your audience. If two people have conflicting beliefs about the same thing,
00:08:37.600
must someone be wrong? And the answer to that is it depends if it's a matter of taste. No,
00:08:44.260
you like pepperoni on your pizza. I hate pepperoni on my pizza. You like Beethoven. I can't stand
00:08:50.660
Beethoven. I don't know if you like any of these things. Um, so there are different types of truths.
00:08:55.680
There are truths within language. Seven plus five is 12. A bachelor's unmarried man. There are truths
00:09:00.840
about the world. Something falls at 9.8 meters per second squared. Speed of light is 186,000 miles.
00:09:06.260
So there are truths. And what's happening now is in, to go back to what you said about the academy,
00:09:11.820
I teach critical thing for a living, that professors are increasingly looking at the university system as
00:09:17.820
an ideology mill. And the goal is not to find the truth. It's to place an agenda. It's to, it's an
00:09:24.020
agenda driven kind of activism in which they want people to go to their own truths. And we can talk
00:09:31.560
about this thing called standpoint epistemology, which is a big driver in this. I think I said no,
00:09:36.240
no use of the word epistemology. Oh, did you? No. Okay. Okay. It just, it's a, it's just,
00:09:41.660
it's not, all it means is how, you know, what, you know, it's just a process that people use to
00:09:47.340
knowledge. So some people use, and not, so here's the thing, not all, and I think, so we have,
00:09:53.920
obviously you and I have different political beliefs. We have different metaphysical beliefs
00:09:58.460
about God and, and, and that's why this conversation is so important. And it's why the tragedy of this
00:10:07.360
whole thing is that we have to come here to Dallas to talk to you because this, this is not happening
00:10:11.940
in our academies and people don't, our kids are not welcome. I'm, I'm less welcome in on university
00:10:18.920
campuses than you guys are. That's an astonishing statement. Correct. Yeah. Correct. Yeah. And
00:10:24.900
there's, this is the kind of conversation that I think should be happening on college campus.
00:10:29.400
And it's not right. Right. And, and not even the most contentious conversations like, well,
00:10:34.160
we need to get black lives matter with police officers who actually teach and train people in
00:10:39.100
tactics because if you don't, more dead bodies keep piling up, right? There are direct consequences
00:10:43.800
for not having these conversations. But so, so think about it like this. You have your reasons for
00:10:51.380
belief. I have my reasons that I don't think God belief that there's not sufficient evidence for that.
00:10:57.300
We can have that conversation, but what you're not saying is, and I'm sure you're not saying this is,
00:11:02.880
if I'm wrong, I'm sure you'll correct me. That's just true for me. Like you're not saying that this is
00:11:08.060
just some highly subjective thing that people participate in. And what we're seeing now is
00:11:14.260
this torrent of subjectivity, this demeaning the objective. So hang on just a second. Yeah. Sure.
00:11:21.200
I have faith. Yeah. Okay. There's a difference to me and I, I could be the, the one guy who reads
00:11:28.540
it this way. Okay. But I can't prove God to anything. There's no proof. Right. There's no proof.
00:11:33.320
So I have faith. What my father used to say to me all the time when I was growing up is don't talk
00:11:37.400
to me about God. Talk to me about first cause. What was first cause second before the big bang?
00:11:44.720
He, it could be a big bang. Could not be a big bang. We have no idea. Sounds like big bang is it for
00:11:50.920
not right now. Good. What happened the second before? And that's where I can't prove what, I mean,
00:11:58.460
it could have been Wile E. Coyote lighting the fuse of a giant bomb. I don't think so, but it could
00:12:03.460
have been, we don't know. So those who say they're atheist, aren't you more of an agnostic because you
00:12:10.840
don't know what first cause is. I don't know what first cause was. Right. So I kind of think that it,
00:12:17.420
it is, is, is God. It might be just a mathematical equation, but where'd that come from?
00:12:24.220
Right. So this will take our conversation down a different road. I want to, can we keep it brief?
00:12:30.060
Cause I, I'd love to have you talk about that, but I want to make sure we keep on track. Okay. So
00:12:35.360
the, the principle there is their bottom line. Victor Stringer said the universe could have always
00:12:39.980
existed. It could have been caused by a big bang. Krauss says the universe could have come from
00:12:44.400
nothing, could have been God, could have been any one of a number of things now. And I agree with all
00:12:48.500
those things. Right. So, so if we don't know what it is, that doesn't mean we should believe
00:12:54.460
something. It means we should calibrate our confidence accordingly. And so the key there
00:12:59.920
of this whole thing in the same thing you learned when you were questioning at the beginning, you
00:13:04.080
started saying this is if you, and then you asked me what truth was, I think that the key to this whole
00:13:09.080
discussion is, are you willing to revise your beliefs? Yes. Are you willing? Okay. So, okay. So right now
00:13:15.640
this is the rule of engagement that we have. You are a sincere inquirer. I am a sincere inquirer. I
00:13:22.540
know him. He is sincere inquirer. We're all willing to sit down and have a conversation. We agree on the
00:13:27.980
rules of engagement. Now let's have that conversation. The problem is that we are acting, we are engaging
00:13:34.160
with people primarily in university systems that are not good faith actors, as you'll say, they're not
00:13:41.320
willing to revise their beliefs. Correct. They look at speech as a form of violence. They do not
00:13:45.620
want you to come on campus. What are you going to do on campus? Are you going to start
00:13:49.720
lobbing grenades? No, they don't want you to come on campus because they think that there's something
00:13:54.560
in two reasons. One, intrinsically dangerous about your ideology. But, and coupled with that is this
00:14:00.620
idea, and Jim can speak to this, is they look at truth as a form, they look at speech as a form of violence.
00:14:06.260
Speech is not a form of violence. And the moment you start thinking that speech and violence are
00:14:12.120
synonymous, you destroy the enlightenment project. You give up any possibility you have to leading a
00:14:18.840
better life. That's the death of hope. It is the failure to be an honest broker in conversations with
00:14:26.980
people with whom you have substantive political, moral, and moral disagreements.
00:14:31.200
Yeah. To wheel it back a bit, this subjective turn, as Pete was calling it, this is a rise of
00:14:38.140
subjectivity. What is truth? And he talked about there being the objective world, and then we have
00:14:43.000
ways to know, usually science, what's going on in the objective world. And then you have this
00:14:47.520
subjective world. So he doesn't like pepperoni on his pizza. He said that's a subjective truth.
00:14:52.340
That is his truth. Yes. It is, in fact, a real fact about the entity of Peter Boghossian.
00:14:57.700
So whenever that happens, I don't know if you really like pepperoni or not. I thought you did.
00:15:02.320
We've got a lot. He doesn't like pepperoni, and he hates Beethoven.
00:15:08.480
That's right. We've got a finger on you now. But when people speak about their subjective truths,
00:15:14.620
what they are speaking about is something that they know about themselves that they cannot possibly
00:15:23.540
And so at that point, there is no conversation to be had. Now, the problem is, is when you
00:15:27.660
start blurring those two worlds, when you start saying that one subjective truth, one's experience
00:15:32.900
in the world, and your own interpretation of that experience somehow trumps the ability
00:15:39.480
to do an outside measurement of that or to take another view of it. And that's where we're seeing
00:15:45.560
the breakdown in conversation. People believe that their truth is the truth.
00:15:49.640
So doesn't this come down to, like, Pendulat's a good friend of mine. He's an atheist. And I
00:15:55.500
really think he's a good man. Really good man. I've never tried to sell him religion or anything
00:16:04.180
else. He's happy in his life. Isn't it where it breaks down? Sometimes I feel like Jimmy Swaggart
00:16:12.560
and Bill Maher are kind of the same. They're both saying, you don't believe this. You're
00:16:18.900
bad. You're just stupid. You're just whatever. And I'm like, dude, I don't. Isn't that where
00:16:25.340
the problem is? If you don't believe this, this and this that I believe you're just a bad
00:16:31.720
person or you're just stupid. And it's like, I don't care. Are you or worse? It's even worse
00:16:37.380
is that if you don't believe what I believe, you're an existential threat. Yep. That's where
00:16:41.320
we're at. Yes. You're an existential threat. You are. You are the cause of the doom of society.
00:16:46.500
Right. And it's the rare person. I think you guys are in this category. And and Penn is
00:16:53.860
again, I hate to keep going back to atheism. But when he stood up at the big atheist meeting
00:17:00.980
on the mall a few years ago, he stood up and said, let's not be the people that the Christians
00:17:07.780
have always said we were or that they were to us. Let's not be those people. That's not
00:17:12.740
that that is that's what the country needs. That's what the world needs. I agree. The part
00:17:19.920
that's particularly grotesque about this is that if you you ought not to be friends with
00:17:25.400
someone who holds a certain set of beliefs. And if you are, you too are a bad person.
00:17:30.460
Even if you hold the beliefs of your own tribe and there's no currency to be gained by crossing
00:17:36.000
the political aisle right now, there's no currency to be a lot of currency to lose. And
00:17:40.240
there's a lot of currency to lose. And we've seen it happen. And I'm sure you've seen it
00:17:44.240
happen when you can think of specific examples when you've sided with a Democrat or you've
00:17:48.560
sided with somebody because you think it's the right thing to do. And it's not that you
00:17:52.060
have an ax to grind against something like I don't have an ax to grind about the metaphysical
00:17:56.740
world. I just I just don't think that there's sufficient evidence to warrant belief in God.
00:18:00.900
But if I would show that evidence, I would believe. And if someone doesn't happen to have
00:18:04.980
that right belief, right, you know, when when my parents died, you know, I've I got the privilege
00:18:11.940
of holding both of my parents hand hands when they died. And the the dads are my young Armenian.
00:18:18.440
And so the dead high, they call the priest came in and the the religious folks came in.
00:18:23.640
And these are just look, I don't agree with them about metaphysics or God, but these are
00:18:28.140
just fundamentally decent people. They're kind people. And when I was in Las Vegas for this
00:18:33.860
happening, I brought my daughter. I don't usually talk about my daughter. I get very emotional,
00:18:38.380
but we adopted her from China. And the whole community that he lived in, in Sun City, everybody
00:18:46.080
was a Trump supporter. The whole place was a Trump supporter. And I truly dislike Donald
00:18:52.180
Trump. I legitimately dislike this man on every level. But to say that his followers are somehow
00:18:58.840
racist. Now, that doesn't mean he doesn't have some racist followers. It's just it's a
00:19:06.360
misapplication. Who who says who says you voted for Obama? I did. Jeez, I'm not. I didn't
00:19:14.180
realize I was sitting with somebody who just approves of drone strikes on citizens. Are you
00:19:20.180
talking about? You could have voted for him, but you weren't necessarily for the drone strikes
00:19:25.020
who were not these one dimensional people. If you think, you know, if you think that you're
00:19:32.460
when did we become the group of people that you have to buy all of it or none of it? 1994.
00:19:38.600
1994. That was a joke. You said it with conviction. I was like, wow, that's great. It's going to be a
00:19:46.600
big podcast. It's also the death of nuance, right? That's what happens when you have these tight net
00:19:53.440
ecosystems and you constantly chuck people out of the sphere. You're chucking people out of the
00:19:58.740
sphere and nuance dies with that. 144 characters. 144 characters. Yeah. I mean, you can't make any point
00:20:05.480
in that, right? You can't, I mean, no, you can only slam dunk on somebody. And you were just talking to me
00:20:11.020
about the social media today. Yeah. I've actually reflected pretty heavily on social media over
00:20:16.620
breakfast and the influence I had. I sat down, I was waiting for breakfast and I didn't pull out my
00:20:22.080
phone. And then I got bored and I pulled out my phone and looked at Twitter. And almost as soon as I
00:20:26.680
did, it was like, I lost the train of thought that I'd had in that moment of boredom. I had started to
00:20:31.540
become creative. I started to think, I started to, you know, be curious about people around me and
00:20:36.640
things that are happening and just my own thoughts. And the second I pulled it out, now social media is
00:20:40.680
directing my thought. It's either, you know, people speaking to me, my feed or, and then it's the
00:20:46.040
things that I've subscribed to follow. And now it's directing my thought. And I figured out that,
00:20:50.320
you know, a little moment of self-reflection, I put my phone back in my pocket and I reflected on this.
00:20:55.060
And I feel like I've been kind of sucked into a trap on this for the last couple of years
00:21:00.460
where I get bored and I'm work a lot. So I'm tired. And so I don't want to go do something
00:21:05.980
difficult. I want to just relax. So what do I do? I turn to social media. I don't watch TV really.
00:21:10.240
So I pull out social media and it never struck me quite as profoundly as it did. I mean, I've known
00:21:15.400
there are problems like psychic problems with, with engaging too much with social media, but it just
00:21:20.900
immediately in that instant took away all of my creative thought and directed it into whatever noise
00:21:26.700
was being thrown in front of me. And so much of that becomes because 140 characters or whatever,
00:21:32.100
it's so easy to just put that like, you know, a little slap out there or that, that slam dunk or
00:21:36.720
just whatever it happens to be that it's partisan baloney. Right. And there's currency to be gained.
00:21:42.280
There's a lot of social currency. There's only, there's, there's, there's only two in social media.
00:21:47.320
It's either tear down or build up. Yeah. And it's a lot more profitable to tear down and easier.
00:21:52.980
Yeah. It's very difficult to put out a thing to build something positive in the world because
00:21:58.320
a, it's really easy to be wrong. So you probably are on some level, even if you're doing something
00:22:03.640
really great and positive and you thought about it a lot, something's probably wrong with it. So
00:22:07.540
there's places for B millions of people who see it or thousands or whatever your reach is to start
00:22:12.780
trying to grab with their claws and pull that down and tear it apart. And it's the, there's a million
00:22:18.160
problems with everything. It's very difficult to be right. This is something I talk about a lot,
00:22:21.980
but you want to talk about science, you know, it's very difficult to, to be right. I think
00:22:25.760
Carl Sagan talked about it as crying truth from the fabric of the universe, like diamonds or
00:22:30.460
something. It's very difficult to pull out a truth. So we put out, we're smart people. We put
00:22:34.180
out ideas. Maybe we're 90% wrong. Most of the time, it's kind of Sturgeon's law, right? 90% of
00:22:39.400
everything is crap. And so most of what we think we're smart. We put out an idea. We think,
00:22:43.860
we think we're on it. And when it's 90% wrong. And so what do we do? We have to whittle that
00:22:47.480
wrong stuff away. And when that process is collaborative, rather than, you know,
00:22:52.500
you get this kind of social credit for just trashing somebody, then you have science happening.
00:22:57.700
Or if you say, I'm sorry, you know, I made a mistake. I thought this way. People will meme
00:23:01.620
it out a hundred times. Oh yeah. Put your face on it. And then, and so there's a mechanism in place,
00:23:08.580
a social mechanism. And I don't know if this dysfunction exists independent of social media
00:23:13.000
or social media just amplifies it. There's a mechanism in place to make you not want to say,
00:23:19.000
you know what? I made a mistake. Yeah. Yeah. And we need to create a culture. How do we fix this?
00:23:25.520
Well, beside the conversations, we need to create cultures when it's lauded when people say, I don't
00:23:31.220
know, when it's lauded when people say, you know, I made a mistake. We need to stop. Yeah. We need to
00:23:36.180
stop this idea that, oh, he flip flopped. Well, you know what? Maybe he had different evidence.
00:23:40.220
Well, there's a difference, but I've always said, as I've talked to politicians
00:23:43.440
for 40 years, that's why my hair is white. I don't know why your hair is white, but my hair is white
00:23:49.240
from talking to politicians. And I've always said to the audience, there's a difference between
00:23:56.500
flip flopping and growth. Right. That's right. If you're not changing your mind over 10, 5, 10,
00:24:03.160
15, 20 years, you're not alive anymore. Right. You know, if you can tell me, you know,
00:24:09.460
Mitt Romney said I was for abortion and, or I was against abortion and now I'm for it. And I'm like,
00:24:14.720
uh-huh. If he can't tell me what room he was in, what the wallpaper or the color looked like,
00:24:21.660
he's lying to you because you don't go from life to no life or whatever it is. You don't make a
00:24:28.440
change that big without something profound. So I asked him the question and I said, tell me that
00:24:36.020
moment, he had the moment he was sitting in Harvard. They, I mean, he knew everything about
00:24:42.680
it. That's not a flip flop. That's an honest, that was the key to Jefferson. Right. Honest
00:24:49.220
questioning. Right. Honest, not gotcha, not trying to win here, but honest questioning.
00:24:54.760
And, and the interesting thing about that to me is that that's an attitude and the attitude is,
00:25:01.420
are you willing to revise your beliefs? So the, the idea then, well, how do you formulate your
00:25:06.460
beliefs? My, my guess is that, well, how do you formulate beliefs on the basis of evidence?
00:25:12.140
Yeah. Right. So if you formulate your beliefs on the basis of evidence and some, that must mean by
00:25:17.940
definition that some piece of evidence could come in to make you question the belief you already have,
00:25:23.120
then you'd have to revise it. Correct. Now, if you're unwilling to revise your beliefs on the
00:25:27.740
base of evidence, then you don't formulate your beliefs on the basis of evidence. You formulate
00:25:31.680
your beliefs on the basis of something else, which is fine. Then just say so. Right. Don't lie to me
00:25:36.440
and certainly don't lie to yourself. Correct. And that's the, you know, like you mentioned the God
00:25:40.900
thing again, like you can have a conversation with people when they say, you know what, you're going to
00:25:45.640
go to hell. Like, I like having conversations with these people. These are honest people. They're
00:25:49.020
honest, bro. I might disagree with them, but okay. Now we were honest. You know what I think. I know
00:25:53.400
you think now we're going to have a conversation, right? The problem is that when either people make
00:25:58.740
these unbelievable subjective moves about my truth, your truth, or the thing that I see that
00:26:04.960
is so despicable right now is that they reduce you to some characteristic that you have. You're
00:26:10.660
white, you're male, you're heterosexual, you're able bodied, you're privileged. Privilege is the
00:26:15.800
original sin. This is the new religion, right? And so situated truth. So they demean any claim
00:26:21.960
that you have about the world because of something, some characteristic you possess.
00:26:27.500
Martin Luther King would not be welcome in today's academia, I don't think.
00:26:31.920
Oh, no, no. In fact, one of the universities, I think in the University of Oregon a few years ago,
00:26:36.400
tried to problematize him. They had that bust of Martin Luther King, and I think they were doing
00:26:40.720
some renovation, so they had to do something. There's the plaque, and it has the section from
00:26:44.460
the I Have a Dream speech. And they were saying, ah, well, Martin Luther King wasn't sufficiently
00:26:48.300
inclusive to sexual minorities and trans people, so he's problematic. It's like, holy crap.
00:27:10.260
What truth does this collection of 350 million people hold self-evident? Can you think of
00:27:23.560
Freedom of religion, freedom of the press, freedom of speech. I mean, these are things that are-
00:27:27.740
I think that still is- I think those are on the ropes.
00:27:30.300
I think they are on the ropes, but I think- Oh, I took your question to mean originally.
00:27:34.320
No, no, no. I mean today. What is it that you could say? I could wake every American up in
00:27:39.880
the middle of the night and say, hey, what do you think about? And they'd be like, what?
00:27:44.480
Of course not. Or, of course. What is the self-evident truth?
00:27:50.440
Yeah, and I don't know the answer to that. I know that that's one of the reasons for the-
00:27:54.600
That's a strong conservative message that we've kind of lost something there, and we've lost-
00:27:59.480
What we've lost in not having self-evident truth is we've lost a kind of social cohesion-
00:28:04.760
Which prevents us from looking at each other like people and having compassion.
00:28:09.020
The important part of that is all men are created equal. And I mean, I'm stressed created,
00:28:15.160
not just because of God, but created as that you're created equal. You don't end up equal.
00:28:20.980
You know what I mean? You got to bring something to the table.
00:28:23.940
And we don't have that anymore. And we don't, you know, we separated church and state for a very good
00:28:32.760
reason. But I contend we are back with church and state. The church is just academia.
00:28:43.440
If I don't graduate from the right college or even go to college, I don't have a place at a table
00:28:49.740
at all. I have nothing. Yeah. I can create anything in my own life and I can do it without
00:28:55.880
the education. It means nothing. And so they're closing all those doors. And if you're not in
00:29:01.360
that group and then groups beyond that one, you're toast. We've got a papacy.
00:29:10.800
Yeah. We can speak to how to solve or address some of those problems. You know, legacy entitlements
00:29:16.480
and the college scandal thing. We have to do away with legacy entitlements. So if you
00:29:21.060
went to Stanford, like my wife went to Stanford, if you went to Stanford, your kids get more
00:29:24.940
points. That's anti-meritocratic, has to be done away with. So I think, so here's another
00:29:31.740
example. I think we're in a broad agreement on the principles, the rules of engagement,
00:29:36.380
how to have civil conversations, why we need to open up the universities, why, you know, we
00:29:43.160
Wait, wait, wait, wait. When you say open up the universities, I, I think I agree with
00:29:47.760
that. I just wanted to define this. That may mean that more Asians are.
00:29:52.060
That's not what I mean. No, I mean that specifically we need diversity of opinion in the university
00:29:59.240
So I actually, I'm not a Marxist. I actually think we need someone, somebody teaching Marxist
00:30:05.580
economics in a university system and I'm not a Marxist. And I think that we need a diversity
00:30:11.540
of opinion to give people and we, we give students the critical thinking, the infrastructure,
00:30:16.140
we teach them how to talk about this. And Mill spoke about this. You need someone who
00:30:20.680
actually believes this stuff. So I teach an ethics, atheism class. I teach arguments for
00:30:26.880
the existence of God, but I don't believe them. So it's, and I tell people all the time,
00:30:31.180
that's why I try to have guest lecturers come in who believe this. So Robbie George and
00:30:36.100
Cornell West, Robbie George and Cornell West all the time and their friends, right? And
00:30:41.100
they get along and they don't agree on anything. And Phil Fisher from veggie tales and myself
00:30:44.920
and other people. So when you hear diversity, which what people think normal people on the
00:30:50.640
street think, Oh, diversity, that must mean some kind of under that moniker is diversity
00:30:54.860
of opinion. That's just not true. Right. It means in the most superficial way. And so the
00:30:59.280
folks inhabiting the universities right now, they're changing the meanings of ordinary words,
00:31:04.120
equality, equity, et cetera. And they're doing it from a, from a theoretical perspective that
00:31:08.960
if, if you actually look into and come to understand where these people are coming from,
00:31:13.780
it sort of makes sense. So why would you assume that, you know, two people of different demographic
00:31:19.920
qualifications automatically have different opinions? Well, that's where we go back to the
00:31:25.760
standpoint, then epistemology is the word we can't say standpoint way of knowing things that you,
00:31:31.280
uh, that, you know, so it turns out that, um, when you believe fundamentally as these people truly
00:31:37.020
do, the people, and these people have taken over academia, the educational system that to have
00:31:43.020
lived a particular experience as a particular race, sexual orientation, et cetera, confers special
00:31:48.680
knowledge that other people can't possibly have. You automatically see that diversity of
00:31:55.100
identity implies diversity of, uh, thought, but that's the wrong kind of diversity. And I think
00:32:02.080
there was a study, didn't we just see this? There's a study just the other day that came out that showed
00:32:05.820
that diversity training, for example, isn't working. They force people to do this multi-billion
00:32:12.220
dollar industry and it's not working. It doesn't actually achieve anything. So I think because
00:32:16.660
they're focused on the wrong thing. What should they be focused on? They should be focusing on,
00:32:20.620
on differences of opinion, differences of perspective. Um, so if it's in a kind of an
00:32:27.320
extreme example, but you may, for example, have a philosophy department and they have a particular
00:32:31.920
problem that, you know, they're hashing out, bring a mathematician in. They have a completely
00:32:36.700
different way to look at it. Does it matter if the mathematicians white, black, Asian? No,
00:32:40.600
it doesn't matter. Bring a mathematician in. They have a different way of thinking about it.
00:32:44.460
Now you're looking at something in the political sphere. You're looking at immigration or you're
00:32:48.000
looking at, you know, anything guns, you're looking at any topic. You need somebody who's
00:32:53.220
representing the different perspectives. So what's the conservative perspective? What's
00:32:57.180
the libertarian perspective? What's a liberal perspective? What's a, what's a progressive
00:33:00.740
diversity? That's diversity. I'm a self-educated guy. I couldn't afford to go to college for very
00:33:06.840
long. Okay. So I went to the library and I read Alan Dershowitz and Adolf Hitler. I mean,
00:33:14.920
I would go for the, I'd look for the people who had the most diverse possible viewpoints on things
00:33:24.120
and read them and knew that if they intersect anywhere, if there's anything there, okay,
00:33:30.420
we know that one line is true because they both agree. And then you just kind of whittle yourself in.
00:33:36.560
Yeah. It's why we don't burn books and yet we're burning people and burning thoughts and,
00:33:42.800
and they're racing their legacy. So that it's a simple, yeah, a cancel culture. It's the same
00:33:46.980
as burning. What's a cancel culture? Well, if you're too problematic, you get canceled. Like
00:33:51.700
you'd cancel a TV show, but then you make sure that all of the, you know, it's not available on
00:33:55.280
Netflix anymore. You can't get the old, you know, DVDs even gone. I've called that digital
00:34:02.060
ghettoization. Yeah. Yeah. I mean, you know, you put them behind a wall. That's today's Jews can
00:34:08.000
talk all you want behind that wall. No one's going to see you or hear you, but yeah, yeah,
00:34:12.460
sure. You're behind that wall. That's, I mean, that's what's happening. There's a lot of that.
00:34:15.720
Yeah. This is the modern book burning. That's what, this is modern book burning. When you take,
00:34:20.640
say a body of work from somebody who you've deemed problematic, and then you erase that problem,
00:34:25.360
that the body of work to where people can't access it anymore. People can't engage with it.
00:34:29.200
You de-platform them and don't let them speak. You deny them their, you know, you erase their
00:34:35.300
Patreon or something. So if they had that, so they, now they can't make money doing what they're
00:34:39.240
doing and they have to go find something else to do. It's the equivalent of book burning. So we are
00:34:43.120
there and that needs to be stopped. I don't know any other way to say it. It just needs to be stopped.
00:34:49.000
We need to have, you know, we need to welcome a diversity of opinion. You know, one of the most
00:34:54.440
interesting thoughts that crossed my mind this year is a guy I was talking to in February.
00:34:59.200
He told me he sets aside one month a year. It's usually August. I think he said, but it's
00:35:03.580
arbitrary. He sets aside one month a year to read opinions. He explicitly disagrees with.
00:35:09.240
That's all he reads. So he's a libertarian guy. So for an entire month, all he does is he digs
00:35:14.860
into the kind of either conservative or liberal or progressive or whatever thought he doesn't
00:35:19.220
agree with. And what he tries to do is tries to find, and this is the key, tries to find the most
00:35:24.940
sense he can make out of that. And then bring that back to his own worldview the next month.
00:35:34.640
Yeah. I think what's key in that is that that is an attitude. That's an attitude that that
00:35:39.380
guy has. And the more 52, the more I think about this stuff, the more I realized that it's all about
00:35:46.280
values. Like if people value certain things, we have to help people value diversity of opinion.
00:35:52.220
We have to help people value revising their beliefs. We have to help people value what's
00:35:57.480
true. So here's the problem. Here's the problem because you're right about that. But here's the
00:36:03.860
problem as I see it. Let me give you two examples. One, I don't think the border wall has anything
00:36:10.240
to do with Mexico. I think it has everything to do with the conservatives who have been saying,
00:36:16.060
look, we've got a problem. We have people coming across our border. We don't have any idea who
00:36:20.220
they are. We have safety issues. We have companies that are abusing these people, et cetera,
00:36:25.740
et cetera. And we got to have a secure border. We have to know our visa program. That was the
00:36:31.600
problem in the first place. We haven't done any of that stuff. And you people in Washington,
00:36:35.940
both sides keep saying you're going to do something. The reason why I think people want
00:36:40.640
the border is they don't trust Washington. It's not about Mexicans. It's about, I want a border
00:36:47.520
and I want a wall because you're going to tell me one thing while you're trying to get elected.
00:36:52.640
And then you're going to do the exact opposite. You can't tear down the wall. I don't trust you
00:36:57.080
anymore. It is a clear, tangible symbol. Correct. Now let me give you the next example.
00:37:01.640
Um, trans hysteria, uh, the female penis. You wrote a paper about that. I know you did. Okay.
00:37:09.760
So you throw all this crap at people. I mean, when did we all of a sudden where we expected to know
00:37:16.820
what a cisgender male was all of a sudden it just appeared one day and everybody was like, well,
00:37:23.160
you don't know what a cisgendered male is. And like, no, I I've lived 48 years and I've never heard that
00:37:28.440
before. Where did it come from? People are printing up words. 2014. And I'm not joking this time. I think
00:37:34.200
I know. So Donald Trump would not be president if people hadn't in academia and people in power
00:37:45.060
hadn't all of a sudden saying, well, you're a cisgendered male. And so you don't, and nobody,
00:37:51.280
nobody could stand up to that's a bunch of crap. I keep trying to find things to disagree with you.
00:37:57.300
Yeah. And I failing. That's a hundred percent. That is absolutely true. We wrote a thing about that,
00:38:03.040
like almost a year before the election. Uh, Peter and I together wrote a thing saying,
00:38:08.300
yeah, for Quillette saying that, um, basically that kind of the exact point that the left is
00:38:13.860
driving people nuts. And they don't see it. And I, I'm utterly baffled by why they don't see it.
00:38:21.080
And they'll say to me, people like, well, don't you realize the bigger threat is in the white house?
00:38:25.240
And I'm like, can I swear on your show? Yes. Go ahead. Do you think put in there?
00:38:29.960
The far left, put in there. Yeah. People call us tools for the right. And I'm like,
00:38:34.180
you want to talk about a tool for the right? An article in the Washington post that says,
00:38:38.700
why can't we hate men? What better tool could you give the right than that? Yes. I mean,
00:38:44.060
come on. We're not the tools for the right. All this reparations talk and everything.
00:38:48.060
The whole thing is, Oh my God, just hand it over. And you know, just give them the election.
00:38:51.520
And, uh, who is, uh, the, the guy, I can't even remember his name. He's the mayor, uh, uh, of,
00:38:58.960
of like some small town in Indiana. Uh, he's running for president. He, Buttigieg. Yeah. Yeah.
00:39:05.820
Okay. So here's a guy who's gay, who says, I don't have a problem with Chick-fil-A. Right. Thank you.
00:39:13.040
Yeah. Oh my gosh. Thank you. Now this guy might be the most radical Marxist ever,
00:39:17.440
but there, I know conservatives who are like, Oh, I could talk to that guy all day. Right.
00:39:24.340
Now that will work against him, but it's just because he's normal. Right. He's not getting up
00:39:32.180
every day going, who can I destroy? Or who am I pissed about? Outrage culture. Right. Virtue signaling.
00:39:39.480
Correct. Yeah. And I think that guy, and there's another one that is also really a Chang or yeah,
00:39:45.740
yeah, yeah, yeah. Yang. Yang. Right. Andrew Yang. Yeah. Andrew. Yeah. Who also, you can,
00:39:50.540
you can listen to and go, I don't agree with, with universal basic income. However,
00:39:55.340
he's being eviscerated by the far left, by the way. Correct. Right. And, and you listen to,
00:40:00.380
I mean, I've been saying this to my audience for years. You don't know what the future is bringing
00:40:05.240
in the next 10 years. Right. We got, we have to talk. We can't just go, I'm against universal income.
00:40:11.320
Right. Wait. Well, do you know all the problems that are coming down the road? Right. This guy
00:40:17.320
is articulating, talking about problems, has intelligence behind it. He's not eviscerating
00:40:23.380
people. Okay. So I think those two guys could win. I think we need to have a conversation about that.
00:40:28.560
So I think, well, about what do you criticize? Like there are so many things to criticize about Trump,
00:40:36.140
but how he likes his steak is not one of them. Right. I mean, we have a failure to understand,
00:40:44.620
and it doesn't even, I would go so far as say, it doesn't even matter what somebody thinks.
00:40:48.380
It's totally irrelevant. All that matters is what they, what are they going to do? And when we
00:40:52.800
find we, so this whole Biden, they're trying to paint him as some creepy dude or whatever.
00:40:58.280
Doesn't matter. And nobody's made a charge. That is a total distraction. Yes. It is a distraction
00:41:03.780
from what we need to be taught. What is he going to do? What are his policy positions? Does he have
00:41:09.540
sufficient evidence for those? Let's take a look at those. Anything else? We're not doing the
00:41:14.680
republic justice. We are, we are cakes and circuses right now. Totally. Absolutely. Cakes and circuses.
00:41:22.540
There's there. And this is what really concerns me. And I, I'd like to talk to you guys about,
00:41:27.780
I don't know the deep thinkers. I don't know who's out there. I don't know who even in Silicon Valley
00:41:34.160
who's, we are just taking a cell phone and then putting our thumbprint on it where we all said,
00:41:41.040
I'm not giving anybody my thumbprint. How dare you have my thumbprint? Now I remember 20 years ago,
00:41:46.500
we're like facial, facial recognition technology in the hands of the government. That's not good.
00:41:51.060
Okay. But it'll open my phone. It'll open my phone. Yeah. We're in a surveillance capitalist
00:41:57.620
system that with people who are in bed with fricking China. Right. And we're talking about
00:42:04.140
Joe Biden. Right. And creepy pictures. Right. Who are the deep thinkers? And what are the questions we
00:42:10.300
should be asking ourselves? Cause it's coming whether we like it or not, it's coming. Yeah. Problems are
00:42:17.480
coming. Yeah. But, but the technologies, like, yeah, let me ask you this. My theory is the
00:42:24.120
industrial revolution, 150 years, that kind of change is coming in the next 10. Oh, totally.
00:42:29.680
That's Kurzweil's idea. Totally. Yeah. It's based upon the, uh, Moore's law. Yeah. Yeah. Yeah.
00:42:36.780
That technology, the, the, the world in the next 50 years will be more different than the world in
00:42:43.460
the last hundred. Like it's, it's the changes coming at exponential paces. And the problem is
00:42:48.160
that we don't have a moral infrastructure to deal with that. Like abortion would be a great example
00:42:51.820
of that. How the, um, what are those amniocentesis and to detect certain. So we're changing our
00:43:00.600
technology, but our moral infrastructure hasn't been brought up. So now this is probably my own
00:43:05.620
hobby horse, but that's why we need to have the conversation, right? Yes. If you do not have
00:43:10.280
conversations about these things, we know for a fact, the problem will not solve itself.
00:43:15.180
So we need to talk about this, the surveillance state, we need to talk about this. And I would
00:43:19.820
argue that the place that we need to talk about this is the universities, but I've utterly given
00:43:24.240
up on that. I know like, maybe this is where our, I disagree. I think it needs to happen in the public
00:43:30.300
square. Everywhere. But, but, but the problem is when people go to college, they need to see those
00:43:35.700
behaviors modeled for them. Like this is what civil, so I, you know, it needs to return to what
00:43:41.340
it's supposed to be. And I'll give a talk and I'll have an associate professor stand up and start
00:43:46.180
screaming at me with, I was with Brett, uh, Brett and Heather and Christina Hoff Summers. And they'll,
00:43:51.320
she stands up and she starts screaming at us in the middle of the talk. I mean, that's the thing is
00:43:56.040
that they're actually teaching to focus on cakes and circuses. They're teaching to focus on, and
00:43:59.820
they're, and then, so this is the key thing too, that's so, this is an important deliverable.
00:44:04.900
When you ask them, why do you do that? They'll point to the literature that they have made up
00:44:10.880
in the first place. The whole thing is bullshit. And they'll say, well, it's Judith, Judith Butler's
00:44:16.180
disruption, performative disruption, disruption. We need to disrupt this.
00:44:19.740
The key difference. We asked you, you said, you know, or you actually just said, why do you believe
00:44:23.960
in God? It's faith. Okay. So you, however you want to take that, you admit there's this,
00:44:28.540
there's no way to measure it. Yeah. Yeah. They, on the other hand, how do you know this? Well,
00:44:33.560
here's 50 years of scholarly literature that he has, we cooked up. Correct. I know how to cook it
00:44:38.500
up. It's like, I'm good at cooking it up. There's a very famous painting of, I can't remember which
00:44:44.480
battle it is in the revolutionary war. And there is full of white guys, full of white guys. There's
00:44:49.120
one black guy in it. Oh, uh, and he was the hero of that battle. Okay. And he's standing behind
00:44:56.000
another guy and he's kind of holding him like this. And the, the, the painter who painted it
00:45:02.600
at the time said, this is who this was. He was the hero, blah, blah, blah. Now paper after paper
00:45:09.760
after paper has come out. And all of a sudden he's the slave of that guy who was holding the horse and
00:45:16.520
was shown cowering behind the white guy. No, that's not what the artist said. And the brilliance that
00:45:22.920
these folks, the brilliance that these folks have done, it's postmodernism, right? Is,
00:45:27.980
is what our friend Brett Weinstein calls ideal laundering. So they have this idea, like they
00:45:32.420
have this moral urge and they don't know how to discharge the surge. They don't know what
00:45:37.360
to do. So they get a bunch of other people who have this urge together, who have some kind
00:45:42.080
of deep moral feeling about something and they write a journal or they publish a journal
00:45:46.100
and then they idea launder. They start publishing their, their, I think they're insane ideas,
00:45:52.080
but they start publishing these ideas in journals. And then those journals inform public policy.
00:45:56.920
So when someone says, how do you know the trigger warnings? How do you know safe spaces? How do you
00:46:00.060
know microaggressions? Well, they point to the journal article.
00:46:03.160
Do you know that observing or training men like you train dogs will prevent rape culture?
00:46:06.880
Well, you push it through a journal. Now it's, now it's knowledge.
00:46:10.580
Yeah. You have, you have the same thing. This is how they distorted history. We have,
00:46:15.060
for instance, George Washington, all of the stories written by the guys at the time in that knew him
00:46:21.220
were next to him. Right. Those have all been erased and new professors come in with new studies.
00:46:27.260
It's their opinion. And they start quoting the next book quotes that guy and the next book quotes
00:46:32.340
the two guys. And then all of a sudden it's done. And what you're probably looking at there is coming
00:46:37.360
out of what they call critical race theory and critical race theory is openly historically
00:46:41.180
revisionist. What's it supposed to do? It's supposed to show that the white power has always
00:46:46.340
been trying to maintain itself. So somehow, no matter what happened, like the civil rights
00:46:50.880
successes, for example, that was white people trying to make themselves look good by giving
00:46:55.980
black people rights. That's it was a means for white supremacy to maintain itself. So they rewrite
00:47:00.740
history in a sense that always serves the narrative that they're trying to spin. And then if it gets
00:47:05.980
any legs behind it, once it gets published and they teach your knowledge, this and then they
00:47:09.760
assign their papers. Go ahead. So how do we, two things, how do we solve this? You're in trouble
00:47:16.340
with your university. So I'd like to talk about that. And then second, I'm raising two teenagers
00:47:22.960
now. My first two, uh, went, went to university. Um, my next two, I, I, I, I'm not comfortable with
00:47:32.680
that. I mean, you want to, if you knew it actually went on in the university, you'd really not be
00:47:37.520
comfortable with that. Right. Yeah. But you know, my wife keeps saying if they don't have the
00:47:42.860
certificate, she's right. I mean, we're kind of in this, this crunch period where I don't know what
00:47:49.000
to do. So let's first address, how do you change it when you guys were, were exposing academia for
00:47:57.660
what it was and you get in trouble because you didn't alert academia that you were trying to
00:48:02.600
expose academia. Right. Right. So the question is, how do we, we, we change it? If you want to
00:48:07.880
change academia, uh, I mean, I think we've got a plan actually, and not to tout too much, but of
00:48:14.300
course people know, a lot of people know that there's a documentary film being made about the
00:48:18.400
work we did should come out early next year or late this year. Mike Naina's N-A-Y-N-A, he has a
00:48:23.980
YouTube channel in which he shows, he documents this. So we're, the film is a thing of course,
00:48:29.480
but aside from that, we're working with Mike now to start going in a new direction. And we think
00:48:36.420
that there's, there's three dimensions to what we need to do to fix this. First, we need to continue
00:48:41.560
to expose the problem, let people see it. I mean, in a sense, I feel like we've already detonated a
00:48:47.180
bomb under the dam and the cracks are there. It just hasn't broken yet.
00:48:53.980
I have tried to get people like you to sit at this table for a very long time.
00:49:05.860
No, it's not easy. It's not easy. Even the guys who I am, I'm cheering for, I'm vocally
00:49:13.420
cheering for, and I'm taking bullets. They won't sit down and do it. So who has, do you have the
00:49:21.980
volume of people that have the courage to break through and say, I don't care who I sit with?
00:49:29.260
It's coming. Okay. I have faith. It's coming. All right, good. I don't have the evidence.
00:49:33.220
Actually, I kind of do. I get a lot of emails of, shh, don't tell anybody, but I fully agree
00:49:37.720
with you. I think the wind is changing. At the same time, the other side is gaining strength.
00:49:44.720
I mean, when Zuckerberg says, you know what? We invite the government in to kind of help.
00:49:50.120
Oh, dear God, help us. That's not good. No, no. So hurry. Continue to expose. Second thing
00:49:58.880
is that we, and this is where our expertise has landed us because we immersed ourselves
00:50:04.420
in this. We explain exactly what we're kind of doing here. How did this happen? How did
00:50:08.120
we get here? And we just keep disseminating that message. And why does it matter? Why
00:50:12.100
does it matter? And then the third thing is we start to articulate a different vision.
00:50:16.300
Right. And so I get asked this question a lot, and I should be doing a show with somebody
00:50:21.780
soon about this, I think, a podcast. And I get this question all the time. I'm a liberal.
00:50:26.940
I care about social justice issues. I'm worried about racism, et cetera. But I think the social
00:50:32.060
justice warriors, as they're called, are nuts. What do I do? We start answering those questions.
00:50:37.280
We start articulating what the founders of the U.S. articulated in the first place about
00:50:43.120
liberalism. We go back to the liberal foundations. Are you a conservative? Okay. But are you also
00:50:48.520
a liberal in the sense of- No, I'm a classic liberal.
00:50:52.020
Are you, do you subscribe to the, as Pete called them, the rules of engagement? Okay. Let's
00:50:57.120
rearticulate those and let's talk about why they matter, what they mean, what they do.
00:51:01.140
So you have to, to be able to do that, you have to reverse what Roosevelt did to us by taking
00:51:08.000
liberal and changing the meaning of it. And blending it with progressive, yeah.
00:51:12.940
Right. I, you're a conservative, so they put you on this European left and right scale. No,
00:51:18.380
what made America different is we said no to that scale. Yeah, exactly. We're on a different scale.
00:51:24.820
We are on this freedom scale. And that is not even considered. So when-
00:51:31.820
It is. Like I'm a conservative. I'm a constitutionalist. I, I would love to live next door to you and we
00:51:38.160
would be best friends. And on Sundays, I'm, I might meet you for a barbecue after I go to church and
00:51:44.320
you don't go to church and we're going to be fine. That's, that's real liberalism.
00:51:49.180
I'm a liberal atheist in the South. That's like all my friends.
00:51:57.500
We're losing it. And if you, and I'm, I'm not even concerned anymore. I'm actually worried at
00:52:05.540
It's, it's, it's really interesting because I spend a lot of time with conservatives because I live in
00:52:10.140
the South. A lot of my friends are, are conservatives and libertarian conservatives, classical liberals
00:52:14.020
at heart with conservative views. And I am actually encouraged by what I'm seeing there.
00:52:20.900
Again and again, I hear the same thing and I don't know what the reasons are. Maybe it's just
00:52:24.320
cause Trump's in power and all of this. But I do know that I keep hearing again and again,
00:52:29.300
I'm tired of all the fighting. I'm tired of it being, you know, daggers against daggers. I'm
00:52:35.180
tired of it being that I can't be your friend because our politics differ. Let's go back to
00:52:39.600
what Jefferson said, where matters of religion, politics, and philosophy don't separate friends.
00:52:44.780
And I hear this so consistently from conservatives that I do have hope that there is at least a
00:52:49.880
sea change going on. There is. I do think that because I did not experience that living as a
00:52:54.840
liberal in the South for the last decade. This is new to see this as the main voice that I'm hearing.
00:53:00.900
There's a hunger to have a converse, an adult conversation with people who has a different
00:53:05.380
view without being called a racist or a bigot or a homophobe.
00:53:08.840
Because what I see is people reaching across the table with an open hand.
00:53:11.820
And some people on the other side are going to slap it, but other people are going to take it.
00:53:15.080
And the more people who take that hand, whether it's a liberal reaching to a conservative or
00:53:19.340
there's a conservative reaching to a liberal, the more people who take that hand, the faster
00:53:23.500
And you probably find you have far more in common. One has far more in common if they're
00:53:27.360
conservative with a liberal. And part of the reason is, I think, I was telling Jim, I think
00:53:33.420
last night at dinner, it's really weird. Like, here we are, two liberal atheists. We're on your
00:53:37.740
show. We're hanging out. I'm having a good time. Your staff was fantastic to me. It's really
00:53:42.440
interesting. It was, it's almost like that there have been, there were two tribes that
00:53:49.260
Do you remember how we felt after the fall of the Soviet Union?
00:53:53.900
Remember? And we all went, these people are just like us.
00:53:56.740
I thought they were behind the wall plotting our death.
00:53:59.840
And they thought the same thing about us. And all of a sudden it was like, oh, it was
00:54:04.440
the leadership. You know, the leadership on both sides making us feel like you were this
00:54:11.240
great enemy. No, no, no. It was the systems warring and we were pawns.
00:54:16.300
Yeah. Yeah. I think that's right. I'll finish the thought. So we're sitting, we have fundamentally
00:54:25.040
different views about things. You're not calling the university, telling them I beat my family,
00:54:31.940
which is what people do to me. You're not calling the university, telling them that I'm
00:54:35.380
a rapist, which is what people do to me. You're not, you know, when, when Jerry Coyne or Dawkins
00:54:40.260
or whoever, they had a difference of opinion with creationists about age of the earth or
00:54:45.120
whatever, speciation, whatever it was. There were certain, they didn't call, Bruce Gilley
00:54:49.840
from my university wrote a piece called The Case for Colonialism. They wanted his PhD.
00:54:54.660
The journal editors had to retract it because he had death, a death sentence, a death.
00:55:01.200
The journal editor. They wanted to take his PhD from Princeton. So think about this.
00:55:05.900
I was, so you're sitting there, I'm sitting there. There was an intense tribalism up until
00:55:11.940
about three years ago. And then all of a sudden the aliens came down. The crazy, crazy aliens
00:55:18.640
came down, the intersectional maniacos on the left. And they're just randomly torturing everybody.
00:55:23.820
They're imprisoning people. They're not engaging in the rules of engagement. They're not sitting,
00:55:28.920
they don't value civility. They don't value discourse. They don't value dialogue. I don't know.
00:55:33.740
They're out. It is. It's scaring my tribe. It is scaring my tribe. I mean, my listeners have been
00:55:43.300
with me for a very long time, you know, a lot of them since 9-11 and, and we've gone through an
00:55:49.400
awful lot of stuff and we've thought things and we're wrong on things. We were right on some things,
00:55:53.580
but we've seen this coming for a long time. And it wasn't Obama. If, I mean, if, when I was on Fox,
00:56:00.420
I was like, it's Democrats. Right. Please don't do this because the pendulum is going to swing
00:56:07.520
just as hard. And now we have people throwing that pendulum the other way. The next guy,
00:56:13.180
I'm worried about the next guy. So, right. So we've kind of been through this, but I'm having,
00:56:19.700
for the very first time, people come up to me and say, Glenn, I am terrified because this is happening.
00:56:29.080
It's like you said, aliens. I said it, Fox, at some point there, they want to tell you they're
00:56:35.580
Marxist. They want to tell you you're wrong. And they, at some point we'll take the masks off and
00:56:41.820
say, yeah, I do believe I should be in charge. And we've got this. I think what's, what's happening.
00:56:49.020
Some of these people, it's like a culture of death. It just is not anything. The average American
00:56:56.220
recognized. It's the absolute denigration of truth. How are we going to solve our problems?
00:57:00.620
So the one, one thing that you, one commonality among these extremists is biology denialism.
00:57:06.240
All right. I totally believe in trans rights, a hundred percent, but that doesn't mean that I need
00:57:10.840
to deny biology. That doesn't mean that I need to make up my own canon of literature and basically
00:57:17.720
make stuff up. And say that everybody who disagrees is morally defunct. And people are good. I can't
00:57:24.040
speak for other people around the world, but I think we're all the same. I'm worried
00:57:32.300
Right. And I don't know of a person who says, oh, we're in a dress. I think they should
00:57:43.980
Yeah. And just as an asides, parenthetically, these people are utterly obsessed with other
00:57:49.660
people's sex lives, utterly obsessed. That's why our paper about sex toys going in the butt
00:57:54.280
to change people's political views. Thank you for that. Was a shoe in. He at least asked
00:57:59.720
and I swear sex toys in the butt. Why not? Just throw it out.
00:58:03.540
Let's not swearing. Remediate your transphobia. That's right. Tell me, tell me the difference
00:58:13.100
between postmodernism, you know, Marxism, socialism. Well, this is a great time to have
00:58:22.400
Helen. I was going to say, this is Helen. So socialism, as far as I understand, is an
00:58:26.240
economic system that was born out of Marxist philosophy. So socialism is ultimately an economic
00:58:31.060
policy where the means of production are owned by the government, ultimately, as contrasted
00:58:39.320
with communism, where it should be ideally owned by everybody equally. Right. By the
00:58:44.400
people, the commune. Right. Right. But somebody has to manage the commune. Correct.
00:58:48.440
So stuff breaks down. Marxism is a philosophy that was ultimately looking at the winners
00:58:55.060
and losers of capitalist society and saying that the fact that it generates winners and
00:58:58.420
losers, in particular losers, is not fair. And therefore, it needs to be overthrown.
00:59:03.960
And I mean, in its kind of simplest brass tacks, that's really what Marx was getting at. And
00:59:10.080
that it was unstable and will eventually stimulate its own revolution, is what he was pointing
00:59:14.860
at. What you have with postmodernism is something completely different. Postmodernism, in the general
00:59:21.320
sense, was a rejection that these grand sweeping explanations like Marxism, like science, like
00:59:28.960
Christianity. There's no gods out of you. Any such big story could tell the truth. And in fact, this
00:59:37.120
got more and more what they call deconstructive. Take apart the big story. See where it fails. See
00:59:41.820
where it's problematic, where it doesn't work. Take it apart. Break it down until there's nothing
00:59:46.580
left. And also then use it and put a new set of... That came later. Okay, that came later.
00:59:53.640
That came later. And so postmodernism, in its first place, was just this skepticism that these
00:59:58.360
big stories we told ourselves through the modern era and the pre-modern era, which would be
01:00:02.440
religion, you know, kind of the middle ages. Grand narratives. Yeah, these grand narratives.
01:00:06.160
We should just be totally, and they say skeptical, but cynically skeptical of them to the point
01:00:11.320
where we just break them down entirely. Then in the 80s and 90s, people who were steeped in
01:00:16.340
postmodern scholarship started to realize you can't really achieve anything if all you're doing
01:00:21.920
is breaking it down. Helen calls that period the high deconstructive phase. And then in the late
01:00:26.960
80s going into the 90s, something new happened. We now call that applied postmodernism or even
01:00:32.160
grievance studies. And those are kind of synonyms that we've used. Grievance studies sticks better.
01:00:37.800
People kind of get a feel for it. So applied postmodernism took the view that we can't
01:00:43.580
deconstruct the idea of truth and be purely subjective because if we say that nothing is
01:00:48.920
true, we can't do anything. So they decided that two things, precisely two things are true.
01:00:53.940
And one of those things is that there is oppression that's based on power dynamics
01:00:59.040
rooted in society. And the other is that that is tied to intrinsically tied to identity. So your
01:01:06.100
identity, not like Marx now wears, you know, bourgeoisie versus the proletariat, rich versus
01:01:12.140
poor, if you will, or owners versus workers. Now it's people with privilege have the power and the
01:01:19.460
privilege that run society and everybody else is a loser. And so you see this parallel that came up
01:01:25.540
in the 80s and 90s with what we've called applied postmodernism, that language and identity and
01:01:32.720
representation all modulates what we can know about society. But the one fundamental thing that's true
01:01:38.440
is that oppression based on identity exists and is a problem and must be overthrown very much in line
01:01:45.480
with the same kind of thinking that Marx was doing. And of course, these thinkers were informed by
01:01:50.400
Marxism. But being that they're also informed by postmodernism, they're very skeptical of Marxism. And as we've
01:01:55.440
had many Marxists reach out to us and thinking we're great. I think it's been confirmed that Marxism, Marxism
01:02:01.240
actually sees this social justice stuff as an attempt by the bourgeois left to steal the working or steal the
01:02:09.340
still the left away from the working class and create a new elite and a new bourgeoisie that that's
01:02:14.140
separate. So it'd be better to say a Leninist as opposed to a Marxist is a more of a danger to free
01:02:24.340
freedom, free thought. I don't know. I don't know that much about it to speak to it. Yeah, not sure
01:02:30.280
exactly what the specifics on that. Yeah, that's when you Helen is our expert on that. We each have our
01:02:35.000
expert. Ultimately, the most important thing here is that you do have this idea that certain people
01:02:41.100
are oppressed. And by virtue of their oppression, they have special knowledge, right? And they also
01:02:46.540
have a right to try to overthrow whatever is oppressing them. So meanwhile, the people who have
01:02:52.900
power and privilege, and this is key, this is the most key point, always, whether intentionally or not,
01:02:59.340
always work to maintain their power and privilege, everything they do. Preserving epistemic pushback is
01:03:04.860
one of the things it's called what our founders knew. That's why they created all the systems that
01:03:10.960
they did that are now being either ignored or dismantled to yeah, to minimize the impact of that. And so
01:03:17.840
that's the difference between equality and equity. I get all these emails from the Portland public schools
01:03:23.600
where they always use the word equity. Have you heard the word equity is thrown around and not just in
01:03:28.100
everywhere in the realm of finance. And it doesn't mean what you think it means. Well, I don't know. Maybe you
01:03:32.680
know what it means, but, but people, they, they make terms, they either smuggle in to change the
01:03:39.620
meaning of words like racism, or they, they smuggle in new words that have other meanings. And when,
01:03:45.620
if you just ask someone on the street, well, you know, do you want to be equitable to people?
01:03:49.260
Well, sure. It sounds pretty good to me. Equitable is a positive word. I want to be,
01:03:53.200
who doesn't want to be equitable? Who doesn't want diversity? I don't want equity. But the problem is that it,
01:03:57.580
that it means that you have to address past injustices. And by definition, that can't be
01:04:04.960
equal. Yeah. Equity means adjusting shares. It's equality of outcome, including as, as a
01:04:12.040
mathematician, I would say, integrated over history. So that's why if you say, they say, oh, this, you
01:04:17.540
know, women don't have it fair. And then you point out some statistic where it's like, oh, well, women
01:04:22.200
actually have 70% representation there. They're actually dominant in, in, in that sense. Now
01:04:26.320
demographically, they'll say, oh, but historically they weren't. Right. So equity means make up that
01:04:31.960
injustice, right? Yeah. So it's, it's adjusting shares in order to make things equal. So this is one
01:04:39.500
of the big, um, blocks in our way. Right. Because I, I really, in Abraham Lincoln, um, great guy,
01:04:51.680
halfway through we're losing every battle. And he's like, okay, what do I do? And he called the
01:05:00.180
country to a day of humiliation, you know, Hey, let's recognize what we've done here. And, and he,
01:05:08.780
at that point said, this is about slavery. It's not about the union. It is about slavery. And if we
01:05:15.540
need to heap all of the treasure up and we lose it all, we lose it all. But this is right.
01:05:21.060
And that's when we started to win the war, uh, for the North. Um, he shot right after,
01:05:29.880
you know, we win the war and he shot right after he says charity toward all malice toward none. Uh,
01:05:39.840
we have the same thing happen. It just festers. Then gets worse again. We have Martin Luther King
01:05:46.300
shot Malcolm X shot RFK shot. And the response both times from Americans was the same. We took
01:05:56.020
care of that. We spent enough blood. We took care of that. Okay. We, we freed the slaves,
01:06:00.860
so we don't need to deal with it. The next time we, we, we, we went through the sixties and those guys
01:06:06.480
were shot and we have the civil rights movement. So we never sat down and just bled, you know what I
01:06:12.820
mean? And that's a very human thing. We have to bleed as a nation, but nobody's willing to bleed
01:06:19.200
because you're going to take stuff. You're going to now make me pay for things I didn't do.
01:06:27.380
So we stopped talking to each other because we have all these roadblocks where we have to talk to
01:06:35.480
each other. We have to. So how do we get there? Yeah. I was thinking about what you said. If you wake up
01:06:41.740
Americans, you know, in the middle of the night, what would they say? And I was thinking when you
01:06:46.860
were telling that story of a few things about how important it is to be across from a sincere,
01:06:53.620
earnest person and how, how much of a difference that makes in the quality of dialogue and honest
01:07:01.700
questioning over blindfolded fear and social reward for people who change their mind, social reward,
01:07:11.300
wow, he changed. He said he didn't know. Wow. Fantastic. We laud that. We don't go on social
01:07:16.520
media and call the guy a moron for the next 10 months and meme him out. How do we get back to
01:07:22.820
the idea that we should have, we, that, that we need spaces where we broker honest conversation
01:07:30.860
among people who have substantive disagreements and why do we need to do that? And what are our
01:07:35.640
common values? I think the thing we got to do is drop blame. Blame's cheap. Blame is easy.
01:07:41.160
You guys, you conservatives, you dah, dah, dah, you know, whatever. It's easy. I don't care.
01:07:45.860
The thing is I care. We got power that we look at it and say, you know, it's like,
01:07:50.300
you don't make a decision when an event is happening. That's the worst time to make a
01:07:59.460
decision. Took my kids to Auschwitz. I believe, you know, what I say is coming. And I know that
01:08:05.660
there is, there is persecution of somebody that is very possible. Six years, seven years ago,
01:08:11.820
I took my kids to Auschwitz and I said, this is the day we decide who we are. There's no problems.
01:08:17.160
Totally. Today is the day we decide who we are. And I get hammered for bringing up the seeds.
01:08:24.280
These were the seeds that the national socialists were planting. These were the seeds that were
01:08:28.740
planted 30 years before Hitler. Who hammered you? People, who hammered you? The left and Jewish,
01:08:36.720
Jewish organizations, very left Jewish organizations. And they hammered you because why? You're bringing
01:08:41.980
up Nazism. How dare you bring up Nazism? And my, my point was, what does never forget mean if you
01:08:49.780
can't talk about, I'm not saying you're a Nazi. I'm saying this kind of thinking is planting the seed
01:08:56.880
and in the wrong soil, that seed will grow. You know what I mean? So you, you, you get the wrong.
01:09:06.700
So I'll, I'll, I'll throw something out. What do you think about this? Do you think that extending
01:09:11.880
that metaphor onward, the wrong soil means not being able to talk about our problems? Yeah.
01:09:18.920
So I, and, and, and, um, uh, uh, enmity. Yeah. Yeah. Hatred. Hatred. Hatred. You are my enemy and
01:09:29.020
I'm going to win. Right. That soil. Yeah. You start planting certain seeds in there at your growing
01:09:35.640
blame. I'll throw it. Blame. Yeah. Blame. These people did this to us. Correct. These people are
01:09:41.120
doing this. And it was a Jews that did this. It was those bankers that did this. Now you can,
01:09:45.960
you round up, you round up anybody you want. Yep. I'll throw out something, something to you
01:09:50.780
that I, I think this is right. I'm not sure, but I think that I, I think part of the problem
01:09:56.780
was we're so polarized. I don't think debates are doing us any good. I remember years ago,
01:10:02.480
I saw John Stewart and crossfire and I didn't get what he was talking about, but I think slowly
01:10:06.800
over time, I, I, I think we need to move towards the conversational model. Everybody is so interested
01:10:12.400
in winning, but what are you winning? Yeah. How do you win a conversation?
01:10:15.640
Yeah. No, no. When you, you think you're winning, but you're actually losing, right?
01:10:20.840
It's honestly, this is honestly why I started here with nothing in this giant room, no, nothing,
01:10:28.060
just a table and a conversation. And I want to have thinking conversations where somebody
01:10:33.740
goes, no, I don't know if that's quite right. I mean, you know what I mean? Where you're allowed
01:10:39.400
to think you're allowed to say something maybe that you haven't really formulated entirely
01:10:45.620
where you're being a little risky. Yeah. The, the, the thing is, is that, um,
01:10:50.820
and that's an attitude, right? Yeah. That's an attitude. And how do we get people to, how do
01:10:56.280
we get people to adopt that attitude? I don't know if you'd market it as cool or I don't know
01:11:00.480
how you do it. It is marketed as cool. We all saw merchants of cool and how MTV manufactured
01:11:08.160
So I remember reading Das Kapital and thinking, and I'm sorry, I'm not the brains you guys are,
01:11:15.800
but I read that and I went, that's nuts. I just don't, I can't make heads or tails. Most
01:11:21.100
of that. Um, and I thought, yeah, this is cool, but Jefferson is not. Why? Yeah. Because it
01:11:32.040
was underground. Yeah. And I, I remember thinking 20 years ago, someday Jefferson's going to be
01:11:38.780
underground and it will just become cool because it's the forbidden thought. That was 2015.
01:11:46.620
Since I'm tracking dates for us and kind of making them up as I go. It's true though. I mean,
01:11:51.640
they started tearing down statues of him. So it was around 2015. Jefferson's gone underground.
01:11:56.800
Everybody should go read some biographies and read some of his letters. They're good.
01:12:21.480
Let me, let me go to the future. Facebook, Google,
01:12:28.800
they terrify me because I know China 2025 and China 2020. They terrify you. Why?
01:12:36.500
They terrify me. They both excite me and terrify me. I am both. I think the future
01:12:43.460
is the brightest or the darkest of all mankind. And we're just, Hey, let's go. We're not thinking
01:12:51.940
this through yet between AI, AGI, ASI. Um, yeah, we wrote a paper about that too. I love you guys.
01:13:01.200
We said that if, um, we were to, uh, keep on the path we are, we're making AI, AGI in particular into
01:13:09.020
a masculine force. And that's why it'll destroy the world. And the solution to that is to make it
01:13:14.000
an irrational feminist. And that will solve all of our problems. And they, it was just, we got busted by
01:13:19.380
the wall street journal before the paper came up, but it was a shoe in. It was a shoe in. They gave
01:13:23.100
us no editorial remarks on it. They thought it was a great idea. Not to interrupt. Okay. Anyway,
01:13:29.720
but go ahead. So, um, uh, we are, we are playing with things that people may not even hear, but
01:13:38.800
around the world are just like, they want to be first and they're willing to play with things that
01:13:45.300
is an alien life force. Oh yeah. Just thinking that fakes, uh, facial recognition that, and that
01:13:51.380
is all of that's in tracking. Yeah. With, with social scores, 20, we're doing social scores right
01:13:57.540
now. We're doing them here. You know what I mean? In a, just in a different, different way,
01:14:02.100
but we're doing the same thing. They were brave new world. They're 1984. And that's a good way to put
01:14:08.420
it. And no one is thinking this way. We have politicians who are acting like it's 1955.
01:14:17.020
I mean, I talked to them about AI and they're like, well, we should, maybe we should look at
01:14:22.100
some laws. Are you kidding me? By the time you guys do anything, it's way beyond that. Um, so
01:14:29.000
they're acting like it's 1955 and they're talking about, we're going to bring jobs back. No, you're
01:14:33.620
not. No, you're not because you're trying to get the unemployment rate to zero and Silicon Valley
01:14:40.400
is trying to get it to 100%. One of you guys is going to win and I don't think it's you.
01:14:46.400
So at some point, Washington realizes they either have to turn on Silicon Valley and blame all this
01:14:54.160
job against those evil guys with the Frankenstein kind of stuff or, or Silicon Valley, which I would
01:15:02.580
count on is smart enough to say, if we get in bed with them, we can partner because then they can
01:15:11.020
control people. We'll have control of markets, et cetera, et cetera. It is a nightmare waiting to
01:15:19.320
happen. So what are you proposing? Oh, I'm not necessarily proposing anything. I don't know if
01:15:24.560
I'm smart enough. Um, it's a problem out of, out of many people's depth. Yeah. I mean, I don't,
01:15:31.200
I am a libertarian. I don't like the idea of, of, uh, breaking up country companies. I, I've, I'm very
01:15:40.540
much into AGI and ASI and I feel like we need a Manhattan project, but I don't trust it with the
01:15:48.240
government and I don't trust it with Google. I don't trust it with really anybody. Um, and if,
01:15:54.020
if you, I think DARPA has the right idea if this is what they're really doing, which is let's be
01:16:01.320
second, but let's take our time to get, uh, you know, AGI that likes man. You know, now I don't
01:16:10.160
know how you can, I mean, we'll be a fly, you know, our, just our day will drive AI or ASI insane
01:16:18.100
earthworm. Yeah. Right. So I don't know what to do. I just think we, we, we should be having a
01:16:26.300
conversation and a realistic conversation, not where you take Stephen Hawking's words. I think
01:16:32.480
out of context, he wasn't saying that mankind will, there won't be any humans on the planet. He was
01:16:39.580
saying you're going to merge with machines. So homo sapien, as you know, it won't be around.
01:16:46.600
That's what he was saying. We should have those kinds of conversations. So the people in the middle
01:16:53.360
of the country and around the world start to get an idea. Maybe we should stop talking about Joe Biden's
01:17:02.340
pictures and start talking about this because this is coming in the next term or two or three terms.
01:17:11.360
Yeah. Great. So what do we do? I have, that's way beyond my area of expertise. I have not even a
01:17:18.780
remote clue about what we would do about this. You know, Ray Kurzweil? No, I've read all this stuff.
01:17:27.640
I don't know him. I know him. I've, I've interviewed him a couple of times. He is both exhilarating
01:17:33.640
and terrifying because he doesn't believe in, he believes that man is just a collection of
01:17:44.240
thoughts and patterns. Right. So once you can duplicate that pattern, when you take the ghost
01:17:49.680
out of the machine, all that's left is the machine. Right. So, so he believes by 2030 time,
01:17:56.660
you know, he's whatever, but he believes by 2030, I can, I can copy you. And so I don't
01:18:02.940
have to worry about, you don't have to worry about death. Right. He told me one time, this
01:18:06.260
is 2004 maybe. He said, Glenn, you just have to stay alive until 2030. Yeah. He's, he's
01:18:12.640
overestimated those timelines. And if a few years ago, Moore's law fell off the rail, so
01:18:17.660
it's no longer a law. It was just operative for a period of time. But, but the principle
01:18:22.040
is it that for singularity, it just extends that range out. So instead of 2030, it's 2040.
01:18:28.400
Yeah. So it's how time doesn't, I mean, time, more time we have the better. Right. But we're,
01:18:34.960
we're in a society right now that says, uh, it used to be arguing abortion in the womb.
01:18:43.580
Mm-hmm. Now we have a baby. Now we're into Peter Singer territory. Right. Now we have
01:18:48.440
a baby. Mm-hmm. Do I let that live or die? We've seen that play out before. That's exactly
01:18:53.860
what Peter Singer's argument. And also. And that has been crazy for 30 years. Yeah. And,
01:18:59.220
and also they don't like him on college campuses either. What? So we, so if it's, what's the,
01:19:06.000
if it's good for the goose, it's good for the gantle. Like if we're letting everybody on,
01:19:09.820
or we just say, you know what, new policy, no external speakers. But the point is that it's,
01:19:14.500
it's not just people like you and me that they don't want on campus. It's the, right. So,
01:19:20.300
so we're now entering a territory where a baby, a doctor can look at a baby outside of the womb and
01:19:28.580
say, eh, well, mom didn't want it. So I don't have to, I don't have to give it anything. You know,
01:19:34.940
it could just, it could just be neglected and died and die. So, so there are a few things
01:19:40.400
operative here. One is what to do about these encroaching technologies and, you know, there's
01:19:45.780
no constitutional right to privacy. And how do we navigate that technologically and politically?
01:19:51.740
I'm the wrong guest for that. I don't think you're the right guest for that either. I have
01:19:55.660
absolutely no idea. I don't. Can you, but you. Yeah. Ethics. Yeah. You see the point I'm making
01:20:01.840
from, I can copy you. Yeah. So if you have cancer and it's really expensive, you know what? We're
01:20:07.300
just going to put you down, but we're going to download you because you're going to live forever
01:20:10.500
right here. The body is too expensive. That's okay. Um, to, to that, when you, you, when we don't
01:20:19.360
have, um, some sort of sanctity of life at this point with the coming technology, shouldn't we be
01:20:28.160
having these ethical questions right now? Yeah, we should be having those conversations.
01:20:33.780
We need to develop a moral infrastructure. We need to get the diversity of voices again,
01:20:38.640
and we need to figure out what the best arguments are against the position. So I'll bring it back
01:20:46.380
around to something that I've been thinking about. Part of that is, and I think you share my belief
01:20:52.780
with this, that we should be able to rationally derive our values. And we have a whole bunch of
01:20:58.620
folks in there thinking in the academy, thinking that there is no rational derivation of values.
01:21:04.680
There are these immutable starting points that have nothing to do with fraction power dynamics and
01:21:09.240
race and oppression variables that, that Jim was mentioning. So I think the larger picture of this
01:21:17.500
is if we don't like, for example, you know, Jordan Peterson's pronoun thing and the Lindsay Shepard
01:21:23.540
case where she wasn't even allowed to present the other side, we need to teach our kids that they
01:21:29.820
need to hear the best arguments from people on the other side. So we make them in Jonathan Haidt and
01:21:36.240
the Heterodox Academy and Greg Lucanoff, I've talked about this. We make them resilient, right? We make them
01:21:41.800
resilient to these ideas. And so they don't crumble. So right now we really do have a type of
01:21:47.300
epistemological fragility. People are completely fragile and they fall apart once you start looking
01:21:52.740
at their epistemology. How do you know that? Why do you know that? Again, as you said, and you're
01:21:58.240
absolutely correct, there were no questions allowed in this framework, right? And we're lazy. We don't
01:22:03.200
ask questions. And I think we don't ask deep questions because we're afraid of the answer.
01:22:07.060
And so we need to create values that we need to create systems in which people value these things.
01:22:14.100
They value intellectual engagement. They value emotional resilience. They value talking to
01:22:20.400
someone across the aisle. They value a friendship. And if all of your friends believe the same things,
01:22:25.680
man, you need to get a new set of friends. Yes. Yeah. Yes. So that would, I mean, obviously we can,
01:22:32.040
we can talk like this and we can try to reach people in the culture and hopefully something will
01:22:35.780
happen. But I mean, from what we've seen in our work, it's utterly critical that we do something
01:22:42.800
to deal with the problem that this ideology has taken over education and is doing so at every
01:22:48.560
level. How can you best foster these kinds of values and attitudes? I, I, I mean, in the past,
01:22:56.720
that has been the proper role of the best of religion. Not to say that religion, I mean,
01:23:03.680
religion goes off, like everything goes way off the rails. Sure. But in its proper role where it's not
01:23:10.400
bigoted, it's not hated. It is teaching you to love one another. It's teaching you to be charitable
01:23:16.080
and decent and don't lie and don't steal and blah, blah, blah. That is your house of, of values.
01:23:23.580
That's gone. So where do you see those values coming from? Yeah. So the only reason people believe
01:23:31.520
in the new religion is because they stopped believing in the old, right? It's Nietzsche.
01:23:35.520
Uh, it's, uh, kind of, I was thinking about game of thrones, but yeah, uh,
01:23:41.200
now I have the new gods and the old gods. Uh, and so Jim and I have written extensively about how
01:23:47.640
there's a new religion and then this new religion is it's, I don't know if it's a religion or a
01:23:52.160
worldview or a cult. We're still almost a religion. It is a faith tradition or not tradition. Really.
01:23:56.760
It is a faith system for certain social justice, uh, is a faith system at this point,
01:24:02.440
but they won't admit it because they don't have to, because their canon looks like knowledge.
01:24:06.740
They aren't pointing to scripture. They're pointing to, but it is their scripture.
01:24:12.180
We have the peril, we have the parallels that you can speak to privilege being original sin,
01:24:18.380
Yeah. Privilege also being depravity. Uh, it corrupts you totally and makes it so that you can't do
01:24:23.740
anything, but in, in, in depravity in the religious sense or Calvinist sense, it's that you are depraved in
01:24:29.800
the sense that you seek to sin and here it's your privilege. So you seek to maintain your
01:24:34.280
privilege. It's a perfect parallel, uh, concept and they go all the way down. Wokeness is being born
01:24:40.400
again and you can just go down the list. Have you written this yet? Yeah. Yeah. I wrote it just
01:24:44.840
before Christmas. Aereo magazine. Helen Pluckrose is our third contributor. It's in Aereo magazine.
01:24:49.500
It's 15,000 words. So enjoy your time. Oh, I will. It's a solid hour. And I think so. Part of this is,
01:24:55.620
you know, we hosted the James Damore event at Portland state university and it was going to be
01:24:59.280
James Damore myself. And we invited the women's studies department on stage. They said, no,
01:25:03.420
two days later, James and Helen Pluckrose and I did an event at Portland state and we invited the
01:25:08.760
women's studies again, nothing. We have consistently invited people to have conversations with us.
01:25:14.720
And it's incredibly difficult when you want to have a conversation with us. That's why it's,
01:25:20.160
it's so interesting to me that the people on the right have been so welcoming to us.
01:25:25.620
Right. I mean, I've never lied to anybody. You know, I'm an atheist. You know, I'm a liberal. I've
01:25:29.320
never lied to anybody, but you've never lied to anybody about what you believe. And I've been
01:25:34.620
totally taken aback by how welcoming people are. Because you, because perhaps you have bought into
01:25:42.580
the narrative that the right and some on the right are this way, are a progressive right,
01:25:51.420
are a big government, big control, you know, that there are those Christians who are like
01:25:56.700
my way on the highway. Good. We can get everybody baptized or whatever that there are. There is
01:26:01.460
that sliver, but the right generally, the strength still is this constitutional. I don't hate my
01:26:09.800
neighbor. I don't mind. We, I want to work together. We're, we're here because we see this vision
01:26:15.460
that people can do something great with their life. That's different than mine. Right. Right.
01:26:21.880
That's that, that's, it's a small group of people, but I think it's actually getting bigger because
01:26:29.040
it's, it's in the American DNA. I think it's in the American DNA. Yeah. And that sliver is the same
01:26:36.120
as the social justice sliver. Yeah. So they're the same. Yes. In the sense that they feel like they
01:26:41.120
have some special access to truth that everybody has to get on board with. Yes. And this has all
01:26:45.680
been described in the literature about authoritarianism. You get to a certain point
01:26:48.880
of conviction and certainty in your views. And then it's called conventionalism. That's the name of
01:26:54.580
the, of the phenomenon where you believe that your views are conventional for you and must be for
01:26:59.420
everyone else. And so you start to try to impose those on other people and claim special knowledge
01:27:03.340
to do it. And we've already eaten up 90 minutes and I could talk to you guys for a long
01:27:13.040
We really appreciate the fact that, and I say this with total sincerity, that you are
01:27:20.140
a sincere broker of conversation. I appreciate you having us on, even though, you know, we
01:27:26.340
have differences of opinion and that's fantastic. Well, I mean, you don't understand how, I mean,
01:27:31.420
you're making a big deal out of this. Is this not happening anywhere?
01:27:34.760
Because nobody, not in our lives, not in our lives. I mean, since we've come out with people
01:27:38.540
on quote, whatever our side should be, aren't inviting us on their shows. They're not talking
01:27:43.380
to us. They're keeping two left wing outlets since October. They're heaping derision on us.
01:27:48.320
Like, you know, so when we did the atheist thing, everyone was like, Oh, you know, you guys are just
01:27:52.540
liberals or whatever. Well, they were right. But now that we've done this and that we've attacked
01:27:57.660
kind of our own tribe or our own side. And the reason is, even though I share a lot of those impulses,
01:28:03.460
that doesn't mean you get to make stuff up. That doesn't mean you get to pretend that something
01:28:08.180
is knowledge. Like we really need to have something we can count on, something we can go
01:28:14.880
to, something we can point to, and then we can squabble over public policy. But we need
01:28:19.540
to have things that we can point to and say, Hey, you know what? We know this. This has been,
01:28:23.840
we've come about this. The integrity of this process is intact. You don't have to worry about it.
01:28:29.460
The process needs to be defended. Yeah. The process needs to be defended. And that's the
01:28:33.240
other thing that we've lost. So I really do appreciate you inviting us on, you having a
01:28:39.060
sincere and honest conversation with us. And that's exactly what we need. And we're not having
01:28:44.060
it. Yep. So can I tell you something? Yeah. I feel exactly the same way. Thank you for coming
01:28:49.900
on. I've, I've wanted to have a conversation with Bernie Sanders forever. And, and the reason
01:28:56.740
why he's honest about what he is. Yeah. Generally speaking, you know, for a politician, he's like,
01:29:02.480
he's been forever. Yeah. I'm a socialist. Yeah. I honeymoon in, in the Soviet union. I can have
01:29:07.920
a conversation with a guy like that. You're going to go, same thing. You're going to go
01:29:10.980
to hell. Same kind of a thing. Like people are honest about what they believe. They're forthright
01:29:15.320
in their speech. The Greeks call it parahesia, speaking truth in the face of danger. If you said
01:29:20.600
something that there shouldn't, I don't think there should be danger when you're exploring
01:29:26.380
truth. I agree. If you're honest brokers, totally agree. And you know, when you, when you said you,
01:29:32.620
you mentioned Dershowitz and Hitler, and then I thought the first thing I thought was, Oh God,
01:29:35.700
is, you know, people, 5,000 people are going to say, you know, Glenn loves Hitler now. So,
01:29:39.320
but there is a danger. And the danger is that, you know, like I'll, I'll be around Portland State
01:29:43.980
University and I'll walk around and I'll see a picture of me with this huge grotesque villain nose
01:29:49.600
saying, you know, with the little thought bubbles saying, you know, I'm pro-life, uh, Republican,
01:29:55.600
love Trump. None of those things are actually true. And so the dangers that, that they, they
01:30:01.940
attack our motivations. They attack me for things that I don't even believe. There's something that
01:30:07.460
is dangerous also that we let them get away with something we should all be standing up and saying
01:30:11.980
no to, which is let's suppose even that they're a hundred percent morally right. Okay. Let's just
01:30:16.760
pretend that their views are a hundred percent morally right. If they can't actually articulate
01:30:20.480
that, if they have to force it upon us, then they're still wrong. They should be able to
01:30:25.880
articulate it. They must be able to articulate it in a way that's convincing. That's the rules of
01:30:29.920
engagement. Right. After Trump was elected, I said, can we, can we stop now for a second? Because,
01:30:36.320
uh, half the country does not like him and they're not going along. And when it flips the other half
01:30:44.280
will not go along. So we either have to change people's hearts or we just might as well start
01:30:49.520
building gas chambers because you're going to have to liquidate. There will be bad things coming down
01:30:55.600
the pike. Yeah. And none of us want that. And one way to solve that is through dialogue, conversation,
01:31:01.660
reach across the table. Yep. Keep reaching across the table.
01:31:10.080
Just a reminder, I'd love you to rate and subscribe to the podcast and pass this on to a friend so it
01:31:31.660
Just a reminder, I'd love you to rate and subscribe to the podcast. I'll see you next time. I'll see you next time.