ManoWhisper
Home
Shows
About
Search
Valuetainment
- November 12, 2021
Former Portland Professor Details How Woke & Politically Correct Universities Have Become
Episode Stats
Length
58 minutes
Words per Minute
179.7908
Word Count
10,502
Sentence Count
694
Misogynist Sentences
5
Hate Speech Sentences
30
Summary
Summaries are generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript is generated with
Whisper
(
turbo
).
Misogyny classification is done with
MilaNLProc/bert-base-uncased-ear-misogyny
.
Hate speech classification is done with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.000
My guest today is Dr. Peter Boghossian. We happen to have the same last name. My mother's last name
00:00:05.080
is also Boghossian. This is a very interesting person we're going to be talking to today.
00:00:09.340
He decides to troll academia by writing some hoax papers, which maybe we'll get into. Also
00:00:14.680
wrote a couple books. The book we'll talk about today is called How to Have Impossible Conversations,
00:00:20.140
a very practical guide, necessary skill to be able to learn how to do it today.
00:00:23.480
And September 2021, he resigned from his position at Portland State University, citing harassment
00:00:29.900
and a lack of intellectual freedom. Dr. Peter Boghossian, it's great to have you on.
00:00:36.580
Thank you. And I'm loving the fact that you pronounced my last name correctly.
00:00:53.480
I listened to a couple of the interviews. They said Boghossian, Boghossian. It's, yeah,
00:01:10.500
but the gh, gh, gh is an Armenian pronunciation and it's a little tough to do.
00:01:15.520
Yeah, I tell everybody it rhymes with explosion just to make it easy. Explosion, Boghossian.
00:01:19.620
There you go. There you go. So, Peter, I mean, listen, you're one entertaining, you're witty,
00:01:27.640
you're one of those genius scientist types that knows how to play the games to, you know,
00:01:35.400
mess with the other guy. But before we get into the book and before we get into some of the things
00:01:39.820
that's going on, maybe, maybe why don't we talk about why you recently resigned from your job at
00:01:43.900
Portland State University? Yeah, I was going to say, keep expectations low for the interview.
00:01:48.480
I resigned from my position at Portland State University because I was hired to teach critical
00:01:54.300
thinking and ethics and the university became so woke and so utterly preoccupied with issues of
00:02:01.560
gender and race and sexual orientation that it was, not only was it infused in everything,
00:02:06.840
but unless you towed the party line, there were consequences to that. And so I just,
00:02:12.560
I couldn't maintain my integrity and teach there anymore.
00:02:15.840
And what does that mean? What is towed party line? What does that mean in your world?
00:02:20.500
In my world, it means that there are certain conclusions that you have to have about,
00:02:25.600
and by the way, I actually share those conclusions, but certain conclusions that people have to have
00:02:30.180
about, you know, trans bathrooms or, I should say I share many of those conclusions, not all of them,
00:02:36.860
or Donald Trump or whatever the orthodoxy is. They have a moral orthodoxy of things that, that,
00:02:43.880
that they believe diversity, equity, inclusion, microaggressions. I mean, it's a, it's all bundled
00:02:51.780
up together and the university became and is an indoctrination center. And I don't think kids go
00:03:01.680
there and they get any kind of an education that you and I would be familiar with. They go in there
00:03:06.620
and, and they're expected to, I think the, the wording I used was mimic the moral certainty of
00:03:13.020
ideologues. So these people are ideologues. They have moral beliefs that they come into the classroom
00:03:17.700
with. And not only do they teach those, but they test those, those kids on the beliefs and they
00:03:22.620
want it back. So it's really a terrible situation, even if you agree with it. So the thing about it
00:03:30.640
like this, I think the part of the problem is that people get too caught up into right thinking and
00:03:35.080
left thinking and, oh, you know, academia is 90, actually Portland State University, the National
00:03:41.220
Scholars Association found that 99% of the faculty and administrators donated to one political
00:03:48.880
party. Get out of here. Yeah, it's true. 99%? Yeah, you can look it up, the National Association
00:03:54.840
of Scholars and then there's the Oregon Association of Scholars. You, you know what that party is,
00:03:58.580
right? Of course, they're all Trump guys. They're all MAGA guys, right?
00:04:00.960
Yeah. So, so the, so I don't necessarily think that's a problem, but because, you know, people
00:04:10.360
can believe anything they want, but remember, this is a public institution, right? So this
00:04:14.680
is a public institution. Kids are going there. They're never hearing from people who believe
00:04:20.380
other sides of the argument. And it's the whole thing. It's just so disheartening.
00:04:24.620
So for somebody that's not in that world, maybe give us a visual of what that world looks
00:04:29.560
like both for students who are going there with maybe opposing ideas, as well as a professor
00:04:35.700
that's in that world where 99% gives to one party and you're just kind of trying to, and
00:04:42.280
by the way, for, for the audience to know this, you were not supportive of anyone in 2016 on
00:04:48.060
the right as candidates, right? I don't, I don't think you were even, so it's not like you're
00:04:51.320
one that's a MAGA team or anything like that. You're just a professor that's trying to teach
00:04:55.420
there, but maybe give us, maybe give us a visual of what it's like to be in that world
00:04:59.000
today. Sure. But not just 2016. I've never voted for a Republican in my life. So I'm
00:05:04.300
not, that's very important for the audience to know. That's very important for the audience
00:05:07.000
to know that. Yeah. I'm not, not even am I not a right wing maniac. I'm not even a right
00:05:11.780
wing guy, period. And so, you know, I, I believe I'm an atheist as well. Uh, and I, I want to
00:05:17.380
talk about that in the context of your question. So what does it look like? It looks like kids
00:05:23.360
go into the classroom and they never hear the other side of the argument. They never
00:05:28.760
hear an imposing view. And they certainly, if they do, which I doubt, but they never hear
00:05:33.700
it from someone who believes it. Why is that a bad thing? Well, well, well, if, if you don't
00:05:39.620
hear an opposing view, it's an indoctrination mill. You're, you're going there to parrot back
00:05:44.060
or regurgitate. You're not teaching people how to think and then making decisions on their
00:05:48.920
own. Let me, let me give you an example. So I teach a, I taught a class. It feels freeing to
00:05:53.260
say that I taught a class in atheism and I'm, I'm a very out and very outspoken atheist. And I would
00:05:59.120
have people come into my class. So I would tell people exactly what I believe. And then I would have
00:06:04.600
believing Christians who are qualified, come into my class and speak to the kids. And I'll give you
00:06:10.720
an example. Somebody wrote after my resignation letter, somebody wrote, Dr. Phil Smith wrote a
00:06:16.180
letter to the Oregonian. He's a conservative Christian. He teaches at a conservative Christian
00:06:21.360
university here in Portland, Oregon. And I gave him, uh, the whole class provided that he had a Q and
00:06:29.240
A. So he couldn't just give a lecture. So he gave a one hour lecture and answered a one hour Q and A
00:06:34.120
about his best arguments for the existence of God. I didn't say a thing and I let the kids decide.
00:06:40.140
So the students themselves make their decisions. And I viewed my job is not only to be honest,
00:06:46.180
with them, but to bring in the best representatives of the other side. And so he wrote, and I had no
00:06:50.860
idea he was going to do that. He wrote a lovely op-ed to the Oregonian, which is the local paper
00:06:54.900
here. And so what does it look like? It looks like only one view is forwarded. It looks like
00:07:00.960
you can't question certain things that are morally fashionable, like equity. And most people,
00:07:06.620
by the way, they don't even have the slightest clue what equity means or diversity or inclusion
00:07:11.240
or any of these things that have been, that have hijacked. Uh, I mean, these things are now infused
00:07:18.080
in our institutions and most people have no idea what they mean. So, so, you know, let's start off
00:07:23.080
with the basic part. Woke. You hear woke every everywhere. I mean, obviously you and I know what
00:07:27.020
woke means, but how would you define what woke means? All right. It's funny you ask that. I have a
00:07:31.920
video series coming out that translates woke-ish into plain language. And it's a sick,
00:07:36.920
they're 60 second videos. Basically woke means to be awakened to the injustice in the world.
00:07:43.400
And the more woke you are, the more you understand that you can never be woke enough
00:07:47.300
because reality is just infused with injustice and oppression everywhere. And so the part of the
00:07:54.840
communication problem, and that's my book, how to have impossible conversations, is that people are
00:07:59.300
speaking past each other, particularly in this context, woke people are using certain words that
00:08:04.820
they don't traffic in the normal meanings of words. When we look them up in the dictionary,
00:08:08.360
like equity would be another, a perfect word. That's totally misunderstood. My kids go to public
00:08:14.860
school here in Oregon or my, my, my daughter, my, my son graduated every email, equity, equity,
00:08:20.700
equity, equity, equity, equity, equity, equity. Do you know what equity means?
00:08:25.040
You're asking me, ask somebody in your, in, in your film crew over there.
00:08:29.380
David, what does equity mean? This is a seven time, but what does equity mean, David?
00:08:33.920
Uh, to have a piece of something. Yeah. I like to, no, no, not equity of a company equity as an
00:08:39.140
equity, uh, uh, uh, in, in, uh, in, in, in society equity. Oh, I mean, that's, I don't know.
00:08:47.860
You wouldn't know. Equity in society to be. Well, that's what they're talking about. It's not
00:08:51.580
equity like you're a piece of the company. All right. So this is wonderful. This illustrates the
00:08:56.060
point. The vast majority of people have no idea what the word equity means. And you know,
00:09:02.360
the guy that you're going to fire after this episode, he's, he's absolutely, he's absolutely,
00:09:07.580
he's absolutely correct. They've changed the meanings of the word. So equity means to make
00:09:16.040
up for, this is Iblum Kendi's definition, uh, or a part of his worldview that feeds into the
00:09:21.640
definition to make up for past injustices. We need to future injustices and future and presence
00:09:28.040
injustices. So if people have been just systematically discriminated against because
00:09:32.300
they're, you know, gay or trans or black, we need to, um, fix that by discriminating against
00:09:41.760
people who don't have those characteristics. The other thing to think about with equity is
00:09:45.720
you want equality of outcome, not equality of opportunity. That's why the governor of New
00:09:50.400
York, for example, has, uh, are AP classes. Yeah. Well, they're, they're getting rid of the
00:09:56.500
talented and gifted programs for kids, because the idea is it's an equity based system. It's not,
00:10:02.660
it's the opposite of equality. But the only reason I mentioned this at all is because one of the
00:10:08.980
reasons that woke has made such inroads is that in, in, you know, when you ask David, most people
00:10:14.860
don't know what these words mean. And so we're basing policies, educational policies for our kids
00:10:20.500
and K to 12 systems for our judiciary, for our media, we're basing these for the ACLU, the Southern
00:10:26.840
Poverty Law Center. All of these things are now a value that suddenly sprung into existence that
00:10:32.700
nobody heard of five years ago. And if they did, it was in the context of finance. Now who's driving
00:10:38.280
this though? Who's, who's driving these initiatives? Who's the mastermind behind this today? Because it
00:10:43.780
seems like it's sudden that this happened the last five, 10 years. Yeah, that's, that's correct.
00:10:48.340
It started in the university system and it leaked out of the university system. I want to say two
00:10:53.300
things. Remember how shocked you were at the 99% of people? Very much so. Yes. Okay. So, so I think
00:11:02.620
the problem is that people get way too caught up in, oh, this is a right left thing or conservatives
00:11:07.340
or Republicans. Forget about that. Forget about that. Let's say, let's say that, that, that the people
00:11:12.280
in there were Mormons. Let's say that 99% of the people at a university were Mormons.
00:11:18.340
Do you think that, that, and it was a liberal arts university as opposed to, you know, STEM or
00:11:25.680
civil engineering or math. Do you think that those kids would get an, as good of an education
00:11:32.200
from that as they would, if there were intellectual diversity, where there are some Mormons and some
00:11:36.960
Muslims and some atheists and some Christians? Of course, you're going to have more, uh, you know,
00:11:42.860
if you have diversity, if you have a, uh, sure. And so part of the, that's part of the problem. Part of
00:11:49.340
the problem is that you're correct. Intellectual diversity benefits our kids. It benefits our
00:11:54.280
democracy. It benefits our society. But when you hear the word diversity, which is a very, very common
00:12:00.840
word now, what they mean is two things. They mean intellectual homogeneity. They mean an environment
00:12:07.840
in which everybody thinks the same and has a certain set of beliefs, but has superficial
00:12:12.460
characteristics. Yeah. But, but I, I guess what I want to know is, do you know who drove these
00:12:18.440
things? Like who is driving these initiatives? Like, and then, and then next, what's the master
00:12:23.300
plan? Because you're not going to be able to push this for too far. There's going to be pushback
00:12:27.500
by people on the opposing side that they're just not going to take it eventually. So who's driving
00:12:32.300
this and why? Okay. So that's a separate question. So we've identified the problem. We've seen how
00:12:38.720
it, that here's what happens. So kids go into the university system. They're taught by people
00:12:45.180
who believe this. These people are true believers, not all of them because they've created a culture
00:12:51.320
of fear. So anybody who speaks up against it is a bigot, is a homophobe, is a racist, is a misogynist,
00:12:57.700
is a Nazi in some cases. And so, so they've, they have jobs for life. It's called tenure.
00:13:04.500
They teach people, you know, basically what are moral conclusions. They test them on it.
00:13:10.980
Three, four, five, six years later, these kids get out, they go to, they get out of the university.
00:13:16.900
They go into the workforce because they have degrees. Then they occupy positions of administration,
00:13:23.020
management, et cetera. And they bring these ideas with them. The pronoun ideas, the safe spaces,
00:13:28.700
the trigger warnings, the microaggressions, the nucleation point where this all starts is the
00:13:34.700
university system. All of it.
00:13:36.500
I get that. But so why, what do you, so let's just say, uh, let's just say, for example, uh, uh,
00:13:43.600
the first Titanic that was shot, the director was a guy that was a Nazi and the producer was Hitler
00:13:49.920
and the hero in the first Titanic was a Nazi, right? And by the way, that's a true story.
00:13:54.520
So when that first Titanic that came out, the messaging for Hitler was to get people to say,
00:13:59.520
what a great hero Nazi is, what a great community they are. They're good people. And then our guard comes
00:14:05.200
down and we're more, you know, receptive. We're more willing to receive information or influence
00:14:09.860
from those guys who's behind this. And what's the outcome to do what? Like, I understand if a
00:14:15.680
person is saying, I want to get everybody to eliminate this. So I become the person they
00:14:21.180
listen to because one day we're going to do X, Y, Z. What's the motive? Okay. So the motive is the
00:14:26.800
idea. And you see this, and this is the other thing we should probably talk about colleges of
00:14:31.280
education. Um, bracket that. We'll come back to that later. The motive is that there is oppression
00:14:38.760
everywhere. There is systematic oppression and the evidence for this, they don't really use evidence
00:14:45.760
or really talk about evidence, but a way for sane, rational people to think about it is, well,
00:14:50.100
why would they believe this? Well, they believe this for a few reasons. First, they are 100% correct
00:14:56.340
in that, that there has been systemic racism up until fairly recently. It has been embedded in
00:15:02.920
systems. African-Americans have been treated horrifically. And, you know, even, uh, uh,
00:15:09.820
you know, there were miscegenation laws in this country where people couldn't marry other people
00:15:14.920
and they were specifically set up for black men and white women. And those, those have gone away.
00:15:20.120
And, and, and even in my lifetime, and I'm 55. So the, the main driver for this is the historical truth
00:15:26.960
that there were, uh, real genuine systemic oppressions and that those oppressions still
00:15:34.080
exist. And because they still exist, we need to overthrow the institutions that allow the perpetuation
00:15:41.840
of systemic injustice. That's why the university is constantly talking about systemic injustice.
00:15:47.560
And that's also, by the way, I don't know if you want to get down this rabbit hole,
00:15:51.460
why they look at the police, they want to defund the police because the police are the things that
00:15:56.960
they're standing in the way of the current society that we have. They're upholding the institutions
00:16:02.180
and the structures that we have. And if you can defund the police, then you can, um, weaken those
00:16:07.860
institutions by allowing people to revolt. And I totally get all of that. Everything you said,
00:16:12.740
I totally get, I'm aware of everything you just said right now. What I'm asking is who's the leader
00:16:17.300
for this and what's the outcome to do what to bring down America? Is it to, you know, eliminate,
00:16:23.660
uh, Westernized thinking? Is it to eliminate capitalism? Is it to what, who is the voice?
00:16:28.280
Who's the person behind it? And what's the outcome? Okay. There is no singular voice. Okay. So you can
00:16:35.020
think about this, like being, uh, you know, in a weird way, like being a Protestant as opposed to being
00:16:39.900
a Catholic. There is no Pope of this stuff. Now there, there may be bishops or there may be very
00:16:47.480
influential voices behind this, but there is no leader that's governing, governing things.
00:16:52.040
The outcome is, so even the fact that you ask what the outcome is means you've thought about it more
00:16:58.480
than people who have lived in this space. The, one of the outcomes is you see these, these zones
00:17:04.160
popping up like these chases popping up in Seattle and Portland. I really don't think that they've
00:17:10.000
seen, they've thought through what the outcome is when you overthrow Western values and Western
00:17:16.740
civilization and freedom of speech. Freedom of speech is a big one that you have to overthrow
00:17:20.540
freedom of assembly due process. And again, they're looking at the United States as fundamentally racist
00:17:27.100
and oppressive and they want to overthrow it. Now, when you asked me what the outcome is,
00:17:31.220
I can't say, I can give you my own opinion of what the outcome is. You'll, you'll see a new world
00:17:36.320
hegemon. The United States is an empire in decline. That's completely obvious right now,
00:17:41.580
especially after that, which I don't want to talk about, but, but the, uh, the fiasco,
00:17:46.920
the utter catastrophe that was Afghanistan, uh, our, our alliances are weakening. Our economy is
00:17:52.900
struggling. If you look at the trade between other, other countries, the United States in the last 20
00:17:58.420
years, it's shifted primarily to China. So if they think the United States is a bad boogeyman
00:18:02.580
and you should look into the one belt, one road policy in China, wait until they see the new,
00:18:07.220
the new hedge fund. They're really not going to like that. Who, who's, who's the, uh, who wants
00:18:12.640
to see this happen to America? Who wants to, because you know, a lot of times when you think about this
00:18:16.980
stuff, you, you think about proxy wars, you think about how some of these countries are sitting there
00:18:23.240
saying, listen, the way I'm going to get to America is by pinning X, Y, Z against them. So they're
00:18:28.020
playing those games, but who would like to see America's way of living capitalism, all of that
00:18:34.420
fall? Who would love to see that happen? You're, you're a good interviewer. You're really, these are
00:18:38.360
great questions. Um, my friend Faisal Amu-Tar, who heads ideas beyond, on borders, he's an Iraqi
00:18:44.360
refugee who's come to this country. He's an amazing human being. Um, he was telling me, spending time
00:18:50.540
with him is like dog years. It's like seven to one. Um, he was telling me that the, the, that many
00:18:56.900
places in the Middle East, primarily funded by China and Russia are, have stations dedicated to
00:19:03.980
BLM, dedicated to the divisive madness that's currently overtaking. So basically the enemies
00:19:09.860
of the United States want to see this succeed. They want to see these rebellions succeed. Meanwhile,
00:19:14.540
there are people who, there are countries who are literally putting their own citizens in
00:19:19.840
concentration camps and you don't hear a peep about it from these folks when, you know, you'll
00:19:24.240
hear about a gender imbalance at a conference, but when ISIS takes literal slaves, and we know that
00:19:29.880
we have the videos from the slave bazaars when they take Yazidi women, uh, there's not a peep.
00:19:35.340
There's not a protest. There's, there's nothing. You don't, you don't hear anything about it. So
00:19:38.620
to answer your question directly, there are countries like Russia and China and Iran to a certain
00:19:44.260
extent who are directly funding the BLM and, and the, the accoutrements are the kind of the, um,
00:19:51.420
conceptual drivers for this. And then you have the people on the far left, the woke far left in
00:19:56.440
particular, um, who really want to see the end to what they consider to be an oppressive patriarchal
00:20:02.500
regime. Yeah. It's going to be interesting to see what happens here. By the way, going back to your book,
00:20:07.900
how to have impossible conversations. So when you're dealing in an environment,
00:20:11.140
in a climate where 99% is giving to one party and 1% is given to the opposing party,
00:20:17.020
how do you have those difficult, those impossible conversations with people from, uh, the opposite
00:20:23.020
aisle? I mean, how do you have those conversations? Yeah, that's a, another really good question.
00:20:27.400
It's actually really easy to have those questions that once somebody, when I taught in prisons,
00:20:32.480
I did my dissertation and I taught prison inmates how to think through moral questions. And I pulled
00:20:39.640
from the history of Western philosophy, you know, questions like, what does it mean to be,
00:20:44.640
what is justice? And she has so many questions, you know, can you be unjust towards yourself? And
00:20:51.380
what does it mean to be a good man? When people will talk to you already, even, even if you think
00:20:58.100
the, the, the, the gulf or the divide is so great, the moment someone's talking to you,
00:21:03.340
the, that's, that's great. Um, the, those conversations are more, far more possible than
00:21:10.080
you think, far more possible. The problem is when they won't talk to you. And that's the situation
00:21:16.600
which we have now. These, the folks who participate in this ideology, they don't, um, buy into the norms
00:21:24.820
of civil society. They don't buy into reason, discourse, evidence. And this is really important
00:21:32.460
for your audience to understand. So if, if, if, let's say that you and I want to figure out a
00:21:38.600
problem, let's say we want to figure out, you know, we've heard from Armenians, for example, and,
00:21:44.060
and Armenians are claiming that, uh, they're pulled over by the police at radically disproportionate
00:21:50.820
rates than non-Armenians. You and I would sit down and say, all right, man, we, we got to figure this
00:21:57.820
out. How are we going to do this? And we'd say, okay, well, we're going to look at the, the, the
00:22:02.400
body cams of people. We're going to, every time someone calls in, we're going to see if they have
00:22:07.160
an IAN or the YN, which anybody doesn't know, it's an Armenian last name. And we're going to, and then
00:22:12.780
we're going to, we're going to kind of study this somehow. We're going to figure out, you know,
00:22:16.040
we're going to do a survey data, whatever, however we're going to do it. Okay. That's how
00:22:20.380
sane, rational people go about figuring stuff out. These folks don't buy the traditional
00:22:28.820
tools that we would use to solve problems, reason, evidence, epistemic adequacy, which
00:22:35.160
basically means, you know, knowing what you're talking about. Uh, what they would do is, so
00:22:40.080
they believe this is this, uh, Aubrey Lloyd's, the master's tools cannot disable the master's
00:22:45.640
house. The master's house is the current system we have, and it's patriarchy,
00:22:50.360
racism, oppression. So what built the, this is their thinking, like what built the system? Well,
00:22:57.520
evidence, reason, you know, science to a certain extent, all of these things. So you can't disable
00:23:04.000
the patriarchy. You can't disassemble it through evidence and reason. You have to use something
00:23:09.620
else. Like you have to, you know, rip down statues. You have to tear stuff down. You have to,
00:23:15.920
uh, whatever, whatever the, the, the particular brand of, of lunacy is. You have to defund the
00:23:22.400
police. Um, you, you have to, you have to make it so that the system in place is just, is disabled.
00:23:30.200
Was that clear? You have to do it. So the system in place is disabled. Uh, I, I, I, again, I fully
00:23:37.800
get that. Uh, for me, it, it, uh, keeps going back to the outcome of what you're trying to do.
00:23:44.040
Like, you know, if you look at different empires on how they fell, okay, Iran, how many from the
00:23:48.660
outside was sending tapes in. Okay. Those tapes eventually caught, you know, they, they started
00:23:53.620
creating some momentum. People were, you know, dubbing the tapes, passing it down to other
00:23:57.880
people. And the messaging was the Shah is too rich. Look at the celebration. He put out the 2,500 years.
00:24:04.040
That's your money. If I was running Iran, I would give that back to you. This is not fair.
00:24:09.140
Look how bad the conditions are. Let's revolt. It's worth it. We can take them out.
00:24:13.860
You know, look what Savak is doing to innocent people, et cetera, et cetera. Boom. They take them out.
00:24:18.360
But the outcome was get rid of Shah because they painted Shah to be the puppet to the West.
00:24:24.200
And, you know, you know, nothing's really going to happen for 40 some years later, it's still here.
00:24:28.460
So that's one I'm trying to, I'm trying to get a little bit deeper to see what you have there.
00:24:31.680
Let me see if I can, again, this is complicated. Let me see if I can give you the, the goal,
00:24:37.140
what they want is a utopia. Now they want, so, so this is, okay. So this is, this is the next level
00:24:45.140
of this. So I'm going to try to explain it. It's very complicated. If it's unclear, you tell me
00:24:51.360
it's not, it's my explanation, not your understanding. Part of the assumption going
00:24:58.020
into this whole thing is that they have certain assumptions. They don't like what they call grand
00:25:04.860
narratives. Grand narratives are sweeping explanations to explain things like Christianity
00:25:11.300
is a grand narrative. Biology is a grand narrative. Communism is even a grand narrative. And so
00:25:19.520
at root of this is a biology denialism. That's why you see so many of these folks deny evolution,
00:25:28.460
for example. So when you, if, if you're trying to, okay, this is so complicated. Okay.
00:25:39.880
Every, this is the way they think every disparity of outcome is risk. The system is responsible for
00:25:48.360
that. And the system is responsible for that because it's inherently racist. That's again,
00:25:52.760
why they want to get rid of the talented and gifted program. So every disparity in outcome is due to a
00:25:59.380
racist system. So it couldn't possibly be to any cultural upbringing, or maybe it is, but it's
00:26:05.560
certainly not due to biology. So if you can change the systems, you can engineer an outcome. You can
00:26:15.420
engineer the outcome that you want to engineer, which in one word to answer your question is utopia,
00:26:22.040
but you can't get to the utopia. If you have the existing systems. And part of the, and part of the
00:26:29.420
reason for that is because they're biology denialists. They deny basic rudiments of biology.
00:26:37.640
Unpack that. Unpack that.
00:26:41.640
So, um, okay, now I'm going to get in trouble. This is where I get in trouble. Um,
00:26:48.300
so there are certain things that we can't talk about in society. Let me throw out, let me throw
00:26:53.780
out something. What, what, what is the, what is one of the commonalities among Nobel prize laureates?
00:27:00.740
Um, who have a disproportionate number of winners been left and academia? Uh, no, they've been
00:27:08.520
scientists. Oh, I see what you're going. Okay. Jews, but not only have they been Jews, I was just
00:27:15.080
talking to Brian Keating about this. Um, he's the guy who, uh, he's, uh, the physicist into the
00:27:20.560
impossible. Um, it's much easier to talk about this with people who actually share those identity
00:27:26.000
markers than those who don't like you and you and I have a lot of identities in common, I'm sure.
00:27:31.660
Um, but it's not just Jews. It's, it's not Sephardic Jews. It's Askenazi Jews. And it's not
00:27:39.840
even just Askenazi Jews. It's Askenazi Jews that are from specific regions in like Poland, then Germany,
00:27:46.880
Breslau region. So why would that be? And how do we know that? Well, um, this is complicated.
00:27:55.600
Steven Pinker, the psychologist from Harvard writes about this. One of the ways we learn about
00:28:00.180
this is we look at identical twins separated at birth. And then again, this is a whole, I don't
00:28:06.640
know. This is the whole thing is complicated. So the mean IQ is, is a hundred. The average IQ is a
00:28:11.500
hundred. If you look at identical twins separated from birth. So if you just pluck two people out of
00:28:18.140
the population, the difference in IQ would be eight, but identical twins separated by birth would have a
00:28:24.140
difference in IQ of four. That tells you that there's something biological to the IQ. But if
00:28:32.680
you're starting assumption is that there are nothing that biology doesn't determine cognition,
00:28:40.120
it doesn't determine IQ, et cetera, et cetera, then you have to come up with a reason for that.
00:28:44.780
And one reason you could say as well, uh, that's racist or the tests are faulty or the tests don't
00:28:50.680
test what you think they test or whatever reason that you wanted to, to come up with, by the way,
00:28:56.360
parenthetically, this is extraordinarily interesting. Okay. So what happens is
00:29:03.740
that idea itself is morally fashionable because people teaching it are basically on the woke left
00:29:12.740
and they have been for years. And so what, what happens is a consequence of that is that we now
00:29:18.580
have a generation of people who don't think that the IQ measures what psychologists say, G or general
00:29:24.520
intelligence, that it doesn't measure what people think that it measures, right? It doesn't, maybe it
00:29:29.460
doesn't measure anything at all. So we have all these people now thinking the IQ is bogus.
00:29:34.020
Can I tell you a cool story? Go for it. I don't think I've ever told anybody this before. So, um,
00:29:40.880
my mentor told me this story. This is absolutely fascinating to me.
00:29:47.500
So my mentor, um, his name was Frank Wesley. He was interned in, in Buchenwald. He was picked up by
00:29:56.040
Chris, uh, by the Nazis on, on Kristallnacht. And he, he became a psychology, uh, professor
00:30:04.000
fascinating man, fascinating history. He told me, and he was a behaviorist, um, and a behaviorist is
00:30:10.340
someone who believes that you can control behavior by looking at it in terms of stimulus and response
00:30:15.760
oversimplified, but basically. So he told me this unbelievable story. So there was a kid,
00:30:22.860
and I think if memory serves me correctly, this kid was in Washington and he kept punching his face
00:30:28.580
like this and they tried everything, but they couldn't get this kid to stop punching his face.
00:30:35.600
And so what they did was they put electrodes on his arms. And when he brought his hand up there,
00:30:42.040
they, they zapped him. It only took two zaps on, um, two zaps for this kid to stop punching himself in
00:30:51.940
the head. Now, Frank had this on, um, on, on VCR. I don't know if any of you were maybe older than,
00:31:02.040
older than you, but VCR is like these analog tapes. Then they're not digital. And so I don't know,
00:31:07.960
maybe you should, I don't know if you, more people even know what it is. They'll know what VCR is.
00:31:11.040
Yeah. Okay. Okay. So, so, um, so the, the, the tape had a glitch in it. So he sent the tape down to
00:31:20.160
the AV or whatever IT places and fixed it. And they, I, the, the place reported the tape for having
00:31:28.920
content, which, um, displayed human cruelty. It basically violated a rule. And so they literally
00:31:36.940
cut the piece of the tape out. So you don't see the, the guy, uh, punching him, but you can see
00:31:42.200
the electrodes. And they also cut out the section that even explained that. So we're, so that knowledge
00:31:51.440
is then lost to the rest of humanity because it's viewed as cruel. Now, whether or not you think it
00:31:57.460
should be lost, that's, that's another story. Yeah. But the idea is that you can see how our
00:32:04.780
institutions based upon values that people have determine the outcome of what people think is
00:32:11.200
true. There was just something that came out recently about dog training. I can't remember
00:32:16.160
the name of it, but you can put it on your screen that, that, you know, uh, there, there shouldn't
00:32:21.180
be any punishment of big dog. I can't remember the name of it, but, uh, plus said, you, you know,
00:32:27.380
there's dog training shouldn't have any punishment at all. It should be all reward. Okay. So that's
00:32:31.880
ideological. That's not evidence-based. So what happens now is that the people, for example,
00:32:37.420
in the behaviorists with the kid punching himself in the head or woke ideology or microaggressions
00:32:42.380
or any of this stuff, all of that is then fabricated. It's, it's, it serves, it lives in
00:32:49.600
service to an ideology and not in service to the truth. And then people take this information and
00:32:56.380
they, they believe it because they think it's true. And then they get out and they, they bring it
00:33:01.220
to the workplace. They talk about it to their friends. So the whole society is harboring some
00:33:05.740
delusions about things. Does that make sense? Yes, it does. Yes, it does. So let me ask you,
00:33:10.520
does this become the story that you told? Is that selective? That's not selective hearing. It's a
00:33:16.300
selective teaching. Is it selective influence? Is it controlling a part of the information to help you
00:33:23.580
come up with a different conclusion? And how often is that done? You know, in academia, academia,
00:33:29.220
well, it's, it's done. The reason it's done is because people's moral minds override their rational
00:33:36.180
minds. Now, look, I'm not saying that, I mean, there it's, there's a reasonable conversation to
00:33:44.380
be had and I'm not really sure which side I fall on, you know, about Dr. Mengele's experiments on Jews
00:33:50.460
and should those, you know, when he broke bones over and over again, should we look at that
00:33:55.720
and use that to benefit humanity? Or is it, it was just so utterly monstrous, it shouldn't even
00:34:03.200
be considered. We can have that conversation. But the point is that we're, we're all doing our
00:34:09.920
best to try to create systems here that are fair and just. And although they're not trying to create
00:34:17.680
systems that are fair, they're trying to create systems that are equitable, but we're trying to do
00:34:20.860
the right thing. And when you try to do the right thing, often you're, you're living in service to
00:34:27.380
an ideology. Your moral mind is truth is no longer your North star, right? So, so the purpose of the
00:34:33.800
institution hasn't become defined. What's true. It's to replicate the dominant moral orthodoxy.
00:34:39.700
Does this strategy work to work today? Meaning like the direction they're going with wokeism,
00:34:45.500
with equity, with gaslighting, with confusing the hell out of all of us and pinning us against each
00:34:49.680
other. Is this a proven strategy long-term? Because, you know, you're seeing some, what you're
00:34:55.300
starting to see as well is how people on opposite sides are sitting there saying, I disagree with
00:35:01.340
my party, what they're doing here. I also disagree with my party. Maybe we got more things in common
00:35:05.240
than before. I'm not sure this approach, maybe it worked 200 years ago. Maybe it worked 100 years
00:35:11.820
ago. Do you think an approach like this is going to work today in America?
00:35:14.840
Well, it's, it's, I'm shocked by the question. It's already worked. It's astonishing.
00:35:19.520
Long-term. You think long-term this is sustainable?
00:35:22.200
Yeah. Long-term.
00:35:22.880
No, I don't think so.
00:35:24.380
I agree. Yeah. In fact, it, it, it, it is utterly, it is utterly impossible to sustain itself long-term.
00:35:31.300
You would need a kind of tyranny. You know, you would need to upend, totally upend free speech. But
00:35:37.360
it's interesting if you look at surveys, for example, from the Foundation for Individual Rights
00:35:41.800
and Education, it's Greg Yanov's organization, one in four college students believe that violence
00:35:47.580
is acceptable to college lecturers. There's the, that's an astonishing to guest lecturers
00:35:53.460
who come in. It's an astonishing number of people who don't feel, think 40% don't feel
00:35:57.700
they can ask questions or ask difficult questions if it's on a moral topic. I mean, it's across
00:36:04.560
the board. We're seeing people, we're seeing this, this effect incredibly successful in
00:36:10.400
the short term, but not successful in the long term.
00:36:12.500
I don't think so. Because if you think about what happened in, in, in Canada, Toronto, when
00:36:17.080
the university of Toronto, I think it was when they came out with the trans, here's what we're
00:36:20.980
going to be doing in the controversy with Jordan Peterson. Let's take that out. Let's say
00:36:25.240
that event doesn't take place. Do you think Jordan Peterson is as famous as he is today?
00:36:28.720
Think about it. Take that event out. Who is Jordan Peterson? So, so I think what they're
00:36:34.180
also doing is they're giving birth to Jordan Peterson, to Gatsa, to, you know, people like
00:36:40.440
yourself, to the Rogans of the world. They're, they're getting others to say to a Russell
00:36:44.800
Brand of the world, a Bill Maher of the world, to some people that maybe they would have never
00:36:49.020
given birth to. They're like, wait a minute, this shit just doesn't make a lot of sense to
00:36:51.660
me. I'm backing off because I no longer agree with you because we're not on the same
00:36:55.840
page. So I just don't think long-term this approach of bullying is sustainable because
00:37:00.940
what, what is the power of a free thinker? If you think about a free thinker, what, what's
00:37:05.560
their DNA? They question things. They're curious and they typically don't stop at no. They don't
00:37:12.780
stop at mind your own business. So they don't do well with bullies. They typically stand up
00:37:18.500
to bullies and eventually bullies can bully the regular guys. They're like, okay, I don't
00:37:22.760
want to have any conflict. But until you face a free thinker, like, okay, I don't
00:37:25.820
like messing with this guy. He's different. So you're going to give a lot of birth to free
00:37:30.000
thinkers like Jordan Peterson over the next decade, because I just don't think a lot of
00:37:34.160
the free thinkers are going to stand alone, stand, stand there and just say, you know,
00:37:37.640
we're going to take all the bullying from you. I see that not taking place.
00:37:40.760
No, no, no, no, that's right. And Jordan has done an amazing job of that. My, my friend Gad
00:37:46.080
has done an amazing job of that. So the question is, how do we empower other people to stand up
00:37:52.560
and fight back against this ideology? That's one, one question. Not only is this not sustainable,
00:37:58.280
listen, nobody likes living like this. Nobody likes being petrified of what they can say,
00:38:05.280
but people have been canceled because they were bullies in grade school. Are you freaking
00:38:10.540
kidding me? They were bullies when they were six. And now that you're, you're denying them
00:38:16.040
employment or you're complaining to their advertisers that they're, you know, white supremacists,
00:38:21.180
which doesn't even have anything to do with it. And so not only do people not like living like this,
00:38:27.260
but the question is, what damage is this going to do to our institutions in the short term? I mean,
00:38:33.900
it's already done a tremendous amount of damage. By the way, you were talking about Nobel Prize earlier.
00:38:39.260
So I went and looked up when you were talking, uh, to see who's won over the years. Uh, Armenian,
00:38:45.140
uh, uh, Bengali, four people, Armenian one, apparently Chinese, 12 people, Hokkien one Jewish. I'll get
00:38:53.980
to Jewish last. Pashtun Nobel is one Punjabi is two Tamil is three Tibetan is one. And how many
00:39:02.120
do you think is Jewish based on what you said? What do you think the number is? Well, the Jews aren't
00:39:07.060
even two percent of the population. So, uh, significantly, uh, 198 people of Nobel prize
00:39:16.560
laureates by ethnicity, one 98. I mean, the numbers are staggering. And another article just came out
00:39:23.040
yesterday that, uh, by, uh, they had Goran Hansen head of Royal Swedish Academy of science sciences said
00:39:30.640
they want people to win because they made the most important discovery. We will not have gender or
00:39:35.760
ethnicity quotas says top scientists. Interesting. Good for them. Yeah. Well, I mean, that's a,
00:39:41.140
that's a merit-based system. And again, so, so the question is two questions, one drill down the data
00:39:47.620
and you'll find that they're overwhelmingly Askenazi Jews and they're overwhelmingly from certain region,
00:39:53.380
region of the planet. But the question is, well, why is that the case that gets us a little far
00:39:58.580
field in our, in our topic? But I brought that up because, um, you, you can't say that it's,
00:40:04.020
if you're on the woke left, you can't say that biology has anything to do with it. And it's
00:40:08.780
interesting. So since we're having this conversation, let's just, let's have this
00:40:12.280
conversation. Let's be honest about this. So somebody was teaching, I'm not going to name the
00:40:16.600
name of the institution. You can probably guess what it is, but somebody was teaching a philosophy
00:40:21.020
of race class at that institution. And, um, it was fascinating to me. I remember I, I, I was sitting
00:40:30.380
across from this person and, um, I just wanted to say, well, what about this? What about this? What
00:40:36.500
about this? Like, and this is way outside my area, but I didn't do that because if I did that,
00:40:42.020
I would have been accused of racism or bias. That's not saying I believe that they're valid
00:40:46.740
biological races, or that's not saying anything. It's just saying that there's a kind of, there's
00:40:51.660
even something called the bias response team that you can report people to. If you think that they,
00:40:56.660
they have any bias in them whatsoever. And, uh, many, I think over a well over 200 institutions,
00:41:03.660
academic institutions have bias response teams, but it would seem to me like, you know,
00:41:07.540
well, why do certain people get Tay-Sachs and other people don't? Why do certain people get sickle
00:41:11.660
cell and other people don't? But you would think that the, the purpose of that class would be to,
00:41:16.640
to really take a look at what race is, what it means, et cetera, how it's come to be as an idea.
00:41:22.760
But one would also think to be really honest about that, you'd have to at least talk a little
00:41:28.760
bit about, uh, you know, race realism, or I'm even hesitant to talk about these things, but
00:41:34.560
you would think that the, that you would have to give students, um, the best voices from all sides of
00:41:45.700
the, of the issue. And if you're not comfortable doing that, then don't offer the class, right?
00:41:50.840
Don't offer the class. But the other thing that's interesting about that is you would think that
00:41:56.040
the qualifications for that person, they should have some biological qualifications,
00:42:00.780
master's degree in biology, PhD in philosophy. I don't know what it would be,
00:42:05.120
but it would seem like that that's important. I'm going to tell you, may I tell you one,
00:42:08.840
one more quick story?
00:42:09.580
Call for it. Yeah.
00:42:10.220
So I was at a, at a, um, at a, uh, a talk and I'm not going to name the university, the university
00:42:17.100
where I used to teach. And somebody said that he wanted to, um, to, to offer a native American,
00:42:24.720
uh, class, a class in native American philosophy. Somebody in the audience said, you know, uh, I'm not,
00:42:33.440
I'm not really, I think this is a wonderful idea. I love your presentation, but I'm not really
00:42:39.600
comfortable questioning the, as a, as a colonizer, I'm not really comfortable asking questions about
00:42:46.720
things I disagree with in the philosophy. And, and the presenter said, yeah, you know, I'm, I agree.
00:42:53.260
I'm not comfortable either. And the purpose would just be to teach the ideas and to sit with them
00:42:59.200
and, and, and to learn, learn from them. Listen, the tool of the trade from Socrates on
00:43:06.060
is counter example is question is challenge. There is no, nobody gets a free pass. If you want to put
00:43:14.800
native American philosophy into the philosophy department, great. I think that's the best idea,
00:43:19.260
that wonderful idea. And we do, we, we don't give it special treatment. We give it the same
00:43:26.060
treatment that we give the French, the Germans, the Africans, everybody gets the same treatment.
00:43:30.540
We can generate counter examples. We can find flaws or what we think are problems in the argument
00:43:36.280
and we move on. But the, but, but here's the other problem. Am I really going to be the guy who says,
00:43:42.760
you know what, since we're not going to do that, we shouldn't offer it. No, I'm not going to be the
00:43:46.080
guy because that's easily spun as well. That's right. No, it has nothing to do with me hating
00:43:51.580
Native Americans. It, I, no, it's, if you want to offer a course that has to play by the same rules
00:44:00.180
as every other class. I keep going back to the outcome. I keep going back to trying to find out
00:44:07.840
what the outcome is, but you know, if you don't mind for some of the audience that doesn't know
00:44:12.040
what you and your partner did with the hoax papers, do you mind sharing a couple of those
00:44:17.200
stories on what you guys did with the hoax papers? Yeah. So let's, let's put a pin in the, the,
00:44:21.800
the outcome. The outcome is to destroy the system, to create a new utopia because it's not biology
00:44:27.880
that's standing in the way it's all. It's, it's, it's the patriarchy that's standing in the way
00:44:32.520
and systemic racism. And if we can only destroy the system, we can create some kind of a utopia.
00:44:36.700
History doesn't favor those guys. History just doesn't favor quite a bit of damage. Fantastic.
00:44:42.820
You know, they've done a, they've done a great amount of damage to society. No question about it.
00:44:47.820
Yeah. Um, okay. So the hoax papers. So we, I, I noticed that this stuff being in the belly of the
00:44:55.540
beast, I noticed that this madness, I, I used to follow on Twitter, uh, a, um, a, uh, Twitter feed
00:45:04.720
called new real peer review, and they would tweet out the actual articles from peer reviewed. These are
00:45:10.600
scholarly presentations. And I, I, I would think like, these are just, can I swear on your show?
00:45:16.500
Sure. Go for it.
00:45:17.880
Like, these are fucking insane.
00:45:19.560
Yeah.
00:45:19.740
These are just batshit crazy. And Alan Sokol, who subsequently became a friend of mine,
00:45:26.160
published a fake paper in the late nineties. And that paper was in a postmodern journal. And he
00:45:34.640
wanted to expose the, the kind of quote unquote scholarship in the journal is just bogus. And so
00:45:41.160
he used gibberish to do so. And he talked about meaningless things. So I thought, well, let's,
00:45:48.660
let's do it, do a Sokol style hoax. So, uh, my buddy and I wrote down, wrote a, uh, a paper and,
00:45:56.000
and you can link to it in the description here. The conceptual penis is a social construct.
00:46:00.880
And we argued, we argued among other things in that paper that, you know, penises were responsible
00:46:07.520
for climate change and it was gibberish and it was bogus. And, and, and they, it's a really funny
00:46:12.900
paper, man. You should read it. It's really funny. Uh, uh, but I'll let, I'll let you and your audience
00:46:18.220
decide if it's funny or not. Um, and so, um, people went crazy. They, they, they lost their minds.
00:46:25.680
And the point of this was to show that, look, you know, these journals are publishing stuff.
00:46:30.020
They're publishing, you know, if we said the same thing about vaginas, it never would have gotten
00:46:35.100
in, et cetera. So, you know, a lot of people had some very legitimate criticisms of, of the journal.
00:46:40.880
And they said, you know, this paper does not do what you think it does. If, if you want this paper,
00:46:47.040
if you want to show that gender studies in particular, but anything with the word studies in it
00:46:53.020
is just publishing dangerous nonsense. You have to do this with better journals. You have to do
00:46:58.720
this with more journals. They gave us a roadmap, you know? And so I said to, to, to my buddy, okay,
00:47:04.540
well, dude, let's just do this. They've told us exactly what we need to do. Let's do it. He said,
00:47:09.700
all right. So, uh, over the course of the year, the three of us wrote 20 papers, the wall street
00:47:16.500
journal caught us, which ironically was because the paper about dog rape, we claim that, um, um,
00:47:23.320
dog parks are petri dishes for canine rape culture. And we need to leash men like we need
00:47:28.660
leash dogs. And we looked at it from black feminist criminology. That paper won an award.
00:47:34.020
Get out of here.
00:47:35.400
Well, I took totally serious. It won an award. But anyway, the wall street journal busted us
00:47:39.880
and we, for sure we would have gotten more than seven papers published. And these papers were just
00:47:44.580
insane. They were just like, you know, a fat bodybuilding that fat people should go into professional
00:47:50.120
bodybuilding spaces and display their fat and, and non-competitive ways. So they should be
00:47:55.660
given equal time. So we, we did a whole bunch of-
00:47:58.260
Full on troll. I mean, this is a full on troll job.
00:48:01.280
Well, it's even, well, the, the point, I mean, we translated Hitler's Mein Kampf and,
00:48:05.480
and, uh, I mean, you know, we, we, about remediating, uh, transphobia by why, you know, why we asked
00:48:12.520
the question, why don't men like things, uh, shoved up their asses? Uh, you said I could
00:48:17.120
speak, uh, and we came up with this whole thing about, you know, their transphobic.
00:48:22.520
Okay. But anyway, anyway, the, the, the, the point, the point of this is that we were trying
00:48:29.640
to make the point that, um, that there are bodies of literature producing nonsense. This
00:48:35.400
shit is untethered to reality. They're teaching this into the kids as knowledge. Um, and, and
00:48:42.760
you want to know, frankly, what's responsible for this is that people think that they know
00:48:48.160
things, right? And they think they know things because it's in peer reviewed journals that people
00:48:52.900
with PhDs teach them. No, these are the musings of ideologues. That's what that is. These are the
00:48:59.720
moral impulses that people discharge in journal journals and teach kids and then have the audacity
00:49:05.280
to test them on it. That's why I told you that story about my mentor and having that thing, because
00:49:10.780
what happens is anything that doesn't fit the narrative is just removed from the curricula,
00:49:16.380
right? So we have everybody thinking the same thing. Oh, you know, IQ is bogus there. It's
00:49:22.480
just, you know, we have everybody thinking the same thing about sexual orientation and race.
00:49:26.580
And look, here's the rub to this. We need to study these issues. We need to, but we need to do that
00:49:34.900
rigorously and we need to do it honestly. And we need to, we need to have, we need to try to falsify
00:49:42.420
what these claims are, not to try to prove them. And if we want to make any steps forward as a
00:49:48.480
society, if we want to build better institutions, if we want to treat people more kindly and more
00:49:53.200
compassionately and more humanely, that has to be because not because we just started making shit up,
00:49:59.800
but because we tried our best, we forwarded hypotheses and we used the tools of science to
00:50:07.080
to see if those hypotheses stood up to scrutiny. That's how we make a better society.
00:50:12.520
So I got, I got a call the other day from a guy in Hollywood who says some of the guys from CNN and
00:50:17.940
Fox are wanting to leave and start a media company. And they'd like to talk to you. Anyways,
00:50:24.120
we're having a conversation together right now. Follow up. They want the company to be 50% owned by
00:50:30.880
people on the right, 50% owned by people on the left, right? That's what they want to do.
00:50:35.040
And let's go and see, you know, how this is going to look when we run the company.
00:50:39.480
So everything gets debated nonstop. How should these papers be judged where somebody from both
00:50:46.560
sides can sit there and, you know, uh, uh, uh, uh, not just trash it, but be almost like the devil's
00:50:53.520
advocate to, you know, show some, uh, uh, you know, leaks in the argument. And, and, and if we don't have
00:50:59.880
that today, what is the current process of getting a paper published? Well, that's, that's what the
00:51:04.260
two things. One, that's what the reviewers are supposed to do. The reviewers are supposed to
00:51:08.940
find flaws in the argument. Who are the reviewers? Well, the reviewers are literally experts in their
00:51:14.940
field. People who have been deemed experts in their field, review the papers. That's how the
00:51:19.920
peer review process works. You have the, I don't know if they're world's leading scholars, but, but
00:51:25.060
bona fide scholars who have published in scholarly journals, read over your piece and then make
00:51:31.100
recommendations. And in most cases, they made recommendations that made our papers more crazy.
00:51:37.700
So we took those recommendations and we wrote them right into the paper. The second thing about the
00:51:43.040
media enterprise is we are in desperate need of that. My, my unsolicited advice to your friend is
00:51:48.840
I would definitely not do it 50 50. The problem with that is that 50 50 will leave out other
00:51:55.180
voices. There are many voices like Andrew Yang wants to make the forward party. I'm actually
00:51:59.180
speaking on Friday, but, um, you know, there, there are, you supported them. You supported them
00:52:04.060
in 2016, I think, right? You were, that's correct. That's correct. There are, you know, greens and
00:52:09.320
libertarians, et cetera. There are many alternatives. Uh, and my, my fear is that if you just had
00:52:15.660
a 50 50 split, you'd just be replicating in a sense, the same thing over and over, you know,
00:52:21.300
the famous John, John Stewart, when, uh, or, or when he went on Tucker Carlson, you know,
00:52:25.940
you, you don't want to be that guy who replicates a kind of division. You, my thinking would be,
00:52:30.840
you want genuine intellectual diversity. You want to present the best arguments on both sides,
00:52:36.160
but you want to do so in a certain way that, um, that, that, I just think it needs to be more
00:52:46.220
thoughtful than 50 50. Although I, I'm very sympathetic with that, that, um, that impulse.
00:52:51.500
Yeah. I think, I think as long as we go in the direction where we can see opposing sides sincerely
00:52:56.980
having a fair platform to argue each other and the audience makes the decision for themselves,
00:53:02.120
I think we win. I, you know, like when you're saying 99% is one side, 1%, the other side,
00:53:07.820
who the hell is winning? Nobody's winning there. And that's what we don't have right now, right? We,
00:53:12.380
we don't, kids aren't seeing that. They're not seeing those conversations modeled for them.
00:53:17.000
They're, they're not, uh, they, they feel uncomfortable asking questions. They don't even
00:53:21.820
have anybody who, who, who disagrees with the other side. And if anybody is wondering about that,
00:53:28.160
here's a litmus test question. If you know, a gender studies student,
00:53:31.240
ask them this, do they know what Martha Neussbaum's criticism of Judith Butler is?
00:53:36.800
I don't want, it's too very inside baseball, but the idea is that they will not know it and they
00:53:41.640
will not know it because they're not taught other sides of the issue. These are advocacy
00:53:45.720
institutions. They're activist institutions that push certain points of view.
00:53:50.640
So what's the long-term solution? Let's wrap it up with that. What is the long-term solution?
00:53:55.260
What can the average guy do? And if not the average guy, somebody who is an influencer and is
00:54:01.220
worried about coming out, what can he or she do, uh, short-term and long-term?
00:54:06.780
Okay. So a few things, the first order of business, I'm coming out with a, uh, a series of videos.
00:54:13.600
I now have a sub stack. My last name is Boghossian, B-O-G-H-O-S-I-N, Boghossian.
00:54:18.660
Uh, I have a, um, I'm on Twitter at Peter Boghossian. And so I, my goal is to give people a front row seat
00:54:27.340
in the culture war to re-anchor us in reason and evidence and civility and argument and basically
00:54:33.340
not being dicks to each other. And how do we do that? So the, what, one of the things that the
00:54:37.920
average person can do as they move forward is they can listen. They need to figure out what people mean
00:54:44.020
by certain words. If you don't have that, that's not academic. That's just having a conversation
00:54:48.160
with someone. Like, what do you mean by equity? Again, most people have no idea what it means.
00:54:52.940
Minoritized, houseless. No, when people don't know what those words mean and we hear them
00:54:57.740
increasingly. Um, I would suggest that you, that people who want to do something about this, listen,
00:55:05.460
learn. You can read the book, cynical theories by Helen Pluckrose and James Lindsay. That's like a
00:55:11.600
master level course and all of this stuff. It explains this in detail. Um, the other thing
00:55:17.780
you can do is you can document, you can go to meetings, you can record, look at the, whatever
00:55:23.200
the laws of the state are. Um, but you know, take pictures of, of material and then you can do a
00:55:29.140
Jody shot and you can just go to YouTube and, and post those videos to YouTube because we need to let
00:55:36.100
people know it's happening. The other thing, if you want to do, you can be more involved and you
00:55:42.000
can actually, you know, like I have projects I need help in, like, you know, not financially
00:55:46.040
involved, you know, like we need a lawyer right now and we need, you know, we just need people to
00:55:51.380
help us. And there are other organizations that Astronomani is doing wonderful work so that you can
00:55:57.760
get involved and get involved in a way that, that makes you feel comfortable. But the most important
00:56:02.800
thing in all of this is you have to be forthright in your speech and you have to be honest when you
00:56:07.920
talk to people and you have to know that one of the consequences of you being forthright and honest
00:56:13.420
in your speech is that you will lose quote unquote friends, but those people will never your friends
00:56:18.460
to begin with. If your friends aren't, if your friendships aren't based on virtue, they're just,
00:56:22.680
they're, I wouldn't say they're bullshit, but they're not what you think they are. Now the answer
00:56:28.940
your last question, we can end on this. What do we do in terms of the context of the university
00:56:34.480
system? We have to make truth the primary goal and value of the institution. The moment the truth
00:56:43.900
is no longer part of the institution, it becomes a kind of ideology factory. Whatever the ideology
00:56:50.840
is, today it's wokeism, who the hell knows what it's going to be tomorrow? Nobody knows what it's
00:56:55.020
going to be, but it has to be truth and it has to have intellectual diversity. And for example,
00:57:00.780
if you're donating to your alma mater, I would stop. And basically every university, because these
00:57:06.420
aren't the same institutions that you went to, there are new institutions emerging. One is in Austin.
00:57:12.380
Yep. One is in Austin. I'll be a part of that. Uh, and there, and those are based on genuine free
00:57:17.340
speech, open inquiry, and people actually having conversations with each other, but truth is the
00:57:23.940
goal. So as long as truth is the North star, you're good to go. Well, brother, it's been great
00:57:29.260
having you on. I really enjoyed listening to, we're going to put the links to, uh, your book below the
00:57:34.440
how to have impossible conversations as well as the paper you recommended, but I really enjoyed this,
00:57:40.360
man. Thanks for coming on. Thanks man. Where do you live? Let's have a few drinks. Come on down
00:57:44.340
for Lauderdale. Come on down here, South Florida. We'll have a good time together. I'd love that. I
00:57:48.840
totally love that, man. I'm looking forward to it. All right. Thanks bro. Take care, buddy.
00:57:52.740
Honestly, I lost count how many topics we covered, but, but, but it was a lot of stuff that we went
00:57:56.980
through. Curious to know what stuck with you. Do you agree with him? If you do give it a thumbs up
00:58:01.840
and subscribe to the channel. And if you enjoyed this interview, two other interviews, I think
00:58:05.000
you'll enjoy. One is with Gadsot, which was not only, uh, informative, it was entertaining. You're
00:58:10.600
going to laugh or with anarchist Michael Malice, complete different angle, but it makes sense if you
00:58:17.420
listen to him as well. So click if you want to watch us, if not watch Gadsot. Take care, everybody. Bye-bye.
00:58:22.740
Bye-bye.
Link copied!