The Michael Knowles Show - May 16, 2021


Why Nothing Satisfies The Woke Culture | Wokal Distance


Episode Stats

Length

1 hour and 22 minutes

Words per Minute

181.26074

Word Count

14,938

Sentence Count

998

Misogynist Sentences

36

Hate Speech Sentences

28


Summary

It's tough to make sense of our world today. We're living in a hyper-reality where everything is inauthentic, and we're trying to figure out how to get our heads around it. How do we do it?


Transcript

00:00:00.000 With 20 years reporting on the markets, I know that some industries are built to last,
00:00:03.980 but others are built to lead. John Oelichman here. If you want exposure to what's really
00:00:08.280 shaping our world, think beyond trends. Think defense, healthcare, telecom, real estate,
00:00:13.920 gold, crypto. They're not just headlines, they're foundations. And with GlobalX,
00:00:18.400 one of Canada's largest ETF providers, you can invest in them intelligently. With a range of
00:00:23.660 ETFs designed for long-term growth and steady income opportunities. GlobalX,
00:00:28.120 where innovation meets investing. Brought to you by GlobalX Investments Canada, Inc.
00:00:32.440 For key risk information, please refer to the ETF's prospectus, available at globalx.ca.
00:00:44.340 It's pretty tough to make sense of our world today. We've got the former decathlete, the guy on the
00:00:51.000 Wheaties box, may very well become the first female governor of California. We are told that we need
00:00:56.980 to get race entirely out of our politics, but we're told this by people who view politics primarily as
00:01:02.960 a matter of race. We are told that we need to be very scientific by people who deny objective truth.
00:01:10.580 I don't know how to make heads or tails. I guess they've already succeeded. One guy who really,
00:01:16.300 I think, is so perceptive about these sorts of things, you may have heard of him, you may well not
00:01:21.060 have heard of him though, is Wokal Distance. A great Twitter follow. You can follow him at
00:01:26.380 Wokal, W-O-K-A-L underscore distance. You can probably spell that one out. You can find him on
00:01:34.020 YouTube, but more importantly, you can find him right here today. Wokal, first of all, thank you for
00:01:39.760 coming on. Thank you for having me. Second of all, I'm trying to reconcile the myriad contradictions
00:01:46.820 that, that are being foisted upon me by the dominant culture. And I'm not talking about
00:01:52.260 some radicals in the streets, though I guess they're part of it. I'm talking about by everything,
00:01:55.800 by the corporations, by the administrative government, by entertainment, by the whole
00:02:00.340 kit and caboodle. How am I to, to make sense of this sort of thing? Well, I think to put it simply,
00:02:07.600 we are living in the postmodern era. We have moved through modernism and we are now living in
00:02:12.820 postmodernism. And let's take a nice little way to think about this is, well, let's make two points
00:02:22.420 about this. First off, I don't think postmodernism is good. I think we should get through it as quickly
00:02:26.900 as we possibly can. And then I think the second point is to make us a little bit of an understanding
00:02:30.700 of what postmodernism is. And so I think a good place to start with that is there was a postmodern
00:02:35.380 philosopher named Jean Baudrillard. And Jean Baudrillard said that we are living in a, in a simulation,
00:02:43.180 in the world of simulacra and simulation. He wrote a book, and I believe it was 1980,
00:02:47.800 called Simulacra and Simulation. It's a very difficult read. So it can, it can be tough to get
00:02:54.400 your head around. He said we're living in a hyper reality where everything is inauthentic. And I think
00:02:59.460 here's a good way to get your head around it. It's pretend that you and I are living in ancient Rome
00:03:04.040 and you and I are walking along the road and we find some wild strawberries and we go and we pick
00:03:08.480 them and we eat them. Those strawberries are real. Now fast forward to the 1960s and you and I are
00:03:16.380 still alive because we've discovered the fountain of youth. Yeah. We're like Dr. Fauci, you know,
00:03:21.080 that the ages come and go, but we seem to remain. Yeah. Yeah. We, we are, we are immortal and we're
00:03:26.960 walking along and we go to the supermarket and we buy strawberries. Now those strawberries are grown in a
00:03:32.140 factory setting and only the biggest, most beautiful, juiciest, ripest strawberries get
00:03:37.660 picked and given to us. And they're, they only plant the nicest strawberries, the nicest seeds. So
00:03:43.780 they're, they're real strawberries still, but they're selectively grown strawberries.
00:03:48.620 They're the pinnacle of what a strawberry could be. Yeah.
00:03:53.100 So then you and I sit around and think, and we have all of our knowledge that we had from ancient
00:03:56.740 Rome from until 1960 and we say, Hey, let's make a strawberry candy. We're going to distill that
00:04:03.760 flavor of the strawberry and we're going to put it into a little candy. So we make the strawberry
00:04:08.340 candy. Fast forward to the 1970s, you know, not you and I think, Hey, maybe we could sell more of
00:04:14.100 these candies if we make them a little sweeter. So we distill the flavor and make it even more powerful
00:04:18.440 than a strawberry is. We make, we make that 10 times as powerful as a strawberry and we add sugar.
00:04:23.980 Well, fast forward to the 1990s and the Jolly Rancher company comes along and says, you know,
00:04:29.700 using real strawberries and sugar is expensive. We're going to use a synthetic strawberry flavor
00:04:35.140 and high fructose corn syrup and they make the Jolly Rancher. But then the soft drink company comes
00:04:40.580 along and says, we're going to make a soda pop of the strawberry flavored Jolly Rancher.
00:04:46.580 And then seven 11 comes along and seven 11 or some other company comes along and says, we're going to
00:04:50.860 make a Slurpee that is flavored like the soda that is flavored like the Jolly Rancher, which is
00:04:58.260 flavored like the candy, which is flavored like the original candy, which is flavored like the
00:05:01.600 genetically modified strawberry, which is flavored like the wild strawberry. So we're starting with
00:05:06.240 the wild strawberries and then we move to the, the, the perfectly selectively grown strawberry.
00:05:12.040 Then we go to the candy. Then we go to the Jolly Rancher. Then we move to the soda pop.
00:05:16.940 Then we move to the Slurpee. And by the time we get to the Slurpee, we're dealing with something that is vaguely like a strawberry that tastes maybe a little like a strawberry, but isn't really anything like a strawberry. So my son could go along and he he picks up the Slurpee and he starts drinking the strawberry Slurpee.
00:05:34.160 And as he's walking along the road, he sees these funny kind of red shaped things in the background, kind of over there. And he goes and he picks one and he goes, this kind of looks like the little logo on my drink here. And he takes it and goes, it kind of maybe tastes vaguely like this Slurpee thing, but it's not nearly as good. I'll drink the Slurpee.
00:05:53.420 And that's an example of a of a simulacrum of a thing that is a copy of a copy of a copy of a copy of a copy. And what Baudrillard thought was that everything is kind of like that. We're dealing with copies of copies of copies of copies of copies of copies of copies of things.
00:06:12.720 Now, Baudrillard pushes this idea and he thinks that, for example, we could talk about, I don't know, women. Back in Rome, there's women and they're just walking around. Fast forward to the 1960s and there's women, but we've invented makeup.
00:06:27.400 So fast forward to the 1970s and we've invented makeup and we've invented birth control. And then fast forward to the 80s when we have makeup and we have birth control and we have breast implants.
00:06:36.560 And then you go forward to the 1990s and then you add the Photoshop. And then you fast forward to Instagram and you have a woman who is wearing makeup and has a wig on and she's on birth control and she has breast implants and she's got 14 layers of makeup on and she's been Photoshopped.
00:06:54.500 And on top of that, she has a wonderful little filter on. And all of a sudden, you're so far away from what a regular person looks like that it's more real than real.
00:07:04.220 It's a woman that nobody could ever aspire to look like. You could never have the skin tone of a filter. That doesn't exist. It's not real, but it's pointed through to you.
00:07:15.160 And Baudrillard thinks we're stuck here and he thinks we can't ever get back to reality. And I would like to say that we can.
00:07:21.080 I think I see where you're going with this. I was trying to figure out how does the Jolly Rancher relate to Caitlyn Jenner?
00:07:27.720 But you've just explained how the Jolly Rancher relates to Caitlyn Jenner. Caitlyn Jenner is a hyper real woman. She's more woman than a woman.
00:07:36.700 He is now this appearance of this woman in such an exaggerated and caricatured way.
00:07:45.000 But whereas I might think that this is a very strange turn of events that Bruce Jenner now looks like this kind of exaggerated woman.
00:07:54.600 What you're suggesting, perhaps, is that we're living in this hyper real world. So, of course, we're going to get a Caitlyn Jenner.
00:08:01.280 Yeah, I would say I would say it kind of works like this. The the movement of the transgenderist movement or the transgender movement, the gender ideological movement, whatever you want to call it.
00:08:12.900 Has basically latched on to the symbols of femininity, the symbols of womanhood and have abstracted those away from the women who actually wore them and have turned that into the thing that is a woman.
00:08:27.540 So they would say that being a woman is a purely social role. It's entirely a socially constructed thing, kind of like the presidency or the job of being a lawyer or being a mailman.
00:08:38.140 And so talking to them about a biological woman would be a little bit like trying to talk about a biological mailman.
00:08:44.120 They would they would say no. They would say a woman isn't this this biological entity.
00:08:49.600 A woman is a social role of of of a person who follows particular norms in a particular society.
00:08:54.820 So they would say that they've taken all the things of that women have have added to their arsenal of of social signification.
00:09:05.840 They've taken all the things that women have brought on to themselves, makeup, wearing dresses, using the color pink, having eyeliner, wearing heels.
00:09:14.120 And they've said that the thing that constitutes the woman is all the symbols and the social role that they play.
00:09:20.680 It's got nothing to do with the underlying biological reality.
00:09:23.840 OK, I'd like to mix metaphors for a second about metaphor.
00:09:27.160 I'd like so what you're saying is, in a sense, we've put the cart before the horse.
00:09:31.380 And to be a little more precise here, we've confused the symbol for the symbolized for the thing that the symbol is referring to.
00:09:39.260 Yes. OK.
00:09:40.340 Yes. Baudrillard explicitly states that we are living in a world where the only thing that does anything is the symbols.
00:09:47.740 Yes. The symbols are the whole world.
00:09:49.500 That's it. So the underlying reality, he says, rots away.
00:09:54.240 He says you can't ever get to the reality because the only thing that you're operating on is the level of symbols.
00:09:59.600 Every time you look at something, when you when I look at you, Michael Knowles, I don't see a man or a person.
00:10:06.720 I see the wedding ring and that's a symbol of being married.
00:10:09.540 And I see the jacket. That's a symbol of being at work.
00:10:12.100 And I see the leftist tears. That's a symbol of intelligence.
00:10:14.540 And I see the Michael Knowles show. And that's the symbol of the show. The underlying person is is completely removed.
00:10:20.980 I can't get to that. The only thing I have access to is the symbols.
00:10:24.160 Yes. Right. OK. This makes sense. I've wrapped my head around this.
00:10:28.260 I agree with your observation. I would say with Baudrillard's observation, but you should take credit for it.
00:10:33.340 I take credit for all sorts of ideas that I just read in some book somewhere.
00:10:36.620 Now, we agree that this is what's going on. OK, fine.
00:10:39.620 I want to know, first, how did we get here? And then second and more importantly, how do we get out of it?
00:10:49.460 OK, so there's a couple of things that have that have happened.
00:10:55.120 In 1993, there was a book written by a man named Stanley Grenz called A Primer on Postmodernism.
00:11:00.820 And he argues in that book that we have moved into the postmodern epoch or the postmodern age.
00:11:05.860 He says it's as significant to shift as moving from, say, the Middle Ages to the Enlightenment.
00:11:11.880 He thinks we've now moved from the Enlightenment into the postmodern age.
00:11:16.240 And so Baudrillard in 1980 is writing about this.
00:11:21.520 Now, you have to think he's writing about this in 1980.
00:11:23.800 And in the 80s, when we have landline phones and network TV, he thinks you are awash in symbols and information to the point where you can't see the real world.
00:11:33.940 Well, you'd have to wonder what you think looking around now.
00:11:37.080 Yeah.
00:11:37.540 Right.
00:11:39.180 He'd look around now and be like, what?
00:11:41.500 Right.
00:11:41.740 So the way that the postmodern person views the world is to say that the thing that we have access to is the signs and the symbols.
00:11:51.780 Right.
00:11:52.040 That's what we're walking around in, that everything is socially constructed, including your methods of knowing.
00:11:57.340 Yeah.
00:11:57.440 Right.
00:11:58.360 Science is a particular to them.
00:12:00.980 It is a way of knowing.
00:12:03.240 Right.
00:12:03.560 That's culturally validated.
00:12:05.460 Right.
00:12:05.800 And that there could be other culturally adequate ways of knowing.
00:12:09.300 And all that you have is ways of knowing and constructing ideas and constructing knowledge.
00:12:14.140 Right.
00:12:14.360 So they would say that what we call, for example, science is a sort of, here's a technical term which I'll unpack, legitimation by pyrology.
00:12:27.440 Okay.
00:12:27.740 And what that would mean is that things are true because there's enough social momentum in society for the society to consider those things to be true.
00:12:36.480 Right.
00:12:36.760 But if, pardon my interruption.
00:12:39.660 So, but let's say that there's enough momentum in society for society to consider something to be true.
00:12:46.680 Last time I checked, truth is not based on consensus.
00:12:49.720 Right.
00:12:49.900 I mean, truth is based on objective reality.
00:12:52.700 Correct.
00:12:53.500 So they have a different conception of truth and a good way to get at that if you want to really undercut them.
00:12:58.000 Here's a nice way to think about it.
00:13:00.660 It was, I believe, in the 80s or 90s, there was a building built on the campus of Ohio State University called the Wexner Center for the Performing Arts.
00:13:10.460 And it was a postmodern building.
00:13:13.020 And so it runs at different angles to itself, so it's not a clean grid.
00:13:18.260 And you can see bits of the foundation and bits of the infrastructure and the scaffolding poking out at various parts that have doors that lead to nowhere.
00:13:25.880 And the idea was that all of the architectural norms and patterns and things that we use to understand architecture are random and arbitrary.
00:13:35.160 So why can't we just break all those?
00:13:36.900 And the nice way to get at how to undercut kind of that's their view, right?
00:13:41.720 That if it's all arbitrary and socially constructed, we can do that.
00:13:44.460 And so the question that you might want to ask those people is, okay, so you made all the lines and the windows and the doors, you put them in arbitrary places, used arbitrary shapes and arbitrary directions.
00:13:55.140 Okay, did you do that with the foundation?
00:13:56.340 And the answer is going to be no.
00:14:00.940 Because much like your ideas about the world, the foundation is the place where the building connects to the world.
00:14:07.600 And if the place where you connect to the world isn't set on a good foundation, the whole thing is going to collapse.
00:14:13.540 Yeah.
00:14:14.740 Right?
00:14:15.180 So that's kind of where we're at right now is that we are beginning, we are in a postmodern world where we're getting, where people are getting away from the reality, where they're in isolated bubbles of information, where the social pressure is dictating the belief set, nothing to go on with the real world.
00:14:37.720 And that's the situation we're living in.
00:14:39.780 This reminds me, I once asked, speaking of the grievance studies sorts of experiments that we were talking about a bit earlier, Peter Boghossian, who is one of the people behind that, once explained to me that intersectionality posits that the only thing that I can know with certainty is my own suffering.
00:15:00.200 And I thought this was an interesting key into intersectionality.
00:15:03.060 And I see there, okay, you need this, some connection to the world.
00:15:07.920 You need some connection to reality here.
00:15:10.220 And so for these woke people in the grievance studies, it can be my resentment or my suffering or my whatever.
00:15:17.260 But there has to be some link.
00:15:19.020 Okay.
00:15:19.500 So then, right, there's some link here.
00:15:22.060 You're right.
00:15:22.360 They won't mess with the foundation of the building, but they're destroying everything else.
00:15:27.620 People saw this coming in the 1980s.
00:15:29.060 In the 1980s, very few people, but at least this one guy saw it coming in the 1980s.
00:15:33.600 It had been building for a while before that.
00:15:36.460 But how did you even get to that point in the 80s?
00:15:38.220 How did you even get to the point where people could see this happening?
00:15:41.460 Well, I think there's a few things that were going on.
00:15:43.800 I think Baudrillard was very pessimistic about this.
00:15:47.480 And as he moves on through his career, he actually, he wrote a book in 1991 called The Gulf War Did Not Take Place.
00:15:53.140 And his whole thing is he thinks, look, the thing that you saw on TV didn't happen.
00:15:57.880 Something happened, but we don't know what it is, right?
00:16:00.140 He thinks that it was a show that was put on, right?
00:16:03.640 He says the United States wants to play the role of the good guy.
00:16:06.920 And so they're casting Saddam Hussein in the role of the bad guy.
00:16:10.160 And here we go.
00:16:11.160 We're going to have a fake war.
00:16:12.560 That's kind of how Baudrillard sees it.
00:16:14.260 Now, the thing with Baudrillard that we want to avoid at all costs is Baudrillard accepts the premise that reality is gone, that we can't get to it.
00:16:23.620 And what we would want to do is we want to look at Baudrillard and say, okay, Baudrillard, you've made some interesting observations about how social abstractions can get away from reality.
00:16:31.700 But that doesn't mean the reality is gone.
00:16:34.220 It's still here.
00:16:35.640 Not even Baudrillard is going to step in front of a bus, right?
00:16:38.100 Right.
00:16:38.220 So if you rewind a little bit, there's some philosophy that's going on in the 60s, 70s, and 80s that really kind of pushes this.
00:16:47.820 And it comes from a couple of guys, Jacques Derda and Michel Foucault.
00:16:51.960 And I can't explain those guys in just a few minutes.
00:16:54.820 But suffice to say that, how do I put this, as complicated as their ideas were, what was taken out from their ideas and from their philosophy, what was abstracted out was the idea that there is no central or correct perspective, that there is no inherent, fixed, and stable meaning in language.
00:17:19.100 And there is no stable, correct, proper, appropriate categories.
00:17:25.620 All those things are arbitrary.
00:17:27.100 You know, I love the distinction you've made here because you say you can't sum up these radical theorists like Foucault or Derrida in three minutes.
00:17:36.940 Frankly, I don't think we could sum them up in three years or 30 years.
00:17:40.260 And frankly, I don't even know if they could sum themselves up in certain things that they said.
00:17:44.120 But regardless, who cares?
00:17:46.220 I frankly, I don't even care.
00:17:48.140 I mean, I'm interested as an intellectual matter what Foucault or Derrida thought.
00:17:51.560 But what I care more about is the effect that they had, what was taken out of their philosophy.
00:17:56.600 There's a line that people love to quote from Derrida, which is variously translated as there is nothing outside of the text.
00:18:04.080 But then Derrida's defenders come in and they say, no, actually, he said, there is no outer text.
00:18:09.900 And then they are trying to find all these semantic distinctions.
00:18:11.760 Well, pardon my bluntness, who cares?
00:18:16.060 If the effect of it is that people interpret the man to mean there's nothing outside the text, that really everything is just constructed, everything is a matter of language and we can control language and control the world, then that's what matters to me because that is what engineered my politics.
00:18:31.480 Yeah. So what happened is is these these theorists had some kind of interesting ideas, but their their ideas were harvested through by activists and were turned against themselves.
00:18:43.040 When Derrida said there is no outside text, one of the problems that Derrida runs into, and this was pointed out by John Searle in 1983 when he was doing a review of a book called On Deconstruction.
00:18:53.720 Yeah.
00:18:54.600 And Searle, he says, look, Derrida's one of the Derrida's things is to say that there's no there is nothing that is central.
00:19:03.680 There's no essence to anything.
00:19:05.240 There's no inheritance.
00:19:06.460 There is no he attacks logocentrism.
00:19:09.040 The idea that there's a logos, that's gone.
00:19:12.000 A logos meaning the sort of divine logic of the universe.
00:19:15.420 In the beginning was the word and the word was with God, right?
00:19:17.820 The word there is in the beginning was the logos, right?
00:19:20.440 Yeah, that's right.
00:19:21.540 And so Derrida says all of these things that we've been talking about, Plato with the forms, essence, God, all of that, he says, is this logocentrism, which he wants to deconstruct.
00:19:35.440 There's a nice term for it and do away with it.
00:19:38.120 And he says, and Derrida comes to the conclusion that because there is no central point, all interpretations are endlessly open for reinterpreting and reprocessing.
00:19:51.300 And Searle, the analytic philosopher John Searle came along and clocked him and said, look, when you get rid of metaphysics, Derrida, you're buying into the same thing that you're attacking because you want to get rid of metaphysics because you think metaphysics is necessary for us to have truth, for us to have reality.
00:20:08.180 And what Searle says, and we could nitpick him about this if we wanted to keep some metaphysics around, but he said the classical metaphysics of guys like Plato, of Aristotle, he said, that's not necessary for us to talk about truth.
00:20:22.740 Because I hereby declare metaphysics is gone.
00:20:27.300 Building didn't fall down.
00:20:29.440 And I still see things around you and I can still talk.
00:20:33.340 Reality is the thing that grants us the objectivity, right?
00:20:36.840 And while I am a subject, I am causally connected to the world.
00:20:41.180 So reality is the thing that grants us objectivity.
00:20:43.640 I don't – Derrida can attack metaphysics all he wants, but his – what he thinks that follows from that and what he was taken as meaning, that there's nothing outside the text, that there's nothing outside of interpretation, that there's nothing outside of context, that everything is purely contextual and defined by its context.
00:21:03.700 Searle is going to say that's wrong, and I would toss out also – Derrida, if you think things are only differ because of context, differ in virtue of what?
00:21:14.700 Of the properties that they have?
00:21:17.340 Of the thing that actually makes them up?
00:21:20.040 Of the reality, Derrida?
00:21:22.040 Is that what you mean?
00:21:23.320 So Searle points out that Derrida's whole philosophy is rickety.
00:21:29.260 It's on stilts.
00:21:30.020 His blinding insight isn't actually all that insightful.
00:21:32.520 And what he was taken to mean, that everything is merely interpretation, that turns out to be entirely wrong.
00:21:40.260 Besides which, we could ask a very simple question, which is if everything is entirely interpretable, why should we accept the interpretation that everything is entirely interpretable?
00:21:46.460 Right, right, right.
00:21:47.940 Why should I accept your interpretation?
00:21:49.880 This reminds me of every freshman philosopher who rips the bong a little too hard and says, you know, man, there is no truth.
00:22:00.820 And you say, okay, well, with what authority are you convincing me that your statement is objectively true, namely that there is no truth?
00:22:09.220 You remind me too of this line from C.S. Lewis, which I love where he says, I'm paraphrasing, he writes better than I talk.
00:22:17.160 But he says that the atheist can no more blot out God than the lunatic can blot out the sun by writing the word darkness on the walls of his padded cell.
00:22:27.180 And I love that image that, you know, so Derrida says, no, the subjective reality, it's gone.
00:22:33.220 And you say, well, I don't know, sun's still shining, everything building still standing, it looks real to me.
00:22:38.600 Yeah, so they've taken Derrida, Derrida wasn't enough to do it, though.
00:22:42.260 They needed somebody else, and they needed Michael Foucault.
00:22:44.200 And as much as people have tried to nuance him and say he wasn't really saying the things that they said he was saying, I kind of think he was.
00:22:55.160 And what Foucault was arguing was that what is considered to be truth are things get – truth is a status that we bestow on ideas socially.
00:23:06.000 It's a social status, like saying, you know, president is something that we bestow on Joe Biden by virtue of an election, or prime minister is something we bestow on Justin Trudeau by virtue of an election.
00:23:18.000 Truth is something that we bestow on ideas by virtue of our social institutions, right?
00:23:24.220 And so he's – this is what – that idea that truth is a product of discourses and power, that truth is a social entity that is created using power via discourse, the way we talk and discuss things, that idea took hold.
00:23:44.320 And so all of a sudden you have these people on the one hand who say everything is just context, and we can reinterpret things endlessly.
00:23:52.380 And then you have another people hold on and say all of the ideas that we have are really just a product of discourse and of power.
00:24:00.180 And now you can do two things.
00:24:01.800 One is you can – every time someone says, you know, I think that – I don't know – lower taxes are good.
00:24:09.820 They're going to come along and say that's a power move.
00:24:12.640 You're saying that because lower taxes will help your side gain power.
00:24:18.100 Yeah, yeah.
00:24:18.520 And everything is due to the lens of power.
00:24:20.280 But not only is everything viewed through the lens of power, it's also endlessly reinterpretable, right?
00:24:26.560 So when everything can be endlessly reinterpreted and everything is just – is seen as a mask for power, all of a sudden you can see what the problem – the problems that crop up.
00:24:39.240 And you can see this tying it back to the Caitlyn Jenner thing.
00:24:42.280 When someone says, look, I don't think the trans women who are biologically male should be allowed to compete in women's sports.
00:24:49.400 What they'll tell you is the Foucauldian answer is you don't really care about women's sports.
00:24:53.880 You don't watch the NBA.
00:24:54.800 This is about controlling women's bodies.
00:24:56.900 So that's the power thing.
00:24:58.000 And then they'll say – now here is the reinterpretation of the symbols is to say, look, the idea of what a woman is is we can pick what words mean.
00:25:08.680 We can decide what things mean.
00:25:10.380 And we can reinterpret the idea of a woman to refer to the social role, right?
00:25:14.860 You do both of those things at once and all of a sudden women went from – women's sports goes from people with a particular biology being able to compete with each other on a fair playing field to people who are in a particular social role playing against each other regardless of their biology.
00:25:31.420 And you can see this.
00:25:32.180 Sarah Silverman did an interesting little thing about it just recently.
00:25:35.780 She said – what about the category?
00:25:38.620 She says, well, what about tall women?
00:25:40.500 Yeah.
00:25:40.920 What about some women are bigger than other women?
00:25:42.580 And what she's doing is she's calling on the type of deconstruction that went on in the 70s where they would say it's all just context, right?
00:25:52.580 That's why they say that – who was it recently that said that testosterone – Samantha B.
00:25:57.800 Yeah.
00:25:58.080 On full frontal, Samantha B. said testosterone doesn't – isn't the thing that makes someone a better athlete.
00:26:03.860 And what she's doing is she's saying, look, by itself, if I just inject a bunch of testosterone into me, I won't become a great athlete.
00:26:10.460 Therefore, testosterone isn't the thing that does it.
00:26:13.460 This is the univariate fallacy, right?
00:26:15.360 Where you say, because I changed that one – because the one thing by itself isn't enough, that means that it's – you can't make any judgments based upon that.
00:26:26.160 So saying that testosterone by itself isn't the deciding factor is a little bit like saying bullets have never killed anybody.
00:26:32.020 They haven't.
00:26:33.400 They haven't.
00:26:33.900 They're just you seeing them sitting there on the table.
00:26:36.000 You need some metal that's formed into a gun.
00:26:39.660 You need a barrel.
00:26:40.560 You need a handle.
00:26:41.500 You need aim.
00:26:43.000 You need oxygen.
00:26:44.700 You need energy.
00:26:46.080 Why are you going around spreading this fallacy that bullets are killing people?
00:26:49.700 You need organs to be ripped up.
00:26:51.580 I mean, you know, though, I have to say to give Sarah Silverman and Michel Foucault some credit here,
00:26:56.520 when we're imagining Foucault to say this and when we actually heard Sarah Silverman say this verbatim.
00:27:02.000 She says, this isn't about women's sports.
00:27:03.900 You don't care about women's sports.
00:27:05.400 The fact is, I don't.
00:27:07.380 I don't care about women's sports.
00:27:08.740 I've never watched women's sports.
00:27:10.400 I guess in the gymnastics of the Olympics or curling or something.
00:27:13.660 I don't know.
00:27:14.240 Occasionally if it happens to be on.
00:27:15.440 Otherwise, I just don't follow it and I don't – it doesn't interest me.
00:27:18.740 Regular sports, you know, like real sports with guys, that doesn't really interest me either.
00:27:22.320 Okay, but what does interest me is reality.
00:27:27.140 What does interest me is the conception of truth as eternal and grounded in reality
00:27:34.900 and the conception of truth as just something socially constructed.
00:27:38.660 And to your point, you've just made me think of it.
00:27:42.480 The traditional understanding of truth in our culture, in Christendom, is that Christ is the truth
00:27:49.100 and he is standing there before Pontius Pilate saying, I come to give testimony to the truth.
00:27:56.080 And Pilate, who's not the good guy in the story, right?
00:27:58.960 You've got the bad guy there.
00:28:00.420 He cynically asks, what is truth?
00:28:03.380 As though he were Derrida or Foucault or any of these postmodernists that we see.
00:28:08.440 And the idea that truth is just this sort of label that we use for popular ideas that the society has come to embrace.
00:28:17.860 Well, what happens to Christ in the passion narrative?
00:28:21.060 Christ is scorned, he's abandoned, he is scourged, and he's crucified.
00:28:26.800 Doesn't sound very popular to me, does it?
00:28:28.720 Even by his closest friends.
00:28:30.420 So you've got that conception of truth, which is the one that I go to.
00:28:34.220 And then now you've got this postmodern idea that you've described that it's just, you know,
00:28:39.680 whatever we all think is popular and it's about power dynamics.
00:28:43.720 So I totally see your point that we have transformed, we have gotten here.
00:28:48.960 There were particular thinkers going back at least to the 60s that have brought us here.
00:28:53.780 But it didn't just start in the 60s either.
00:28:55.820 I know we could be here all day if I keep asking you to go further and further back.
00:28:59.060 But it didn't just start in the 60s.
00:29:00.580 There were writers in the 30s and 40s and the 1920s and the 1890s.
00:29:07.560 And, you know, I hate to ask you to trace, you know, centuries of intellectual history in about five minutes.
00:29:13.960 But how did we even get to Derrida?
00:29:17.640 Well, Derrida was picking up on Ferdinand de Saussure's idea of structuralism, right?
00:29:22.580 That's where he comes from, where Ferdinand de Saussure was the first one to say words get their meaning from other words.
00:29:27.420 And those words get their meaning from other words and you just words are just endlessly defined by other words.
00:29:32.120 And the meaning is found in the structure.
00:29:33.640 And Derrida comes along and says, OK, what structure?
00:29:35.960 Right. There's no structure.
00:29:37.020 It's just words endlessly referring to other words.
00:29:39.940 And so I think because we could go back forever, I mean, even Nietzsche talks about in The Death of God, he says, who has wiped away the horizon, right?
00:29:48.180 And he's talking about the demarcations of reality, right?
00:29:50.060 I think the thing that we need to do right now is we need to get back to the idea of truth as corresponding to reality, that it's not endlessly interpretable and deconstructable.
00:30:04.700 Even with what you did with Sarah Silverman, there's something you said, well, I don't care about women's sports.
00:30:08.140 No, but you do care about fairness.
00:30:09.900 Right. Well, exactly.
00:30:11.140 I care about fairness and justice and truth and reality.
00:30:13.960 Yeah.
00:30:14.060 And what is she doing when she does that?
00:30:16.060 What she's doing is she's is she's saying you don't care about women's sports.
00:30:18.780 That's meant to be interpreted as as your your care about here isn't an intrinsic concern for the WNBA.
00:30:25.660 Your concern is about power.
00:30:26.880 No, my concern is about fairness, because once you bring fairness into the WNBA, what's to stop you from bringing it into the NBA?
00:30:33.880 If people can use particular chemicals and hormones and whatever and have whatever testosterone levels they want and pay in the WNBA.
00:30:40.260 Well, what's to stop a guy from the NBA saying, I'm going to have whatever testosterone levels I want and he's on steroids.
00:30:48.320 Right.
00:30:48.580 There's there's no end point for it.
00:30:51.000 Right.
00:30:51.360 So I think we have to get back to the idea of truth that it corresponds to the world and corresponds to reality.
00:30:56.460 Right.
00:30:56.880 We'll get there first.
00:30:58.300 Right.
00:30:58.580 That's the first thing we've got to do is bring back the objectivity.
00:31:01.780 Remember, if your foundation doesn't have any objectivity to it, when you're building meets the world, it's going to collapse.
00:31:07.140 Well, if you're if you're if your way of thinking has no objectivity in its foundation, when your ideas meet the world, they'll collapse, too.
00:31:13.380 OK, but I think that's the first thing to do.
00:31:15.720 I totally agree.
00:31:16.900 But the reason I suppose I've asked about this further back intellectual history of where this all comes from is because the problem seems so deeply embedded going back, you know, just to give a very sort of broad sketch.
00:31:28.700 You've got the new left rises in the 60s.
00:31:31.700 Right.
00:31:31.860 Guys like Herbert Marcuse, the father of the new left.
00:31:34.100 Where does he come from?
00:31:35.040 He comes from the Frankfurt School and critical theory, which we're now seeing a lot of the emanations of critical theory.
00:31:40.360 Where does the critical theory come from?
00:31:41.580 Well, it ultimately goes back to Marx.
00:31:43.580 What does Marx say he wants to engage on?
00:31:45.500 The ruthless criticism of all that exists, tearing down everything.
00:31:49.560 An early discussion of deconstruction, to use the popular word.
00:31:53.920 And frankly, I think it goes back further than Marx.
00:31:58.120 So now you're saying we need to ground our sense of the world in reality again.
00:32:04.260 Well, how do we do that after centuries of this gradual poisoning of our worldview?
00:32:12.080 So if you want to go back, we could talk about Hegel if we have to go back there.
00:32:15.160 Although I don't want to talk about Hegel.
00:32:17.040 I'm not a Hegel fan.
00:32:18.700 So the way I like to think about this, because people point at what we call critical social justice, which is the term I use, the woke movement, if you will.
00:32:31.400 They point at it and say, this is updated Marxism.
00:32:33.400 And I go, kind of not the correct way to think about it.
00:32:36.520 The kind of way to think about it is Marx kicked Marxism off rolling down a hill.
00:32:41.200 And as the snowball went, it picked up all kinds of things.
00:32:44.280 What we're seeing right now is a mutated stew of Marxist conflict theory, the critical theory of the 1930s, the new left that kicked off with Marcuse in the 60s, Derda's and Foucault's postmodern analysis.
00:33:06.660 The way that that postmodern analysis was laundered through queer theory in the 90s via Judith Butler and through post-colonial theory, Edward Said, Homie Baba, Gary Spivak, all the way up.
00:33:23.360 And now that thing got toxified and then memefied, right?
00:33:30.040 And it trickled down through Tumblr.
00:33:32.880 So if you think about it like this, here's another nice way to think about it.
00:33:37.160 Pretend I have a bright light, and the further I get away from the bright light, the dimmer the light gets.
00:33:41.760 Okay?
00:33:42.680 I think of if the light is the truth, we're getting away dimmer, but the truth still shines bright.
00:33:48.040 Okay?
00:33:49.040 The critical theorists, the postmodernism, is like a rotting chicken.
00:33:58.200 And the more – the further – it just rots and decomposes.
00:34:03.340 That's all it does.
00:34:04.500 There was no better thing.
00:34:07.180 They might – well, someone will nitpick my analogy and say, well, what about the real chicken?
00:34:10.480 No.
00:34:10.840 Give me a break.
00:34:12.080 Imagine you started with a rotting chicken, and then you just like let it ferment and keep fermenting and keep fermenting.
00:34:17.260 It's toxic and toxic and toxic, right?
00:34:19.540 It's not like the sweet grapes of truth that are fermented into the wine of the modern world.
00:34:26.100 It's like the rotting chicken of Foucault-Menderda that is just degrading into Tumblr-level activism, right?
00:34:32.520 This is what we're dealing with here, right?
00:34:35.620 So those ideas got all picked up and were used explicitly for political reasons, and it gets laundered through and dumbed down as it goes.
00:34:44.240 So there's a sense in which some of these people try to hide, and they'll say, well, you know, that's not real critical theory.
00:34:52.120 Yeah.
00:34:52.640 No.
00:34:53.160 But the toxicness of critical theory, when you go back and you read Marcuse's essays on, say, repressive tolerance, it's just as bad then.
00:35:00.980 Right.
00:35:01.200 It's just more academic, and it's steel-man by really powerful intellect, so it looks like it's a lot better, right?
00:35:07.780 But it's still rotten right to its core and right to its bones.
00:35:11.060 So I think what we need to do is that stuff has been allowed to sit and ferment in various areas of the humanities.
00:35:21.820 Postmodernism and postmodern philosophy really didn't get a hold in—I mean, it's not really even popular in France where it got kicked off.
00:35:31.960 It really got a hold in English literature departments because when you can reinterpret anything endlessly, boy, that's useful in art, right?
00:35:42.560 All of a sudden, I can decontextualize one piece of art with another piece of art, and I can put them beside each other and juxtapose them and create a new piece of art.
00:35:51.420 Reinterpreting things endlessly in the artistic world has some utility and some use.
00:35:55.240 It allows us to create new meaning.
00:35:56.800 I want people to do that with art.
00:35:58.540 I don't want them to do that with the cure for cancer.
00:36:00.820 Right.
00:36:01.160 I don't want some postmodern person to pull up the ingredient list for HIV drugs and say,
00:36:08.980 we're going to reinterpret this and then market it.
00:36:11.640 That's bad.
00:36:12.520 Don't do that.
00:36:13.400 That's not a good idea.
00:36:14.360 In the realm of science you saw during the BLM riots and the COVID lockdowns, there was a letter signed by 1,200 public health experts.
00:36:23.040 They had very fancy degrees.
00:36:24.700 They had wonderful, really impressive training.
00:36:26.700 And they said, it's bad when conservatives go to church or when they go protest their civil liberties being taken away.
00:36:34.020 But it is good when BLM goes out and steals Nike sneakers and riots because white supremacy is a lethal public health threat that predates and exacerbates COVID-19.
00:36:45.580 So you see this language seeping into the science, but it doesn't sound very scientific to me.
00:36:51.920 They're taking the credibility from their work in, say, immunology.
00:37:00.200 Yeah.
00:37:00.400 And they're using that to launder through the BLM activist vision of the world, right?
00:37:07.320 Yeah.
00:37:07.460 And all of the political ideas that are contained within the BLM organization, such as the deconstruction of the family, the abolishing of the police, and buying Patricia Galera's five houses, right?
00:37:21.040 These are all the things that are part of that.
00:37:24.940 And, I mean, we have a little fun with it.
00:37:27.020 The point is that these people are, in very real ways, doing deep damage to the credibility of science as an institution because they're using it to bolster their credibility in other areas.
00:37:42.560 And this is not appropriate.
00:37:43.600 This was never what this was designed to do.
00:37:46.280 What I wrote back then was medical experts said COVID-19 meant we have to close businesses, cancel weddings, cancel church and funerals, and stay at home.
00:37:53.060 Most of us, through tears and broken hearts, listened.
00:37:54.900 And I have and do and did advocate for lockdowns where they were appropriate and where they were necessary.
00:38:02.880 I think that they were needed in some places.
00:38:05.720 I think that lockdowns, particularly in the early part of the pandemic when we didn't know what we were dealing with, were entirely justified.
00:38:12.820 But in that early part of the pandemic, when they were doing that, and Los Angeles, downtown Los Angeles, filled up with 35,000 people saying, you know, Black Lives Matter.
00:38:26.240 And I believe that the life of every Black person is important.
00:38:30.040 I believe that we need to be fair and equally treating everyone.
00:38:34.360 What a controversial opinion.
00:38:35.540 I can't say such an unpopular opinion.
00:38:38.620 But when you have a Black Lives Matter rally and they say, well, this is OK.
00:38:45.340 People are looking around saying, well, wait a second.
00:38:48.360 You're deciding what spreads and what doesn't based upon what suits your political ideology.
00:38:54.680 You've politicized your field, and now we no longer trust you.
00:39:00.620 And the cynical sort of attitude of the postmodern theorists has been, yeah, but it was always already political.
00:39:12.180 That's what they think.
00:39:13.220 They think everything was political.
00:39:14.260 So my thing is I have been very vocally pro-lockdown, very, very pro, very vocally pro-vaccine.
00:39:22.660 And when that letter came out, I just looked at this and I said, you people have sold your inheritance of credibility and trust for the mess of pottage of a single rally in L.A.
00:39:34.480 And I cannot believe that you did it.
00:39:36.480 You sold your credibility for nothing.
00:39:39.940 Well, on the point of the lockdowns and the vaccine, it's funny now to look at it because I think that the state absolutely has a right to certain emergency police powers during a pandemic or something like that.
00:39:55.480 I think that vaccines have a long history and there are various ways that the state has coerced vaccines and that has perfect standing in American politics.
00:40:05.540 But the issue is that these people who are signing these letters and the Dr. Fauci's of the world who go on TV and lie to you and say the masks don't work.
00:40:15.340 And then five seconds later, they say, you need to wear the masks.
00:40:18.020 I only lied so that my nurses got more masks when, and I'm only slightly paraphrasing him there, when they squander that credibility, then I say that every time they open their mouths, I become 15% less likely to abide by their lockdown orders or to get the vaccine or use the passport or whatever.
00:40:35.540 Yeah, the professor of epidemiology, Seth Prince, S underscore J underscore Prince, P-R-I-N-S, said that public health officials needed to, quote, pick a side.
00:40:48.960 Can you imagine telling people that we've picked a side?
00:40:54.960 Yeah.
00:40:55.780 All of a sudden, people who are like, well, I'm on the other side and I need health care.
00:40:59.680 Why would I trust you?
00:41:01.020 Right.
00:41:01.620 Why on earth?
00:41:03.400 You know, why would any police officer ever take advice from a public health official again?
00:41:07.940 Yeah.
00:41:08.200 These people have used their platforms to pick a side and say it should be the goal of public health to abolish the police.
00:41:13.420 That's from Epielli.
00:41:15.580 She quoted a graphic that she had originally made for protesters and for police, and she changed it and said, as one of the original creators of this revised graphic, disarm, defund, and abolish.
00:41:28.000 That's what she said.
00:41:28.820 All of a sudden, if you're a police officer and the professor of immunology says, I no longer believe that police should be allowed out and I want to defund, disarm, and abolish them.
00:41:37.580 If you're a police officer or a police chief in a major area and you need information for how to keep your police officers safe, are you going to go to that professor of immunology?
00:41:49.480 Is that what you're going to do?
00:41:50.580 No, so these people have, by deciding to use the platform that they got for healthcare to launder through their political ideas, is breaking the credibility of everybody.
00:42:04.940 And the point here is I think it can't be overstated.
00:42:08.540 The politicization of everything, because they think everything is already political, if that's what you buy into, if you think that everything is always already political, then you have no problem dragging politics in.
00:42:23.840 But when you think everything is always already political, then anytime someone who disagrees with you says something, you're now analyzing what they said through the lens of, okay, what is their political goal here?
00:42:32.620 But I'd like to stand up for it.
00:42:34.920 Everyone does that?
00:42:36.020 Sure.
00:42:36.460 No, I see that point and I think I agree broadly.
00:42:40.140 But to stand up for these insane leftists for a second, or I guess it was expressed most concisely by the second wave feminists when they say that the personal is the political, right?
00:42:50.840 This is where this idea kind of breaks out into the mainstream.
00:42:52.660 When they say that, they make a real point, which is that my personal life, the way that I interact with my husband as a housewife of the 1970s, the way I interact with my society, the way that I interact with schooling or go to work, that has a political basis to it.
00:43:09.960 That it rests on certain political premises, rests on a very, you know, political arrangement of society.
00:43:18.180 It rests on a religious foundation too.
00:43:21.440 And I, as a feminist, I'm going to question all of that.
00:43:24.920 And I'm going to say a woman needs a man like a fish needs a bicycle.
00:43:27.780 And I'm going to say, as Simone de Beauvoir said, you know, Jean-Paul Sartre's strumpet and a very famous feminist herself.
00:43:33.620 She said, women should not be allowed to stay at home with the children.
00:43:37.880 This is a political matter.
00:43:39.240 If we let women stay at home, the politics will never change.
00:43:41.940 So we've got to change their personal decisions, coerce their personal decisions, and then we'll have a better polity than we currently have.
00:43:51.280 I think just in the most modest way, aren't they making an important point that our private decisions exist in a political context because we're all living in society together?
00:44:02.600 So I think there are two ways in which to handle that sentence.
00:44:13.000 First off, people say, you know, a woman needs a man like a fish needs a bicycle.
00:44:18.800 Imagine if we went to, like, I don't know, found an endangered species like the panda and someone said, we need more pandas.
00:44:30.560 And I said, the female pandas need a male panda the way the fish needs a bicycle.
00:44:34.640 Right.
00:44:34.740 This is how the human species repopulates and keeps going.
00:44:38.500 So I think it's absurd if you take it at its best, it could say, well, socially or my value or whatever else.
00:44:46.820 But it really just turns out to the idea of sort of an atomistic view of life where I am complete unto myself, disconnected from anything.
00:44:55.240 And so what they have, the critical social justice, has taken a high-low coalition where at the level of the individual, my choices should be absolutely unencumbered.
00:45:03.840 But at the level of society, society has to be adjusted entirely so that my –
00:45:08.240 To accommodate me.
00:45:09.520 To accommodate all of – to accommodate – well, they would say, to be fair, to accommodate everybody living how they – the life which they have constructed for themselves according to their way of knowing.
00:45:20.980 Right.
00:45:21.280 Yeah.
00:45:21.460 They would say this is liberatory.
00:45:23.420 Yeah.
00:45:23.600 Right.
00:45:23.820 They think that they are liberating themselves from the clutches of Enlightenment liberalism and its demands for objective truth, which is really just a prison.
00:45:32.080 Right.
00:45:32.580 Foucault was once quoted as saying – someone said, do you want to go to heaven?
00:45:35.540 And he said, I don't know.
00:45:36.200 Am I allowed to leave?
00:45:38.180 Right.
00:45:38.600 Because if I can't leave, it's just a prison.
00:45:40.000 Right.
00:45:40.260 Right.
00:45:40.460 Right.
00:45:40.800 That's – the self, a gentle automation unchecked by anything is what matters.
00:45:46.820 Right.
00:45:47.780 So they would say that for everyone.
00:45:49.000 So they would say they want to liberate us from having – from having responsibilities.
00:45:53.140 This is why they hate Jordan Peterson so much because they're saying we want to liberate ourselves from any – from having the responsibilities of everyone else and people encumbering them on themselves.
00:46:01.180 And Jordan Peterson comes along and says, take on as much responsibility as you can.
00:46:04.960 Right.
00:46:05.260 Right?
00:46:05.440 It's the opposite idea in a way.
00:46:06.900 But you mentioned here Enlightenment liberalism as sort of the – that objective standard that they're rebelling against.
00:46:13.140 But could one not also point to Enlightenment liberalism and say, ah, therein lies the problem.
00:46:19.820 Because out of Enlightenment liberalism, we get a whole lot of treatises about tolerance.
00:46:24.140 We get a whole lot of tracts about pluralism.
00:46:26.660 We get a sort of cracking up, perhaps, of the certainty of the truth.
00:46:33.040 I mean, this is – this is what Hamlet is about, right?
00:46:35.100 Hamlet is basically about Martin Luther, as far as I can tell.
00:46:38.060 It's about the crack up of this – this monopoly on truth in the West.
00:46:42.300 And so could – could one – I'm not saying that I'm saying this, but I suppose I am saying this – that one could look – look back at this moment and say,
00:46:49.120 rather than the vindication of objective truth, there began this descent into you do you.
00:46:56.400 I would say two things about that.
00:46:58.640 As someone who has defended Enlightenment liberalism fairly robustly, I say we have a version of Enlightenment liberalism,
00:47:05.940 which I'm seeking is one which suffices as a conflict resolution strategy.
00:47:14.640 The Enlightenment liberalism is going to say that when there's a conflict in society, we have particular methods of doing so.
00:47:23.420 Democracy, the rule of law, applying equally fundamental human rights, and truth, right?
00:47:31.620 Enlightenment liberalism cannot provide your life with the meaning that undergirds it.
00:47:36.820 It was never meant to do that.
00:47:37.940 That's not its goal.
00:47:39.440 Enlightenment liberalism says that there's Muslims and Jews and Christians and atheists and feminists,
00:47:43.760 and these people all need to be able to somehow get along and work with each other.
00:47:48.640 And the postmodernist comes along and says, aha, why should we follow Enlightenment liberalism?
00:47:52.800 We can deconstruct all of that.
00:47:55.040 And then once they've deconstructed everything, all of those conflict resolution strategies are now gone.
00:48:00.520 I think this is a very good point, but it does require one answer here, which is John Locke, father of liberalism,
00:48:09.860 John Locke, writes the great letter concerning toleration.
00:48:13.320 And he says, we got to tolerate everybody, you know, because that's how we're all going to,
00:48:16.980 that's how we're going to resolve conflicts and get along in society.
00:48:19.440 And we're going to tolerate everybody except for atheists.
00:48:23.220 Atheists have no claim on toleration.
00:48:25.540 They actually, they cannot be tolerated or the society is going to fall apart.
00:48:29.620 Now today, of course, nobody, nobody quotes that part of John Locke, but it seems what he's getting
00:48:35.700 at here is we need to have some common grounding.
00:48:39.260 We need to be standing on some common ground here because if, if we, with regard to morality,
00:48:44.240 ethics, the, the way we view ourselves in the universe, because if we don't, you can't,
00:48:49.120 you couldn't even have that conflict resolution.
00:48:51.700 Right.
00:48:52.160 And, and this is kind of part of the problem where we're at with postmodernism is that the
00:48:55.400 postmodernists are standing on an entirely different thing.
00:48:57.680 I think part of the problem is that, that postmodernism only ever spits itself back
00:49:02.760 and redefined and bastardized versions of the concepts that we use, which is why when
00:49:07.100 they say racism, sexism, and homophobia, we're against those things.
00:49:10.060 And we say, well, I'm against those things too.
00:49:12.680 They have an entirely different set of definitions for what that, those things mean.
00:49:17.060 Right.
00:49:17.660 So when you say I'm against, you know, I would say I'm against transphobia.
00:49:21.840 I don't think that Caitlyn Jenner should be beaten up on the way to buy eggs and cheese and
00:49:25.700 get groceries and do other things that Caitlyn Jenner wants to do.
00:49:28.220 I think that's not a problem.
00:49:29.300 By the way, good luck beating up Caitlyn Jenner.
00:49:31.200 Caitlyn Jenner is the greatest athlete that's ever lived.
00:49:33.780 You know what I mean?
00:49:34.240 Good luck.
00:49:34.820 You're not going to beat that guy up, you know?
00:49:36.440 But they would say that transphobia extends even to your thoughts.
00:49:41.220 Yeah.
00:49:41.820 There was a paper that was written by, I believe it was Robin.
00:49:44.440 I want to say Robin Dembroff from Yale who argued that if to avoid using a trans person's
00:49:51.560 chosen pronouns, you use their name.
00:49:54.040 So if I say Caitlyn Jenner instead of she, that that's still a subtle form of transphobia
00:49:58.660 because I haven't, I haven't used the opportunity because I've used the opportunity where I would
00:50:04.740 have normally used a pronoun to sub it out.
00:50:06.580 And in doing so, I have denied Caitlyn Jenner's, I have failed to affirm Caitlyn Jenner's female
00:50:12.300 identity, right?
00:50:13.180 The Enlightenment liberal would tell you to toss off with that, but the postmodernist
00:50:18.320 would say no.
00:50:19.120 And so you're right.
00:50:20.420 The postmodernist, the core assumptions of the postmodernist are inherently social, radical
00:50:27.380 social constructivism of absolutely everything, right?
00:50:31.660 And so that is really the core of the problem that we're seeing because that's the thing
00:50:36.740 that filters down and uploads so easily into a world of Photoshop, right?
00:50:40.840 That everything is just socially constructed, just uploads just so wonderfully and simply
00:50:44.820 onto the internet.
00:50:47.180 That kind of nitpicky kind of 2016 era Tumblr activism uploads really easy when you just
00:50:55.100 want to reinterpret everything and say, well, this really means this and this really means
00:50:58.560 this.
00:50:58.900 That kind of stuff uploads so simply into that world, right?
00:51:05.740 Yeah.
00:51:06.680 And that's where it kind of can take off in a way that slow, careful rigor doesn't really
00:51:13.720 take off.
00:51:14.560 Yeah.
00:51:14.760 Right?
00:51:14.980 Like, I can take a meme of, so somebody said, oh, I can't remember who it was, but if you
00:51:22.000 know what a diminutive is, like when I say my cat, like, who's a good boy?
00:51:25.400 That's a diminutive.
00:51:26.340 Yeah, yeah.
00:51:26.700 Megaluccio.
00:51:27.500 That's my Italian diminutive.
00:51:29.080 Hey, Megaluccio.
00:51:30.040 It's a little like, hey, cute little Megalo.
00:51:31.820 So the postmodernist person comes along and says, I deconstruct gender binary all the
00:51:39.220 time, but I always talk to my cat, good boy, good boy, good boy, because gender is just
00:51:42.580 a diminutive linguistic construct.
00:51:44.900 That's it to them.
00:51:46.260 And they've reinterpreted the whole thing.
00:51:48.100 And you go look at that and you say, I see how you did that.
00:51:51.280 It's complete nonsense, but I see how that works.
00:51:53.200 I see what you're doing there.
00:51:54.660 It's just endless, endless reinterpretation of the signs and symbols because that's the
00:51:59.700 entire world.
00:52:00.340 The discourse in which they live in, right?
00:52:03.360 So everything is just discourse.
00:52:06.500 So when the Enlightenment liberal says, I believe that there are facts about the world that are
00:52:10.980 indubitable, that are true, that are correct, that there's something that we can contact
00:52:16.820 with, immediately you're already stepping away because what the postmodernist is going to
00:52:21.720 say is going to say, no, no, no, no, no.
00:52:23.260 You're filled with so much bias and you're filled with so much longing for power and nothing
00:52:29.940 that you say can be taken at face value and you're badly motivated.
00:52:33.800 And so we have to deconstruct your motives and you've been socialized in.
00:52:37.500 And so by the time they're done, there's no stable, correct, true or absolute sentences
00:52:43.500 because every sentence that you say could have been reinterpreted anyway.
00:52:46.300 The world is constructed in systems of power, right?
00:52:50.720 Science is just a system of power.
00:52:52.820 Now, you might want to tip a hat to Foucault a little bit and say, some of the people who
00:53:00.920 are urging for some of the COVID legislation to stick around permanently, Foucault might
00:53:13.060 have a look at that and say, see, I told you so.
00:53:15.220 Yeah, right.
00:53:15.740 The problem is that Foucault thinks that's the whole world, right?
00:53:19.340 Yeah.
00:53:19.780 These guys, I mean, there is a way in which Derda has a point about some stuff in which
00:53:23.580 you could say, yeah, things can be reinterpreted.
00:53:25.760 We can take something old and use it for something new.
00:53:32.560 I can juxtapose these two different paintings to create a new piece of art.
00:53:36.300 That's okay.
00:53:37.140 And there's something about Foucault where you could say, yeah, you know what?
00:53:39.780 There are certain times when someone's credibility in one particular area of the academic world
00:53:48.760 becomes, to use a term from the economic world, fungible and then extend them to other areas
00:53:56.080 where they use their, say, credibility and, I don't know, mathematics to opine on all kinds
00:54:01.320 of different things, which is why we might ask Michael Jordan or LeBron James or Tom Cruise
00:54:07.500 about moral issues, right, because it extends their credibility and their social credibility
00:54:12.880 extends out.
00:54:13.780 And we might say, Foucault has a point about some of this stuff.
00:54:16.480 The problem is not that, sure, sometimes power gets abused by people.
00:54:21.100 And sure, sometimes people take this stuff out and go in wrong directions with it.
00:54:25.120 And sure, sometimes institutions who have credibility and we trust to do the rigorous work
00:54:31.860 of providing us truth use that trust to launder through political ideology.
00:54:36.180 That can happen.
00:54:38.300 But to say that that's the whole show and that's everything always everywhere is just wrong.
00:54:44.880 And so what I would want to do is I would want to say is that the ideology that's been
00:54:47.860 constructed out of their ideas needs to be, I mean, I would say that you and I are acting
00:54:55.140 somewhat deconstructively on a metanarrative here.
00:54:58.760 Go on.
00:55:00.200 Deconstruct that.
00:55:00.940 Unpack that, please.
00:55:01.880 You might be acting in a deconstructive way on some of their metanarratives.
00:55:06.680 And we might be wanting to point out to them that part of their tactics rely entirely on
00:55:13.960 creating social asymmetries where they're allowed to deconstruct you, but you can't
00:55:17.920 deconstruct them back.
00:55:19.140 Right, right.
00:55:19.900 And that's the point about intersectionality, by the way.
00:55:23.180 That's the thing that Crenshaw thinks that she can put her linchpin on.
00:55:26.360 The universal solvent of postmodernism sits in the jar of personal experience, which can't
00:55:34.360 be deconstructed, right?
00:55:35.800 Right, right.
00:55:36.440 That's what she thinks.
00:55:37.240 You can't deconstruct my experience of living as a woman, as a black woman, and she couldn't
00:55:43.480 deconstruct my experience as living as a Jewish white male who's converted to Christianity
00:55:48.980 in Canada, or deconstruct your experience of being a...
00:55:55.540 Swarthiness in podcasting in Tennessee.
00:55:59.720 All this hair.
00:56:00.420 Yeah.
00:56:01.720 You know, she would say you can't deconstruct those things, right?
00:56:06.060 And then they have the standpoint epistemology, which says that the person at the...
00:56:11.680 That everyone can see the dominant view because it's dominant.
00:56:14.380 And this bottom view under here, right, that's the view that nobody sees or understands.
00:56:22.120 And one might want to ask them, given that you share the same views as Coca-Cola, who's
00:56:28.020 the dominant view in society at the moment?
00:56:30.060 Coca-Cola and the administrative state and big tech.
00:56:32.800 So now, we've obviously run way over time in the interview, but I don't care.
00:56:37.460 My producers want to go to lunch?
00:56:38.960 Too bad.
00:56:39.920 In our remaining minutes here, I need to ask this.
00:56:42.960 I can't...
00:56:43.920 I'm too depressed by everything you've told me.
00:56:47.200 I need to know the answer.
00:56:49.180 People have proposed many different answers to this problem of the wokeness and the kind
00:56:53.720 of postmodern gel, the jelly that we're all living in, where nothing makes any sense and
00:56:59.260 we can't really seem to move.
00:57:00.920 Some people have said we ought to go back to enlightenment liberalism.
00:57:04.180 Just restore the...
00:57:04.940 Go back in time to the values of enlightenment liberalism.
00:57:07.080 That'll be great.
00:57:07.820 Some people have said we need to go back a little further still.
00:57:10.700 Maybe those Catholics had a point.
00:57:12.320 Maybe we need to be a little firmer in our declaration of truth.
00:57:16.600 Some people have said that we need to...
00:57:19.400 You know, I think of the identitarian types, that we need to basically...
00:57:23.440 If the left is going to be racist, we need to be racist.
00:57:26.020 You know, we need to have a hyper-focus on race.
00:57:28.340 If there are going to be atheists, you know, we're going to be atheists too, but we're going
00:57:32.600 to choose to perform certain religious rituals just because, you know, they're good or something.
00:57:37.680 But it doesn't matter if they're true, but they're, you know, anyway, they're conducive
00:57:41.320 to society and people have all these different ideas.
00:57:45.460 What's your idea?
00:57:46.300 I want to see nothing less than the return of the Logos to prominence in Western civilization.
00:57:52.840 Sounds great.
00:57:53.460 I could think of quite a few philosophers who might want to have that with me, and they
00:58:00.520 might construe it differently.
00:58:02.020 They might construe...
00:58:02.700 Some would construe the Logos as Christ.
00:58:04.220 Others people are going to construe the Logos as truth.
00:58:07.700 Other people are going to construe the Logos as wisdom and understanding reality.
00:58:10.880 But the Logos needs to come back.
00:58:14.800 Searle, when he, in his view, says that we could take Derrida's deconstruction and we
00:58:22.020 can just deconstruct it right back and show him to be a classical metaphysician, right?
00:58:26.760 The acid of postmodernism eats everything, including itself.
00:58:29.860 It doesn't leave anything left, right?
00:58:32.060 So once you do that...
00:58:32.920 I want to...
00:58:33.740 Before you go on, I want to make sure nobody missed that point, that you can use Derrida
00:58:39.120 to undermine Derrida.
00:58:40.640 You can use the sort of postmodernism to undermine the postmodernism and turn it into whatever
00:58:45.300 you want.
00:58:45.860 We can turn Derrida into Pope Gregory the Great if we want to, right?
00:58:50.140 It's just words, words, words.
00:58:51.900 Well, I would...
00:58:52.940 To be fair with Derrida, he would say it's endlessly open to interpretation.
00:58:56.980 It's always endlessly open.
00:58:58.060 He wouldn't say you can interpret it just any way you want, but I would turn that around
00:59:01.060 and say, well, Derrida, I guess what you're saying is open.
00:59:04.780 And since you buy the death of the author and that the author doesn't get to decide what
00:59:08.980 the work means after it leaves his pen, I guess you don't get to decide what your work means
00:59:13.280 after it leaves yours.
00:59:15.480 The way...
00:59:16.280 I mean, this goes back to the foundation.
00:59:17.780 Would you do that with the foundation, right?
00:59:19.440 All of the postmodern stuff is operating as our friend...
00:59:24.000 I wish he was my friend.
00:59:24.980 Our colloquial friend John Searle points out is that all of their therefores and as a result
00:59:33.340 and in light of and all the inferences that they claim from their claims are based on the
00:59:39.000 very rationality that they're trying to get rid of, right?
00:59:42.360 They're using rationality to try and get rid of rationality.
00:59:45.940 The point is that if you ever succeeded in doing that, you'd have undermined rationality,
00:59:49.240 which means that your whole critique would have been undermined from the get-go.
00:59:51.740 So what I would say is, to put it simply, when someone says to you, there is no truth,
00:59:58.780 the response of, is that true, should tie them up in knots, right?
01:00:04.500 And typically the way that they've done is to kind of live with a sort of detached irony,
01:00:09.620 right?
01:00:09.980 Because nothing matters.
01:00:12.400 So the way that we go out of this is that we can't go back.
01:00:15.860 We have to go through.
01:00:19.040 That's the only way out of this is through.
01:00:21.740 We can't undo it.
01:00:22.920 There's no unweaving the rainbow, to steal a phrase from Richard Dawkins, right?
01:00:29.320 We can't undo it.
01:00:31.240 There's no way back.
01:00:32.500 There's no stepping out.
01:00:33.800 We have to go through it.
01:00:35.140 So I think the first thing that we want to do or something that we need to make sure that
01:00:37.940 we're doing when we're attacking this stuff is to get clear about the constellation of
01:00:41.320 concepts that populate the motivational economy of the people that are pushing this stuff.
01:00:45.420 We want to get really clear about how all of these things relate to each other so that
01:00:50.040 we can see the endless circularity that they use, right, to try and make their stuff go.
01:00:55.680 We want to understand their power moves and social pressure tactics, right, that they use
01:01:00.600 and why they, when we get into an argument about truth, they instantly switch to moral
01:01:04.920 shaming because they're not interested in having a debate about what the facts are.
01:01:08.360 They're interested in gaining social power.
01:01:10.360 We need to understand that.
01:01:11.600 We need to move through.
01:01:12.780 And then I think we return, we don't return, we bring through with us.
01:01:17.640 Before you go on, I just have to clarify here.
01:01:19.680 When you say, you know, they're using these ad hominems and they're just moral shaming
01:01:23.900 and whatever, they're attacking you.
01:01:25.940 And you say, we have to not try to avoid that, but go through, what do you mean go through
01:01:30.260 it?
01:01:31.340 Like just take it?
01:01:33.100 Oh, oh, no.
01:01:34.680 I mean that having the wherewithal, for example, white fragility.
01:01:38.940 When someone says you have white, you say, I'm not a racist.
01:01:41.220 They say you have white fragility.
01:01:42.940 Yeah.
01:01:43.100 You have to immediately expose that as a Kafka trap.
01:01:45.680 Okay.
01:01:46.080 You have to immediately suppose, oppose that and say, look, what you're saying is that
01:01:50.400 if I say I'm not white, racist, I have white fragility and that white fragility is proof
01:01:56.160 that I have racist and you know that I'm a racist because I have white fragility, which
01:01:59.820 proves that I'm a racist.
01:02:00.900 Like you immediately have to call the bluff on it.
01:02:03.980 Right.
01:02:04.600 The, the, their whole game is to attack the assumptions that you make.
01:02:09.440 Yeah.
01:02:09.760 So the, the response is to say all the stuff that you, all the games that you're playing
01:02:16.280 is relying on my assumptions too.
01:02:19.580 A nice way to do this.
01:02:20.820 I mean, if you're going back to the Caitlyn Jenner thing, when they say, um, this, this
01:02:26.700 person is wearing a dress, this person's wearing makeup, this person has got a wig on, this
01:02:31.040 person identifies as female.
01:02:32.500 Well, why, why is that not enough?
01:02:35.200 And you say, can men wear dresses?
01:02:37.120 Well, yeah.
01:02:37.820 Okay.
01:02:38.200 Can men wear makeup?
01:02:39.300 Yeah.
01:02:39.580 Okay.
01:02:39.820 So then what does that have to do with anything with being a woman?
01:02:41.820 Right.
01:02:42.320 Right.
01:02:42.720 You have to, you have to demonstrate the absurdity of it.
01:02:45.940 Right.
01:02:46.100 You have to show that their, their own assumptions are, are that they, that the whole, to be trans,
01:02:53.620 to go from male to female relies on the existence of a gender binary.
01:02:57.200 Yeah.
01:02:57.480 You couldn't go from male to female unless there was a male and a female to go from and
01:03:01.000 to go to.
01:03:01.640 Of course.
01:03:01.900 Yeah.
01:03:02.060 Right.
01:03:02.280 So I have no, if someone has gender dysphoria and says, this is the only way that I can
01:03:07.200 live, we can have a discussion about that.
01:03:09.660 And I would probably be okay with that.
01:03:11.080 If someone says, but you have to actually believe now that I am this thing that I claim
01:03:15.160 to be, I can say, no, there's a biological reality here that undergirds this.
01:03:20.800 Right.
01:03:21.240 So when someone moves with you on an ad hominem to immediately expose the tactic for what it
01:03:27.060 is, is of absolute paramount to porn.
01:03:29.740 Expose the Kafka trap, expose the circular reasoning, expose the argument via insinuation,
01:03:37.140 expose the unjustified inferences at all times, everywhere, always.
01:03:42.280 When they say you're misinterpreting me and they want to say, well, you have to read this
01:03:46.000 4,000 page book.
01:03:47.040 You immediately say, if you're demanding charitability from me reading you, then you must give me the
01:03:51.460 same charitability in the way you read me.
01:03:53.100 And if you don't, then I'm, I'm justified.
01:03:55.560 All of these social and argumentative asymmetries need to be called out immediately.
01:04:00.560 You go through them.
01:04:01.880 You can't go around them.
01:04:03.040 You can't avoid them.
01:04:03.860 You have to rip them apart.
01:04:05.040 That's a great point.
01:04:05.780 This is a very Dantean point.
01:04:07.620 I like the only way out is through that Dante, you know, to give us the vision of heaven,
01:04:11.820 he doesn't get to just jump right up and go to heaven.
01:04:14.120 He has to go down through the very bowels of hell and crawl across Satan up to purgatory
01:04:19.560 and up to heaven.
01:04:20.360 Yeah.
01:04:21.380 So then the second thing that you do is you have to take the things about what social
01:04:25.220 justice is pointing at, which give it its plausibility.
01:04:30.060 You have to be able to show how those don't justify the philosophy and the social ontology,
01:04:35.400 their social philosophy that they've created out of this, their, their critical social justice
01:04:39.740 doesn't hold together.
01:04:40.860 You have to drive that apart.
01:04:42.260 Uh, then I think that you have to, there is a very strong tradition.
01:04:48.000 I think most people hold to it.
01:04:49.380 They're kind of ducking and covering right now because all of the cultural megaphones
01:04:53.900 are owned by woke people.
01:04:55.620 Yeah.
01:04:56.000 Um, and we need people, I mean, John McCorder went, uh, I think yesterday or the day before
01:05:01.100 on Bill Maher show and just lit wokeness on fire and it was absolutely beautiful.
01:05:04.160 We need more of that.
01:05:05.800 Um, so I think going through, it means that we have to come through on the other side with
01:05:09.120 the version of enlightenment liberal, which said enlightenment liberalism, which says
01:05:12.720 this is a conflict resolution strategy built around democracy, human rights, property rights,
01:05:21.240 freedom, um, the power, utility, and strength of science, um, the usefulness of markets, and
01:05:27.940 then say, um, um, we're going to keep that through, but we're going to adopt an understanding
01:05:36.740 of that, which, which allows us to push, to, to filter out the postmodern view of the world
01:05:43.340 that inoculates us from that, right?
01:05:45.840 That's what we want to do.
01:05:46.960 If, if I mean, the, the, the, the, the, the, the Wuhan lab of wokeness have unleashed the
01:05:54.500 virus and we need the vaccine now, right?
01:05:56.820 So what we're going to do is inoculate ourselves from it by understanding how these Kafka trap
01:06:01.040 works.
01:06:01.540 We're going to, we're going to bring back the logos.
01:06:03.480 We're going to bring back reason and objectivity.
01:06:05.600 We're going to bring back an understanding of the, well, though I myself may not be able
01:06:09.860 to be fully objective, we can create universities and systems of peer review, which rather than
01:06:15.340 functioning as ideal laundering factories for woke activism can actually, um, suffice to
01:06:20.560 be places where people can criticize.
01:06:23.320 And within that criticism, we can hammer out and we can, we can polish off all of this subjectivity
01:06:30.080 and get, and get to truth.
01:06:32.580 And then I think the Enlightenment liberals need to understand that human beings crave
01:06:38.160 meaning and that the Enlightenment liberal framework, the Enlightenment liberal framework
01:06:44.900 is to create conflict resolution within society, within science, within truth and within politics.
01:06:51.500 But it does not create meaning.
01:06:54.160 Well, so how do we create meaning?
01:06:55.760 That's all I'm interested in.
01:06:57.260 That's what, this is why I kind of am being somewhat derisive of the Enlightenment liberal
01:07:00.900 thing.
01:07:01.340 I want the meaning, darn it.
01:07:03.160 Give me a system with meaning.
01:07:05.020 I mean, if we would just want to talk about meaning, that's a five, that's a quick five
01:07:07.620 minute thing.
01:07:08.640 The meaning, the meaning, the meaning of, of life is, is found in the drinking of woke tears.
01:07:14.820 No, I actually think that a lot of people who've been swept up in wokeness have been swept up
01:07:22.400 because they've been drinking it out of the culture.
01:07:24.340 I'll make an even more controversial claim, and you can edit this out if you have to.
01:07:27.880 Bring it on, please.
01:07:29.220 I want it.
01:07:29.740 Donald Trump is the first postmodern president.
01:07:32.240 Yeah, I agree.
01:07:33.240 I agree.
01:07:33.680 Well, my, my, the only way I'll disagree or push back even slightly is Obama may have
01:07:39.320 anticipated that.
01:07:40.460 Obama, Obama may, I, I see your point on, on Trump.
01:07:43.980 And I'd like you to go, go into it a little bit more, but I think many of the eccentricities
01:07:50.660 of Trump, I think you see prefigured a little bit in, in Obama, but I'd like to hear more
01:07:55.500 about your point on Trump.
01:07:56.120 I think that Obama would have been, there was a wonderful essay written by James Lindsay
01:08:01.360 talking about this exact thing, about different eras.
01:08:04.760 So you have the initial era, then you have the reconstruction era.
01:08:08.260 Then you have, for example, okay, the liberal era in the, or sorry, FDR, that era.
01:08:13.580 Then you have the neoconservative era starting with say Ronald Reagan and moving all the
01:08:17.480 way up.
01:08:17.940 And now Obama could have been a new liberal era, but that movement, Obama, I think was
01:08:24.260 a liberal.
01:08:26.040 Okay.
01:08:26.920 But Obama started getting attacked and criticized as being not far, far enough by the up and
01:08:32.640 coming millennial class of journalists, creators, and all the rest of it.
01:08:38.100 And that, um, the postmodernism was coming.
01:08:46.980 I think Obama may have seen it and was trying to operate within it and figure out what to
01:08:50.640 do with it.
01:08:51.180 Yeah.
01:08:51.760 Right.
01:08:52.120 Especially after getting shellacked in 2010.
01:08:54.020 Yeah.
01:08:54.260 Yeah.
01:08:54.380 Um, I think he tried to figure it out.
01:08:55.940 I think, I think Trump just intuited it.
01:08:58.380 There's a, there's a little video where, where Trump says, uh, I could be presidential.
01:09:03.000 I could be presidential.
01:09:04.240 Right.
01:09:04.460 Do you remember this video?
01:09:05.520 He said, but that would be boring.
01:09:06.860 There's another video where he's talking on camera and then he kind of stops from and
01:09:09.700 says, excuse me, can you step aside out of the shop and say, and then goes right back
01:09:12.380 to it.
01:09:13.220 I think Trump intuited and understood.
01:09:16.800 I don't think Trump's ever read Judith Butler.
01:09:19.120 I don't think he's read gender trouble.
01:09:21.820 Um, I don't think that he's ever read, um, Orientalism by Edward Sayida or, or the grammar
01:09:27.300 or of grammatology by Jacques Derda.
01:09:29.240 I think that lucky guy.
01:09:30.580 I think, I think he sat for 11 years doing the apprentice and saw how the media ecosystem
01:09:35.640 worked.
01:09:36.040 Yeah.
01:09:36.620 Yeah.
01:09:36.860 And then I'll make another claim.
01:09:38.880 So we're going to do this real brief.
01:09:41.300 If we're talking about Q, QAnon is postmodernism on the right.
01:09:45.340 Yeah.
01:09:46.200 Yeah.
01:09:46.720 QAnon is postmodernism.
01:09:47.900 And it's also, you could describe it as an oral tradition, right?
01:09:50.180 Because it always gets banned everywhere.
01:09:51.920 So it's just in the retelling of the story that it recreates itself.
01:09:54.660 And the old story of it only exists in the ruins of dead forms that are ossifying and
01:09:59.280 decaying away in, in internet archives in various places.
01:10:02.420 Right.
01:10:03.800 But 1980, Ed Rollins puts everything on the table and fights hard to get Ronald Reagan
01:10:11.360 elected.
01:10:11.900 Yeah.
01:10:12.180 Yeah.
01:10:12.300 Yeah.
01:10:14.560 Then who, who, who did Ed Rollins bring on?
01:10:17.420 And he brought in who?
01:10:18.040 Lee Atwater.
01:10:19.880 And Lee Atwater is the one who ran the Willie Horton ad to get George H.W.
01:10:25.300 Bush, right?
01:10:27.520 So you have, we're getting Reagan elected and from Reagan, we degrade a little bit to
01:10:32.420 Bush, right?
01:10:33.400 But we're still, maybe the most qualified man to ever be president is George H.W.
01:10:37.960 Bush.
01:10:38.160 He's got a great resume, that guy.
01:10:39.560 Yeah.
01:10:39.760 And, and a good man.
01:10:41.760 But the operative around them is Lee Atwater.
01:10:44.020 Now you, you fast forward to, to, um, the Bush era and it's Karl Rove and who was Lee
01:10:56.140 Atwater's chief disciple?
01:10:58.060 It's Rick Wilson.
01:10:59.860 Really?
01:11:00.620 I actually didn't know that.
01:11:02.240 Rick Wilson was taught by Lee Atwater.
01:11:05.340 Rick Wilson ran the ad against Max Cleland in 2003, 2002 to get Saksby Chambers elected.
01:11:17.560 So Ed Rollins sold his soul to get Ronald Reagan elected.
01:11:21.380 Rick Wilson sold his soul for Saksby Chambers, right?
01:11:25.400 But hold on, go a little further.
01:11:27.380 That ad has, Rick Wilson actually defended this.
01:11:30.500 I will afterwards, if you want, I can send you the screenshots of this because Rick left
01:11:34.000 them on the timeline and I can dig.
01:11:37.020 And he defends that ad and says, I didn't say anything false.
01:11:39.700 No, what you did is you had a picture of Max Cleland juxtaposed or turning into Osama
01:11:45.180 bin Laden.
01:11:45.780 I can't remember whether they juxtaposed or had him turned into and said that he didn't
01:11:49.260 have the courage to lead.
01:11:52.040 Max Cleland is a triple amputee from Vietnam who opposed what George H.W.
01:11:56.760 Bush was, what George W. Bush was doing in 2002 with the Iraq war on the basis of the
01:12:02.140 costs to the future and not wanting to give him a blank check.
01:12:05.200 And Rick Wilson basically came out and said, you are Osama bin Laden.
01:12:10.660 Okay.
01:12:12.000 You want to know why Trump gets off attacking a gold star mom, Rick?
01:12:15.660 You set the paradigm.
01:12:17.260 Right.
01:12:17.540 Wow.
01:12:17.880 What an amazing tidbit.
01:12:19.840 I actually had no idea about Rick Wilson's role in that.
01:12:23.400 I'll do you even one better than that.
01:12:25.520 Now, you step forward.
01:12:27.340 People are saying, well, the populism of the GOP, the racism, the racist populism of the
01:12:31.420 racist populist.
01:12:32.220 The race, yeah.
01:12:32.760 It goes back to Sarah Palin.
01:12:34.300 Who picked Sarah Palin?
01:12:35.980 Bill Kristol.
01:12:37.600 Steve Schmidt was the one who was in the campaign.
01:12:41.160 Who was in the campaign.
01:12:42.180 Steve Schmidt and John Weaver.
01:12:42.940 You know, it wasn't just Schmidt, though.
01:12:45.420 Bill Kristol and I think Charlie Sykes, too, but certainly Bill discovered her, was advocating
01:12:50.900 for her.
01:12:51.380 And then, yeah, you get Steve Schmidt and guys like Weaver, and somehow you get Sarah Palin.
01:12:56.860 The entire Lincoln project has on its hands the fingerprints of all the bad parts of the
01:13:02.900 Trump presidency.
01:13:06.040 Right?
01:13:06.480 All of those guys are absolutely responsible for all of this.
01:13:11.800 If we look at the men who the people picked, Reagan.
01:13:15.280 Yeah, yeah.
01:13:16.320 How many men was George H.W. Bush married to?
01:13:18.420 One.
01:13:18.720 How many men was George, or women, sorry, was George H.W. Bush married to?
01:13:21.540 One.
01:13:21.940 How many women was George Bush married to?
01:13:23.820 One.
01:13:24.580 You know, those are the men that we're bringing in.
01:13:27.460 Right?
01:13:28.340 Mitt Romney, the straightest shooter you could possibly imagine was picked by the people.
01:13:33.060 Who are the consultants?
01:13:35.580 It's the Lee Atwaters, the Rick Wilsons, the Steve Schmitz, the John Weavers.
01:13:42.580 The rot in the GOP can be entirely traced back to the people who for years, for absolutely
01:13:50.440 for years, in the name of a certain kind of economic program and in a certain form of
01:13:57.960 foreign policy, ran ads blatantly attacking people like Max Cleveland to the point where
01:14:04.440 even John McCain said that he refused to campaign with Saxby Chambers because of that
01:14:11.360 ad.
01:14:11.760 Yeah, yeah.
01:14:12.060 And now, when Donald Trump attacks a gold star mom, which was bad, it was terrible, these
01:14:17.280 guys are, I am incandescent with rage.
01:14:20.480 I'm sorry, Max Cleveland would like to have a word, sir.
01:14:23.500 Yeah.
01:14:24.080 These people have been salting the ground for this for years.
01:14:29.220 I should point out for the fact checkers, I should point out that, you know, Reagan was
01:14:34.740 remarried, but in his defense, his wife left him.
01:14:37.600 He did not want to get remarried.
01:14:38.940 He was actually very, very against it.
01:14:41.580 Yeah.
01:14:41.920 But yes, certainly he lived a very upright life.
01:14:44.920 Bush, you know, of course, Bush, that's one of the great love stories ever in politics,
01:14:49.780 Bush Sr.
01:14:50.240 And yeah, you do see a sort of decay.
01:14:54.560 And then ironically, these hypocrite GOP consultants who created the modern GOP, now they turn on
01:15:01.160 it as though their hands are totally clean.
01:15:03.460 It's totally ridiculous.
01:15:04.820 And it's all the bad stuff, too.
01:15:06.700 Yeah.
01:15:07.220 Like you trace everything bad that Trump did.
01:15:10.440 Is there any attack ad that Trump ever ran that's anywhere close to what Rick Wilson did
01:15:15.180 to Max Cleland?
01:15:16.260 Yeah.
01:15:16.400 And he defended it by saying, what in the ad did I say was false?
01:15:20.240 Um, that Max Cleland lacks courage.
01:15:22.700 I mean, this is a triple amputee and they cropped the photo so you could only see his
01:15:26.680 head and shoulders.
01:15:27.420 So that part gets left out.
01:15:29.300 Oh, we're juxtaposing different things.
01:15:32.200 Where do we see the ripping out of context and the juxtaposition?
01:15:36.200 That sounds like, hmm, sounds like postmodernism.
01:15:39.880 Sounds like there is a kind of postmodern conservatism.
01:15:42.920 In fact, we could say that we're living in, I don't know, hugely tremendous times, you might
01:15:47.220 say, because, right.
01:15:50.240 And these guys are all upset about it.
01:15:52.520 But like, let's look at Charlie, the history of Charlie Sykes.
01:15:55.180 Who did Charlie Sykes bring on board?
01:15:56.700 Who did Bill Kristol bring on board?
01:15:58.120 All of these guys mainlined the very people.
01:16:01.700 And we're left now with a group of people in the, in the GOP.
01:16:04.700 They, I'm thinking of Kinzinger and Cheney and some of these other people who, who, who
01:16:10.120 have mistaken Bush era platitudes and talking points at, um, for eternal truth.
01:16:17.560 Yeah, that's right.
01:16:18.900 That's right.
01:16:19.240 They thought, they thought that the ossified talking points of the Bush administration
01:16:22.080 were the bones of eternal truth.
01:16:23.320 And they're not.
01:16:23.940 Yeah.
01:16:24.240 Right.
01:16:24.580 This is not correct.
01:16:25.760 So this is, this is absurd.
01:16:27.860 And so to me, I look at the Lincoln project and I think if you want to know grift, don't
01:16:33.540 come at me.
01:16:34.840 Don't come at you or Ben.
01:16:36.340 Yeah.
01:16:36.620 Let's have a conversation about Rick Wilson, Steve Schmidt, John, John Weaver.
01:16:40.480 Um, Stuart Stevens, all of these guys who were wanted to go blood and guts, you know, attacking
01:16:51.040 war veterans, bashing people's patriotism.
01:16:54.860 If you're not with us, you're with the terrorists, all of that stuff for years.
01:16:59.880 Then Trump comes along and says, you are fake news.
01:17:01.980 And they're just like, well, this is the enemy of the people.
01:17:03.620 We need to start the Lincoln project.
01:17:05.060 You can donate here.
01:17:06.720 Yeah.
01:17:07.080 Right.
01:17:07.480 And they'll run ads on, on morning Joe in Washington, DC.
01:17:11.160 They'll run their ads in places that, that would never have any effect on the election,
01:17:14.700 but that will milk Democrat donors out of a lot of dollars.
01:17:17.940 Right.
01:17:18.520 I mean, it's utter insincerity of this.
01:17:20.460 And so there is a, a point.
01:17:23.640 And I mean, there's some good and there's some bad.
01:17:25.640 The good people who are still within, I would say a Bush ish kind of conservative paradigm.
01:17:32.260 I mean, people I might respect.
01:17:33.720 I think I still, I would still respect David French.
01:17:36.100 I think David French is honest, but unlike, you know, some of those other guys, David
01:17:41.140 French went to war and is a veteran.
01:17:44.620 Yeah.
01:17:44.800 You know, I really like David personally.
01:17:47.340 He and I have always had a great personal relationship.
01:17:49.820 Some of the things he was written recently, I find it difficult to take a charitable read
01:17:54.880 on, but, uh, but yeah, he's a nice guy.
01:17:58.480 I, I, you know, but you, you, you can, you can read David French and say, I disagree, but
01:18:03.740 I don't think that David French is running an entirely cynical operation, but I think
01:18:07.360 that some of the other guys are being entirely cynical.
01:18:10.520 I mean, yeah, yeah, that's fair.
01:18:12.720 You know, I, I was, I was getting made fun of by John Stewart when I wrote liberal fascism
01:18:17.540 years ago.
01:18:18.140 What have you done?
01:18:19.020 And it's like, I'm sorry.
01:18:21.180 That was the interview where I first realized Mr. Goldberg, Mr. Jonah Goldberg, I'll say
01:18:24.940 your name because you deserve it.
01:18:26.540 Um, that was the first interview where I realized that there are conservative charlatans because
01:18:31.900 John Stewart exposed you for what you were back in what, 2009.
01:18:35.820 It's a joke.
01:18:36.520 It's absurd.
01:18:37.520 And so when they say that the conservative movement has been intellectually hollowed out,
01:18:41.440 we might want to ask who did that.
01:18:43.580 Who was in charge for the last 30 years?
01:18:45.360 Well, you know, I, I love this point too.
01:18:47.400 I guess this sort of ties it in together, uh, which I'm sure my producers will be happy
01:18:51.360 about so they can go to lunch that there is these, these, no, these, no, but I, uh, you
01:18:57.740 know, what you, what you're saying is so important for understanding, you know, the broader culture
01:19:04.080 and we can knock the left and how the left brought us here, but also understanding what
01:19:08.000 that means for our side, how our side has indulged this, how we represent this now.
01:19:11.940 And, and for those who say the awful Trump era people, they have, they have betrayed
01:19:17.640 true conservatism, capital T, capital C, trademark over the E of, you know, 2006 or
01:19:23.600 whatever, uh, you think, well, how did we get here?
01:19:26.860 It didn't, this didn't just spring out of the, the, the air one day, you know, alone in
01:19:31.440 a forest.
01:19:32.200 This is a consequence of, of other ideas and other actions and, and, uh, uh, uh, a certain
01:19:38.300 ideology, you know, that perhaps has reached the end of the line.
01:19:40.520 And I think there's a group of conservative people and you touched on this a little bit
01:19:44.060 who've said, Hey, if the left is doing this and it's working, we should indulge it too.
01:19:47.180 Yeah.
01:19:48.300 And this is going to be part of the book I'm writing, which I am writing on, on the tactics,
01:19:52.660 which I explain how this works.
01:19:54.680 And that's not going to work.
01:19:56.340 Cause if you step onto the postmodern ground with the people who are postmodern, they're
01:19:59.200 native to it, right?
01:20:00.280 This is like having a, this is like having a, uh, uh, a battle of play on words with a speaker
01:20:05.860 of a native language.
01:20:06.640 And you just learned it yesterday, right?
01:20:08.340 Like if I learned Spanish tomorrow and I tried to get into a poetry contest with a, with a
01:20:13.120 native speaking Spaniard, that's going to not go real well unless we're doing free verse.
01:20:17.500 But, um, you know, like this isn't going to go well for me.
01:20:20.500 Maybe if we're doing a haiku and I only need eight words or something, but I'm going to lose
01:20:24.260 if, if I learn how to play hockey yesterday and I'm going to step on the ice against Connor
01:20:29.740 McDavid, this is not going to go well for me.
01:20:32.300 So for us to adopt those tactics bad, the thing that we ought to be doing is adopting the tactics
01:20:39.480 of the enlightenment liberal.
01:20:41.880 And when I say the tactics of the enlightenment, sorry, the principles of the enlightenment liberals.
01:20:46.900 I prefer, yeah, I prefer the tactics of the Spanish inquisition, but I agree.
01:20:51.020 So some of the principles in the enlightenment are fine.
01:20:53.500 We need, we need to bring the enlightenment principles in and then we need to have tactics
01:20:57.740 that pull them onto that ground.
01:20:59.040 Because once you pull them onto the ground of objectivity, their houses crumble.
01:21:02.760 Yeah, that's true.
01:21:03.480 And I suppose then we can, we can pull them back even further to the glories of Europe,
01:21:07.920 Christian unity, the high church, but we'll have to save that topic for another time.
01:21:12.520 Vocal distance, everyone go follow Vocal.
01:21:15.340 I'm sure they've already followed you after your very illuminating thoughts, but you can follow them
01:21:19.600 at W-O-K-A-L underscore distance at Vocal distance.
01:21:25.020 And stay tuned for that, that book with words.
01:21:28.360 And I, having written a book with words and a book without words, it is a much tougher
01:21:31.620 undertaking to do one with words.
01:21:33.040 So I commend you for that.
01:21:34.900 Vocal, where else can people find you?
01:21:37.240 So right now it's just the Twitter, Vocal underscore distance.
01:21:41.060 That's, that's my Twitter handle.
01:21:42.400 That's where you'll find me.
01:21:43.460 There it is.
01:21:43.940 I occasionally do interviews on Benjamin Boyce's podcast on YouTube.
01:21:47.580 So if you search for Benjamin Boyce on, on YouTube, he's got a lot of wonderful interviews.
01:21:51.660 He interviews some really great people.
01:21:54.300 Yeah, that's where you can find me.
01:21:55.700 I, I appreciate you having, having me on.
01:21:58.440 Absolutely.
01:21:59.000 We're going to have you back.
01:21:59.840 It was fabulous.
01:22:00.720 I'm so glad you came on.
01:22:02.260 You're, you're extremely intelligent and, uh, and that's a, you know, I try not to ever
01:22:08.140 compliment Canadians.
01:22:09.620 It just goes against my nature to say nice things about America's hat.
01:22:13.260 But, but, uh, I, I will in your case.
01:22:15.960 Well, as a Canadian being polite, it's, it would be untoward of me to accept it.
01:22:20.540 Fair point.
01:22:21.280 Vocal.
01:22:21.660 Thank you so much for coming on.
01:22:23.240 Thank you.
01:22:24.000 I appreciate it.