The Jordan B. Peterson Podcast - March 21, 2024


433. Streaming, Politics, & Philosophy | Destiny (Steven Bonnell II)


Episode Stats

Length

2 hours and 3 minutes

Words per Minute

199.71783

Word Count

24,631

Sentence Count

1,581

Misogynist Sentences

5

Hate Speech Sentences

23


Summary

In this episode of the Daily Wire Plus podcast, I talk to American Twitch streamer, debater, and political commentator Stephen "Destiny" Bunnell. We talk about how he became a voice on the left and right, the dangers of political ideology, and the role of video games in shaping our understanding of the world, as well as how they affect our perception of reality and our ability to make sense of it. This episode is sponsored by Dailywire Plus, a new service from Dr. Jordan B. Peterson's new series that could be a lifeline for those battling depression and anxiety. With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way, and offers a roadmap towards healing. In his new series, he provides a roadmap toward healing, showing that while the journey isn t easy, it s absolutely possible to find your way forward. If you re suffering, please know you are not alone. There s hope, and there s a path to feeling better. Go to DailywirePlus now and start watching Dr. B.P. Peterson s new series on Depression and Anxiety. Let s take the first step towards the brighter future you deserve. Let s Reach Out to Dr. P. Peterson on Depression & Anxious: A Path to Feeling Better. - Let s Talk About Depression and Anxious? by Jordan Peterson on Dailywire PLUS, go to DailyWireplus.org/Depression and Anxiety by clicking here to get the first episode of Depression + Anxious by clicking HERE to get a FREE gift from Jordan Peterson's newest series on his new book, and let him know what you can do to help you feel better. by becoming a supporter of his new project, The Dark Side of the Conversation, by clicking Here to receive $5, $10, $15, $20, $25, and $50, and more! Thank you, Jordan Peterson, and much more. You re not alone! by Dr. I m listening to the podcast, and I m talking about Depression + Anxiety + Anxiety + Depression + Depression & Depression, and Anxiety, by I m helping me feel better, I m Helping Me Reach Out, I can help you get a brighter Future you deserve a brighter future, and so I can be a better place to feel better? by , I m looking forward to helping you find a better life.


Transcript

00:00:00.960 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.480 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.740 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.100 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.420 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.360 If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.780 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.460 Let this be the first step towards the brighter future you deserve.
00:00:57.420 Hello, everyone. I'm here today talking to Stephen Bunnell, known professionally and online as Destiny.
00:01:11.220 He's an American streamer, debater, and political commentator.
00:01:15.280 He really came to my attention, I would say, as a consequence of the discussion he had with Ben Shapiro and Lex Friedman,
00:01:22.080 and I decided to talk to him, not least because it's not that easy to bring people who are identified,
00:01:29.100 at least to some degree, with the political beliefs on the left into a studio where I can actually have a conversation with them.
00:01:35.680 I've tried that more often than you might think, and it happens now and then, but not very often.
00:01:39.980 So today we talk a lot about, well, the differences between the left and the right,
00:01:45.720 and the dangers of political ideology per se, and the use of power as opposed to invitation,
00:01:50.920 and all sorts of other heated, often heated and contentious issues.
00:01:58.360 And so you're welcome to join us, and I was happy to have the opportunity to do this.
00:02:03.920 So I guess we might as well start by letting the people who don't know who you are get to know who you are with a little bit more precision.
00:02:13.220 So why have you become known, and how has that developed?
00:02:18.440 It's a pretty broad question.
00:02:20.760 I think I started streaming around 15 years ago when it wasn't really a thing yet.
00:02:25.580 There were a few people that did it.
00:02:27.300 I started early on.
00:02:28.800 I was a, well, I guess back then you weren't a professional gamer yet because the game had just started to come out.
00:02:32.820 But there was a game called StarCraft II, and I streamed myself playing that game.
00:02:36.680 I was a pretty good player.
00:02:37.700 It was pretty entertaining to watch.
00:02:39.200 And then I kind of grew over, I guess, maybe the next seven years just streaming that people would watch.
00:02:46.720 Streaming on YouTube?
00:02:48.080 Well, back then I started on a website called Livestream.
00:02:51.060 Then I switched to Ustream.
00:02:52.240 Then I switched to a site called JustinTV.
00:02:54.080 And then that turned into Twitch.TV.
00:02:57.220 So after streaming there for like seven or eight years, I was a semi-professional StarCraft II gamer.
00:03:01.640 That game kind of came and went.
00:03:03.320 But I had a lot of other interests.
00:03:05.000 Around 2016, I started to get more involved into the world of politics.
00:03:08.500 It's kind of a left-leaning figure.
00:03:10.120 Because of my background in like e-sports and internet gaming and internet trash talk, I had more of a kind of like a combative attitude.
00:03:17.920 And that was kind of rare for left-leaning people at the time.
00:03:20.180 So that's basically where my early political popularity came from.
00:03:23.280 I think from like 2016 to 2018 was debating right-wing people.
00:03:26.560 So was there a game-like element to the debating, do you think?
00:03:29.640 And is that part of why that morphing made sense?
00:03:34.960 No, I wouldn't say so.
00:03:36.460 I mean, if you get really reductionist, everything in life is kind of a game.
00:03:40.760 But that's not very satisfying.
00:03:42.520 I think I grew up like very argumentative.
00:03:45.020 My mom is from Cuba.
00:03:46.480 So my family was like very conservative.
00:03:48.220 And then I grew up like listening to the news all day, listening to my mom's political opinions all day.
00:03:52.340 And then I argued with kids in high school and everything.
00:03:54.020 And I've always been kind of like an argumentative, type A, aggressive personality.
00:03:57.200 So I think that probably lent itself well to the political stuff in 2016.
00:04:01.520 Was that useful in gaming?
00:04:04.600 That personality?
00:04:06.440 In some ways, yeah.
00:04:07.740 In some ways, no.
00:04:09.060 I don't know directly for the games itself.
00:04:11.140 I don't know how much it necessarily mattered.
00:04:13.120 But for all the peripheral stuff, in some ways it was really beneficial.
00:04:16.140 I could kind of like cut out my own path.
00:04:17.620 And I could be very unique.
00:04:18.440 And I could kind of be on my own.
00:04:19.900 In some ways it was very detrimental.
00:04:21.100 I'm very, I can be very difficult to get along with.
00:04:24.500 And I'm very much kind of like, I want to do this thing.
00:04:26.580 And if you try to tell me what to do, I don't want to have like a sponsor or a team or anybody kind of with a leash on me.
00:04:31.480 So, yeah.
00:04:32.360 I guess it worked out.
00:04:32.920 It's interesting because that, the temperamental proclivity that you're describing, that's associated with low agreeableness.
00:04:40.280 And generally, well, and that's more combative.
00:04:42.800 It's more stubborn.
00:04:43.760 It's more implacable.
00:04:45.080 It's more competitive.
00:04:48.240 The downside is that it's more skeptical.
00:04:51.060 It can be more cynical.
00:04:52.560 It can be less cooperative.
00:04:54.140 But generally, a temperament like that is associated with, is not associated with political belief on the left.
00:05:00.240 Because the leftists tend to be characterized by higher levels of compassion.
00:05:06.880 And that's low agreeableness.
00:05:09.480 So, you know, that element of your temperament, at least, is quite masculine.
00:05:12.720 And a lot of the ideology that characterizes the modern left has a much more temperamentally feminine nature.
00:05:21.780 So, all right.
00:05:23.380 So, why do you think the shift from your popularity to political commentary worked?
00:05:29.200 And you said that started about 2016.
00:05:31.820 And why do you think that shift happened for you, like, in terms of your interest?
00:05:36.640 I think I've always been interested in a lot of things.
00:05:38.860 Like, I grew up with a very strong political bend.
00:05:41.100 And it was conservative until I got into my streaming years.
00:05:44.140 Probably five or six years of streaming.
00:05:45.560 I slowly kind of started to shift to the left.
00:05:48.720 I would say that, I guess, in around 2016, when I saw all of the conversations going on with the election and with all the issues being talked about,
00:05:58.460 I felt like the conversations were very low quality.
00:06:01.080 And in my naivety, I thought that maybe I could come in and boost the quality, at least in, like, my little corner of the internet to have better conversations about what was going on.
00:06:09.080 And so, that was my, basically, my injection point into all of that was, yeah, fighting about those political issues and then arguing with people about them, doing research and reading and all of that, yeah.
00:06:18.800 And so, did you do that by video to begin with as well?
00:06:22.140 Yeah, it was all streaming, yeah.
00:06:23.140 It was all streaming.
00:06:24.020 And so, I presume you built an audience among the people who were following you as a gamer first and then that started to expand?
00:06:30.480 Is that correct?
00:06:31.100 Basically, yeah.
00:06:32.760 Without getting too much into, like, the business or streaming side of things, basically, actually, this probably carries over to, basically, to all media, I would imagine, is you've got people that will watch you for special events.
00:06:43.920 So, maybe you're, like, a commentator of the Super Bowl or maybe you're hosting, like, a really huge event.
00:06:48.540 Then you've got people who will watch you every time you're participating in your area of expertise.
00:06:52.840 So, for me, that's, like, a particular game I might be playing.
00:06:55.900 It might be when you're on, like, a particular show or something that people watch you for.
00:06:59.960 And then the fundamental fan, like, the best fan that you're converting to the lowest and most loyal viewer, I guess, is somebody that's watching you basically no matter what you're doing.
00:07:08.940 And these are the people that will follow you from area to area.
00:07:11.040 And I think because of the way I did gaming and I talked about a lot of other stuff, whether it was politics, science, current events, whatever, I had a lot of loyal fans that kind of followed me wherever I went.
00:07:19.480 So, quite a few of them stuck with Star Trek.
00:07:20.900 So, you've established a reputation.
00:07:21.820 Yeah.
00:07:22.140 So, how would you characterize your reach now?
00:07:24.460 How would you quantify it?
00:07:26.980 I think my—well, can you be more precise?
00:07:29.380 How many people are watching a typical video that you might produce?
00:07:33.800 And what are you doing for subscribers, say, on YouTube and total—any idea about total reach?
00:07:39.760 Yeah.
00:07:40.120 Well, I mean, I guess my subscribers on YouTube, I have around, I think, I think that's around 770,000 on my main channel.
00:07:46.340 I think I probably do, between all three channels, I think around 15 to 20 million views a month.
00:07:51.880 And then I live stream to anywhere from 5,000 to 15,000 concurrent viewers a day for hopefully around eight hours a day.
00:07:59.620 Yeah.
00:08:00.600 Okay.
00:08:01.140 Okay.
00:08:01.420 So, you have quite a substantial reach.
00:08:03.640 And so, you said that initially you were more conservative-leaning, but that changed—okay.
00:08:10.860 What did it mean that you were more conservative-leaning, and how did that—how and why did that change?
00:08:15.620 When I said I was conservative-leaning and I was writing articles for my school newspaper defending George Bush and the Iraq War, I was, like, very much like—I think it's, like, an insult now when people say, like, neocon.
00:08:27.540 But I was, like, very much like a conservative, a Bush-era conservative, so supported big business, supported traditional—all of the conservative, I guess, like, foreign policy, you know, hawkish foreign policy, for whatever that meant as, like, a 14, 15-year-old.
00:08:41.400 Right, right, right.
00:08:42.300 There was the whole Elian Gonzalez incident that was very big for Cuban Americans, where there was a Cuban boy that tried to come to the United States with several other people and his mother, and their raft, I guess, crashed or something happened.
00:08:54.360 I think his mom died, and some other people died, and there was a huge debate on whether or not to send him back to Cuba, and Clinton ended up sending him back to Cuba, and I know that my mom was super irritated and all that, to say the least.
00:09:05.140 And then once I hit college, I think I supported Ron Paul in 2000—would have been 2008.
00:09:10.680 I was a big Ron Paul libertarian guy in high school when I went from—I went to a Catholic Jesuit high school, and I kind of became atheist in that process.
00:09:19.060 I started reading Ayn Rand, so I was very, very, very, very, very conservative.
00:09:24.360 But on the libertarian end, it sounds like.
00:09:28.060 Yeah, I would say so, yeah.
00:09:29.060 Yeah.
00:09:29.260 Initially on the, like—
00:09:30.760 That makes more sense in relationship to your temperament.
00:09:33.560 Sure, maybe, yeah.
00:09:34.260 Libertarian, yeah, yeah.
00:09:35.260 Initially it was, like, Christian conservative, and then it became, like, libertarian conservative.
00:09:38.180 Without—my life kind of took, like, a wacky path, and then as I started working, I kind of had to drop out of school, I was working, and then I got into streaming.
00:09:48.140 And once I started streaming—I had a son—basically around the first year I started streaming—as I started to go through life, and I went from kind of being in this, like, working poor position to making a lot of money, especially through the lens of my child.
00:10:01.840 I saw how different life was when I had more money versus less, and I guess, like, the differences between what was available to me and then my child, as I made more money, while I was really wealthy versus not as wealthy, it kind of started to change the way that I—
00:10:17.840 So you got more attuned to the consequences of inequality?
00:10:20.840 Basically, I would say, yeah, yeah, yeah, yeah.
00:10:23.840 Okay, and so that—okay, how did that lead you to develop more sympathy for left-leaning ideas, particularly?
00:10:31.840 I guess my core beliefs have never really changed, but I think the way that those become applied kind of change.
00:10:39.840 So much the same way that you might think that everybody deserves a shot to go to school and have an education, that might be, like, a core belief where, as a libertarian or conservative, I might think that as long as a school is available, everybody's got the opportunity to go and study.
00:10:54.840 But maybe now as, like, a liberal or progressive or whatever you'd call me, I might say, okay, well, we need to make sure that there is enough, you know, maybe, like, food in the household or household or some kind of funding program to make sure the kid can actually go to school and study, basically.
00:11:06.840 So, like, the core drive is the same, but I think the applied principle ends up changing a bit based on what you think—
00:11:13.840 Right, so is your concern essentially something like the observation that if people are bereft enough of substance, let's say, that it's difficult for them to take advantage of equal opportunities even if they are presented to them, let's say?
00:11:31.840 Yeah, essentially, yeah.
00:11:33.840 And you have some belief, and correct me if I'm wrong, you have some belief that there is room for state intervention at the level of basic provision to make those opportunities more manifest.
00:11:49.180 Yeah, to varying degrees, yeah.
00:11:50.840 Okay, okay.
00:11:51.580 How—okay, so let's start talking more broadly then on the political side.
00:11:56.580 So, how would you characterize the difference, in your opinion, between the left and the conservative political viewpoints?
00:12:06.580 Oof, on a very, very, very broad level, if there's some—I would say if there's some, like, good world that we're all aiming for, I think people on the left seem to think that a collection of taxes from a large population that goes into a government that's able to precisely kind of dole out where that tax money goes, you're basically able to take the problems of society.
00:12:35.580 You're able to scrape off, hopefully, not super significant amount of money from people that are—that can afford to give a lot of money.
00:12:42.380 And then through government programs and redistribution, you target that—that—those taxes, essentially, to people that kind of need whatever bare minimum to take advantage of opportunities in society, yeah.
00:12:52.080 Okay, okay.
00:12:52.620 And then for—on the conservative end, I guess a conservative would generally think that, why would the government take my money?
00:12:59.580 I think from a community point of view, through churches, through community action, through families, we can better allocate our own dollars to our own friends and family to help them and give them the things that they need so that they can better participate in a thriving society, basically.
00:13:10.620 Okay, so one of the things that I've always found a mystery—I mean, I think there's an equal mystery on the left and on the right in this regard—is that the more conservative types tend to be very skeptical of big government.
00:13:24.360 And the leftist types tend to be more skeptical of big corporations, right?
00:13:31.500 Well, you—okay, so following through the logic that you just laid out, you made the suggestion that one of the things that characterizes people on the left is the belief that government can act as an agent of—can and should act as an agent of distribution.
00:13:46.000 Okay, okay, a potential problem for that is the gigantism of the government that does that.
00:13:51.540 Now, the conservatives are skeptical of that gigantism, and likewise, the liberals, the progressives in particular—we'll call them progressives—are skeptical of the reach of gigantic corporations.
00:14:04.660 And I've always seen a commonality in those two, in that both of them are skeptical of gigantism, and so one of the things that I'm concerned about, generally speaking, with regard to the potential for the rise of tyranny is the emergence of giants.
00:14:21.560 And one potential problem with the view that the government can and should act as an agent of redistribution is that there is an incentive put in place—two kinds of incentives.
00:14:33.820 Number one, a major league incentive towards gigantism and tyranny.
00:14:38.080 And number two, an incentive for psychopaths who use compassion to justify their grip on power to take money and to claim that they're doing good.
00:14:48.000 And I see that happening everywhere now in the name of—particularly in the name of compassion.
00:14:53.500 And it's one of the things that's made me very skeptical, in particular, about the left, and at least about the progressive edge of the left.
00:15:01.520 So I'm curious about what you think about those two.
00:15:04.340 First of all, it's a paradox to me that the conservatives and the leftists face off each other with regard to their concern about different forms of gigantism and don't seem to notice that the thing that unites them is some antipathy.
00:15:19.020 This is especially true for the libertarians, some antipathy towards gigantic structures per se.
00:15:25.080 And so then I would say, with regards to your antithesis between liberalism and conservatives, the conservatives are pointing to the fact that there are intermediary forms of distribution that can be utilized to solve the social problems that you're describing that don't bring with them the associated problem of gigantism.
00:15:43.860 And, like, this is a—it's been shocking to me to watch the left, especially in the last six years, ally itself, for example, with pharmaceutical companies, which was something I'd never saw—never thought I would see in my lifetime.
00:15:58.140 I mean, for decades, the only gigantic corporations the left was more skeptical of than the fossil fuel companies were the pharmaceutical companies.
00:16:10.080 And that all seemed to vanish overnight around the COVID time.
00:16:13.540 So I know the story.
00:16:14.800 That's a lot of things to throw at you.
00:16:16.580 But it sort of outlines the territory that we could probably investigate productively.
00:16:22.280 Yeah, so a couple things.
00:16:24.840 I would say that the current political landscape we have, I think, is less—I understand the concept of conservatives supporting corporations and liberals supporting, like, large government.
00:16:35.040 I think today the divide we're starting to see more and more is more of, like, a populist, anti-populist rise or even, like, an institutional or anti-institutional rise.
00:16:43.880 So, for instance, I think conservatives today in the United States are largely characterized with—I would say with populism in that they're supporting, like, certain figures, namely right now Donald Trump, who they think alone can kind of, like, lead them against the corrupt institutions, be them corporate or government.
00:16:59.860 I feel like most conservatives today are not as trustful of big corporations as they were back in, like, the Bush era where we would, you know, conservatives would champion, you know, big corporations.
00:17:08.980 Yeah, I think that's right.
00:17:09.480 That's a strange thing because it makes the modern conservatives a lot more like the 60s leftists.
00:17:16.860 Potentially, yeah.
00:17:18.580 I mean, that brings us into the issue, too, of whether the left-right divide is actually a reasonable way of construing the current political landscape at all, and I'm not sure it is, but—
00:17:27.800 Going online without ExpressVPN is like not paying attention to the safety demonstration on a flight.
00:17:32.880 Most of the time, you'll probably be fine, but what if one day that weird yellow mask drops down from overhead and you have no idea what to do?
00:17:41.120 In our hyper-connected world, your digital privacy isn't just a luxury.
00:17:44.920 It's a fundamental right.
00:17:46.220 Every time you connect to an unsecured network in a cafe, hotel, or airport, you're essentially broadcasting your personal information to anyone with a technical know-how to intercept it.
00:17:55.440 And let's be clear, it doesn't take a genius hacker to do this. With some off-the-shelf hardware, even a tech-savvy teenager could potentially access your passwords, bank logins, and credit card details.
00:18:06.120 Now, you might think, what's the big deal? Who'd want my data anyway?
00:18:09.780 Well, on the dark web, your personal information could fetch up to $1,000.
00:18:14.200 That's right, there's a whole underground economy built on stolen identities.
00:18:18.480 Enter ExpressVPN. It's like a digital fortress, creating an encrypted tunnel between your device and the internet.
00:18:24.500 Their encryption is so robust that it would take a hacker with a supercomputer over a billion years to crack it.
00:18:30.580 But don't let its power fool you. ExpressVPN is incredibly user-friendly.
00:18:34.740 With just one click, you're protected across all your devices.
00:18:37.760 Phones, laptops, tablets, you name it.
00:18:39.940 That's why I use ExpressVPN whenever I'm traveling or working from a coffee shop.
00:18:44.080 It gives me peace of mind knowing that my research, communications, and personal data are shielded from prying eyes.
00:18:49.800 Secure your online data today by visiting expressvpn.com slash jordan.
00:18:54.800 That's E-X-P-R-E-S-S-V-P-N dot com slash jordan, and you can get an extra three months free.
00:19:01.260 ExpressVPN dot com slash jordan.
00:19:03.060 Right now, it kind of is, but only because so many conservatives are following Trump.
00:19:10.400 So, like, your populist, anti-populist thing kind of maps on kind of cleanly to the left and right.
00:19:15.500 It doesn't work with progressives, though, or the far left, because they're also anti-large everything.
00:19:20.480 So, in a surprising way, on very, very far left people, you might find them having a bit more in common with kind of like a MAGA Trump supporter than like a center left liberal.
00:19:29.800 So, for instance, like both of these groups of people on the very far left will be very dovish on foreign policy, probably a little bit more isolationist.
00:19:36.360 They're not a big fan of like a ton of immigration or a ton of trade with other countries.
00:19:40.600 They might think that there's a lot of institutional capture of both government and corporations.
00:19:44.920 So, both all of the MAGA supporters and the far, far left might think that corporations don't have our best interest at heart and the government is corrupt and captured by mobbius.
00:19:53.080 Like, yeah, you'll see a lot of overlap there.
00:19:55.060 Right.
00:19:56.200 I think that sometimes there's a couple things.
00:19:58.820 One, this is something I feel like I've discovered.
00:20:00.860 People have no principles.
00:20:02.120 I think that people are largely guided by whatever is kind of satisfying them or making them feel good at the time.
00:20:07.920 I think that's a really important thing to understand because people's beliefs will seem to change at random.
00:20:12.760 If you're trying to imagine that a belief is coming from some underlying principle or is governed by some internal, you know, like moral or reasonable code or whatever, I think generally there are large social groups and people kind of follow them along from thing to thing, which is why you end up in strange worlds sometimes where, you know, like the position on vaccines and being an anti-vaxxer might have been seen as something, you know, 10 years ago as kind of like a hippie leftist.
00:20:37.380 And now maybe it's more like a conservative or it's associated more with like MAGA Trump supporters or whatever.
00:20:42.760 I think as a result of how the social groups move around.
00:20:46.060 When it comes to the, you mentioned this like gigantism thing.
00:20:49.660 That's another thing where I'm not sure if people actually care about gigantism or if they're using it as a proxy for other things that they don't like.
00:20:56.380 Like I could totally imagine.
00:20:57.640 Well, I care about it.
00:20:58.780 Sure.
00:20:59.040 Yeah, you might.
00:20:59.560 Yeah, sorry.
00:21:00.100 Just in general.
00:21:00.940 That's okay.
00:21:01.340 Because like I could imagine somebody saying that like they don't trust like a large government.
00:21:06.240 They think there's too much, you know, prone to tyranny or something like that, but also be supportive of an institution like the Catholic Church, which is literally, you know, one guy who is a direct line to God.
00:21:13.860 Right, but they can't tax.
00:21:15.600 Well, I mean there's…
00:21:16.200 And they don't have a military.
00:21:17.920 That is…
00:21:18.240 And they can't conscript you.
00:21:19.860 True, yeah.
00:21:20.560 And they can't throw you in jail.
00:21:22.200 That is true, yeah.
00:21:23.080 I mean…
00:21:23.600 Well, those are major.
00:21:24.800 Those are major and significant.
00:21:26.120 I mean, I get the overlap.
00:21:27.620 Don't get me wrong.
00:21:28.320 Sure, but I'm saying like even if you had a local government, like a local, like if you had a state government or a tribe, usually they've got some form of enacting punishment.
00:21:34.320 It'll be sometimes more brutal, but they can throw you in jail.
00:21:37.280 Conscription hasn't existed in the U.S. since the Vietnam War.
00:21:40.500 Yet.
00:21:40.860 I mean, yet.
00:21:41.820 Yeah, true, yes.
00:21:42.660 Yeah, true.
00:21:44.560 So, yeah, I think that I guess when I look at…
00:21:47.680 So, this is…
00:21:48.280 Oh, yeah, yeah, go ahead.
00:21:49.100 Well, let's go back to the redistribution issue.
00:21:51.700 I mean, we pay 65% of our income at, say, upper middle class, middle class to upper middle class level in Canada.
00:22:04.480 It isn't obvious to me at all that that money is well used.
00:22:07.720 In fact, quite the contrary.
00:22:09.640 In my country now, our citizens make 60% of…
00:22:14.300 They produce 60% of what you produce in the U.S.
00:22:16.820 That's plummeted over the last 20 years as state intervention has increased.
00:22:21.400 I'm not convinced that the claim that the interests of people who lack opportunity are best served by state intervention.
00:22:34.320 And there's a couple of reasons for that.
00:22:36.460 I mean, first of all, I'm aware of the relationship between inequality and social problem.
00:22:42.480 So, there's a very well-developed literature on that, and it essentially shows that the more arbitrary…
00:22:49.320 The broader the reach of inequality in a political institution of any given size, the more social unrest.
00:22:57.580 So, where all people are poor, there isn't much social unrest.
00:23:01.160 And where all people are rich, there isn't much social unrest.
00:23:03.700 But when there's a big gap between the two, there's plenty.
00:23:06.400 And that's mostly driven by disaffected young men who aren't very happy that they can't climb the hierarchy.
00:23:12.940 There are barriers in their way.
00:23:15.140 And so, there is reason to ameliorate relative poverty.
00:23:18.600 The problem with that, to some degree, is that most attempts to ameliorate relative poverty tend to increase absolute poverty.
00:23:25.800 And they do it dramatically.
00:23:27.620 And the only solution that we've ever been able to develop to that is something approximating a free market system.
00:23:32.500 I wouldn't call it a capitalist system, because I think that's capture of the terminology by the radical leftists.
00:23:39.580 It's a free exchange system.
00:23:41.020 And the price you pay for a free exchange system is you still have inequality.
00:23:45.500 But the advantage you gain is that the absolute levels of privation plummet.
00:23:49.980 And I think the data on that are…
00:23:51.500 I think they're absolutely conclusive.
00:23:54.520 Especially…
00:23:55.080 And that's been especially demonstrated in the radical decrease in rates of poverty since the collapse of the Soviet Union in 1989.
00:24:02.160 Because we've lifted more people out of poverty in the last four decades than we had in the entire course of human history up to that date.
00:24:08.520 And that's not least because the statist interventionist types who argued for a radical state-sponsored redistribution lost the Cold War.
00:24:18.540 Right?
00:24:19.000 And that freed up Africa to some degree, and certainly the Southeast Asian countries, to pursue something like a free trade economy.
00:24:26.740 And that instantly made them rich.
00:24:30.280 Even China.
00:24:32.040 So, well, so that's an argument, let's say, on the side of free exchange.
00:24:37.720 But it's also an argument, a two-fold argument, pointing out how we ameliorate absolute poverty, which should be a concern for leftists, but doesn't seem to be anymore, by the way.
00:24:47.380 And also an argument for the maintenance of a necessary inequality.
00:24:52.640 Like, I'm not sure that inequality can be decreased beyond a certain degree without that decrease causing other serious problems.
00:25:00.100 And we can talk about that, but it's a complicated problem.
00:25:03.900 Yeah.
00:25:04.260 For one point of clarification, when you say leftist, what do you mean by that?
00:25:08.000 Well, I was going with your definition.
00:25:11.280 Like, essentially, the core idea being something like the central problem being one of relative inequality and distribution of resources, and the central solution to that being something like state-sponsored economic intervention.
00:25:28.000 I mean, there's other ways we could define left and right, and we can do that.
00:25:31.520 But I'll stick with the one that you brought forward to begin with.
00:25:34.840 Gotcha, gotcha.
00:25:35.460 Okay.
00:25:35.680 I only want to be clear on that because people get mad if I call myself a leftist.
00:25:41.440 Oftentimes, online or in—especially in Europe or worldwide, leftists will refer exclusively to, like, socialists or communists.
00:25:49.820 And anybody to the right of that would be considered, like, a liberal.
00:25:52.740 No, usually a fascist.
00:25:54.540 Well, depending on who you're talking about.
00:25:56.200 Very rapidly.
00:25:57.080 Yeah, yeah.
00:25:57.760 I just wanted to be clear on that.
00:25:59.240 So, I'm absolutely a pro-capitalist, pro-free market guy.
00:26:02.940 I'm never going to—
00:26:03.740 Okay, okay.
00:26:04.420 Yeah, yeah.
00:26:05.080 Okay.
00:26:05.620 Okay.
00:26:05.920 Well, that's good.
00:26:06.700 It's good to get that clear.
00:26:07.660 Why?
00:26:07.940 Yeah.
00:26:09.360 Because I would argue that when you look at, like, the fall of the Soviet Union or you look
00:26:13.560 at the failure of, like, socialist or communist regimes, I don't know if the issue there was
00:26:18.360 so much redistribution.
00:26:19.560 I think the problem—
00:26:20.240 That was one of many issues.
00:26:22.140 I don't think it was an issue at all, actually, I would say.
00:26:23.760 I think the issue was command of communists.
00:26:25.800 Wait a minute.
00:26:26.180 Wait a minute.
00:26:26.300 Yeah, go ahead.
00:26:26.820 Wait a minute.
00:26:27.360 What do you mean redistribution wasn't an issue?
00:26:29.480 What the hell do you think they did to the kulaks?
00:26:31.360 That was forced redistribution.
00:26:33.640 It resulted in the death of six million people.
00:26:36.160 So maybe I'm not understanding what you mean, but that was redistribution at its, like,
00:26:41.620 pinnacle.
00:26:42.160 Sure.
00:26:42.340 And forced redistribution.
00:26:43.880 It was brutal.
00:26:44.820 When I think of the strengths of capitalism, the ability for markets to dynamically respond
00:26:51.840 to shifting consumer demand is, like, the reason why capitalism and free market economies
00:26:56.260 dominate the world.
00:26:57.180 When you've got socialist or communist systems command economies where a government is trying
00:27:01.900 to say, this is how much this is going to cost, this is how much you're going to
00:27:04.500 produce and make, this is a failed way of managing a state economy.
00:27:08.460 Even in places where they still do it, there are always shadow economies and stuff.
00:27:11.580 There were in the Soviet Union that prop up where people try to basically ameliorate the
00:27:16.060 conditions that are resulting from said horrible command economy practices.
00:27:20.020 So I guess in a way you could argue a command economy is kind of like redistribution.
00:27:23.920 It's a form of it.
00:27:24.880 No, it's a worse problem.
00:27:26.180 If you're pointing to the fact that that's a worse problem, I'm with you 100%.
00:27:30.120 I would say that's definitely the reason why these places failed, because they just weren't
00:27:34.160 able to respond to changing conditions.
00:27:35.480 Okay, so what's the difference between a state that attempts to redistribute to foster
00:27:43.380 equality of opportunity and a command economy?
00:27:46.600 Is it a difference of a degree?
00:27:48.780 Like, are you looking at models, let's say, like the Scandinavian countries?
00:27:52.580 I wouldn't use Canada, by the way, because Canada is now, what would you call, predicted
00:27:58.880 by economic analysis analysts to have the worst performing economy for the next four
00:28:04.700 decades of all the developed world.
00:28:07.000 So maybe we'll just leave the example of Canada off the table.
00:28:10.460 Scandinavian countries are often the polities that are pointed to by, I would say, by people
00:28:16.340 who, at least in part, are putting forward a view of redistribution for purposes of equality
00:28:22.100 of opportunity, like you are.
00:28:23.480 But they're a strange analogy, because they're very small countries, and up till now, they
00:28:28.240 were very ethnically homogenous.
00:28:30.760 Exactly.
00:28:31.200 And that makes a big difference when you're trying to flatten out the redistribution.
00:28:35.560 Plus, they are also incredibly wealthy, which makes redistribution, let's say, a lot
00:28:39.960 easier.
00:28:40.340 So why doesn't a government that's bent on redistribution fall prey to the pitfalls of
00:28:50.440 command economy, and forced redistribution, for that matter?
00:28:54.240 How do you protect against that?
00:28:56.260 I think what you have to do, which is very, very, very difficult, is people get very ideologically
00:29:00.340 captured by both ends, and they feel very, I guess, like committed, or they feel very
00:29:04.560 allegiant to pushing certain forms of economic organization.
00:29:07.840 And I think sometimes it blinds them to some of the benefits of what exists when you incorporate
00:29:12.880 kind of multiple models, or, I mean, you'd call them mixed economies, which is really
00:29:16.220 what every capitalist economy today is.
00:29:18.040 It's some form of free market capitalism, combined with some form of, like, government
00:29:21.860 intervention to control for negative externalities.
00:29:24.180 These are the ways that all economies, even in Scandinavia and the world, work.
00:29:27.800 And I think that recognizing the benefits of both systems are the best way to make things
00:29:32.300 work, yeah.
00:29:32.700 Fair enough.
00:29:33.400 And the Scandinavian countries seem to have done a pretty good job of that.
00:29:36.860 But, like I said, they have a simpler problem to solve, let's say, than the Americans have.
00:29:41.540 Negative externalities.
00:29:43.120 That's a, you know, that's an interesting rabbit hole to wander down, because the problem I have
00:29:48.120 with negative externalities, you made a case already that, and again, correct me if I've
00:29:53.480 got this wrong, but I think that I understood what you said.
00:29:58.480 A free market, free exchange economy is a gigantic distributed computational device.
00:30:05.320 Basically, yeah.
00:30:05.900 Right, exactly.
00:30:06.860 Which, funnily enough, one of the big problems for our command economies is called the computation
00:30:09.780 problem, because no central body can actually compute, you know, the incidence of...
00:30:13.520 Right, exactly.
00:30:14.460 Right.
00:30:15.060 That's not, yeah, that's a fatal problem, right?
00:30:18.700 Because it doesn't have the computational power.
00:30:20.700 It certainly doesn't have the speed of data recognition.
00:30:24.080 It doesn't have the on-the-ground agents, if all of the perception and decision-making
00:30:29.240 is centralized, right?
00:30:30.340 It's way too low resolution.
00:30:31.660 It's going to crash.
00:30:32.480 Okay, so, and I think that that's comprehensible technically, as well as ideologically.
00:30:37.740 All right, so, but having said that, with regards to externalities, all the externalities
00:30:45.880 that a market economy can't compute are so complex that they can't be determined centrally
00:30:54.280 by the same argument, and so...
00:30:56.900 There are ways to account for them, though.
00:30:58.880 Really?
00:30:59.360 That work with...
00:30:59.920 Tell me how.
00:31:01.080 Because I can't see that, because I can't see how that they can be accounted for without
00:31:06.960 this same computational problem immediately arising.
00:31:10.720 Yeah, and I understand that, and I think that's a problem sometimes of people very far on
00:31:14.820 the left when they want to deal with certain problems.
00:31:17.100 I think that they want to bring, like, heavy-handed, you know, like, things like price controls
00:31:20.760 in to say, well, we need less of this, so let's just make this cost this particular thing,
00:31:24.720 which, ironically enough, introduces a whole other set of externalities that will happen
00:31:28.300 when you get a lot of friction between where your price floor or ceiling is set compared
00:31:31.300 to what a market was set at.
00:31:32.620 But, ideally, if you're a reasonable person and you view economies as mixed economies, what
00:31:37.040 you try to do is you try to take these externalities, meaning things that aren't accounted
00:31:40.560 for with your primary system.
00:31:41.960 So, in a capitalist system, an externality might be something that causes a negative
00:31:44.940 effect, but it doesn't cost you any money.
00:31:46.720 Pollution would be a good example of that.
00:31:48.580 And rather than saying, like, well, no company can pollute this much, or, you know, if you're
00:31:52.480 a company, you have to use these things because the other things are making too much pollution,
00:31:56.400 all you do is you say, okay, well, if we've determined that, say, carbon is bad for the
00:31:59.740 atmosphere, well, we're just going to attach a little price to that.
00:32:02.660 The government is going to say that, yeah, if you pollute this much, here's the price,
00:32:05.320 and then if you want to pay for it, you can.
00:32:06.520 But that type of intervention in the economy basically allows the free market to hopefully
00:32:11.540 do its job because the government is tacked on a little bit of a price and then it tries
00:32:14.420 to account for the cost of that externality, yeah.
00:32:16.280 Great.
00:32:16.760 That's a great example.
00:32:17.940 We can go right down that rabbit hole.
00:32:19.940 Carbon.
00:32:21.000 Okay.
00:32:21.400 So, first of all, one of the things I've seen, you tell me what you think about this,
00:32:26.500 something that I've seen that actually shocks me that I was interested in watching over the
00:32:31.600 last five or six years, I wondered what would happen when the left, the progressives, ran
00:32:37.100 into a conundrum.
00:32:38.840 And the conundrum is quite straightforward.
00:32:41.180 If you pursue carbon pricing and you make energy more expensive, then you hurt the poor.
00:32:46.520 And I don't think you just hurt them.
00:32:48.460 In fact, I know you don't, you just don't hurt them.
00:32:50.500 I heard a man two days ago who's fed 350 million people in the course of his life, heading the
00:32:58.260 UN's largest relief agency, make the claim quite straightforwardly that misappropriation
00:33:05.880 on the part of interventionist governments increased the rate of absolute privation
00:33:11.260 dramatically in the world over the last four or five years.
00:33:14.840 And that has happened not least because of carbon pricing, not just carbon pricing, but
00:33:20.100 the insistence that carbon per se is an externality that we should control.
00:33:25.020 Now, Germany's paid a radical price for that, for example.
00:33:27.620 So their power is now about five times as expensive as it could be.
00:33:31.380 And they pollute more per unit of power than they did 10 years ago before they introduced
00:33:35.920 these policies that were hypothetically there to account for externality.
00:33:40.340 And the externality was carbon dioxide.
00:33:42.500 I don't think that's a computable externality.
00:33:44.760 And I don't think there's any evidence whatsoever that it's actually an externality that we should
00:33:49.560 be warping the economic system to ameliorate if the cost of that, and it will be, will be
00:33:56.060 an increase in absolute privation among the world's poor.
00:33:58.820 So, and here's an additional argument on that front with regards to externalities.
00:34:04.360 You get that wrong.
00:34:05.420 And here's something you could get right instead.
00:34:07.880 If you ameliorate absolute poverty among the world's one billion poorest, they take a longer
00:34:13.940 view of the future.
00:34:15.360 And that means they become environmentally aware.
00:34:17.820 And so the fastest route to a sustainable planet could well be the remediation of absolute
00:34:24.100 poverty.
00:34:24.680 And the best route to that is cheap energy.
00:34:27.080 And we're interfering with the development of cheap energy by meddling with the hypothetically
00:34:33.460 detrimental externality of carbon dioxide.
00:34:35.860 And so it's, I think this is a complete bloody travesty, by the way, we are putting the lives
00:34:43.220 of hundreds of millions of people directly at risk right now to hypothetically save people
00:34:50.320 in the future, depending on the accuracy of our projections.
00:34:53.560 A hundred years out in these, these interventionists, these people who are remediating externalities,
00:34:59.680 they actually believe that they can calculate an economic projection one century out.
00:35:05.880 That's utterly delusional.
00:35:08.100 So, okay.
00:35:09.100 So just as a, to be clear on the first thing, I was just giving an example of how you can
00:35:12.560 use like a government intervention to make a free market track something, which, which
00:35:16.460 is what cap and trade or like carbon taxes would do.
00:35:19.640 I wasn't necessarily speaking to the strength of that individual thing, but.
00:35:22.500 Yeah, but that's a good thing to focus on.
00:35:24.360 Yeah, we can focus on that as well.
00:35:25.520 That's a major externality.
00:35:26.380 We can focus on that as well.
00:35:27.160 So, um, the first thing, uh, this is going to sound mean, uh, but I'm, you know, I'm
00:35:31.980 very realistic.
00:35:32.960 Uh, there needs to be a better argument than just it disproportionately impacts the poor.
00:35:37.360 That's not always.
00:35:38.340 That's a classic leftist argument.
00:35:39.220 It might be, but.
00:35:40.060 Right, but it's the same argument you made to justify your swing to the left at the beginning
00:35:44.000 of our discussion.
00:35:45.080 You said that you were looking at economic inequalities that disproportionately affected the poor.
00:35:50.220 So I can't see why, and I'm not trying to be mean about this either.
00:35:54.760 I can't see why you could base your argument that it was moral, it was morally appropriate
00:36:00.260 for you to swing to the left from your previous position because you saw disproportionate
00:36:04.460 effects on the poor.
00:36:05.420 And I can't use that argument in the situation that I'm presenting it right now.
00:36:10.260 Starting a business can be tough, but thanks to Shopify, running your online storefront is
00:36:14.860 easier than ever.
00:36:16.280 Shopify is the global commerce platform that helps you sell at every stage of your business
00:36:20.380 from the launch your online shop stage, all the way to the, did we just hit a million
00:36:24.240 orders stage?
00:36:25.460 Shopify is here to help you grow.
00:36:27.740 Our marketing team uses Shopify every day to sell our merchandise, and we love how easy
00:36:31.800 it is to add more items, ship products, and track conversions.
00:36:35.620 With Shopify, customize your online store to your style with flexible templates and powerful
00:36:40.180 tools, alongside an endless list of integrations and third-party apps like on-demand printing,
00:36:45.420 accounting, and chatbots.
00:36:46.660 Shopify helps you turn browsers into buyers with the internet's best converting checkout,
00:36:51.100 up to 36% better compared to other leading e-commerce platforms.
00:36:55.520 No matter how big you want to grow, Shopify gives you everything you need to take control
00:36:59.340 and take your business to the next level.
00:37:01.900 Sign up for a $1 per month trial period at shopify.com slash jbp, all lowercase.
00:37:07.880 Go to shopify.com slash jbp now to grow your business no matter what stage you're in.
00:37:13.200 That's shopify.com slash jbp.
00:37:16.660 Well, because it depends on if we think it's a condition that ought to be remedied or not.
00:37:21.340 For instance, if I walk around and I see homeless people, and I'm like, man, this is
00:37:24.840 really sad.
00:37:25.460 We ought to spend more money on homeless people because it seems like they're disproportionately
00:37:29.040 affected by their living conditions.
00:37:30.400 And then somebody says, oh, well, do you think we should still lock up rapists and murderers?
00:37:34.240 Aren't they disproportionately poor?
00:37:36.100 I'd probably say, well, yeah, we probably should.
00:37:37.440 And I go, well, isn't that hypocritical?
00:37:38.440 Well, no.
00:37:38.780 I think that rapists and murderers should probably be in jail, but we can also help the
00:37:42.600 homeless at the same time.
00:37:43.520 I think that just helping the poor isn't an argument like a blank check to do every
00:37:49.300 possible thing to satisfy poorer people.
00:37:52.160 Right.
00:37:52.420 I agree.
00:37:52.980 And it's going to depend on issue to issue.
00:37:53.680 Yeah, that's fine.
00:37:54.560 So, like, for instance, I think—
00:37:55.300 But that's because the poor—everyone who's poor is not a victim.
00:37:59.120 Some people who are poor are psychopathic perpetrators.
00:38:01.920 Sure.
00:38:02.160 And it's very useful to distinguish them.
00:38:03.820 But I was making a much more specific argument.
00:38:06.520 My argument was that the fastest way out of absolute privation for the world's bottom
00:38:11.040 billion people is through cheap energy.
00:38:13.520 Yeah, I understand what you're saying, though.
00:38:14.740 So I just worked my way towards that.
00:38:16.260 Yeah.
00:38:16.640 Yeah, I just want to say that just because something targets the poor is not necessarily
00:38:19.060 an argument against it.
00:38:20.440 It depends on how hard it targets them, and it depends on whether mass starvation is the
00:38:24.480 outcome.
00:38:25.020 The outcome is important.
00:38:26.200 That I agree with.
00:38:26.820 So, for instance, like a sin tax.
00:38:28.320 The outcome will be mass starvation in this situation.
00:38:31.420 Yeah, I'm getting to it, okay?
00:38:32.900 Sin taxes on, like, cigarettes and alcohol are always going to disproportionately impact
00:38:36.440 the poor.
00:38:37.040 Or even sugar, we might say, right?
00:38:38.560 But just because that disproportionately impacts the poor, is that a good thing or a bad thing?
00:38:41.800 These are probably the people that suffer the most from those particular afflictions.
00:38:45.540 Right, right.
00:38:46.080 So, a tax that targets—
00:38:46.920 And that is an immediate versus delayed issue, too, right?
00:38:50.080 Because the reason—
00:38:50.940 Well, is it immediate?
00:38:51.720 I mean, obesity is an immediate.
00:38:52.960 I don't think alcoholism is—
00:38:53.360 I mean, the reason for the tax is to stop people from pursuing a certain form of short-term
00:38:59.880 gratification at the cost of their longer-term well-being.
00:39:03.280 Correct.
00:39:03.700 And that exact same idea, if you believe climate models, or if you believe that we're heading
00:39:08.720 in a certain direction in terms of climate, the overall warming of the planet, would be
00:39:12.740 the same argument you would make for climate change.
00:39:15.360 Only if you believe that you could model economic development 100 years into the future.
00:39:19.620 Well, we're not trying to model—we're more concerned with modeling climate development,
00:39:22.480 economic development.
00:39:23.180 Yes, absolutely.
00:39:23.900 We are equally—
00:39:24.420 No, well, okay, tell me how I'm wrong.
00:39:26.640 I don't believe that, because what I see happening is two things.
00:39:30.120 We have climate models that purport to explain what's going to happen over a century on the
00:39:34.980 climate side, but we have economic models layered right on top of those that claim that
00:39:39.600 there's going to be various forms of disaster for human beings economically as a consequence
00:39:44.600 of that climate change.
00:39:46.220 And so, that's like two towers of Babel stacked on top of one another.
00:39:49.500 And so, because if people were just saying, oh, the climate's going to change, there'd
00:39:54.040 be no moral impetus in that.
00:39:55.620 It's the climate's going to change, and that's going to be disastrous for the biosphere and
00:40:00.120 for humanity.
00:40:01.060 But that's an economic argument as well as a climate-based argument.
00:40:04.900 It's both.
00:40:06.060 But the worst projections of what would happen if the climate took a disastrous turn are
00:40:11.380 worse than the worst projections of what is our planet going to look like economically
00:40:14.540 if we hardcore police fossil fuels.
00:40:16.760 Why would you—okay, but I don't understand the distinction between the models.
00:40:21.460 Well, the argument would be that whatever pain and suffering poor people might endure
00:40:25.780 right now because of a move towards green energy, that pain and suffering is going to
00:40:29.320 be short-term and far less than the long-term pain and suffering that comes along with—
00:40:31.720 Right, but that's dependent on the integrity of the economic models.
00:40:34.780 And the climate models as well, right?
00:40:37.260 Exactly, but in exactly the stacked manner that I described.
00:40:41.120 Like, there's nobody in 1890 who could have predicted what was going to happen in 1990
00:40:45.760 economically.
00:40:46.680 Not a bit.
00:40:48.440 Not a bit.
00:40:49.700 And if we think we can predict, like, 50 years out now with the current rate of technology
00:40:55.120 and calculate the potential impact of climate change on economic flourishing for human beings,
00:41:00.780 we're deluded.
00:41:01.880 No one can do that.
00:41:02.920 And then—and so—and it's worse—so imagine that as you do that and you project
00:41:07.740 outward, your margin of error increases.
00:41:10.080 That's absolutely, definitely the case.
00:41:13.360 And at some point, you're—certainly on the climate side, the margin of error gets rapidly
00:41:17.560 to the point where it subsumes any estimate of the degree to which the climate is going
00:41:21.560 to transform.
00:41:22.300 And that happens even more rapidly on the economic side.
00:41:25.020 Potentially.
00:41:25.560 But right now, I think right now, this is a disagreement on the fact of the matter, though,
00:41:28.400 not the philosophy of what we're talking about in terms of controlling externalities.
00:41:31.420 If we think—so I'm curious.
00:41:33.520 Let's say that we think we can accurately predict the climate and the economic impact,
00:41:37.000 and we think that the climate impact would be far worse if we don't account for that,
00:41:40.380 both in terms of human conditions and—
00:41:42.540 I don't believe any of those presumptions.
00:41:44.180 I think they're both false.
00:41:44.780 But then if you don't—but I mean, like, obviously, if I agreed with that factual analysis,
00:41:48.920 I would probably agree with you on the prescription here, too, right?
00:41:52.340 And if I thought, like, none of the climate models were accurate or couldn't accurately
00:41:54.600 predict anything, then I'd also say, why make any—
00:41:56.200 Well, they're not sufficiently accurate.
00:41:58.820 That's the first thing.
00:41:59.840 And because they have a margin of error, and it's a large margin of error,
00:42:03.260 they don't even model cloud coverage well.
00:42:05.520 That's a big problem.
00:42:06.480 They don't have the resolution—they don't have nearly the resolution
00:42:09.720 to produce the accuracy that's claimed by the climate apocalypse mongers.
00:42:14.080 People keep saying that, but we just got another one of the hottest years on record.
00:42:19.040 How many times are we going to have another hottest year on record?
00:42:22.520 How many times are we going to have an increase of carbon dioxide concentration in the atmosphere
00:42:25.140 before we're finally like, okay—
00:42:26.360 I don't know. And the reason I don't know is because it depends—the scientific answer
00:42:31.600 to that question depends precisely on the time frame over which you evaluate the climate
00:42:36.160 fluctuation. And that's actually an intractable scientific problem. So you might say, well,
00:42:41.260 if you take the last hundred years, this variation looks pretty dismal. And I'd say, well, what if you took
00:42:46.860 the last 150,000 years, or the last 10,000, or the last 10 million? You can't specify the damn
00:42:53.520 time frame of analysis.
00:42:54.100 No, no, no. The time frame is incredibly important. That would be like saying, look at your—you know,
00:42:58.580 let's say somebody developed cancer, and they didn't realize it, and the person has lost,
00:43:01.940 you know, 40 or 50 pounds in the past six months. And I'm just like, you look very sickly. And you're
00:43:06.820 like, okay, well, look at my weight fluctuation over the past 10 years. You say, well, that doesn't
00:43:10.000 really matter. What matters—
00:43:10.660 I'm not saying the time frame isn't important. I'm saying that it is important. I'm just saying
00:43:15.180 I don't know how to specify it.
00:43:16.780 Well, you would probably specify it with the beginning of the industrial age, right?
00:43:19.380 Why?
00:43:19.640 Because that's when carbon dioxide, which is a gas that's trapping more heat on the planet—
00:43:23.740 Why is that relevant to the time over which you compute the variability?
00:43:27.620 Because it seems like as carbon dioxide has increased in the atmosphere, the surface
00:43:31.300 temperatures have risen at a rate that is a departure from what we'd expect over 150,000
00:43:35.260 year cycles of temperature variations on the planet.
00:43:37.840 No, not with that time frame. That's just not the case.
00:43:40.340 That's absolutely the case.
00:43:41.300 No, what do you mean? You just flipped to 150,000-year time span.
00:43:45.620 What I'm saying is that if we expect to see a temperature do this in 150,000-year time span,
00:43:50.920 in 100-year time span seeing it do this, that's very worrying.
00:43:54.000 You mean like Michael Mann's hockey stick, the one that's under attack right now in court
00:43:57.920 by a major statistician who claimed that he falsified his data. You mean that spike?
00:44:03.220 I'm talking about the record temperatures that have been declared for the past five years
00:44:09.060 that have also increased with the concentration of parts per million of carbon dioxide in
00:44:12.880 the atmosphere. I mean, I'm not going to tell you that every model is perfect.
00:44:17.320 They're not perfect.
00:44:18.320 Sure, but right now we're like standing in traffic with our eyes closed saying the car
00:44:22.220 hasn't hit me yet, so I don't think there's any coming. I think it's pretty undeniable
00:44:25.400 at this point that there is an impact on climate across the planet.
00:44:28.560 I think that's highly deniable. We have no idea what the impact is from. We don't know where the
00:44:33.520 carbon dioxide is from. We can't measure the warming of the oceans. We have terrible temperature
00:44:38.180 records going back 100 years. Almost all the terrestrial temperature detection sites were
00:44:46.360 first put outside urban areas.
00:44:48.600 And then you have to correct for the movement of the urban areas. And then you introduce an error
00:44:55.700 parameter that's larger than the purported increase in temperature that you're planning
00:45:00.120 to measure. This isn't data. This is guess. And there's something weird underneath it.
00:45:05.560 There's something weird that isn't oriented well towards human beings underneath it.
00:45:09.880 It has this guise of compassion. Oh, we're going to save the poor in the future. It's like,
00:45:14.200 that's what the bloody communists said. And they killed a lot of people doing it. And we're walking
00:45:18.820 down that same road now with this insistence that, you know, we're so compassionate that we care
00:45:24.040 about the poor a hundred years from now. And if we have to wipe out several hundred million of them
00:45:29.160 now, well, that's a small price to pay for the future utopia. And we've heard that sort of thing
00:45:34.500 before. And the alternative to that is for, is to stop having global level elites plot out a utopian
00:45:41.780 future or even an anti-dystopian future. And that's exactly what's happening now with
00:45:47.760 organizations like the WEF. And if this wasn't immediately impacting the poor in a devastating
00:45:54.320 manner, I wouldn't care about it that much, but it is. You know, I watched over the course of the
00:45:59.620 last five years, the estimates of the number of people who were in serious danger of food privation
00:46:04.600 rise from about a hundred million to about 350 million. That's a major price to pay for a little
00:46:10.860 bit of what, what would you say for, for progress on the climate front that's so narrow, it can't even
00:46:16.880 be measured. I don't think the increase in hungry people on the, in the planet is because of climate
00:46:21.340 policies. Why not? Because, because I don't think that countries in Africa are being pushed away
00:46:26.460 from fossil fuels. I mean, most developing nations- Of course they are. They can't even get, they can't
00:46:29.600 even get loans from the World Bank to pursue fossil fuel development. And there's plenty of African
00:46:35.680 leaders who are screeching at the top of their lungs about that because the elites in the West have
00:46:40.380 decided that, well, it was okay for us to use fossil fuel for, so that we wouldn't have to starve to death.
00:46:45.360 And our children had some opportunities, but maybe the starving masses that are too large a load for
00:46:50.880 the world anyways, shouldn't have that opportunity. And that's, that's direct policy from the UN
00:46:56.320 fostered by organizations like the WEF. They're going to have to turn to renewables. Yeah. Well,
00:47:03.100 good luck with that because renewables have no energy density. Besides that, they're not renewable
00:47:08.140 and they're not environmentally friendly. And then one more thing, there's one more weird thing
00:47:12.820 underneath all of this. Okay. Well, let's say if carbon dioxide was actually your bugbear and it
00:47:17.800 was genuine, well, then why wouldn't the Greens, for example, in Africa, the progressives be agitating
00:47:23.500 to expand the use of nuclear energy, especially because Germany has to import it anyways, especially
00:47:30.320 because France has demonstrated that it's possible. We could drive down the cost of energy with low cost
00:47:35.560 nuclear and there'd be no carbon production. And then the poor people would have something to eat
00:47:39.780 because they'd have enough energy and that isn't what's happening. And that's one of the things
00:47:43.660 that makes me extremely skeptical of the entire narrative. It's like two things. The left will
00:47:50.080 sacrifice the poor to save the planet and the left will de-industrialize even at the nuclear level,
00:47:56.100 despite the fact that it devastates the poor. And that's even worse because if you devastate the
00:48:02.320 poor and you force them into a short-term orientation in any given country where starvation beckons,
00:48:08.840 for example, they will cut down all the trees and they will kill down all the animals and they
00:48:13.380 will destroy the ecosphere. And so even by the standards of the people who are pushing the carbon
00:48:19.760 dioxide externality control, all the consequences of that doctrine appear to me to be devastating,
00:48:26.580 even by their own measurement principles. We're trying to fix the environment. Well, boys and girls,
00:48:33.780 doesn't look like it's working. All you've managed to do is make energy five times as expensive and
00:48:38.820 more polluting. You were wrong. That didn't work. And so, and I can't understand. You can help me.
00:48:45.620 That's why you're here today talking to me. I can't understand how the left can support this.
00:48:50.060 Just one quick thing. Let's say that everything you've said is true. What do you think is the
00:48:53.720 plan then? What is the goal? What is the drive? Like why push, why push obviously horrible ideas for
00:48:59.320 the planet and the poor? That's a good question. That's a good question. Well, because you're
00:49:03.820 positing it, right? So what do you think is the driver goal? Well, I listened to what people say.
00:49:09.560 Here's the most terrible thing they say. There are too many people on the planet.
00:49:15.440 Okay. So who says that? I've heard people say that for 30 years, perfectly ordinary,
00:49:21.100 compassionate people. Well, there's too many people on the planet. And I think, well, for me,
00:49:25.520 that's like hearing Satan himself take possession of their spine and move their mouth. It's like,
00:49:31.760 okay, who are these excess people that you're so concerned about? And exactly who has to go? And
00:49:38.300 when? And why? And how? And who's going to make that decision? And even if you don't, even if you're
00:49:44.300 not consciously aiming at that, you are the one who uttered the words. You're the ones who muttered the
00:49:50.380 phrase. What makes you think that the thing that possessed you to make you utter that words?
00:49:55.300 Isn't aiming at exactly what you just declared. And so that's, you know, that's a terrible vision.
00:50:01.500 But when you look at what happens in genocidal societies, and they emerge fairly with fair
00:50:07.080 regularity, and usually with a utopian vision at hand, the consequence is the mass destruction of
00:50:13.380 millions of people. So why should I assume that something horrible isn't lurking like that right
00:50:18.660 now? Especially given that we have pushed a few hundred million of people back into absolute
00:50:24.920 poverty when we were doing a pretty damn good job of getting rid of that. I just don't understand
00:50:30.500 what's happening in Germany or in the UK. Like, it's insane. Like, look, man, if they would have
00:50:37.340 got rid of the nuclear plants and made energy five times as expensive, and the consequence would have
00:50:43.400 been they weren't burning lignite coal as a backup, and their unit production of energy of pollution per
00:50:50.040 unit of energy had plummeted, you could say, well, look, you know, we hurt a lot of poor people,
00:50:54.920 but at least the air is cleaner. It's like, no, air's worse, and everyone's poorer. So like,
00:51:01.420 explain to me how the hell the left can be anti-nuclear. I don't understand it at all.
00:51:07.360 Gotcha. All right. This is something that I brought up earlier that is concerning to me.
00:51:12.960 I feel like when people get political beliefs, I feel like what happens is, what we think happens,
00:51:18.720 what we hope happens, is you have some moral or philosophical underpinning, and then from there,
00:51:24.020 you combine this with some epistemic understanding of the world, and then you combine these two things,
00:51:28.420 you engage in some form of analysis, and through your moral view...
00:51:31.260 Yeah, it'd be nice if that was true.
00:51:32.280 Yeah, you start to apply, like, prescriptions. So maybe I'm religious, maybe I analyze society,
00:51:38.480 and I see that particular TV shows lead to premarital sex, so my societal prescription
00:51:43.420 is we should ban these TV shows, right? Ideally, this is how you would imagine this process works.
00:51:47.720 What I find happens, unfortunately, all too often is what people do is they join social groups,
00:51:52.200 and then with those social groups, they inherit something that I call, like, a constellation of
00:51:55.860 beliefs. And this constellation of beliefs, instead of rationally building on each of these,
00:52:00.560 you basically get this, like, Jenga tower that is, like, floating over a table, and every block is,
00:52:07.160 like, supporting itself, and no real part of the tower can be addressed, because you pull out one
00:52:11.780 piece, it all falls apart. So people become, like, very stuck in all of this combined constellation
00:52:17.500 stuff, and none of it is really given, like, any analysis, and you can't really push anybody from
00:52:22.860 one way or another, in terms of, like, re-evaluating any of the beliefs that are part of this
00:52:27.200 constellation. I wish I would have. That's good. That's fine. That's right.
00:52:34.380 That's good. That's fine. That's right. Well, you know, there are models now of cognitive
00:52:40.380 processing, belief system processing, that make the technical claim that what a belief system does
00:52:49.060 is constrain entropy. Sure. That's not surprising at all to me. Yeah. Okay. So, and now, now the signal
00:52:54.680 for, for released entropy, which would be a consequence of, say, violated fundamental beliefs,
00:53:01.100 is a radical increase in anxiety, right? And a decrease in the possibility of positive emotion.
00:53:07.000 And so people will struggle very hard against that, which is exactly the phenomena that you're
00:53:11.440 describing. Yeah. Okay. I agree with what you said. So here's, here's my, yeah, so I'm not sure what
00:53:17.120 it's relevant to the issue I was pursuing. I'm getting it. I'm getting it. Okay, fine. Yeah.
00:53:20.540 Here's, here's my issue. Okay. So when I'm trying to evaluate a situation, I like to think that I
00:53:26.680 have some, I've got some insulation from the effects of what liberals think or what conservatives
00:53:31.320 think is because on my platform, I don't necessarily have an allegiance to a particular political
00:53:35.400 ideology. Like right now I'm like center left to progressive, but I break really hard from
00:53:39.400 progressive on certain issues. I think Kyle Rittenhouse is in the right. I think basically everything
00:53:42.680 you guys are doing with indigenous people is insane. Uh, including the complete mass grave
00:53:47.360 hoax. Uh, I think that I'm a big supporter of the second amendment. Uh, I have beliefs where I can
00:53:51.600 break from my side, you know, pretty hardcore because I am not like a legion to certain political
00:53:56.660 ideology. One thing that worries me with this constellation beliefs thing is that sometimes
00:54:00.260 when it comes to evaluating a particular policy or a particular problem, I feel like it's part of the
00:54:04.540 constellation. And sometimes it inhibits people from like taking a step back and reasonably thinking
00:54:08.720 about the issue. So when we're talking about climate change, you mentioned the WEF sacrificing
00:54:13.660 tons of people, the UN global elites, uh, five times energy costs in Germany, uh, genocidal people.
00:54:21.420 I feel like this is part of like a whole thing where it's like, okay, well, let's take a quick step back
00:54:26.160 and let's just like think rationally about this particular issue for one moment. Okay.
00:54:29.800 Well, you asked me what the motivation for anti-poor policies might be. So that's why I was trying to push that out.
00:54:34.140 Well, I did, but, but I got all of those things before I even asked that question. Um, because I
00:54:38.600 think it's totally possible that somebody might say, okay, well, when you put carbon dioxide in
00:54:42.580 the atmosphere, it seems to cause an increase in surface temperatures. This has been happening from
00:54:47.140 about the 1800s. And as we started to track surface temperatures, whether the thermometer is on top of
00:54:51.440 the Empire State Building or in the middle of the field, it seems like there's an average rise in
00:54:55.160 temperatures and people all around the world are observing this in some places more than others. If you live
00:54:59.300 in Seattle and 20 years ago, your apartment building wasn't built with air conditioner units.
00:55:02.540 You feel that now. If you live in place in London and you've never had an air conditioner before,
00:55:06.260 now that's not acceptable. I think that people on the ground can see that there are changes.
00:55:09.460 And I think that scientists, when they look in labs can see changes. It might be that some models
00:55:12.980 aren't precise enough. And it might be that for reasons we don't even understand.
00:55:16.220 Well, the economic models certainly aren't precise enough.
00:55:18.780 Sure. Maybe, maybe that might be true.
00:55:20.300 Not maybe. They can't even use them to predict the price of a single stock for six months.
00:55:24.920 The economic models are not sufficiently accurate to calculate out the consequences of climate
00:55:29.540 change over a century. Not in the least. I like the comparison because economic models
00:55:34.180 can't predict individual stocks, but they do predict the rough rise of the market.
00:55:38.100 If you invest in the S&P 500, you get about- Yeah, except for the
00:55:39.700 own cataclysmic collapse. Nope. Even with the cataclysmic collapse accounted for,
00:55:43.860 you're going to see about 7% returns on average with inflation over long periods of time.
00:55:48.260 I wouldn't call an average a very sophisticated model analogous to a climate change.
00:55:53.140 That's the difference between climate and weather though, right? It's that climate isn't going to tell you
00:55:56.740 what the temperature is on a given day, but it might tell you the average surface temperature
00:55:59.780 over a period of one year or 10 years. And then that's the difference between climate and weather.
00:56:03.860 Well, that's the hypothetical difference.
00:56:05.940 It is a hypothetical, but again, we're seeing more and more and more data every single year
00:56:10.660 that things are getting hotter and hotter.
00:56:12.580 Let's jump out of our cloud of presuppositions for a minute.
00:56:15.620 Sure.
00:56:16.100 Now, one of the things that-
00:56:17.620 Oh, no, wait. I'm going to avoid it, actually, because I don't want to say.
00:56:20.180 Yeah, that's fine.
00:56:20.980 There are some things that we've gotten as a result of investing in green energy that have been good.
00:56:25.780 So, for instance, the power of solar energy has dropped dramatically in the United States,
00:56:30.020 faster than anybody thought possible, such that
00:56:34.100 solar energy is, like, competitive or beating fossil fuels in certain areas.
00:56:38.020 As long as you can set the solar panels up, you're literally beating fossil fuels.
00:56:40.500 Yeah, and as long as the sun is shining.
00:56:42.580 Well, I mean, it still is, but we're not in nuclear winter yet.
00:56:44.820 No, no, but it isn't when it's cloudy, and it isn't in the winter.
00:56:46.820 That's why I said depending on where you live.
00:56:48.180 There are places, equatorial places, if you're trying to set up a solar panel in Seattle,
00:56:52.340 you might not have as much luck, or New York City might not have as much.
00:56:55.460 Or in Germany, true.
00:56:56.660 Or in Europe, or in Canada.
00:56:58.900 There are also other issues that are coming up that I think are obfuscating our ability to
00:57:03.780 evaluate what's being caused by green energy versus not.
00:57:06.100 When we look at energy increases in Germany,
00:57:08.500 I think there's a similar constellation around nuclear energy, for instance.
00:57:11.300 People don't want nuclear energy because they think of nukes, and they think of nuclear meltdowns,
00:57:15.460 and they think of Chernobyl, and they think of Fukushima, and they think of atomic bombs,
00:57:19.140 and that's it, and that's stupid. And I agree with you, but nuclear energy is a totally viable
00:57:23.780 alternative to other forms of fossil fuel.
00:57:25.380 Then why does the radical left oppose it? You think it's just this man—see you—
00:57:29.140 For the same reason, the right opposes vaccines, because it sounds scary,
00:57:32.500 and it's a big thing, and they don't trust it.
00:57:34.020 Well, the right has a reason to distrust vaccines in the aftermath of the COVID debacle.
00:57:39.620 Because they were imposed by force, and that was a very bad idea.
00:57:42.580 You get to choose if you have a nuclear power plant? That's imposed by force too, no?
00:57:45.940 You don't get to choose where your energy comes from if you live in a country. You just—you
00:57:48.580 turn the light switch, and hopefully you don't have a Chernobyl that melts down in your particular
00:57:52.180 town, right? Well, you get to choose it because you can buy it or not.
00:57:56.900 That's the choice. But the negative— Nobody had a choice with the vaccines.
00:58:00.820 Nobody had a choice whether or not they lived near Chernobyl or not.
00:58:03.460 Nobody has a choice if there's a nuclear power plant— Sure, they can move away.
00:58:05.700 Well, I don't realize it's a move, like, 500 miles. That's like telling conservatives when
00:58:10.500 Biden tried to do the OSHA mandate for vaccines, like, well, you can just get a different job, right?
00:58:13.620 I don't want to debate about whether or not large nuclear power plants are frightening. They are.
00:58:17.620 Sure, okay.
00:58:18.020 And there are technologies now where that's not a problem. And I think that's a
00:58:23.300 counterproductive place for our discussion to go, because I also understand why people are afraid
00:58:28.100 of it. But what I don't understand, for example, is why the Germans shut down their nuclear power
00:58:33.940 plants and the Californians are thinking and have doing the same thing when they have to import power
00:58:39.060 from France anyways. Like, it's completely— Or burn coal, which is a million times worse.
00:58:43.140 Well, not just coal. Lignite. Yeah.
00:58:45.300 Right. And then with regards to these renewable power sources, they have a very—they have a number
00:58:50.020 of problems. One is they're not energy dense. They require tremendous infrastructure to produce.
00:58:57.540 They might be renewable at the energy level, but they're not renewable at the raw materials level,
00:59:02.660 so that's a complete bloody lie. They're insanely variable in their power production. And because of
00:59:07.780 that, you have to have a backup system. And the backup system has to be reliable without variability.
00:59:12.980 And that means if you have a renewable grid, you have to have a parallel fossil fuel or coal grid to
00:59:19.140 back it up when the sun doesn't shine and the wind doesn't blow, which is unfortunately very,
00:59:23.700 very frequently. And so, again—and so I'm not going to say there's no place for renewable energy like
00:59:30.420 solar and wind, because maybe there are specific niche locales where those are useful. But the logical,
00:59:37.940 what would you say, antidote to the problem of reliability, if we're concerned about carbon,
00:59:42.580 but we're really not, would be to use nuclear. And the Greens haven't been, like, flying their
00:59:48.340 bloody flags for 30 years saying, well, we could use fossil fuels for fertilizer and feed people,
00:59:54.980 and we could use nuclear power to drive energy costs down in a carbon dioxide-free manner. That seems
01:00:00.980 pretty bloody self-evident to me. And so then it brings up this other mystery that we were talking
01:00:05.460 about earlier. You know, what's the impetus behind all this? Because the cover story is, oh, we care
01:00:11.460 about carbon dioxide, which I don't think they do, especially given the willingness to sacrifice the
01:00:17.620 poor. It makes no sense to me. And I think it's relevant to the issue you brought up, which is
01:00:23.540 that people have these constellations of ideas, and there's a driving force in the midst of them,
01:00:28.900 so to speak. They're not necessarily aware of what that driving force is.
01:00:32.420 Don't we—isn't it more likely that people are either misinformed or misguided than people
01:00:37.620 are legitimately trying to depopulate the planet? I'm—look, misinformed and ignorant, that's
01:00:43.540 plenty relevant and worth considering. And stupidity is always a better explanation than malevolence.
01:00:49.380 But malevolence is also an explanation. And no, I don't think it's a better explanation because—
01:00:54.180 Why would we waste so much money sending food aid, having Bush do, you know, programs to Africa for
01:00:59.780 AIDS, having other billionaires like Bill Gates invest so much money and anti-malarial stuff? Like,
01:01:05.220 why would all the global elites be so invested in helping and killing the people here at the same time?
01:01:10.100 Well, some of it's confusion. And some of it's the fact that many things can be happening
01:01:16.020 simultaneously with a fair bit of internal paradox because people just don't know which way is up
01:01:21.300 often. But the problem with the argument—okay, so you tell me what you think about this. So,
01:01:28.420 you know, Hitler's cover story was that he wanted to make the glorious Third Reich and elevate the
01:01:33.060 Germans to the highest possible status for the longest possible period of time. Okay, but that wasn't
01:01:39.220 the outcome. The outcome was that Hitler shot himself through the head after he married his wife,
01:01:44.420 who died from poison the same day, in a bunker underneath Berlin while Europe was in flames.
01:01:51.460 Well, he was insisting that the Germans deserved exactly what they got because they weren't the
01:01:55.380 noble people he thought they were. And then you might say, well, Hitler's plans collapsed in flames,
01:02:01.060 and wasn't that a catastrophe? Or you could say that was exactly what he was aiming for from the
01:02:05.700 beginning, because he was brutally resentful and miserable, right from the time he was, you know,
01:02:10.820 a rejected artist at the age of 16. And so he was working, or something was working within him,
01:02:16.420 and something that might well be regarded as demonic, whose end goal was precisely what it
01:02:21.300 attained, which was the devastation of hundreds of millions of people, and Europe left in a smoking
01:02:26.420 ruin. And the cover story was the Grand Third Reich. And so there's no reason at all to assume that we're
01:02:32.100 not in exactly the same situation right now. I think there's a great reason to assume. I think
01:02:35.780 that Hitler's motives and everything that he was trying to do wasn't a secret. Like,
01:02:39.380 I don't think that anybody had to guess that he was incredibly anti-Semitic, that his Aryan
01:02:42.900 supremacy was going to lead to the destruction and the murder of, like, so many different people
01:02:46.420 in concentration camps. Like, none of this was a secret. It's not like he was hiding it.
01:02:49.380 He hadn't wanted it. I mean, like, he tried to maybe hide the death camps. But nobody in Germany was
01:02:54.020 wondering, like, wow, crazy that pogroms are happening as Jewish people. That's so crazy. Or, wow,
01:02:58.260 they're all being shipped to just mainly the Jews to camps to work. Like, that's kind of interesting.
01:03:02.340 Or, wow, he talks about this a lot in Mein Kampf, but maybe it's just a coincidence.
01:03:05.860 I don't think you can compare, like, Hitler to people that are worried about climate change.
01:03:09.140 The worry that I have here is— Why not?
01:03:10.500 Because if we're applying this— People thought—people in Germany thought Hitler was
01:03:14.100 perfectly motivated by the highest of benevolent needs. If I were to take this standard of evidence
01:03:19.060 and apply this lens of analysis, couldn't I say the exact same thing about the conservative
01:03:22.660 constellation of belief? They don't want to intervene anywhere in the world because they don't care
01:03:25.860 about the problems there. They're anti-immigration because they hate brown people. Trump wanted to ban
01:03:29.780 Muslims to come to the United States because he's xenophobic. Conservatives don't want to have
01:03:33.780 taxes to help the poor because they want homeless people to starve and die in the winter.
01:03:38.660 Some of that's true. And yes, you can adopt that criticism. I think the difference
01:03:43.300 with regards especially to the libertarian side of the conservative enterprise, but also to some
01:03:47.700 degree to the conservative enterprises, they're not building a central gigantic organization to put
01:03:53.940 forward this particular utopian claim. And so even if the conservatives are as morally addled as the
01:03:59.220 leftists, and to some degree that might be true, they're not organized with the same gigantism in mind.
01:04:05.300 And so they're not as dangerous at the moment. Now, they could well be, and they have been in the past,
01:04:09.860 but at the moment, they're not. And so, of course, you can be skeptical about people's motivations when
01:04:15.700 they're brandishing the moral flag. How can we say, how would we, why would we say that they're not as
01:04:19.860 concerned about the gigantism? I feel like everybody is when it's a particular thing that they care about.
01:04:25.140 You mean if, whether they would be inclined in that direction? For sure. Conservatives wield the
01:04:29.940 power of the government whenever they feel they need to adjust as liberals do, right? Conservatives were very
01:04:34.100 happy to see, for instance, abortion was brought back as a state-regulated thing. Look, that's a good,
01:04:37.940 that's a good objection. I think that you're correct in your assumption that once people
01:04:44.100 identify a core area of concern, they're going to be motivated to seek power to implement that concern.
01:04:51.380 I think cancel culture is a good idea too. I think conservatives, prior to the 2000s,
01:04:55.780 if they could censor everything related to either LGBT stuff, or weird musical stuff,
01:05:00.340 or something they didn't want their kids to watch, conservatives would do it. But now that you see
01:05:03.300 that like liberals and progressives are kind of wielding that corporate hammer, now conservatives are very much
01:05:07.380 well, hold on, we need freedom of speech, we need to platform everybody, and now progressives are like,
01:05:10.580 well, hold on, maybe we shouldn't platform people. I've got no disagreement with those
01:05:14.500 things that you said, and I have no disagreement about your proposition that people will seek power
01:05:18.900 to impose their central doctrine. Okay, so then you might say, and so we can have a very serious
01:05:26.180 conversation about that, what do we have that ameliorates that tendency? In the United States,
01:05:32.820 we've got a, hopefully, a form of decentralized government. I can't speak to Canada as much, but...
01:05:37.380 Yes, well, yes, that's true. So that's one of the institutional protections against that,
01:05:41.620 because what that does is put various forms of power striving in conflict with one another,
01:05:46.820 right? And so that's a very intelligent solution. But then there are psychological and philosophical
01:05:53.060 solutions as well. And one of them might be that you abjure the use of power, right, as a principle.
01:05:59.620 And so that, and this is one of the things that was done very badly during the COVID era,
01:06:04.820 let's say, because the rule should be something like, you don't get to impose your solution on
01:06:10.980 people using compulsion and force. There's a doctrine there, which is any policy that requires
01:06:17.620 compulsion and force is to be looked upon with extreme skepticism. Now, it's tricky, because now and
01:06:23.300 then you have to deal with psychopaths, and they tend not to respond to anything but force. And so there's
01:06:29.060 an exception there that always has to be made, and it's a very tricky exception. But look, let me tell
01:06:36.020 you a story and you tell me what you think about this, because I think it's very relevant to the
01:06:41.300 concern that you just expressed. And I don't believe that the conservatives are necessarily any less
01:06:47.060 tempted by the calling of power than the leftists. That's going to vary from situation to situation.
01:06:57.700 Though I would say probably overall in the 20th century, the leftists have the worst record in
01:07:02.020 terms of sheer numbers of people killed. So I mean, it depends on how we're quantified.
01:07:06.740 Not really. Okay, we'll just quantify Mao. How's that? Direct death of 100 million people.
01:07:13.860 So, you know, that's a pretty stark fact. And if we're going to argue about that,
01:07:17.300 well, then we're really not going to get anywhere.
01:07:19.860 So, and you- I'm not disagreeing that the
01:07:21.620 Holodomor happened as well, the Soviet Union and China were-
01:07:24.660 20 to 50 million people in the Soviet-
01:07:25.700 Yeah, I'm not going to-
01:07:27.700 Well, it's a horrible thing. Yeah, of course.
01:07:28.500 It's a war of-
01:07:29.540 I'm just saying, for World War II, it depends on how much you attribute the war
01:07:32.340 it does to Nazi Germany, et cetera, et cetera. But sure, like, largely speaking, I don't think that
01:07:36.580 the left beat the right because the right wasn't trying. I don't think it's because Hitler's lack
01:07:40.660 of trying led him to kill less people than who ended up dying during the Great Leap Forward or
01:07:44.740 during the industrialization of Soviet Union.
01:07:45.940 Yes, well, I also think it's an open question still to what degree Hitler's policies were
01:07:49.940 right-wing versus left-wing. And no one's done the analysis properly yet to determine that.
01:07:53.940 Well, what do we consider-
01:07:54.660 Because he was a national socialist movement for a reason.
01:07:57.140 And the socialist part of it wasn't accidental.
01:07:59.220 Well, but the so- I mean, there was no, you know, cooperatively formed
01:08:02.420 businesses that were owned by all of the people for the people and distributed to the people.
01:08:05.940 And I don't think redistribution was high on Hitler's list of-
01:08:08.420 That's true. That's true.
01:08:09.860 It was a strange mix of totalitarian policies.
01:08:12.820 I don't think it was a strange mix. I think it was a bid to appeal to
01:08:15.540 mid-left and center-left, the KPD and the German Socialist Party by calling themselves national
01:08:19.380 socialists. I think it was very much like an authoritarian, ultra-nationalist regime
01:08:22.900 that pretty squarely fits with- people get mad if you call something far-right or far-left
01:08:26.500 because they have an attachment to the terms.
01:08:28.580 And one of the things I would have done if I would have been able to hang on to my professorship
01:08:31.860 at the University of Toronto would have been to extract out a random sample of Nazi policies
01:08:38.660 and strip them of markers of their origin and present them to a set of people with conservative
01:08:43.940 or leftist beliefs and see who agreed with them more.
01:08:47.220 And that analysis has never been done as far as I know, so we actually don't know.
01:08:51.060 And we could know if the social scientists would do their bloody job, which they don't,
01:08:55.460 generally speaking, that's something we could know.
01:08:58.420 We could probably use the AI systems we have now, the large language models, to determine to what degree
01:09:04.020 left and right beliefs intermingled in the rise of national socialism.
01:09:08.100 So that's all technically possible. And it hasn't been done, so it's a matter of opinion.
01:09:13.060 Sure. I don't necessarily disagree. But that's something you could do.
01:09:17.300 Okay, so I was going to tell you this story.
01:09:19.300 Okay, well, this has to do with the use of power. So I spent a time with a group of scholars
01:09:28.260 putting and analyzing the Exodus story in Exodus seminar recently. And so the Exodus story is a
01:09:34.500 very interesting story because it's a, what would you say? It's an analysis of the central
01:09:42.660 the central tendency of movement away from tyranny and slavery. That's a good way of thinking about it.
01:09:49.700 So the possibility of tyranny and the possibility of slavery are possibilities that present themselves
01:09:57.140 to everyone within the confines of their life, psychologically and socially. You can be your own
01:10:03.140 tyrant with regards to the imposition of a set of radical doctrines that you have to abide by and punish
01:10:08.500 yourself brutally whenever you deviate from them. And we all contend with the issue of tyranny and
01:10:13.300 slavery. And there's an alternative path. And that's what the Exodus story lays out. And Moses is
01:10:19.620 the exemplar of that alternative path, although he has his flaws. And one of his flaws is that he turns
01:10:25.940 too often to the use of force. So he kills an Egyptian, for example, an Egyptian noble who has slayed
01:10:33.220 a Hebrew, one of Moses' Hebrew slave brothers, and he has to leave. There's a variety of indications in
01:10:40.020 the text that he uses his staff, he uses his rod, and he uses power when he's supposed to use persuasion
01:10:48.820 and legal or verbal invitation and argumentation. And this happens most particularly, most spectacularly,
01:10:57.780 right at the end of the sojourn. So Moses has spent 40 years leading the Israelites through the desert.
01:11:04.420 And he's right on the border of the promised land. And really what that means at a more fundamental
01:11:11.460 basis is that he's at the threshold of attaining what he's been aiming at, what he's devoted his whole
01:11:19.540 life to. And he's been a servant of that purpose in the highest order. And the Israelites are still
01:11:27.620 in the desert, which means they're lost and confused. They don't know which way is up. They're still
01:11:32.260 slaves. And now they're dying of thirst, which is what you die of, spiritual thirst if you're sufficiently
01:11:39.700 lost. And they go to Moses and ask him to intercede with God. And God tells Moses to speak to the rocks
01:11:46.660 rocks so that they'll reveal the water within. And Moses strikes the rocks with his rod twice
01:11:52.820 instead, right? He uses force. And so God says to him, you'll now die before you enter the promised
01:11:59.220 land. It's Joshua who enters and not Moses. Okay. And you're, you might wonder why I'm telling you that
01:12:05.080 story. I'm telling you that story because those concepts at the center of that cloud of concepts that
01:12:11.140 you described are stories, right? They're stories. And if they're well formulated, they're
01:12:16.260 archetypal stories. And this is an archetypal story that's illustrating the danger of the use of
01:12:22.000 compulsion and force. No one. So one of the problems you're obviously obsessed by and that I'm trying
01:12:28.560 to solve is what do we do as an alternative to tyranny, whether it's for a utopian purpose in the future or
01:12:35.240 maybe for the purpose of like conservative censoring music lyrics they don't approve of. And one answer is
01:12:42.120 we don't use force. We do the sort of thing that you and I are trying to do right now,
01:12:46.680 which is to have a conversation that's aimed at clarifying things. And so that's a principle that
01:12:52.960 that's something like the consent of the governed, right? It's something like, but it's also something
01:12:59.460 like you have the right to go to hell in a handbasket if that's what you choose. And I'm, as long as you
01:13:04.720 don't, you know, in doing so, you're not in everyone's way too much. You have the right to
01:13:11.620 your own destiny, right? And so, and you don't get to use power to impose that. That's the other thing
01:13:17.460 that worries me about what's going on on the utopian front. Because the problem is, you know,
01:13:23.240 once you conjure up a climate apocalypse and you make the case that there's an impending disaster
01:13:29.520 that's delayed. And you might say, well, delayed how long? And the response would be, well, we're
01:13:34.580 not sure, but it's likely to occur in the next hundred or so years, which is pretty inaccurate.
01:13:40.060 You now have a universal get out of jail card that can be utilized extremely well by power mad
01:13:45.840 psychopaths. And they will absolutely do that because power mad psychopaths use whatever they can
01:13:51.980 to further their cause. So here's my, this is my issue, I think. This is my issue with a lot of
01:13:57.540 people when it comes to political conversations. I think that everything you've said is true. And I
01:14:03.000 think that all of it is, it's, it's good analysis, but I feel like it just gets wielded sometimes in
01:14:09.060 one direction. And then people kind of miss that it completely and fully describes their entire side
01:14:14.960 as well. And the thing that I feel like the only solution for this is you hinted at it. It's more than
01:14:20.440 just conversation, although that's a good start. We have to go back to inhabiting similar areas. We have to go
01:14:25.880 back to inhabiting similar like media landscapes. I think that the issue that we're running into
01:14:29.840 right now more than anything else is people live in completely separate realities at the moment,
01:14:34.260 such that if we were even to describe basic reality, how many illegal immigrants came into
01:14:39.180 the United States last year? That should be a factual number that we can know.
01:14:42.120 How many do you think?
01:14:43.360 Somebody, um, I, the actual number, probably in the hundreds of thousands. I think some conservatives
01:14:49.460 think it's 3 million per year over the past three years because they look at like border contacts
01:14:53.720 or they look at asylum seekers and they're not looking at like crossings and lawsuits.
01:14:56.420 Yeah, I think it's 3.6 million.
01:14:57.800 Came into the U.S. and stayed?
01:14:59.440 Yes, through the southern border.
01:15:00.880 Okay.
01:15:01.720 You know the historical, you know the historical average is about a million.
01:15:05.260 I understand this. I understand this chart. Historically, there's like 13 to 15 million
01:15:08.660 people full stop in the United States illegally. That's like the history of illegal immigration
01:15:12.800 in the United States. But some, but hey, maybe I'm wrong there, right? So we can say that
01:15:17.000 that's an example of us living in a fundamentally different reality.
01:15:19.680 Well, the Pew Research Group has established quite conclusively that the variability over
01:15:25.100 the last 20 years for illegal migration in the south border is between 300,000 and 1.2 million.
01:15:30.320 Well, the Pew Research can only establish, I think, the number of people attempting to cross. I don't
01:15:33.840 know if they can know. I don't know if Pew does like census analysis. I'd have to see.
01:15:37.520 Well, I don't, well, that's, that's a different issue, right?
01:15:40.100 Sure.
01:15:40.260 Because I don't know how you measure how many illegal immigrants there are actually in the country.
01:15:44.440 I understand. I just want to point out, I just want to point out, I agree with you. I listened to a lot of
01:15:48.560 Rush Limbaugh growing up. I understand the fear of having a government agency say,
01:15:52.760 climate change, therefore, we have a blank check to do whatever we want. That's a scary-
01:15:57.360 Which is what they are doing.
01:15:58.660 The conservatives do the same thing, though.
01:16:00.660 I'm not claiming otherwise.
01:16:02.020 Yeah, but the problem is, I think people don't talk about it. So for instance, I heard,
01:16:05.660 so we can pretend now that the conservative argument was just compulsory vaccines are bad
01:16:10.580 because they infringed on my freedom. That wasn't the conservative argument. The conservative
01:16:13.460 argument was that mass deaths were going to happen, mass side effects were going to happen.
01:16:16.960 Uh, there was going to be all this corruption and stuff related to the vaccine distribution to
01:16:21.840 the crazier theories were microchips and blah, blah, blah. None of that came true. Absolutely
01:16:26.160 none of the conservative fear mongering related to the mRNA vaccines came to fruition. But now that's
01:16:30.480 all forgotten. And that was used as an excuse to-
01:16:31.160 What do you mean none of it? What do you make of the excess-
01:16:33.160 Forgotten. And that was used as an excuse to-
01:16:36.160 What do you mean none of it? What do you make of the excess deaths?
01:16:39.040 There are, that for related to vaccines, there are almost none. This, the mRNA vaccines have been
01:16:43.600 administered to-
01:16:44.160 Excess, excess deaths in Europe.
01:16:45.360 For related to vaccines, absolutely.
01:16:46.560 We don't know, no, no, no.
01:16:47.360 We don't know.
01:16:48.360 We absolutely know. We absolutely-
01:16:49.360 Wait a second. This is like settled science.
01:16:50.560 What do we know? What do we know?
01:16:51.840 In terms of vaccine related deaths-
01:16:53.360 No, no, no, no, no, no, no, no. That's not my question.
01:16:56.960 Excess deaths in Europe are up about 20% and they have been since the end of the COVID pandemic.
01:17:02.640 That sounds really high to me. 20%-
01:17:04.640 Go look! Go look!
01:17:05.440 I mean, I'll check afterwards, but is this including like the Ukrainian war with Russia?
01:17:09.760 No, no, it's not including the Ukrainian war.
01:17:11.920 Okay. Are you implying that you think it's because of vaccines?
01:17:14.800 I'm not implying anything. I'm saying what the excess deaths are.
01:17:18.160 But what is your take on what's causing it?
01:17:21.840 And you said that in an encounter to me describing mRNA vaccines. You said, well,
01:17:24.400 the excess deaths are 20%. That makes sense that the implication is that the vaccines are causing it.
01:17:28.480 Okay. First of all, something is causing it.
01:17:31.360 Well, that obviously, yeah.
01:17:33.040 Something is causing it or some combination of factors.
01:17:36.000 Sure.
01:17:36.480 Now, one possibility is that the healthcare systems were so disrupted by our insane focus on the COVID
01:17:42.800 epidemic that we're still mopping up as a consequence of that.
01:17:46.480 Wait, are these excess deaths tracing back through COVID as well?
01:17:49.200 Post-COVID.
01:17:50.240 Just post-COVID.
01:17:51.280 Post-COVID.
01:17:52.080 Okay.
01:17:52.480 Right. They're terrifying.
01:17:54.560 Right. They're terrifying. And they're not well publicized.
01:17:59.680 I think excess deaths, the fact that you're speaking to them right now seems like…
01:18:04.160 Yeah. But I ferret down a lot of rabbit holes. It's not like it's front bloody page news on the
01:18:08.960 New York Times.
01:18:09.840 Sure. But I think excess deaths, that's a metric that you can Google. And I'm pretty sure there are
01:18:13.280 like three different huge organizations that track excess deaths around the world.
01:18:16.000 There are many more than three.
01:18:17.360 Yes. In every single European country.
01:18:20.400 Right. Okay. Well, so one relatively straightforward hypothesis is that it's a consequence of the
01:18:27.360 disruption of the healthcare system, the staving off of cancer treatment, etc. The increase in
01:18:34.080 depression, anxiety, suicidality, and alcoholism that was a consequence of the lockdowns, the economic
01:18:40.000 disruption. And there's plenty of reason to believe that some of that is the case. But the other
01:18:45.600 obviously glaring possibility is that injecting billions of people with a vaccine that was not
01:18:52.160 tested by any stretch of the imagination with the thoroughness that it should have before it was
01:18:57.120 forced upon people also might be a contributing factor. Partly because we know that it led to a
01:19:03.840 rise in myocarditis among young men. And we also know that there was absolutely no reason whatsoever
01:19:11.200 to ever recommend that that vaccine was delivered to young children whose risk of death at COVID was
01:19:16.800 so close to zero that it might as well have been zero. When you're talking about a disease,
01:19:20.640 the risk of death isn't the only thing that you worry about for the disease.
01:19:23.200 So you're going to talk about transmission? Because that was another thing that the COVID vaccine
01:19:28.720 Yeah, but it didn't do anything to transmission. It absolutely did because it decreased your chance
01:19:31.760 of getting affected. It didn't destroy, it didn't get rid of transmission, but it reduced
01:19:34.800 Yeah, but it was claimed that it would get rid of transmission.
01:19:37.040 Only if you take one reading of one single quote, I think that Biden said one time where he said,
01:19:42.640 no, come on. I've heard so many times because I'm going to say, oh, you can't take anything
01:19:45.920 Trump says seriously. Biden one time on the news says, if you get the vaccine, you won't
01:19:49.600 That is so silly. Which was it. No.
01:19:51.200 Do you know that our prime minister in Canada deprived Canadians of the right to travel for six months
01:19:56.320 because the unvaccinated were going to transmit COVID with more likelihood than the, than the vaccinated.
01:20:02.560 So this wasn't one bloody statement. No, no, hold on.
01:20:05.600 What I'm saying is there wasn't a statement given that if you get vaccinated, there is a
01:20:10.960 zero percent chance of transmitting the disease. The idea is that vaccines were supposed to help
01:20:14.880 because it reduces, it reduces your hospitalization, it reduces death, and it reduces transmission,
01:20:20.320 hopefully by making it so that people don't get sick or don't get sick for as long.
01:20:23.120 All three of those things, the vaccines did exceedingly well. They continue to do that to this day,
01:20:27.120 but especially for the first variant and then the Delta variant, the vaccines helped immensely here.
01:20:32.480 They were tested. The myocarditis rates are like seven out of 100,000 injections,
01:20:37.120 and the myocarditis is generally acute. And it's generally not as bad as even getting
01:20:40.560 the coronavirus itself, which will lead you also to have a myocarditis.
01:20:42.640 It's a much worse side effect than side effects that have caused other vaccines to be taken off the
01:20:47.200 market before. That seven out of 100,000 rate of acute myocarditis or pericarditis is not a worse
01:20:54.160 side effect at any of the vaccine. I think that is completely acceptable,
01:20:56.400 given that the disease itself is more likely to cause myocarditis or pericarditis.
01:21:00.320 Yes, I don't think the data suggests to support that presupposition anymore.
01:21:04.160 The latest peer-reviewed studies show that that's simply not true, especially among young men.
01:21:09.520 So there is an age bracket of young men where the elevated rate of myocarditis,
01:21:13.920 acute myocarditis from the vaccine, might have been higher. But we're talking about like
01:21:17.440 three or four cases per 100,000 people. And again, myocarditis and pericarditis are generally acute
01:21:22.240 conditions. They don't last for very long.
01:21:24.000 I told you at the beginning of this conversation that the progressive leftists were on the side of
01:21:28.560 the pharmaceutical companies. It's not about being on the side of the pharmaceutical companies.
01:21:31.760 It's about...
01:21:32.000 Really?
01:21:32.880 Really? Yeah.
01:21:34.240 Well, I see. So what I see as the unholy part of that alliance with the pharmaceutical companies
01:21:40.720 is that it dovetails with the radical utopians' willingness to use power to impose their utopian
01:21:45.600 vision. Because otherwise, how would you explain it? Because the leftists should have been the ones that
01:21:50.000 were most skeptical about the bloody pharmaceutical companies. And they jumped on the vaccine bandwagon
01:21:55.440 in exactly the same way that you're doing right now.
01:21:57.280 Pharmaceutical companies have helped us tremendously throughout the...
01:21:59.520 Yeah, right. There we go. Fine. No, I don't think so.
01:22:01.600 Do you think modern medicine hasn't?
01:22:03.360 No, I don't think so.
01:22:04.400 That you're just wrong.
01:22:04.960 I think they're utterly...
01:22:05.760 You're completely wrong.
01:22:06.320 I see. So you don't think that the pharmaceutical companies who dominate the advertising landscape
01:22:12.400 with 75% of the funding are corrupt?
01:22:15.440 Corrupt. I don't... Corrupt is a very broad...
01:22:18.400 Corrupt. No, no, no. No, no. It's targeted.
01:22:19.680 Do you think that...
01:22:20.320 Do you think that pharmaceutical...
01:22:21.200 Corrupt with a tinge of malevolence.
01:22:23.200 Do you think that...
01:22:23.520 Willing to extract money out of people by putting their health on the line.
01:22:27.280 You don't believe that.
01:22:28.240 Do you think that we get effective drugs from pharmaceutical companies?
01:22:32.080 Not particularly.
01:22:33.440 Okay. Do you... So do you think that any vaccines work?
01:22:37.680 Yes.
01:22:38.240 Do you think that any...
01:22:39.760 I don't think 80 of them work.
01:22:41.360 I don't think that any vaccines work at once for babies.
01:22:45.280 I think that's a little risky.
01:22:46.960 But yet we've been on this vaccine schedule for how many decades?
01:22:50.000 Like this. Like this. Not like this. Not carefully.
01:22:55.040 I had a ton of vaccines when I was a child. I'm pretty sure that was the norm for people.
01:22:58.960 There were a ton of vaccines.
01:22:59.760 There's way more now.
01:23:02.560 Okay. And you think that...
01:23:03.760 Well, and you can understand why. I mean, look, part of it, no doubt, no doubt,
01:23:08.400 part of it is a consequence of the genuine, genuine willingness to protect children.
01:23:12.960 But the moral hazard is quite clear. And people on the left used to be aware of this.
01:23:18.160 What do you think the mRNA vaccine, the speeding up of it came from?
01:23:21.120 How do you make for the fact that it was Donald Trump that didn't work speed?
01:23:24.080 Terror. Terror.
01:23:26.160 Foolish panicking, just like we're doing with the climate issue.
01:23:29.360 So you think Trump was...
01:23:30.480 Foolish panicking.
01:23:31.120 Was he in bed with the pharmaceuticals?
01:23:32.480 Was he working with the left? Or was it just a dumb...
01:23:34.480 That was the only panicky thing he made?
01:23:36.320 He didn't try to push for the mass lockdowns like other far left people would have wanted him to do.
01:23:40.000 That was just the one mistake he made was the pushing for the vaccine?
01:23:43.040 No, I think Trump undoubtedly made all sorts of mistakes and lots.
01:23:46.240 And it wasn't...
01:23:46.960 It certainly wasn't only the left that stampeded toward the forced COVID vaccine
01:23:55.280 debacle.
01:23:56.240 But it was most surprising to me that it emerged on the left.
01:24:00.400 Because the left at least had been protected against the depredations of gigantic predatory
01:24:06.800 corporations by their skepticism of the gigantic enterprises that can engage in regulatory capture.
01:24:14.800 And that just vanished.
01:24:15.920 Is it not possible that maybe people looked and they said,
01:24:18.800 hey, if all the governments, all the institutions, all the schools,
01:24:23.040 all the private companies across all the countries around the world are saying the same thing,
01:24:26.880 maybe it is the case that this vaccine just helps. Is that not possible?
01:24:30.320 Oh, sure. They probably...
01:24:31.680 Sure, of course it's possible, but that didn't mean it was right.
01:24:35.280 They use force.
01:24:36.800 They use force.
01:24:37.920 We use force for all sorts of things in terms of public health.
01:24:40.240 We don't generally use force to invade people's bodies.
01:24:43.440 How long have vaccine mandates been a thing in Canada, the United States, and the entire world?
01:24:48.400 I don't think they should have been a thing.
01:24:49.840 That's great if you don't think they should have been.
01:24:50.880 But when you say we don't generally use force, we absolutely use force.
01:24:54.400 We've enforced vaccines for a long time. It's an important part of public health.
01:24:57.840 Yes, fair enough.
01:24:58.960 We did it on a scale and at a rate during the COVID pandemic, so-called pandemic,
01:25:04.480 that was unparalleled. And the consequence of that was that we injected billions of people
01:25:10.080 with an experimental... It wasn't a bloody vaccine.
01:25:13.440 Of course it was.
01:25:14.480 No, it wasn't.
01:25:15.200 Yes, it was.
01:25:15.840 No, it isn't.
01:25:16.480 Yes, it is.
01:25:18.160 It's not...
01:25:18.720 You think it doesn't have a 100% success rate? You think it's a definition of vaccine?
01:25:21.680 Well, the point of the vaccine is to give your body a protein to train on so the immune system
01:25:24.800 works in the future.
01:25:25.440 It's not the same technology.
01:25:25.920 Who cares if it's not the same? There's plenty of...
01:25:27.360 They used the word vaccine so that they didn't have to contend with the fact that it wasn't
01:25:31.920 the same technology.
01:25:33.120 There are different types of vaccines that are different technologies.
01:25:36.560 Fine.
01:25:36.960 The mRNA vaccines is a type of vaccine technology.
01:25:38.320 This used to be vaccines. Now this is vaccines.
01:25:41.520 No, it was like this and now it's like this.
01:25:42.960 No, no, no. It was like this and now it's like this.
01:25:45.200 The mRNA technology was a radical, qualitative leap forward in technology.
01:25:53.120 You can call it a vaccine if you want to, but it bears very little resemblance to any
01:25:57.760 vaccine that went before that. And the reason it was called a vaccine was because vaccine
01:26:03.120 was a brand name that had a track record of safety and shoehorning it in that was one
01:26:08.640 of the ways to make sure that people weren't terrified of the technology.
01:26:11.520 I think the reason it's called a vaccine is because they're injecting you with something
01:26:14.160 that's inoculating you against something in the future because it has proteins that resemble
01:26:17.280 a virus that infects you.
01:26:18.240 There are overlaps between the mRNA technologies and vaccines, to be sure. But they wouldn't
01:26:25.200 have been put forward with the rate that they were put forward if they weren't a radical new
01:26:29.040 technology. And it's bad in principle to inject billions of people with an untested new technology.
01:26:36.320 Isn't it also bad in principle for billions of people to get infected with a worldwide pandemic
01:26:40.480 that initially was causing a decent number of deaths, a ton of complications, shutting
01:26:44.320 down world economies?
01:26:45.600 Maybe. Maybe it was. Maybe it was.
01:26:48.000 So shouldn't we be able to engage in that analysis and figure out, like, if we look at
01:26:50.960 the side effects of the vaccine? We're not engaging in the analysis.
01:26:52.400 No, because now we're talking about whether or not vaccines or even vaccines are not instead.
01:26:55.440 No, no, no. Don't play that game. That is not what I was doing. I was making a very specific and careful case.
01:27:01.200 The mRNA technology, by wide recognition, is an extraordinarily novel technology.
01:27:06.640 That doesn't make it not a vaccine, though.
01:27:08.720 Well, okay. It's a radically transformed form of vaccine. I don't give a damn. That still makes
01:27:15.440 it something so new that the potential danger of its mass administration was highly probably,
01:27:23.440 highly probable to be at least or more dangerous than the thing that it was supposed to protect
01:27:30.000 against. And we are seeing that in the excess deaths.
01:27:32.880 We are absolutely not seeing that. So are you implying that the excess deaths were caused by
01:27:35.840 the vaccines? Or does it sound like- I don't bloody well know what they're
01:27:38.960 That's what you're implying now.
01:27:40.160 Well, look, if you're going to use Occam's razor, you're kind of stuck in an awkward place here.
01:27:45.360 I'm absolutely not stuck in an awkward place here.
01:27:47.280 This is the most administered vaccine to the hit, or inoculation, or whatever you're going to call it,
01:27:50.800 the history of all of mankind. Every single organization around the world is motivated
01:27:55.520 to call this out if it was a bad thing. You don't think Russia or China would be screaming if Donald
01:27:59.600 Trump or the United States warp-sped through a vaccine that was having deleterious effects
01:28:03.120 on populations all around the world? You don't think there wouldn't be some academic institution?
01:28:06.560 You don't think there'd be more than a handful of doctors and Joe Rogan and some conservatives
01:28:09.840 saying this vaccine might have been bad if it was the case that American companies,
01:28:13.600 working with companies in Europe and Germany, especially, because that's where biotech is from,
01:28:17.440 in order to create or manufacture a vaccine that was causing excess deaths all around the world?
01:28:21.360 There are so many different people that would be motivated to call this out.
01:28:23.680 How do you explain- You are calling it out.
01:28:25.200 No, it's a handful of people. Where are the governments calling it? Where are the academic
01:28:28.480 institutions calling it? Where are the other private companies calling it out? Wouldn't you
01:28:31.280 stand to make a killing if you were a private company in Europe and you could say, look,
01:28:33.920 the mRNA vaccines for sure are causing all of these issues. Why wouldn't Putin,
01:28:37.440 why wouldn't Xi Jinping, why wouldn't anybody else in the world call this out if it was as
01:28:40.480 horrible as it was? There are plenty of people attempting to call out- Nobody credible,
01:28:44.400 and no huge institution. What do you make of the excess deaths? You haven't come up with a bloody hypothesis.
01:28:51.440 I don't even know if there are 20% excess deaths in Europe right now. If I had to guess off the top
01:28:55.760 of my head, it's going to be, like you said, one might be lingering effects of an overall health
01:28:59.680 care system. Another one might be deaths related to the war in Ukraine. Another one might be rising
01:29:03.920 energy costs that have happened for a couple reasons. But it's absolutely impossible that any
01:29:06.880 of it could be unintended consequences of a novel technology injected into billions of people.
01:29:11.680 I think that if excess, first of all, there aren't billions of people in Europe.
01:29:14.240 I didn't say there were. I understand, but you're talking about excess deaths in Europe. I'm not
01:29:17.920 aware of excess deaths that exist in other places that are completely and totally unaccounted for where
01:29:21.200 the only explanation could be the vaccine. I think if there were, I think more people would be
01:29:24.160 talking about it. Well, we have to. Well, first of all,
01:29:26.320 the number of people talking about something is not an indication of the scientific validity of a
01:29:30.800 claim. Quite the contrary. I agree with that, but for a vaccine-
01:29:32.880 Then why are you using mass consensus as the determinant of what constitutes truth?
01:29:38.560 Because I think for something- That's never been the case.
01:29:40.080 Because I think for something that was given to billions and billions of people,
01:29:43.760 if this was something that was having a measurable effect on people, it would be
01:29:47.040 it would be impossible to cover it up or ignore it. We wouldn't have to look at the one case brought
01:29:51.680 up on a documentary. We would have to look at the one thing being talked about.
01:29:54.880 Then what do you make of the VAERS data? There's more negative side effects reported
01:30:00.640 from the mRNA vaccines than there were reported for every single vaccine ever created since the dawn of
01:30:07.760 time and not by a small margin. So it's not just the excess deaths. It's the VAERS data.
01:30:13.280 What is VAERS data? It's the data base that until the COVID-19 pandemic emerged
01:30:19.680 and we had the unfortunate consequence that there were so many side effects being reported,
01:30:24.320 it was the gold standard for determining whether or not vaccines were safe.
01:30:28.960 And now as soon as it started to misbehave on the mRNA vaccine front,
01:30:34.080 we decided that we were going to doubt the validity of the VAERS reporting system.
01:30:37.360 Okay, the VAERS reporting system has never been the gold standard for anything.
01:30:39.360 VAERS reporting is just if you want to report that there is some issue
01:30:42.000 that you have after getting a vaccine. That's it. I think it's vaccine, adverse-
01:30:45.600 What the hell do you think it was set up for?
01:30:47.120 To report adverse events that happen after a vaccine, to track and see if something was
01:30:51.760 related to the vaccine. Right. So most people, most people didn't even know VAERS existed
01:30:56.800 until after the COVID vaccine. Once people know that it exists, of course,
01:31:00.640 more people are going to engage with it. But what happens-
01:31:03.440 So it's all noise.
01:31:04.400 No, well, it could be or couldn't be. So what do you do when a bunch of stuff-
01:31:07.760 Well, first of all, you might begin by suggesting that maybe it's not all noise.
01:31:12.560 Correct. So when all of these things are admitted to VAERS, what they do is from there,
01:31:17.760 they investigate. All you can do, all of that, all VAERS is, is I might go and get a vaccine
01:31:22.000 and maybe in three days they go, hmm, I've got a headache. I'm going to go ahead and like call my
01:31:25.920 doctor and make this report. And they'll say, okay, well, it's an adverse event after vaccine.
01:31:29.360 It doesn't mean the vaccine caused the headache. And now that more people know about this-
01:31:32.240 I had meant it. I'm just saying that VAERS is not the gold standard of determining
01:31:36.480 if a vaccine is working or not- Compared to what?
01:31:37.920 Compared to actual longitudinal, prospective, randomized, controlled trial-
01:31:42.800 You mean like the ones they should have done to the goddamn vaccines before-
01:31:45.360 Like the ones that they did do for the vaccines and they continue to do to this day. Yes, that is
01:31:48.800 correct. Yes, correct.
01:31:49.840 You really think that you're in a position to evaluate the scientific credibility of the
01:31:53.680 trials for the vaccines, do you?
01:31:55.120 Really? No, I don't. So I have to trust-
01:31:57.040 Then what are you doing? What I have to do?
01:31:58.320 I don't trust them. I looked at the blood data.
01:32:00.560 You have- First of all, you have to trust third parties to some extent.
01:32:03.760 When you go outside- I don't have to trust-
01:32:05.280 Of course you do. You do every day. When you turn the keys in your car,
01:32:08.080 you hope your engine doesn't explode. When you're drinking water,
01:32:10.560 you hope that the public water or whatever tap or bottle water you got it out of
01:32:13.520 isn't contaminated or poisoned with cholera.
01:32:15.280 I don't do that as a consequence of consensus.
01:32:17.520 No, you- Of course you do.
01:32:18.880 No, I don't. I do that as a consequence of observing
01:32:21.520 multiple times that when I put the goddamn key in the ignition, the truck started.
01:32:25.280 Why do you know it's going to start the 50th or the 100th time?
01:32:27.520 Why do you- How many times do you wear those?
01:32:28.880 I'm not playing Hume with me. You know perfectly well one.
01:32:30.960 You don't know if the denim and those jeans isn't leaking into your bloodstream.
01:32:33.680 To some extent, we trust, we have to trust third-party institutions-
01:32:38.080 Except when they use force. How about that?
01:32:40.960 Especially when they use force. We trust the police officers.
01:32:43.360 We trust the judicial systems. We do. We do.
01:32:45.040 We on the left trust the police, do we?
01:32:46.720 To some extent, do we? If somebody's breaking into your house,
01:32:48.720 who do you call? That's why we defund them.
01:32:49.840 I'm not a defunder, but if somebody's breaking into your house,
01:32:52.160 you can be the most defunded person in the world. Who are you going to call?
01:32:54.080 Are you going to call your neighbor? Are you going to call Joe Biden?
01:32:56.320 Are you going to call Obama? Are you going to call the Black Panthers?
01:32:58.640 You're going to call the cops.
01:32:59.200 Okay, so tell me this.
01:33:00.960 Tell me this then, because the core issue here is use of force,
01:33:04.080 as far as I'm concerned. You know, we examine some of the weeds around that.
01:33:09.840 Politicians throughout the world, and this would be true on the conservative side now,
01:33:13.760 in the aftermath of the COVID tyranny, because it was more a tyranny than a pandemic,
01:33:20.560 are now saying that we actually didn't force anybody to take the vaccine.
01:33:25.440 So what do you think of that claim? Like, so let's define force.
01:33:28.160 I think it's technically true, but I think it's silly.
01:33:30.640 What do you mean it's technically true?
01:33:31.920 It's technically true, and that in the United States, at least,
01:33:35.280 I think the idea what they tried to do, they weren't able to do it because the Supreme Court shot it down,
01:33:39.920 was Biden tried to make it so that OSHA, who's the body that regulates job safety,
01:33:44.480 could make it so that employees had to get vaccinated.
01:33:46.480 Or what? Or what?
01:33:48.080 Or they'd lose their job.
01:33:49.120 Okay, does that qualify as force?
01:33:51.200 That's why I said technically, but not—
01:33:52.560 Yeah, no, no, but it's a serious question.
01:33:54.640 I mean, because we need to define what constitutes force before we can—
01:33:58.560 It seems to me—
01:33:59.280 You could argue it's a type of force, sure.
01:34:01.040 I mean, I think it'd be silly to say it's nothing.
01:34:02.560 It is a type of force.
01:34:03.440 Is it the same as a cop telling you you have to do this, you're going to be killed?
01:34:06.080 No, but it's on the spectrum, sure, of course, yeah.
01:34:08.720 It's as much force as the mRNA vaccines are vaccines.
01:34:14.000 Sure. It is a type of force, and the mRNA vaccines are a type of vaccine, so sure.
01:34:18.160 Okay, so look, I really think the problem was with the COVID response.
01:34:23.040 I really think the problem was the use of force.
01:34:25.600 I mean, I can understand to some degree, although I'm very skeptical of the pharmaceutical companies,
01:34:30.320 and far more skeptical than your insistence upon the utility of consensus might lead me to believe
01:34:36.480 you're skeptical of them, which is surprising, I would say.
01:34:39.360 I'm very skeptical of them.
01:34:40.560 That's why I'm glad there's multiple companies, multiple countries,
01:34:42.720 multiple academic institutions that do research, and the FDA.
01:34:44.960 Yeah, I'm very skeptical.
01:34:46.080 You should be in any private system.
01:34:47.440 You should be skeptical of every private company, of course,
01:34:50.000 whether we're talking media, pharmaceuticals, or automobile manufacturers.
01:34:53.040 Yeah.
01:34:54.000 But skepticism doesn't mean a blind adherence to the complete total opposite of whatever it is
01:34:58.400 they're saying, right? Undoubtedly, like if you look at Alzheimer's research,
01:35:02.320 there's been groundbreaking improvement on drugs to treat Alzheimer's research over the past three
01:35:06.400 years that five years ago none of these drugs even existed. And yeah, so I mean-
01:35:09.680 How about if you're skeptical of anyone who's willing to use force to put their doctrine forward?
01:35:14.480 How's that as a rule of thumb?
01:35:15.280 Then you're skeptical of literally every single person, political ideology ever to ever have existed
01:35:21.040 in all of humankind. Some degree of force, you would undoubtedly believe this, right?
01:35:26.400 Some degree of force is probably necessary for any kind of cohesive society, right?
01:35:30.480 No, I don't believe that.
01:35:31.920 Of course there is.
01:35:33.360 No, I don't believe that.
01:35:33.760 Even if you had a tribe of 100, 120 people, if somebody was stealing something, right?
01:35:39.520 You have to punish that person.
01:35:40.800 I said earlier that that becomes complicated when you're dealing with the psychopathic types,
01:35:46.720 right? So that's a complication.
01:35:48.160 But I would say, generally speaking, that the necessity to use force is a sign of bad policy.
01:35:54.320 And no, I don't think, see, I'm not particularly Hobbesian. I don't think that the only reason
01:35:58.400 people comport themselves with a certain degree of civility in civilized society is because they're
01:36:03.920 terrified by the fact that the government has a monopoly on force that can be brought against
01:36:08.160 them at any moment. I think that keeps the psychopaths in line to some degree, but I think that most people
01:36:14.160 are enticed into a cooperative relationship and that formulating the structures that make those
01:36:19.360 relationships possible is a sign of good policy.
01:36:22.080 I have to ask, because I have watched a lot of your stuff in the past. I remember you speaking
01:36:27.200 very distinctly on this, that for instance, when two men are communicating with each other,
01:36:31.600 there is an underlying threat of force that kind of puts on the guardrails those particular social
01:36:36.800 interactions.
01:36:37.360 Yeah, the threat of force is don't be psychopathic.
01:36:42.320 How broader is psychopathic here? Are we defining?
01:36:44.720 Well, I can define it.
01:36:46.240 Sure, yeah, go for it.
01:36:47.200 Well, a psychopath will gain short-term advantage at the cost of long-term relationship.
01:36:52.240 Okay.
01:36:52.560 That's really the core issue. Well, you know, you made a reference to something like that
01:36:57.120 earlier in your discussion when you pointed out that people claim to be motivated, let's say,
01:37:03.120 by principle, but will default to short-term gratification more or less at the drop of a hat.
01:37:07.440 For social tips, yeah.
01:37:08.000 Yeah, yeah, yeah, exactly. Well, the exaggerated proclivity to do that is at the essence of
01:37:14.080 psychopathy. So it's a very immature-
01:37:16.160 I'm curious, with this definition of psychopathy, does that mean like-
01:37:19.680 It's the definition of psychopathy. It's not an ad. It's not mine. That's the core of psychopathy.
01:37:24.960 Okay. In the United States, I think we call it all ASPD now.
01:37:28.720 No, it's separate from, that's antisocial personality disorder.
01:37:32.320 I thought that subsumed psychopathy and sociopathy.
01:37:34.400 Psychopathy is, no, psychopathy is more like some, it's more the pathological core of antisocial
01:37:40.320 personality disorder.
01:37:41.360 Okay. Maybe that might be true. Okay.
01:37:42.640 That's a better way of thinking. Like the worst, a small number of criminals are responsible for
01:37:47.840 the vast majority of crimes. It's 1%, commit 65%, something like that.
01:37:52.240 Do you think, is psychopathy something that can be environmentally induced? Or do you think this is core
01:37:57.200 to a person?
01:37:58.000 It's both. So, for example, if you're disagreeable, like you are, by the way,
01:38:03.520 one of the, your proclivity, if you went wrong, would be to go wrong in an antisocial and psychopathic
01:38:08.960 direction. That's more true of men, for example, than it is for women. That's why men are more
01:38:13.280 likely to be in prison by a lot. I think it's 10 to 1 or 20 to 1 generally. It depends on the
01:38:18.320 particular crime, with it being higher proportion of men as the violence of the crime mounts.
01:38:23.840 So you can imagine on the genetic versus environment side. So imagine that
01:38:28.480 when you're delivered your temperamental hand of cards, you're going to have a certain set of
01:38:34.320 advantages that go along with them that are part and parcel of that genetic determination. And there's
01:38:39.440 going to be a certain set of temptations as well. So for example, if you're high in trait neuroticism,
01:38:44.640 you're going to be quite sensitive to the suffering of others and be able to detect that. That's useful
01:38:48.960 for infant care. But the cost you'll pay is that you'll be more likely to develop depression and
01:38:53.440 anxiety. And if you're disagreeable, if you're disagreeable, extroverted and unconscientious,
01:38:59.360 then you're the tilt, the place you'll go if you go badly is in the psychopathic or antisocial
01:39:05.920 direction. And there are environmental determinants of that to some degree.
01:39:09.520 Sure. Genes express themselves in an environment. I agree. I'm just curious for the definition of
01:39:14.720 psychopathy for short-term gain at the expense of long-term relationship, really. That's probably
01:39:19.360 the best bit. Yeah. When you look at stuff like people that are self-destructive, say people that
01:39:23.040 engage in behavior, at least like obesity, is that like a type of psychopathy to you? Or is that like
01:39:27.520 something different? Or how do you define these types of things, I guess? Or how do you view that
01:39:30.480 type of thing? Well, no, no. There is an overlap in that addictive processes, one of which might lead to
01:39:40.480 obesity, do have this problem of prioritization of the short-term. So that overlaps with the
01:39:48.000 short-term orientation of the psychopath. But a psychopath is, see, an obese person isn't gaining
01:39:53.680 anything from your demise to facilitate their obesity, right? So there's a predatory and parasitical
01:40:00.960 element to psychopathy that's not there in other addictive short-term processes. Do you think,
01:40:05.760 is it possible that there are things, because then to circle back to the tribal example I gave,
01:40:10.720 isn't it possible that people can commit harms against other people where they're not necessarily
01:40:14.560 gaining from their demise, but it's just some other sort of gain? So for instance, like, say I'm
01:40:19.680 talking to some friends and I'm just gossiping or shit-talking another person. I'm not necessarily
01:40:23.360 feeling good that I'm trashing them per se. I'm feeling good because this group of friends might be
01:40:28.000 more favorably because I have like a gossip or something to share with them. Well, but that's the gain
01:40:32.160 right there. And you are contributing to the demise of the people you're gossiping about.
01:40:36.560 You are, for sure. But I think there's like, I feel like there's fundamentally a different
01:40:39.040 type of thought process between like, I want to tell you something juicy about this guy because
01:40:42.000 it'll make you like me, versus I want to tell you something juicy about this guy because I hate this
01:40:45.360 guy and I want him to like have a worse reputation among people. I feel like there's different
01:40:48.400 drivers for that. I would say that's an interesting distinction. I would say probably that
01:40:54.720 the hatred-induced malevolence is a worse form of malevolence than the popularity-inducing
01:41:00.960 malevolence. It's a tough one.
01:41:03.360 Yeah, the only reason I bring that up is because I feel like a lot of malevolence that we have
01:41:07.200 social guardrails for is that type of like selfish malevolence where you're not, I would argue even
01:41:13.520 the majority of malevolence in the world is usually people acting selfishly or being inconsiderate,
01:41:18.160 not necessarily like, I hate this person.
01:41:19.680 Yeah, I think that's right. I think that, well, that's why Dante outlined levels of hell,
01:41:24.960 right? Yeah, well, exactly that. And I mean, that book was an investigation into the
01:41:30.480 structure of malevolence, right? He put betrayal at the bottom, which I think is right. I think
01:41:35.040 that's right because people who develop post-traumatic stress disorder, for example, which almost
01:41:41.040 only accompanies an encounter with malevolence rather than tragic circumstances, they are often
01:41:46.960 betrayed, sometimes by other people, but often by themselves. And yes, there are levels of hell,
01:41:52.480 you know, and you outlined a couple there. So I guess then my question is just that if you have people,
01:41:57.040 so the kid that steals an orange from a stand, not because he hates the shop owner, but because he
01:42:00.800 wants the orange or he's hungry, without some type of societal, it doesn't have to be the government,
01:42:04.880 it could be family, religious, without some type of use of force, do you think that society ever
01:42:10.000 exists without- Use force on your wife?
01:42:13.280 Well, what are we considering force? Is withholding sex, for instance, is that considered force? Or is,
01:42:18.560 you know, saying we're going to cancel a vacation?
01:42:19.600 Deprivation of an expected reward is a punishment. So you could, well, no, but I mean, this is a
01:42:25.680 serious question. I mean, look, look, if we're thinking about the optimization of social structures,
01:42:31.040 we might as well start from the base level of social structure and scaffold up.
01:42:34.720 Sure. So it's like if a wife is upset at a husband, for instance, would that be considered
01:42:39.600 a use of force? I think a negative punishment, you're removing a stimulus to punish a person for
01:42:43.840 something. Yeah. Would you consider that like a use of force or-
01:42:45.920 I would say it would depend to some degree on the intent.
01:42:48.480 If the intent is to punish a behavior, right?
01:42:50.400 Well, if the intent is to punish, then it's starting to move into the domain of force. I mean,
01:42:55.360 look, while we've been talking, you know, there have been bursts of emotion, right? And that's
01:43:00.560 because we're freeing entropy and trying to close and to enclose it again. And so that's going to
01:43:05.280 produce, it produces negative emotion fundamentally, most fundamentally anxiety and pain, and
01:43:11.200 secondarily something like anger, because those emotions are quite tightly linked.
01:43:14.880 Sure.
01:43:15.360 And so within the confines of a marriage, because we might as well make it concrete,
01:43:19.840 there are going to be times when disagreements result in bursts of emotion. And those bursts
01:43:24.240 of emotion don't necessarily have to have an instrumental quality, right? It's when the emotion is
01:43:29.840 used manipulatively to gain an advantage that's short-term for the person. And then maybe that's at the expense of the other person, or even at the expense of the person who benefits future self, then it starts to tilt into the manipulative-
01:43:44.320 And there's a tetrad of traits. So narcissism,
01:43:50.800 Machiavellianism, that's manipulativeness.
01:43:53.760 Narcissism is the desire for unearned social status. That's what you'd gain, for example, if you were
01:43:57.280 gossiping and elevating your social status. Machiavellianism, narcissism,
01:44:01.760 narcissism, psychopathy, that's predatory parasitism, and those culminate in sadism.
01:44:06.160 And that cloud of negative emotion that's released
01:44:09.760 in the aftermath of disagreement can be
01:44:13.760 tilted in the direction of those traits. And that's when it becomes malevolent. And that's when the problem of force starts to become paramount. Because I think that your fundamental presupposition was both Hobbesian and ill-formed.
01:44:29.200 I do not believe that the basis for the civilized polity is force. Now, you're saying that, you know, you can't abjure the use of force entirely. And I would say, unfortunately, that's true.
01:44:39.280 But if the policy isn't invitational, if I can't make a case that's powerful enough for you to go there voluntarily, then the policy is flawed. Now, it may be that we have some cases where we can't do better than a flawed policy because we're not smart enough. And maybe the incarceration of criminals with a long-term history of violent offenses is a good example of that. We don't know how to invite those people to play.
01:45:09.280 They have a history, generally from the time they're very young children, from the age of two, of not being able to play well with others. And it's a very, very intractable problem.
01:45:20.280 There's no evidence in the social science literature at all that hyper-aggressive boys by the age of four can ever be socialized in the course of their life.
01:45:30.280 The penological evidence suggests that if you have multiple offenders, your best bet is to keep them in prison until they're 30. And the reason for that is it might be delayed maturation, you know, biologically speaking.
01:45:43.340 But most criminals start to burn out at around 27. So it spikes. It's a big spike when puberty hits. And then stability among the hyper-aggressive types.
01:45:54.900 So actually what happens is the aggressives at four tend to be aggressive their whole life, and then they decline after 27.
01:46:00.700 The normal boys are not aggressive. They spike at puberty and go back down to baseline, right?
01:46:06.900 And so you don't really rehabilitate people in prison for obvious reasons. I mean, look at the bloody places.
01:46:12.540 There are great schools for crime in large, but if you keep them there until they're old enough, they tend to mature out of that,
01:46:20.460 except the worst of them tend to mature out of that predatory, short-term-oriented lifestyle.
01:46:25.940 Yeah. And that's the force issue.
01:46:29.580 Yeah, I agree, I agree. So fundamentally, to clear my stance up, I agree that fundamentally you're not building society on force.
01:46:40.820 If for no other reason, because there'd be so much friction, it would fly apart at the seams, right? You can't force them.
01:46:45.260 You get resistance if you use force.
01:46:46.920 Yeah, fundamentally we're building off of cooperation. You want to invite people to participate in society. I agree with that.
01:46:51.320 I feel like once you start to hit certain thresholds or certain points, and you've got so many different types of people involved,
01:46:55.940 at some point, we're going to have to have force around the edges, on the guardrails, just to make sure that we don't allow—
01:47:01.940 Are you familiar with like tit-for-tat systems, essentially?
01:47:03.740 Very.
01:47:04.240 Yeah, tit-for-tat is probably a really important part of our evolutionary biological history and an important part of the animal kingdom.
01:47:09.700 And I think, to some degree, that tit-for-tat punishment is important to—
01:47:13.540 Is that force or justice?
01:47:15.440 You can call it what it is, but—
01:47:16.740 No, no, no, I'm curious what you think. This is a very serious question.
01:47:20.740 Yeah.
01:47:21.240 Because the tit-for-tat—the tit-for-tat is very bounded, right?
01:47:24.740 Yes.
01:47:25.240 It's like, you cheat, I whack you, and then I cooperate, right?
01:47:28.740 Yeah.
01:47:29.240 And I do think that there's a model there for what we actually conceptualize as justice.
01:47:32.740 Sure.
01:47:33.240 It's like, you don't get to get away with it, but the goal is the reestablishment of the cooperative endeavor as fast as possible.
01:47:39.740 Of course, I agree. But in a reductionist way, we're kind of just using justice here as a stand-in for force, right?
01:47:44.740 Well, I don't—
01:47:45.740 Because a tit-for-tat system—
01:47:46.740 That's a good—
01:47:47.740 A tit-for—a tit—so there are different types of tit-for-tat systems, right?
01:47:50.740 You've got tit-tit-for-tat, you've got tit-for-tat-tat, you've got—there's all sorts of types of systems where maybe you'll let somebody make a mistake one or two times, but you can't have a tit-tit-tit-tit-tit system because then somebody can come in and take advantage of it.
01:47:59.740 Yes, which is the problem with the compassionate left, by the way.
01:48:02.740 Exactly.
01:48:03.740 To some extent, sure, it can be. Or a problem with the right that's far too forgiving of Donald Trump.
01:48:07.740 But I would say that that tat part, the—you can call it justice. I think justice is a perspective, a force, right?
01:48:14.740 Where some people might consider a force to be just the cop that arrests the murderer, and other people might consider that force, that tat, to actually be injustice because the murderer was responding to environmental conditions, blah, blah, blah, or was—
01:48:25.740 Yeah, that's a stupid theory, that responding to environmental conditions theory.
01:48:29.740 Because here's why—
01:48:30.740 Well, I mean, it depends.
01:48:31.740 No, it's not.
01:48:32.740 Well, I mean, because essentially that's Rittenhouse's case. Self-defense is responding with violence.
01:48:36.740 So if you assume that there's a causal pathway from early childhood abuse to criminality, let's say, which is the test case for environmental determination of the proclivity for the exploitation of others—
01:48:49.740 Okay.
01:48:50.740 —then it spreads near exponentially in populations. That isn't what happens. So here's the data.
01:48:57.740 Most people who abuse their children were abused as children. But most people who are abused as children do not abuse their children. And the reason for that is because if you were abused, there's two lessons you can learn from that. One is identify with the abuser. The other is don't—
01:49:15.740 Never going to—
01:49:16.740 Right, exactly. And what happens, and if this didn't happen, every single family would be abusive to the core very rapidly.
01:49:23.740 Yeah.
01:49:24.740 What happens is the proclivity for violence is self—it dampens itself out as a consequence of intergenerational transmission. So the notion that privation is a pathway to criminality, that's not a well-founded formulation. And there are an infinite number of counter examples, and they're crucial.
01:49:44.740 Some of the best people I know, and I mean that literally, are people who had childhood so absolutely abysmal that virtually anything they would have done in consequence could have been justified.
01:49:57.740 You know, and they chose not to turn into the predators of others. And that was a choice, and often one that caused them to reevaluate themselves right down to the bottom of their soul.
01:50:09.740 And so that casual association of relative poverty even with criminality—we know also, we know this too. You take a neighborhood where there's relative poverty, the young men get violent.
01:50:22.740 But they don't get violent because they're all hurt and they're victims. They get violent because they use violence to seek social status.
01:50:29.740 And so even in that situation, it's not, oh, the poor, poor. It's no wonder they're criminal because they need bread. It's like, sorry, buddy, that's not how it works. The hungry women feeding their children don't become criminals.
01:50:42.740 The extraordinarily ambitious young men who feel it's unfair that their pathway to success become violent. And that's 100% well documented and generally by radically left-leaning scholars.
01:50:55.740 Sure. I don't disagree with any of that. Wealth inequality in areas is a much better predictor of crime than poverty.
01:51:00.740 Right, but it's a very specific form of crime. It's status-seeking crime by young men. Right? Well, but that shows you what the underlying motive is. It's not even redress of the economic inequality.
01:51:12.740 It's actually the men striving to become sexually attractive by gaining position in the dominance hierarchy. There's nothing the least bit about it.
01:51:19.740 I think you have to be really careful with that assessment, though, because you can say that it's not economically—it's not seeking economic—
01:51:26.740 Why do you have to be careful? The biggest predictor of a male—
01:51:29.740 Well, because we're assuming that people that commit crime in these types of circumstances are status-seeking and not trying to seek economic remedy.
01:51:36.740 That's exactly what we're assuming.
01:51:38.740 But it might be the case, for instance, that in economically prosperous areas, that the men there aren't actually seeking economic prosperity.
01:51:45.740 They're also just trying to elevate status, but they do it through economic prosperity. It's potential, right?
01:51:49.740 They do it with a longer-term vision in mind. Sure, they're trying to elevate. I wouldn't disagree with that in the least.
01:51:55.740 But they do it with a much longer time horizon in mind. And we know this partly because there have been detailed studies of gang members, for example, in Chicago,
01:52:04.740 who are trying to ratchet themselves up the economic ladder, but they do it with a short-term orientation.
01:52:09.740 Most of them think they're going to be dead by their early 20s. Sure.
01:52:12.740 So they're trying to maximize short-term gain.
01:52:15.740 It has nothing to do with the redress of economic inequality, except in the most fundamental sense.
01:52:20.740 And it is status-driven because they're looking for comparative status—
01:52:23.740 Sure, I understand what you're saying. I don't think any human being has baked in a desire to seek economic prosperity.
01:52:29.740 I think that that's like a third-order thing that we look for.
01:52:32.740 And fundamentally, it's probably more like safety, security for ourselves, and then status-seeking for other things.
01:52:37.740 I think that changes when you have children.
01:52:39.740 Um, well, I mean the safety and security would extend to your children.
01:52:41.740 Because your status is irrelevant, or starts to become irrelevant at that point.
01:52:44.740 I mean, depending on how you view your status, right?
01:52:46.740 You can't do that every time we have a discussion.
01:52:51.740 Well, I'm just saying, for instance, one of the important things for my child is to be able to send my child to a good school.
01:52:55.740 I need to have an elevated status, right? I need to be able to buy a house at the right school district, or I need to be able to pay the education.
01:53:00.740 Right, but you're not telling me, I hope, that the driving factor behind your desire to care for your children is an elevation in your status.
01:53:09.740 No, but I'm saying that the elevation of status might be what allows you to take care of children.
01:53:13.740 So, for instance, one of the biggest predictors of getting married is already achieving—
01:53:15.740 Is it status or position?
01:53:17.740 Well, that's what I'm saying. I'm saying there's like a—
01:53:20.740 I know, they're confounding.
01:53:21.740 All of these kind of play into—yeah.
01:53:22.740 Okay, look, we're running out of time.
01:53:24.740 You're smart, you're sharp.
01:53:26.740 You're smart, you're sharp.
01:53:27.740 That tit-for-tat thing, I was just saying that the tat thing, there is some underlying, built into probably our genes, right?
01:53:33.740 Because we see it all throughout the animal kingdom, that there's some level of punishment or some level of force.
01:53:37.740 Justice.
01:53:38.740 You can call it justice.
01:53:39.740 No, but I think it's the right—
01:53:40.740 It's justice when you're the tatter, not when you're the titter, though, right?
01:53:43.740 No, no, no.
01:53:44.740 When you're the titter, it's just retribution.
01:53:45.740 No, no, no, I don't think that's true either.
01:53:47.740 Look, if you read Crime and Punishment, for example, one of the things you see that emerges when Raskolnikov gets away with murder, and it's a brutal murder, and he gets away with it, it's completely clear, and he has a justification for it.
01:53:59.740 And what happens as a consequence is that that disturbs his own relationship with himself so profoundly that he can't stand it, such that when a just punishment is finally meted out to him, it's a relief.
01:54:12.740 And that's not rare.
01:54:14.740 And that is—like, there isn't anything more terrifying—this is why Crime and Punishment is such a great novel—there isn't anything more terrifying than breaking a moral rule that you thought you had the ability to break and finding out that you're somewhere now that you really don't want to be.
01:54:27.740 And then that, you know, there's nothing worse in your own life than waiting for the other shoe to drop.
01:54:34.740 If you've transgressed against a moral rule and now you're an outsider because of that, you live in no man's land, the fact that you have just retribution coming to you, that can be a precondition for your atonement and your integration back into society.
01:54:47.740 But it's probably important to note that, depending on the system you exist in, those moral transgressions just aren't, right?
01:54:53.740 So to take it back to—I'll use your leftist example—you might consider a threat of force for somebody to get a vaccine to be a highly immoral thing that might be a transgression against some fundamental moral thing.
01:55:04.740 But a person on the left might think that they're actually satisfying their moral requirement to society by doing so.
01:55:08.740 Much the same as a child soldier or—or not—I won't use child soldier—but maybe an older person that's committing intifada or some kind of Islamic terrorist thinks that they're fulfilling some moral calling as well.
01:55:17.740 No doubt. No doubt that that's the case. That's why I was focusing in on the use of force is that I think it's a rule of—a good rule of thumb policy that if you have to implement your goddamn scheme with force, then there's something wrong with the way it's formulated.
01:55:31.740 Does it bother that every religious— There's no reason we could have used a pure invitational strategy to distribute the vaccine.
01:55:37.740 It would have been much more effective. And it was bad policy, rushed. We're in an emergency. We have to use force.
01:55:43.740 It's like, no. No, you weren't. It wasn't the kind of emergency that justified force, not least because behavioral psychologists have known for decades that force is actually not a very effective motivator.
01:55:54.740 It produces a vicious kickback. So, you know, one of the things—this is going to happen for sure, you know, is that the net deaths from people stopping using valid vaccines as a consequence of general skepticism about vaccination is going to cause, in my estimation, over any reasonable amount of time, far more deaths than COVID itself caused.
01:56:16.740 You violate people's trust in the public health system at your great peril. And you do that by using force. And we did that.
01:56:23.740 And so you can see already that there's hordes of people who are vaccine skeptic, this generalized skepticism that, to some degree, you were rightly decrying.
01:56:32.740 It spreads like wildfire. And no wonder, because if you make me do something, I'm going to be a little skeptical of you for a long time.
01:56:39.740 You know, this conversation, we're here voluntarily. Like, we're trying to hash things out, and in good faith, you know. But neither of us compelled the other to come here, and neither of us are compelled to continue.
01:56:50.740 And so that makes it a fair game. And a fair game is something that everyone can be invited to. And I suppose that's something that's neither right nor left, you know, hopefully, right? Something we could conceivably agree on.
01:57:01.740 And I also think that I don't have any illusions about the fact that there are people on the right who would use power to impose what they believe to be their core, their core, what, their core, the core, what would you say? Their core idol.
01:57:19.740 Of course, the temptation to use force is rightly pointed to by the leftists who insist that power is the basis for everything.
01:57:29.740 It isn't the basis for everything. That's wrong. It's really wrong. But it's a severe enough impediment to progress forward that we have to be very careful about it.
01:57:38.740 So, look, we have to stop. I want to know if there's anything else you'd like to say before we stop, because unfortunately, we have to stop rather abruptly.
01:57:47.740 And so...
01:57:49.740 I think, yeah, I feel like we got pretty far into this.
01:57:54.740 What are you trying to accomplish? Let's stop with that. We found out a little bit about who you are.
01:58:00.740 I mean, you formulated your proclivity in terms of, to some degree, in terms of delight in argumentation or facility at it, which you certainly have.
01:58:12.740 The danger in that, of course, is that you can be oriented to win arguments rather than to pursue the truth.
01:58:19.740 And that's the danger of having that facility for argumentation.
01:58:22.740 Sure.
01:58:23.740 But what are you hoping to accomplish by engaging in conversations like this in the public sphere?
01:58:29.740 Elevation of status, you know? Absolutely.
01:58:31.740 That's one possibility.
01:58:32.740 No, I feel like I think debate or argumentation is good because it forces two sides to make their ideas somewhat commensurate to the other.
01:58:40.740 If two people are having a conversation, they have to be able to communicate said ideas to the other person.
01:58:44.740 Otherwise, it's just a screaming match.
01:58:46.740 And I think there is a good, for the sake of, like, just being bipartisan or having a collection of people in a certain area and having different people together, just that in and of itself without anything else happening, I think produces a good, at least for a democratic society.
01:59:00.740 For instance, like, I would agree that school, maybe not faculty, but administrators, are very, very, very, very, very far left today.
01:59:08.740 Dangerously so.
01:59:09.740 I don't have to talk to you about this, obviously.
01:59:11.740 But I think part of the responsibility to that, I think, rests at the feet of conservatives who, instead of maintaining participation in the system, decide that they're going to throw their hands up and disengage.
01:59:20.740 When I go and I see-
01:59:21.740 Or be forced out.
01:59:22.740 Or be forced out, sure.
01:59:23.740 As in my case, for example.
01:59:24.740 That's fine.
01:59:25.740 Yeah, sometimes it can happen.
01:59:26.740 Often.
01:59:27.740 But I think that rather than accepting being forced out, or rather than encouraging other people to disengage, the engagement has to happen.
01:59:35.740 It can't be a, I'm losing faith in the system, so all of us are going to go here and do our thing.
01:59:39.740 That's right.
01:59:40.740 It has to be like, no, we're going to be here in these conversations, whether you like it or not, because in a democracy, sometimes the guy you don't like wins.
01:59:45.740 Sometimes the policy that you don't like is enforced.
01:59:47.740 Sometimes a guy you don't like is somebody you have to share an office or a classroom with.
01:59:51.740 And we need to be okay with that.
01:59:52.740 And I'm worried that, like, the internet is driving people into these, like, very homogenous but very polar rice groups.
01:59:58.740 The data on that, by the way, aren't clear.
02:00:00.740 Like, whatever's driving polarization doesn't seem to be as tightly related to the creation of those internal bubbles as you might think.
02:00:08.740 Like, I've looked at a number of studies that have investigated to see whether people are being driven into homogenized information bubbles.
02:00:16.740 And it isn't obvious that that's the case directly, although the polarization that you're pointing to, that you're concerned about, that seems to be clearly happening.
02:00:25.740 So, and why that is, well, that's a matter of, you know, intense speculation.
02:00:30.740 I feel like the homogeneity, I feel like it's not so much, this is not research-based at all, just a total feeling.
02:00:36.740 Yeah.
02:00:37.740 So I'll admit that.
02:00:38.740 But the feelings that I have is it's not necessarily that homogeneity has increased.
02:00:42.740 It's that homogeneity has increased as a byproduct of the bubbles becoming larger.
02:00:46.740 So, for instance, it might be that I'm from Omaha, Nebraska.
02:00:49.740 It's a town in, or city, really, in Nebraska, right?
02:00:52.740 It might have been that 50 years ago, there are bubbles in living in Omaha, and there are different bubbles for living in Lincoln.
02:00:59.740 And there might be bubbles in Toronto, or neighborhoods in Toronto, or there might be bubbles in Vancouver.
02:01:04.740 But now, as the internet exists and things become more internationalized, these bubbles are, it's not just a bubble that exists in these cities.
02:01:11.740 Now the bubbles have come together, and as a result of them coming together-
02:01:13.740 Yeah, well, that's another gigantism problem.
02:01:15.740 Sure, yeah.
02:01:16.740 Or a globalization problem or a communication problem.
02:01:18.740 But you run into this issue where somebody might be in a particular city or state and have a really strong opinion about what AOC says, but they don't know anything about their local political scene.
02:01:26.740 And I think that that's an issue because the bubbles have gotten so large, and they're encompassing so many people now, and you're expected to have, like, a similar set of beliefs between all of these different people now that might live in totally different places.
02:01:37.740 That's, I think, a big issue we're running into.
02:01:39.740 Yeah, well, that could be—we'll close with this, I think—that might be one of the unintended consequences of hyperconnectivity, right?
02:01:47.740 Is that we're driving levels of connectivity that get rigid and that we also can't tolerate.
02:01:53.740 Uh-huh.
02:01:54.740 Yeah. All right, well, that's a good place to stop.
02:01:56.740 Well, thank you very much for coming in today.
02:01:58.740 That's much appreciated.
02:01:59.740 And you're a sharp debater and good on your feet, so that's fun to see.
02:02:05.740 And I do think that your closing remarks were correct, is that the alternative to talking is fighting.
02:02:13.740 Uh-huh.
02:02:14.740 Right.
02:02:15.740 So, when we stop talking, it's not like the disagreements are going to go away.
02:02:18.740 Yeah.
02:02:19.740 We will start fighting.
02:02:20.740 Yeah.
02:02:21.740 Right.
02:02:22.740 And talking—
02:02:23.740 Right, absolutely.
02:02:24.740 And talking can be very painful, because a conversation can kill one of your cherished beliefs, and you will suffer for that, although maybe it'll also help you.
02:02:32.740 Uh-huh.
02:02:33.740 But the alternative to that death by offense is death.
02:02:37.740 Right.
02:02:38.740 Yeah.
02:02:39.740 Right.
02:02:40.740 So, better to substitute the abstract argumentation for the actual physical combat.
02:02:43.740 For sure.
02:02:44.740 Right.
02:02:45.740 Sometimes, like, the worst relationships are the ones where couples fight a lot.
02:02:47.740 Yeah, that's right.
02:02:48.740 Yeah, that's right.
02:02:49.740 The really bad ones are where they don't fight ever.
02:02:50.740 Well—
02:02:51.740 And then, all of a sudden, there's a—
02:02:52.740 Yeah.
02:02:53.740 The couples who fight and reconcile.
02:02:54.740 Exactly.
02:02:55.740 Yeah, yeah.
02:02:56.740 Yes, exactly.
02:02:57.740 All right.
02:02:58.740 All right.
02:02:59.740 Well, that was good.
02:03:00.740 Thank you very much.
02:03:01.740 Yeah, thanks.
02:03:02.740 And for everyone watching and listening on the YouTube platform, thank you very much for your
02:03:03.740 time and attention.
02:03:04.740 And we're going to spend another half an hour or so on the Daily Wire side, so if you're inclined,
02:03:10.740 tune into that and we'll find out a little bit more about the background of our current interviewee,
02:03:17.740 Destiny.
02:03:18.740 See you later, guys.