The Jordan B. Peterson Podcast - August 30, 2018


Gregg Hurwitz - An Invitation to the Intellectual Dark Web


Episode Stats

Length

1 hour and 31 minutes

Words per Minute

194.24956

Word Count

17,766

Sentence Count

994

Misogynist Sentences

9

Hate Speech Sentences

15


Summary

Greg Hurwitz is a New York Times best-selling thriller novelist, comic book writer, screenwriter, and political consultant. He s also the author of 20 thrillers, including Out of the Dark, coming out in 2019, and two award-winning thriller novels for teens. He's written numerous academic articles, academic articles on Shakespeare, taught fiction writing in the USC English Department, and guest lectured for UCLA and Harvard in the United States and internationally. He was the undergraduate scholar-athlete of the year at Harvard for his pole vaulting exploits, and played college soccer in England where he was a Knox fellow. He sneaked onto demolition ranges with Navy SEALs, swum with sharks in the Galapagos, and went undercover into mind control cults. Greg is also a comic book author, having penned stories for Marvel, Wolverine, and Punisher, and DC, Batman, and Penguin, and has been published in 30 languages. He s written screenplays for or sold spec scripts to many major studios, including The Book of Henry, The Joker, The Batman, The Suicide Squad, and Batman: The Dark World. He's also written for TV, and is a regular contributor to the New Republic, The Onion, and The New Republic. In this episode, Dr. Jordan B. Peterson sits down with Greg Hurwitz to discuss his new book, Out Of The Dark, and why he decided to write about politics and politics in the current political climate. Dr. Peterson's new series on depression and anxiety, The Dark Side of the Mind, which debuts this fall. You can find a list of all of the best-sellers Greg has written and is available for purchase on Amazon Prime, Audible, and Barnes & Noble, as well as elsewhere. If you're interested in purchasing a copy of Out of The Dark and Noble, you can find them on Audible and Audible. Click here to buy a copy for $99.99, or you can check out the book at Audible for only $19.99.00. You'll get 20% off the book and get a limited edition copy of the book for free! out of the box only $99, plus shipping included in the Kindle edition, which includes a hard copy for free shipping, and an Audible Prime membership, and a lifetime of Audible membership, shipping upon purchase of $99 or Audible Pro, plus a VHS copy, and shipping included.


Transcript

00:00:00.000 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.000 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.000 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:19.000 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.000 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.000 If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.000 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.000 Let this be the first step towards the brighter future you deserve.
00:00:52.000 Welcome to the Jordan B. Peterson podcast.
00:00:59.000 You can support these podcasts by donating to Dr. Peterson's Patreon, the link to which can be found in the description.
00:01:06.000 Dr. Peterson's self-development programs, self-authoring, can be found at selfauthoring.com.
00:01:13.000 For more information, visit these audience,issant bede Combustions.com.
00:01:17.000 Imagine하세요, where you will get yourкая interest.
00:01:18.000 Hi, everybody.
00:01:46.340 I'm here today again with Greg Hurwitz, an old friend of mine, a former student from Harvard.
00:01:52.400 We've talked before and we've got some exciting things to talk about today. I'll give you some background on Greg first.
00:01:58.280 Greg is a New York Times number one internationally best-selling author of 20 thrillers,
00:02:04.140 including Out of the Dark, coming out in 2019, and two award-winning thriller novels for teens.
00:02:09.560 His novels have won numerous literary awards, graced top 10 lists, and have been published in 30 languages.
00:02:17.500 He's written screenplays for or sold spec scripts to many of the major studios, including the Book of Henry,
00:02:23.840 and written, developed, and produced TV for various networks.
00:02:26.760 He's also a New York Times best-selling comic book writer, having penned stories for Marvel, Wolverine, and Punisher, and DC, Batman, and Penguin.
00:02:37.240 He's published numerous academic articles on Shakespeare, taught fiction writing in the USC English Department,
00:02:43.220 and guest lectured for UCLA and for Harvard in the United States and internationally.
00:02:47.120 In the course of researching his thrillers, he sneaked onto demolition ranges with Navy SEALs,
00:02:53.560 swum with sharks in the Galapagos, and gone undercover into mind-control cults.
00:02:59.360 Hurwitz grew up in the Bay Area.
00:03:01.600 While completing a BA from Harvard and a Master's from Trinity College, Oxford, in Shakespearean tragedy,
00:03:06.920 he wrote his first novel.
00:03:08.580 He was the undergraduate scholar-athlete of the year at Harvard for his pole vaulting exploits,
00:03:14.000 and played college soccer in England, where he was a Knox fellow.
00:03:19.040 So that's Greg.
00:03:21.360 Hello, everybody.
00:03:23.280 It's quite an intro.
00:03:24.240 I think I'm just going to have you follow me around everywhere and read that before I enter a room.
00:03:28.380 Just going to make sure everyone dislikes me.
00:03:30.520 Yeah, exactly.
00:03:31.720 That's right.
00:03:32.140 That's the sort of intro that makes...
00:03:33.580 Everyone hopes you're a real son of a bitch after reading an intro like that,
00:03:36.660 so they can morally dislike you.
00:03:39.320 Yes.
00:03:39.760 The best thing.
00:03:40.640 There's a lot to morally dislike, Jordan, as you know.
00:03:43.040 Yeah, yeah.
00:03:44.260 So we're not going to talk too much today about Greg's previous exploits.
00:03:49.060 Instead, we're going to talk about some of the work he's been doing in the political realm recently
00:03:52.720 and the reasons for doing that.
00:03:54.400 So, Greg, I'm going to let you take that away.
00:03:57.320 Well, so, look, I write thrillers, and I have...
00:04:01.140 The shorthand that I always give is that about half of my friends are born-again Christian Navy SEAL snipers,
00:04:05.940 and half of my friends are gay ACLU lawyers.
00:04:08.760 And I tour equally in the red and blue states.
00:04:11.080 My readership is very, very diverse.
00:04:13.260 And I think because of that, I have a real ear for the buzzwords and phrases and ideological shorthand
00:04:20.260 that really shut people down, that just make them stop listening.
00:04:25.220 My own politics, I'd say I'm a bit more of a classical liberal.
00:04:29.400 And one of the things that I talk a lot about is there are certain phrases that we've grown up with from both sides
00:04:35.720 that just make people completely stop.
00:04:37.480 And it's not dissimilar to what we're trying to do when we're writing, right?
00:04:40.500 If you write and use buzzwords and catchphrases and cliches, it's less interesting.
00:04:45.360 If you already know what you think before you start writing, and there's no room for discovery along the way,
00:04:50.580 then you're not...
00:04:51.040 It's not real writing.
00:04:52.020 It's propaganda, right?
00:04:53.160 It's Ayn Rand.
00:04:54.040 You're never going to be surprised by something in the third act.
00:04:57.460 And so there's a real overlap for that that I found.
00:04:59.780 And so one of the things that I've been trying to do, given how polarized I have found discourse to be
00:05:05.180 and how I found myself moving between vastly different worlds,
00:05:09.000 is to try and commit myself to ending or contributing to ending polarizing discourse.
00:05:13.880 And so part of that is that I've been focused on trying to rebrand the Democratic Party,
00:05:20.560 working with a lot of different candidates and entities in an effort to get away from gridlock and more towards solutions.
00:05:27.100 I have a view that's a lot like Jonathan Haidt, who you've interviewed at length,
00:05:31.680 who I think gives a great description of the need and necessity for both conservatives and liberals,
00:05:37.340 in that conservatives like walls around things, right?
00:05:40.540 They like walls around gender.
00:05:41.840 They like walls around countries.
00:05:43.740 Build a wall is sort of a brilliant Republican slogan.
00:05:47.040 And the job of liberals is to say, hang on a minute.
00:05:50.260 If that wall is too impenetrable, you're not going to have new ideas and people,
00:05:54.420 and then we'll stagnate and die.
00:05:56.620 And so for me, and you and I have talked before about how it's also the case for inventors and entrepreneurs
00:06:00.920 who kind of create things, which tends to be more of a liberal leaning,
00:06:05.240 but if you want them run effectively, often you need conservative managers to come in.
00:06:08.800 And so I want there to be stronger parties on both sides.
00:06:14.060 I mean, I want there to be better discourse.
00:06:15.900 I think if there's a stronger Republican Party, it makes the Democratic Party have to raise their game
00:06:21.120 and come up with solutions.
00:06:22.480 And out of the one thing I will say that has been in common of everyone who I talk to
00:06:26.340 is nobody is thrilled with the level of political discourse right now.
00:06:29.860 So there's a couple of things there that are really interesting.
00:06:32.740 I mean, you talk about the necessity for high-quality representation on both sides of the political continuum,
00:06:44.940 let's say.
00:06:45.940 And there are really powerful temperamental and philosophical reasons for that.
00:06:50.920 So, you know, you mentioned that the liberal types tend to be entrepreneurs and creators,
00:06:55.480 and the conservative types tend to be managers and administrators.
00:06:58.320 So that's pretty well documented in the psychological literature.
00:07:01.280 But you see echoes of that, and you mentioned the border issue as well,
00:07:05.160 because, of course, to categorize things, we have to put boundaries around them.
00:07:08.980 But boundaries can artificially separate and make things stagnant as well.
00:07:13.080 So we have to have a continual discussion about exactly where the boundaries should be drawn
00:07:16.680 without ever also presuming that we should just do away with boundaries altogether,
00:07:20.720 because that does away with category.
00:07:22.460 And then with regards to hierarchies, well, you know, it seems to be the job of the right wing
00:07:28.260 to put forward the proposition that hierarchies are necessary from an organizational perspective.
00:07:34.780 So if a bunch of people decide to go do something of value,
00:07:38.700 let's say it's of collective value as well as individual value,
00:07:41.940 then inevitably when they operate cooperatively and competitively,
00:07:45.860 they're going to produce a hierarchy.
00:07:47.080 And the hierarchy is actually a tool to accomplish that task in the social arena.
00:07:53.420 But the problem with that is that a hierarchy can become over-rigid and corrupt,
00:07:58.220 and it can dispossess people without that dispossession being a necessary function of the task
00:08:06.840 that the hierarchy is attempting to perform.
00:08:09.360 Yeah, it's always both, right?
00:08:11.240 What's that?
00:08:11.980 It's always both, right?
00:08:13.500 I mean, it's always that we need it to organize, and there always are inequities that are a result of it.
00:08:18.940 That's right.
00:08:19.300 And the thing that I have found is that when people lock in the political conversations
00:08:23.480 from one side or the other, it becomes all or nothing.
00:08:26.340 And for me, nothing gets accomplished in all or nothing.
00:08:29.080 And so to straw man, you know, Democrats that all of them just want to tear down the whole system,
00:08:34.600 and, you know, those are some of the loud voices on Twitter,
00:08:37.540 isn't helpful any more than the claim that Republicans have no interest whatsoever in the people who are dispossessed.
00:08:43.820 And so a lot of what's important is to look at, you know, conservatives who are, you know, thoughtful.
00:08:51.120 It's not like they love the fact that people are dispossessed.
00:08:53.720 So the question becomes, how do we all help that, and how do we actually have a discussion,
00:08:57.500 and how do we get away from the fact that compromise towards solutions is somehow a betrayal?
00:09:05.440 That if you compromise anything from the furthest extreme, you're sort of betraying your own ideology and your own people.
00:09:11.020 Yeah, well, I think partly you do that by focusing on specific problems instead of broader ideological solutions.
00:09:16.200 And that's, so part of what we're trying to discuss today and to contribute to is the re-centralization,
00:09:23.620 let's say, of the political parties, and also to make a case for the necessity of intelligent discourse,
00:09:27.860 knowing that people on both sides of the political spectrum have something intelligent to say.
00:09:32.300 So the conservatives have every reason to put forward the proposition that hierarchies are both inevitable and useful,
00:09:39.620 but also to keep firmly in mind that they can become corrupt and that they do tend to dispossess.
00:09:44.560 And the liberals have every reason to keep in mind that hierarchies are absolutely necessary in order to get things done,
00:09:50.620 but that the dispossessed need a voice so that the system remains both permeable and fair.
00:09:57.260 And it's really necessary for both sides to have some respect for the position of the alternative temperamental type, let's say.
00:10:06.160 Well, that's where we have to meet in rational sort of enlightenment discourse, or else we can't win.
00:10:11.980 You know, you've talked a lot about the big five personality traits and the ways that it orients differently in politics.
00:10:18.620 So it's like liberals are higher in empathy, right, and higher in openness.
00:10:21.740 That's why a lot of artists you find tend to be liberal, you know, at least the good ones.
00:10:27.500 Like we have Bruce Springsteen on the other side.
00:10:30.360 It's hard for President Trump to fill up an arena with people who are the artists of a certain caliber.
00:10:37.280 And, you know, it doesn't mean that that higher empathy is better.
00:10:41.880 And I think often when liberals are talking, they're trying to push through only empathy and higher empathy.
00:10:47.260 And they're trying to educate people in the higher empathy.
00:10:49.320 Well, it's a fixed psychological trait.
00:10:51.280 And the other thing is, you know, Republicans or conservatives tend to be higher in conscientiousness.
00:10:56.820 And conscientiousness codes for a bunch of things that are really useful.
00:11:00.220 You know, codes for better health, better finances, more stable marriages, and longer lifespan.
00:11:05.480 So one of the things when I'm advising, you know, Democrats is like you can't – if you – you have to win an argument on the merits.
00:11:12.260 You have to make a better argument if you're losing people.
00:11:14.640 You can't just go to them and say, be more empathetic.
00:11:17.460 That's not an argument, right?
00:11:18.920 It's like going to President Clinton and saying, can't you just be an introvert?
00:11:23.440 These are sort of fixed traits.
00:11:24.840 And the other thing is, is a lot of conservatives are doing fine over there with their stable marriages and longer life.
00:11:30.980 Well, and the empathy issue is interesting as well because it's easy for that to be regarded reflexively as a virtue.
00:11:36.720 But an excess of pity, let's say, can be destructive.
00:11:40.660 There's a whole psychoanalytic literature on the negative consequences of fostering over-dependence, let's say.
00:11:46.960 And that's a real problem.
00:11:47.980 And there is a managerial literature, too, that suggests that less agreeable managers do better,
00:11:52.340 as well as less agreeable people in general having higher incomes because they're better at bargaining for themselves.
00:11:58.060 So it is really necessary to give some consideration to the fact that each of these temperaments has marked advantages and disadvantages
00:12:05.080 and a proper niche where it can function and other places where it actually constitutes a problem.
00:12:10.980 Well, and that's where we have to rise above ourselves and meet.
00:12:14.860 Like, where do we meet?
00:12:15.880 We meet in freedom of speech.
00:12:17.320 There's a reason it's the First Amendment.
00:12:18.820 You know, if I was dumb enough to be a single-issue voter, that would be my single issue.
00:12:23.600 Because part of it is, as a classical liberal, I tend to approach things, I have a strong amount of empathy.
00:12:30.880 But I also realize that that's not a trait that I can map on to everybody else when I'm having a discussion,
00:12:35.680 that I have to actually meet and make arguments for people who approach the world in different ways
00:12:41.680 and have different kinds of successes to make higher conscientious arguments in a way that appeals more broadly.
00:12:47.300 And, I mean, that's free marketplace of ideas.
00:12:49.440 We have to meet and figure out and discuss these matters.
00:12:52.640 And if we're not doing it in words, we're doing it with fists and knives.
00:12:55.040 Well, the other problem with agreeableness, let's say, and empathy is that the empathetic identification
00:13:02.440 tends to make the person who's experiencing it feel immediately virtuous because they're on the side of the weaker party, let's say.
00:13:10.640 And there's obviously some utility in that.
00:13:12.600 But it's not really, it doesn't come with a set of solutions to complex problems.
00:13:19.600 You know, it seems to me that agreeableness is a pretty good virtue for small units like the family,
00:13:24.460 where egalitarian distribution is of extreme importance.
00:13:27.780 But it doesn't seem to work very well at higher levels of complexity.
00:13:32.580 So, for example, it's conscientiousness that predicts workplace proficiency.
00:13:37.480 You know, it's the second best predictor after IQ.
00:13:40.160 Agreeableness is actually somewhat negatively correlated unless the domain has to do with direct personal care.
00:13:46.560 And so I don't think agreeableness scales very well, which is why conscientiousness has to enter into the picture.
00:13:52.280 Yeah, I view it as like a very helpful motivator for myself.
00:13:55.780 I look at a problem.
00:13:56.940 My approach tends to be empathy for people who have been left out or have been,
00:14:02.140 who don't have the right end of the dominance hierarchy.
00:14:05.760 And for me, it's a motivator to look at actual solutions and problems and issues.
00:14:10.980 And there's a lot of real concrete things that can be attacked.
00:14:13.920 But if you don't attack them from a perspective that's morally condescending, you can actually get shit done.
00:14:19.020 Well, it also, it's also very helpful to increase the resolution of the problem, right?
00:14:24.100 And to stop trying to solve every problem with one brushstroke.
00:14:28.560 Most things are really complicated.
00:14:30.180 So intelligent political discourse should involve decomposing a problem like poverty,
00:14:36.560 which isn't a problem, but a set of a thousand problems,
00:14:39.400 into each of those thousand problems and then trying to generate creative solutions to each of them and then to test them.
00:14:45.780 And so the resolution of the discussion has to be increased.
00:14:49.580 And one of the things that you and I had talked about was the possibility that these longer form,
00:14:54.040 the longer form discourse avenues that are available on the new media,
00:14:58.540 like what we're doing right now,
00:14:59.880 might enable us to identify and promote politicians who are capable of high resolution discussion,
00:15:05.800 who have real solutions to real problems instead of having to compress everything into a six second soundbite.
00:15:11.400 Well, the other thing that I'm finding really heartening is a lot of the candidates that I am working with,
00:15:16.380 interacting with, and at this point that's in the hundreds, are very focused on actual solutions.
00:15:21.700 And I'm trying to get a handle on, you know, the cultural conversation, whatever the hell that is.
00:15:27.400 It benefits by blowing the extremes up and it feels like that's almost all that we're hearing about.
00:15:32.540 And when I'm dealing with actual candidates, a lot of them are dealing with anti-corruption,
00:15:37.440 healthcare, jobs, and education.
00:15:39.240 Like they're aware that that isn't the play.
00:15:41.180 And so there's this amplifying effect and then we have a lot of sort of like this cultural conversation
00:15:47.120 about the cultural conversation that things become very kind of meta.
00:15:51.140 And the level that candidates are operating at and the level that people are, you know,
00:15:55.620 I'm dealing a lot with candidates in different states.
00:15:58.540 People are having a hard time right now and they need hardcore solutions.
00:16:01.500 And they actually don't give a shit about everyone on Twitter and Facebook who's, you know,
00:16:06.300 preaching everyone hasn't already unfriended them.
00:16:08.220 They need solutions.
00:16:09.940 They need a government that's functional.
00:16:11.360 They need, they want more transparency and lack of corruption.
00:16:14.440 There's a very common set of things that govern people, but there's a sort of amplified effect
00:16:19.740 from the extremes of the party that are gathering the attention.
00:16:25.240 And the biggest tool that we have in a democracy in some ways is our attention and where we focus
00:16:30.180 it.
00:16:30.380 And if that's being hijacked by, you know, bullshit conversations and straw men, I mean,
00:16:36.300 I would love to engage with, with reasonable, you know, Republicans, a lot of whom are my
00:16:42.960 friends and are my colleagues to say, look, I know we can go and push everything to extremes
00:16:47.720 and fight and debate and score points.
00:16:49.840 But like, you don't like some of this shit anymore than I do.
00:16:52.740 Like what's an actual conversation from our different perspectives, assuming I don't have all
00:16:56.760 the answers that we can come up with certain solutions that's going to take care of people
00:17:00.480 who are not doing as well, you know, who are not doing as well, who are not keeping up with
00:17:04.800 the economy because that's not good for anybody.
00:17:07.400 That's not a game that is iterable across multiple games and generations.
00:17:11.300 If more and more people are struggling more and more, that ain't good for anybody, even the
00:17:15.940 people at the top.
00:17:16.720 So how do we seek to kind of balance that out for people who don't have a fair shot and don't have
00:17:21.840 Even from a conservative perspective, you could say, well, how do we stabilize the hierarchy?
00:17:26.740 So the tendency for, or the multiple hierarchies for the tendency, so that the tendency for people
00:17:32.080 to stack up at the bottom doesn't become so extreme that the entire system starts to
00:17:36.080 shake and tremble, which obviously isn't good for anyone at all.
00:17:39.740 We're seeing a lot of shaking and trembling.
00:17:41.500 I mean, I think a lot of people in the last election did not feel heard or seen or represented.
00:17:46.720 In the choices of candidates.
00:17:47.900 And a lot of people were like, I don't care what it is.
00:17:50.680 It's not going to be business as usual.
00:17:52.760 Right.
00:17:53.060 So the solution isn't, in my estimation, to denigrate people who chose to vote for President
00:17:58.000 Trump.
00:17:58.560 I think the solution is to talk about the fact that we need to do better if we want to
00:18:04.460 promote a, if the Democrats want to promote a viable alternative, we have to do better with
00:18:09.080 what our messaging is.
00:18:10.040 We have to do better on messages that are anti-corruption and we have to make a better argument.
00:18:14.280 Yeah.
00:18:14.360 Well, the thing is, you know, I was on Bill Marshall a while back and a lot of the panelists
00:18:19.260 on his show were really taking potshots at the Trump supporters and, and, you know, with
00:18:24.100 the typical sort of pejoratives that are labeled at or leveled at them.
00:18:27.880 And I thought that was extremely dangerous because they're basically dissing, let's say,
00:18:32.060 to use a terrible cliche, about 50% of the American population and the same percentage
00:18:37.160 that's been voting Republican for about 20 years.
00:18:39.440 And to just out of hand dismiss your, the people who are voting differently than you
00:18:44.420 as somehow primitive or primordial or foolish or, or ill-advised is very, very dangerous.
00:18:50.740 And also more than that, it's also an abdication of responsibility because the, the Republicans
00:18:57.240 might've won the last election or Trump might've won the last election, but certainly the Democrats
00:19:01.560 lost the last election.
00:19:03.240 And so I would have, I thought that their discourse would have been a lot more productive if they
00:19:07.620 would have focused on the failings of the democratic party.
00:19:10.180 And that's also a lot more useful because if you can figure out why you didn't succeed,
00:19:15.200 even though it was very close election, if you can figure out what mistakes you made,
00:19:18.340 then you could rectify it.
00:19:19.520 Well, that's exactly, I mean, that's exactly along the lines of the things that you and
00:19:24.500 I talk about at length, which is personal accountability.
00:19:26.760 It's like you get in a fight with your wife or your husband is the best thing to do to
00:19:31.380 point out the 50% or 80% or 90% where they screwed up or to actually reflect and think about
00:19:37.400 whatever percent it is at whatever number that you can do something better.
00:19:40.840 That's actually within your control.
00:19:42.740 And so for me, that's a scalable notion.
00:19:45.560 You know, I have 20 friends, too white and varied and smart who voted for Trump to dismiss
00:19:50.180 them all as idiots.
00:19:51.180 It's a very, you can't denigrate it.
00:19:53.480 And plus we're married to them.
00:19:54.820 That's 50%, 49.9% of our population.
00:19:58.800 We are married to them and we have to figure out how to talk and to come up with solutions
00:20:02.960 that make sense and that everybody can.
00:20:05.920 And so for me, a lot of it is to look at what the impulse was that was underlying that
00:20:10.760 and to figure out where Democrats can be better at seeing and hearing people.
00:20:15.140 And furthering arguments in ways that feel legitimate and connect with people and to
00:20:19.020 figure out what the messaging is.
00:20:20.300 And you always clean up your side of the house first.
00:20:22.900 And it doesn't mean that there's a number of policies that I'm highly critical of President
00:20:27.520 Trump.
00:20:28.000 It doesn't mean that I denigrate his followers or the people who voted for him or that I
00:20:32.840 will dismiss any tendencies that they might have had or hesitations that they might have
00:20:36.460 had about voting otherwise.
00:20:37.480 Well, the right message should be something like not so much why your opponent is worse,
00:20:43.960 but by why you're better, why your solutions are better, why the grass is greener on your
00:20:49.880 side of the fence, why there's less corruption occurring under your watch and also marketing
00:20:55.280 the fact that the solutions that you have are both reasonable and practical.
00:20:59.400 And I mean, that does require a more elevated form of political discourse.
00:21:02.380 And it would be really nice if that could be facilitated by these long forms that are available.
00:21:07.780 Well, we hear the refrain all the time of like, the Democrats don't have a message.
00:21:11.400 Like saying that President Trump is awful is not a message.
00:21:14.700 And I've been really heartened with the conversations that I'm having with candidates right now that
00:21:19.220 there's a huge focus on kitchen table issues.
00:21:21.820 There's a huge focus on trying to tackle health care and education and jobs in ways that are more
00:21:26.760 innovative.
00:21:27.160 When you talk to the actual candidates who I want to start to pull into more and more
00:21:31.480 conversations, I have an enormous amount of optimism.
00:21:35.180 And the other thing is, is your real money where your mouth is stands against corruption.
00:21:39.920 And there's something called the Disclose Act that a lot of Democrats are working on,
00:21:44.040 which is fully disclosing all of your donors, you know, releasing taxes, taking a pledge against
00:21:49.520 no dark money.
00:21:50.820 There's a whole number of different resolutions, you know, fighting voter suppression.
00:21:54.880 And that means whether we lose or not, it's not about fighting voter suppression to sort
00:21:59.300 of tilt the vote in a direction.
00:22:00.920 It's like, we need the right to win elections or lose elections fairly.
00:22:05.420 And so there's a lot of positions that are being put forth right now that I feel are really
00:22:09.620 heartening, including a very strong anti-corruption stance, because you can't, you have to clean
00:22:15.080 up what your side is.
00:22:16.360 And in this wave of candidates that I'm looking at, it's been really impressive to me how they're
00:22:22.340 starting to further and articulate what the argument can be from that side.
00:22:27.180 And I think that's where we can also start to find common ground to talk about, oh, there's
00:22:31.140 a corruption.
00:22:31.740 I mean, there's a lot of Republicans who aren't thrilled with the level of discourse right
00:22:34.800 now.
00:22:35.580 Well, one of the things that struck me when I've been talking to you lately is when we've
00:22:39.980 been talking about what happened with the Democratic Party is also the idea that the radicals,
00:22:44.420 the identity politics types, managed to take over the narrative to a large degree, partly
00:22:48.940 because the extremism is more attractive, let's say, among a dying mainstream media.
00:22:53.780 It's clickbaity and easy to attract attention to and loud and cinematic and all of those
00:22:59.360 things.
00:23:00.020 But also that the centrist Democrats seem to have lost faith in their central narrative or
00:23:05.380 perhaps have failed to produce one over the last 15 years.
00:23:09.540 And that left a void into which the extremist narrative, the oppressor-oppressive, oppressor-oppression
00:23:16.360 narrative, let's say, or oppressor-victim narrative has been able to slide in and dominate because
00:23:22.320 of that void.
00:23:23.560 And so that's another thing that needs to be seriously addressed.
00:23:26.120 It's like the center has lost its narrative.
00:23:28.320 And that's something I'd like to see the Democrats deal with.
00:23:33.200 Now, in the first ad that you generated, you had a farmer talk, if I remember correctly,
00:23:40.040 you had a farmer talk about success and about what that might mean if you could attain
00:23:43.800 it individually.
00:23:45.200 I think where we have fallen down as Democrats and where we are now reorienting is the discussion
00:23:53.100 of aspirational values, meaning, is it a rigged game?
00:23:56.560 Like, we can absolutely point to things that are disadvantages, you know, but to say it's
00:24:00.700 a rigged game, it's a rigged game.
00:24:01.880 The thing that we have to say is all that we're in favor of are fair rules at the starting
00:24:06.260 gate.
00:24:06.660 And if you work harder, if you're an innovator, if you bust your ass, good on you, like go
00:24:12.840 buy a mansion and three cars, you know, go start a second business.
00:24:16.120 We have to be rewarding of success and speaking to people's aspirations.
00:24:20.260 And, you know, we talk a lot about income inequality.
00:24:22.720 And the problem with that is what's the opposite of that is income equality.
00:24:26.440 Well, nobody wants that.
00:24:27.560 If you look at the average salaries and incomes of people in the Senate and Congress, they're
00:24:32.120 doing just fine.
00:24:33.220 They don't want that either.
00:24:34.320 And so for me, I think it's much more powerful to talk about income and justice, because it
00:24:38.720 means that people have a fair shot at the starting gate, and then they can differentiate
00:24:42.900 themselves by their own, you know, abilities and their own willingness to work.
00:24:47.340 And for me, it's about the starting gate and making sure that we look at things that
00:24:50.680 are there.
00:24:52.060 Yeah, well, and also that emphasis that you just placed on rewarding actual achievement.
00:24:56.600 We also have to be in a situation where we can admit that at least one of the factors
00:25:00.520 that differentiates people as they move up competence hierarchies.
00:25:04.920 And we have to admit that some of the hierarchies that exist are actually based on competence and
00:25:09.100 not merely on oppression is the amount of effort they put into things.
00:25:13.200 And, you know, that's become an anathematic proposition on many university campuses where
00:25:19.540 you're not allowed, in fact, to state that hard work is that some of the people who have
00:25:24.840 made it have done it done it as a consequence of working extra hard, because I guess that
00:25:29.720 undermines the narrative that the whole game is rigged.
00:25:31.820 And the truth of the matter is the game is partly rigged, like all games always are.
00:25:37.200 And that is unfortunate.
00:25:38.160 But you can't throw the whole damn game out because it's partly rigged, unless you think
00:25:42.800 you can put in place a game that isn't going to be rigged.
00:25:45.240 And good luck with that.
00:25:46.560 It's yeah, I mean, there is a very interesting interview between Eric Weinstein and Ben Shapiro
00:25:52.240 on the podcast.
00:25:53.280 And what's so interesting is they're from completely opposite sides of the spectrum.
00:25:56.120 I think doesn't Eric define himself as a socialist?
00:25:59.220 Well, he's certainly on the left.
00:26:00.980 Yeah, I mean, so, but you have them discussed and it was very, very civil discourse.
00:26:05.140 And I think Eric can do a much better job than I can of discussing the ways in which
00:26:09.280 the issue isn't that there is a dominance hierarchy and that people differentiate themselves
00:26:13.380 based on their hard work.
00:26:14.900 The issue is sort of what we've seen with the separation of the working class and people
00:26:18.420 at the very, very top.
00:26:19.760 You know, he points to a lot of trends about what's happened with workers' wages.
00:26:23.760 You know, a CEO a few decades ago made 30 times the average worker.
00:26:27.260 Now it's at 300.
00:26:28.660 It's not about overall people doing well.
00:26:30.780 There's an increasing exponential separation between, you know, working people and people
00:26:36.460 who are at the top.
00:26:37.440 And I think that's the skew that's problematic.
00:26:39.280 And the fact of the matter is rather than pushing away and, you know, we could talk a
00:26:44.780 little bit also about, you know, conservatives like Ben, but rather than sort of denigrating
00:26:50.140 it, there's a lot of good brains on both sides of the fence.
00:26:53.060 And like, wouldn't it be absolutely lovely if we have reasonable conversations about what
00:26:58.280 we all want to do about that rather than the position having to be income equality versus,
00:27:03.860 you know, only self-reliance and pull yourself up by your bootstraps?
00:27:06.760 Because there's some people who can't do that.
00:27:08.440 There are some fundamental problems with the system that's leading to people having a lot
00:27:13.060 harder time.
00:27:13.980 And everyone agrees with that.
00:27:15.760 And if we can talk about the details, you know, it's sort of like when we talk about
00:27:19.380 universal healthcare.
00:27:20.420 I feel like you come in the door and the minute you say it, people are for and against.
00:27:24.180 And for me, there's a lot of other arguments to make.
00:27:26.120 It's like, look, we already have universal healthcare in this country.
00:27:30.000 It's called emergency rooms.
00:27:31.240 It's un-American and wrong to let people die on the streets.
00:27:33.900 We don't do that.
00:27:35.240 But if people are uninsured, the hospital passes on the cost to insurance companies who
00:27:39.420 raise our premiums.
00:27:40.800 So the average cost of an emergency room visit in the United States is $1,233.
00:27:45.640 The average cost of a vaccination is $19.
00:27:48.400 We're all paying for it anyways.
00:27:49.840 And there's also a lot less risk of public health hazards and other things if we can
00:27:54.620 figure out a way to come up with a medical system that people are getting the care that
00:27:58.560 they want.
00:27:59.100 Preventative care is infinitely cheaper than trying to play catch up in emergency rooms.
00:28:03.840 And if you don't just walk in the door and say, I'm for universal healthcare, I'm opposed
00:28:08.180 to universal healthcare.
00:28:09.260 Because a lot of people say, look, I was out of work and having a hard time and I had to
00:28:12.960 figure out paying my own healthcare.
00:28:14.740 We have to frame it in a way of saying, this is good for the robustness of the whole community.
00:28:18.420 And I think there's a lot of arguments and discussions from both sides that we need all those brains
00:28:24.340 pulling together.
00:28:25.700 And it's one of the things as I'm watching a lot of the IDW stuff, I would love to see
00:28:30.480 increasingly, increasing conversations like the one that Eric and Ben had, where there's
00:28:34.900 two people from the opposite side who are engaging very reasonably and seeing each other's points
00:28:39.700 and trying to figure out what skills they can bring from their respective sides of the
00:28:43.680 proverbial aisle.
00:28:44.620 So you see, you think you see within the Democratic Party an attempt at least to increase the
00:28:50.920 resolution of the discussion and to move away from the more polarizing discourse.
00:28:54.700 You see that starting to develop.
00:28:56.560 Absolutely.
00:28:57.060 So tell me some of the concrete things you've seen.
00:29:00.760 Well, I mean, you know, I've been meeting with a lot of candidates.
00:29:04.040 I found there's a candidate who I really like called Joseph Kopsler is running in Texas,
00:29:07.820 who's a, you know, 20 year in the army, he's a ranger, I believe he is a bronze star, he's a
00:29:13.180 graduate of, you know, Harvard's Kennedy School, he's a professor at West Point.
00:29:17.440 And what was amazing with him is like, he was talking about universal healthcare, he started
00:29:21.320 his own business, a private business that led to tons of jobs. And he's this incredible person.
00:29:27.000 And he made an argument in a way I haven't ever thought about it. When he's talking about
00:29:30.040 healthcare, he was like, look, I have my healthcare from the army for life, that allowed me to take a risk
00:29:36.240 and to go out entrepreneurially to start a company that then led to tons and tons of jobs.
00:29:41.700 Yeah, you know, there's evidence in Canada, our rate of entrepreneurial development in Canada
00:29:46.180 per capita is actually slightly higher than it is in the US. And analysis has indicated that one of
00:29:51.420 the reasons for that is the provision of universal healthcare. People can take entrepreneurial risks
00:29:56.580 in Canada without losing their fundamental safety net. So you can make a conservative argument for the
00:30:02.180 provision of a certain level of underlying security, let's say.
00:30:07.040 Right. And it was a great argument for me. I mean, the other thing is, I'm seeing a lot of
00:30:11.800 candidates who are Democrats who are very upfront with their support for the Second Amendment. And
00:30:17.060 it's like, look, the Second Amendment is an amendment, you can't just get rid of an amendment. And I know
00:30:22.440 that as a classical liberal, when people come after freedom of speech, I get really pissed, right? And
00:30:28.460 people who are living in different regions, in different states that have different cultures,
00:30:33.320 when they hear people start to attack the Second Amendment, or when it's global, where it's all or
00:30:38.440 nothing, there's an inherent threat that comes with that. And a lot of it for me is to look at and go,
00:30:43.980 look, every amendment has certain limitations on it, right? So you can't stand up on an airplane and
00:30:50.440 scream that you have a bomb. You can't threaten the President of the United States that you're going to
00:30:53.340 kill him. The First Amendment, we have parameters around it. We already have them in place around
00:30:58.440 the Second Amendment. Everybody knows you can't own a bazooka. Everyone knows you can't own a dirty
00:31:02.560 bomb. So instead of coming in and sort of threatening people in their way of life, if they have handguns,
00:31:08.260 if they're hunting, if they are weapons collectors, if we can boil the resolution down to say we're
00:31:13.660 already having every amendment we have, or the First and Second Amendment, we have certain limitations
00:31:18.540 that we've placed on it for the safety of the community. So how about if we just talk about
00:31:22.440 violent history checks? Let's just talk about that one piece. So 90% of Americans are in favor of
00:31:28.960 that and 73% of the NRA. And so to have a candidate stand up and say, you know, I love the First
00:31:35.120 Amendment. I fully support the Second Amendment. Believe me, I have a lot of friends who are Army
00:31:39.000 Rangers, Green Berets, and SEALs. I know that culture well. And rather than having some sort of
00:31:44.260 frontal attack on it, to sort of say, well, what's reasonable? If you can't yell fire in a crowded
00:31:48.860 theater, how about if we only discuss that, and I'm standing with 90% of Americans and 73% of NRA
00:31:54.420 members, it's a much more respectful conversation. And it's a much more solution-based conversation.
00:32:00.280 It's also a humbler conversation, because it requires a certain amount of appreciation for
00:32:06.780 incremental change, right? Because it's very ego-inflating to do a great thing, to make a massive
00:32:14.520 change, and much less so to work in the background to make a small change. But small changes are solid,
00:32:20.660 and they tend not to produce terrible negative consequences. And to work incrementally is to work
00:32:28.020 realistically and properly, and I would also say meaningfully. And that's another element that
00:32:35.660 needs to be introduced back into the discourse. It's like, well, why don't we, you know, the people
00:32:40.980 who established the American system were incrementalists, fundamentally. And they knew
00:32:47.040 perfectly well that we had a flawed system that was always going to be flawed, and the best we could
00:32:50.940 do was tinker away at it incrementally. And that's a really lovely sentiment. I think it's very mature
00:32:57.840 and wise to understand that what you're working on is like a highway. It always needs repair. There's
00:33:02.920 always going to be construction on it. There's always going to be blockages on it, because it needs to be
00:33:06.620 maintained and updated. But you don't, you don't, you don't scrap the whole thing with every move.
00:33:12.760 Going online without ExpressVPN is like not paying attention to the safety demonstration on a flight.
00:33:18.200 Most of the time, you'll probably be fine. But what if one day that weird yellow mask drops down from
00:33:23.380 overhead and you have no idea what to do? In our hyper-connected world, your digital privacy isn't
00:33:28.740 just a luxury. It's a fundamental right. Every time you connect to an unsecured network in a cafe,
00:33:33.700 hotel, or airport, you're essentially broadcasting your personal information to anyone with a technical
00:33:39.040 know-how to intercept it. And let's be clear, it doesn't take a genius hacker to do this. With some
00:33:44.000 off-the-shelf hardware, even a tech-savvy teenager could potentially access your passwords, bank logins,
00:33:49.520 and credit card details. Now, you might think, what's the big deal? Who'd want my data anyway?
00:33:54.640 Well, on the dark web, your personal information could fetch up to $1,000. That's right, there's a whole
00:34:00.420 underground economy built on stolen identities. Enter ExpressVPN. It's like a digital fortress,
00:34:06.540 creating an encrypted tunnel between your device and the internet. Their encryption is so robust
00:34:11.360 that it would take a hacker with a supercomputer over a billion years to crack it. But don't let
00:34:15.980 its power fool you. ExpressVPN is incredibly user-friendly. With just one click, you're
00:34:20.680 protected across all your devices. Phones, laptops, tablets, you name it. That's why I use ExpressVPN
00:34:26.160 whenever I'm traveling or working from a coffee shop. It gives me peace of mind knowing that my
00:34:30.580 research, communications, and personal data are shielded from prying eyes. Secure your online
00:34:35.720 data today by visiting expressvpn.com slash Jordan. That's E-X-P-R-E-S-S-V-P-N.com slash Jordan,
00:34:43.540 and you can get an extra three months free. ExpressVPN.com slash Jordan.
00:34:47.920 Starting a business can be tough, but thanks to Shopify, running your online storefront is
00:34:56.340 easier than ever. Shopify is the global commerce platform that helps you sell at every stage of
00:35:01.380 your business. From the launch your online shop stage, all the way to the did we just hit a
00:35:05.560 million orders stage, Shopify is here to help you grow. Our marketing team uses Shopify every day to
00:35:11.280 sell our merchandise, and we love how easy it is to add more items, ship products, and track
00:35:15.860 conversions. With Shopify, customize your online store to your style with flexible templates and
00:35:21.340 powerful tools, alongside an endless list of integrations and third-party apps like on-demand
00:35:26.240 printing, accounting, and chatbots. Shopify helps you turn browsers into buyers with the internet's
00:35:31.480 best converting checkout, up to 36% better compared to other leading e-commerce platforms.
00:35:36.940 No matter how big you want to grow, Shopify gives you everything you need to take control and take your
00:35:41.520 business to the next level. Sign up for a $1 per month trial period at shopify.com slash jbp, all
00:35:48.280 lowercase. Go to shopify.com slash jbp now to grow your business no matter what stage you're in. That's
00:35:55.000 shopify.com slash jbp. Part of the problem with the discourse now too is that every issue becomes code for
00:36:04.620 an ideological position. Like I have a real hard time with all the environmentalist noise that I hear
00:36:11.220 on the mainstream media in particular because I always see it as code for a very fundamental
00:36:16.060 anti-capitalist ethos. And so I can't separate the damn wheat from the chaff. I mean, I know because
00:36:21.780 I've read a lot about ecological issues that there are foolish things that people are doing in the world
00:36:26.720 for a variety of reasons. Like overfishing is a real catastrophe and so is deep ocean trawling as far
00:36:32.780 as I'm concerned. But when I read general environmentalist claims, and these include climate change claims,
00:36:38.960 I can't separate it from a fundamental anti-capitalist ethos. And so I can't decide,
00:36:43.800 I can never decide whether I can trust the information.
00:36:46.820 And there's that Rousseauian underlay too, that like man is bad and nature is good.
00:36:52.100 That's not a rallying cry to say, let me pitch original sin to you.
00:36:57.020 That argument is to say, what are short-term effective gains that can be made in additional
00:37:01.380 middle and long-term in business for clean rivers and clean oceans and for fishing industries?
00:37:06.640 And also to talk about the things we've accomplished. I mean, we closed the ozone hole
00:37:11.520 and companies didn't close our business. There are actually solutions. You know, we basically
00:37:17.340 eliminated polio from the face of the earth. There's entrepreneurial and innovative solutions
00:37:22.300 for things that can also be pro-business.
00:37:25.020 Yes. And we've reforested the Northern Hemisphere. You know, there's more forest in the Northern
00:37:29.280 Hemisphere than there was a hundred years ago.
00:37:30.800 So you said this thing about, you know, incremental change. And it's another thing I've been thinking
00:37:35.660 about a lot, which is when we walk in the door and say, I'm for gun control, or I'm pro-life
00:37:42.320 in every circumstance, or, you know, to come in with an absolute position.
00:37:50.760 One of the things that's really tricky is if you're not willing to have a conversation,
00:37:54.360 if you're not willing to see ground, there is no terms under which you feel that the negotiation
00:38:01.120 or compromise can be satisfied. And that gets really dangerous. And it's funny because it's
00:38:06.220 something that I always advocate for when people talk about sort of a broad sweeping systemic
00:38:11.520 problem or injustice. The problem is, is it doesn't boil it down to a level of resolution that you can
00:38:17.080 get traction from all the people that you need because people feel like, well, what's the solution?
00:38:20.680 If the solution is to throw everything away, positive and negative and change everything,
00:38:24.940 you're not going to do it. And it's a problem we saw with Occupy Wall Street. And, you know,
00:38:30.620 it sort of spread from generalized corruption. And incidentally, they were targeting sort of the
00:38:36.200 banks and the bigger systems almost more than they were the government who was allowing a lot of it.
00:38:41.720 But then as it kept rolling into issue after issue after issue, it starts to get very complex of like,
00:38:46.820 well, if we concede and make changes, are there any terms under which you'll be satisfied?
00:38:51.380 And I think that's one of the aspects of coming in on positions of like,
00:38:55.300 pro or against universal healthcare, right? Pro or against the Second Amendment.
00:38:59.120 There's no ground to be had there because everyone's just digging in their heels.
00:39:03.180 And I feel like it's also a bit of a product of social media and the fact that we've all been
00:39:08.320 sort of infantilized to sort of these likes and dopamine hits that keep driving virtue signaling.
00:39:13.800 And that goes for both sides. It goes for both sides. If you can get a real slam or dig in on your
00:39:18.520 opponent, then good for you. And meanwhile, in the middle, there's a ton of Americans,
00:39:22.520 there's a ton of kids in education systems that need real solutions. And it doesn't matter how much
00:39:28.340 we're signaling what our beliefs are in our respective balls. It doesn't come down to real
00:39:33.100 solutions.
00:39:33.400 Well, it's also much more difficult to generate real solutions, because it means you actually
00:39:37.400 have to have some domain knowledge. And it isn't obvious, for example, that the classic
00:39:43.000 forms of media that politicians have used to interact with the populace have rewarded
00:39:49.140 high resolution knowledge, you know, because it becomes, well, it's easy for it to get tangled up
00:39:55.120 in the details for that kind of discussion to get tangled up in the details. But these longer form
00:39:59.660 conversations might allow people to unpack some of that in a more intelligent way.
00:40:04.000 Well, yeah, for instance, there was a great discussion between Dave Rubin and Ben Shapiro at
00:40:08.880 one point, talking about abortion, and there are opposite sides of it. And what was really interesting
00:40:13.600 was it was civil, and there was no denigration of the opposing side. And also was saying, look,
00:40:19.500 I don't have an exact answer of when, for me, it becomes uncomfortable. Like there is some point
00:40:25.400 that it moves from something that is into a point that he's not comfortable with abortion. There's
00:40:31.700 a certain date that that would happen or a certain point in trimesters. And so it's hard. There aren't
00:40:38.500 super clear and easy answers. And, you know, Ben has a different position. Obviously, he's much more
00:40:43.720 conservative on that front. But that's a position, it's weird. I have a lot of friends who were born
00:40:48.060 in Christians. And it's like, that's a position, you can't just dismiss, like, you have to talk
00:40:53.680 about it, like, that's a human position for people who have that belief. I mean, so for me, what it
00:40:58.140 really comes down to, when there's these sort of different positions on stuff, that's really
00:41:02.740 fundamental for people is to say, look, I don't we, I feel like Democrats have to talk about at least
00:41:07.600 that are more humanizing, which is I don't actually know what I would do in that circumstance,
00:41:11.580 if my wife, you know, was pregnant or a girlfriend, like, it's a really hard choice. It's not something
00:41:19.780 that's just flippant of like, oh, it's a cluster of cells, and it carries no emotional waves. That's,
00:41:23.860 I just don't find that to be true. But for me, there's a certain point where I don't think that's
00:41:27.960 ever true for anyone who has an experience like that. Right. And I feel right way to approach it.
00:41:33.780 Well, I think that the answer is like, look, I don't know what I would do. I don't claim to have all
00:41:38.160 the answers morally. The one thing that I believe is I don't want the government coming into your
00:41:42.780 house and into your bedroom and telling your wife or you what to do. Like, that's something I feel
00:41:48.380 like it's so deeply personal. And sometimes all you have are shitty choices and choices that aren't
00:41:53.160 good. And the flip side for me is I don't know what the alternative is when we're really talking
00:41:57.640 with specifics. I don't know what government mandated pregnancy looks like. So if a daughter or a niece
00:42:03.500 who's, you know, 18 is pregnant and doesn't want to have a child, I don't know what the solution
00:42:07.900 is to that. Like, how does the government then mandate that and oversee that? And so these are
00:42:13.760 conversations that, that, you know, I think are legitimate conversations to be had, but, but from
00:42:20.420 a position of respect rather than, you know, sort of slogans or the fact that, that, that people who
00:42:26.300 are pro-life are anti, anti-women, because there's a lot of women who I know who, who have, you know,
00:42:31.300 deeply held pro-life positions. I think it has to be a bit of a conversation that for me, I feel like
00:42:35.780 these are really personal choices. I have a strong libertarian reaction to it. And I also don't
00:42:40.660 understand what the solution is going to be. If that does become something that somebody doesn't
00:42:45.920 want to have a child, I don't understand how the government mandates that and what that looks
00:42:49.340 like for those nine months. And so, you know, it's not an approach that's dismissive. That's all or
00:42:54.920 nothing. It's an approach of having a conversation about, you know, differing values and we have
00:42:59.000 differing values and we have to figure out how to, how to talk about them in ways that aren't
00:43:03.240 dehumanizing. So what do you think, what's a good plan for the next few months? What are you,
00:43:09.440 what are you going to try to do? Well, I have a number of plans. I mean, one of them is to keep
00:43:14.060 Democrats focused on nuts and bolts issues. I also really, there's a lot of Republican voices that I
00:43:23.100 think are really important. I mean, I really like Evan McMullin. I really like Rick Wilson. There's a
00:43:28.960 number of commentators who have been, been really interesting and compelling. I listen to the broad
00:43:34.220 spectrum of everything. I wake up every morning and, and also read the Drudge Report and watch Fox
00:43:39.560 News to have an understanding of, of what opposing narratives are so that I can, I can understand the
00:43:45.880 full sort of spectrum on what's bothering people and what isn't. And Shapiro, Shapiro is a really good
00:43:51.140 voice as far as I'm concerned, because he's as intelligent and articulator of the social conservative
00:43:56.600 position as you could hope for. And so he's a great whetstone against which Democrats could
00:44:01.580 sharpen their knives. Well, because if they can, if they can confront him in a reasonable argument
00:44:06.420 successfully, then they've really got the argument down because Shapiro is very fast. And here's the
00:44:12.380 thing with Ben, he's got a great brain. I actually would like it if he was playing less defense and
00:44:19.020 actually could be engaged the way that Eric has to be like, look, what is your, what are your solutions
00:44:23.960 and thoughts on this front? There are some things he said that I deeply disagree with. Like, I don't
00:44:28.720 believe that all trans people are, are mentally ill. He has some positions, but nothing that I've seen
00:44:34.700 indicates to me that he's someone who can't conceivably be reasoned with. He's also pointed
00:44:39.560 out in places where he believes he's screwed up. He's written a whole articles about reversing
00:44:43.660 his position on things or stuff he said that was dumb. And well, that's, that's his words, not mine.
00:44:49.600 But, um, you know, it's like the Ralph Waldo Emerson, a foolish, a foolish consistency is the
00:44:54.960 hobgoblin of little minds. It's like how many of us have a completely consistent set of ideas.
00:44:59.680 And so I feel like with Ben, and there was that big issue where Mark Duplass came out and sort of
00:45:04.860 said, well, why don't we listen to him? And there was a big uproar and then he apologized. And I think
00:45:09.760 that's an interesting thing to discuss because from my perspective, you have three views. If you're
00:45:13.860 liberal on Ben Shapiro, one is that he's an alt-right Nazi, which I don't believe. I don't believe
00:45:18.720 that he's so far alt-right, I should say fascist, not Nazi. Yeah, and that's, we need to define
00:45:24.180 what the hell the alt-right is too. You know, the alt-right is basically a cover story for white
00:45:28.180 supremacy and for an ethno-nationalist state. And to target everybody who's conservative as an alt-right
00:45:33.580 figure, which is something that's happened to me repeatedly, is absolutely counterproductive.
00:45:38.260 Right. Because you have all our lectures on your opposition to the alt-right. I mean, it's very easily
00:45:44.620 disprovable. But with Ben, like, let's say for the sake of argument that that's who he is,
00:45:49.820 which I don't believe, why would you not want to pay attention or follow him or keep an eye on him
00:45:54.780 or know what he's saying? It's not like he's some crackpot, you know, he's someone who's very
00:45:58.740 influential and has a lot of followers. So even if it's the case, which I don't believe it is,
00:46:02.680 I don't understand how not paying any attention to what's going on with him is helpful.
00:46:06.960 You know, I got to say one of the highlights- This is actually the error that I think Apple
00:46:11.200 and Facebook and Spotify and so forth just made with Alex Jones, you know, because they think
00:46:16.300 they banned Alex Jones, but that isn't what happened. What happened was they pushed his
00:46:20.280 millions of followers underground. That was a very bad idea. And they're making a murder out of him.
00:46:26.960 They're justifying, you know, he's a little on the paranoid side, let's say. He's a real conspiracy
00:46:31.760 theorist. And the best, the worst thing you can possibly do to someone who's paranoid is to generate a
00:46:36.960 conspiracy to take him out. Well, and my belief is the best thing you can do, and Alex Jones has
00:46:42.880 said virtually nothing that I agree with on any front, is to give him as much airtime and as much
00:46:47.920 rope as possible. Or to at least allow him access to that in the same way that everyone else has
00:46:53.120 access to it, you know, because like I, I don't watch Alex Jones, but there are a number of people
00:46:57.680 on the right that I keep an eye on, or the farther right, because I want to see what they're up to,
00:47:02.160 you know, and as soon as that becomes impossible, then you have no idea what those people are up
00:47:06.860 to. And how the hell are you supposed to be able to deal with it if you can't figure out what it is?
00:47:11.340 Well, or if you're not watching it. Like, I'm a big fan, like, I'm, I'm a big fan of the aspect of
00:47:16.440 the ACLU that was the Skokie aspect, which was, we show up if Nazis want to march, right? And we will
00:47:22.680 protect their right to peacefully march. And it's like, I, people who I despise, it's like, get out
00:47:29.900 there. At a point, you'll be crushed by the free marketplace of ideas. I have that belief. At a point,
00:47:34.560 but you have to allow for it. And the more that we murder and shut down, I think we don't know how
00:47:41.020 many people were watching Alex Jones to keep an eye on. Right? No, I don't imagine it was a
00:47:46.620 tremendous number of people, but it wasn't none. Right, right. Well, look, you and I have talked
00:47:52.760 about this. One of the highlights of my undergraduate years was when the bell curve came out, and there
00:48:00.100 was a debate at the Kennedy Center, and Dick Harnstein, who I had as a professor, had passed away. And
00:48:06.260 Charles Murray was sort of left to defend it against Stephen Jay Gould. And it was a very high level of
00:48:11.280 discourse. And Henry Louis Gates gave the introduction, things were very, very heated on campus because of
00:48:16.940 the claims about racial differences in IQ. It was a very flashpoint issue. And Skip Gates gave a
00:48:22.740 spectacular introduction. And he said, look, there's a lot of sentiments roiling around this issue. And
00:48:28.920 the best possible thing that we can do is to dispute and defend and dismiss the aspects of this that
00:48:35.180 aren't correct here, openly and publicly. And there was a big debate, a lot of the student groups stood up
00:48:41.780 and walked out after that intrigue from one of the, you know, amazing intellectual minds of, you know,
00:48:50.740 in the country. And so I just went down and sat in the front row because people had piled out. It's
00:48:55.500 like, I wanted to hear the debate. I wanted to hear the sides from all different arenas. And it was
00:49:00.440 totally, totally captivating. But his point was, once if someone's a crackpot, and they don't have
00:49:06.840 any followers, and it's a stupid racist bellowing things, it's fine to ignore people. There's a certain
00:49:12.600 point where when it's Dick Harnstein and Charles Murray, there's a point where that needs to be debated
00:49:17.800 publicly and openly. And the fact that the chair of the African American Studies Department was
00:49:21.660 advocating for that, to me, I thought was really important and should be heeded, that we need these
00:49:26.480 things, these debates to be had, and the aspects that are incorrect need to be disproved in an
00:49:33.460 enlightenment way, in a very intense and public way. And so that's something that I've always sort of
00:49:39.060 held to, is I don't, you know, and I'm, look, I mean, there's also part of this about,
00:49:43.360 it's sort of a toughness argument. My grandfather went down during the civil rights movement,
00:49:48.400 he was a Jewish lawyer from Manhattan, and went down to the deep south to stick up for,
00:49:53.540 to represent African Americans who've been convicted of crimes against white women. There's
00:49:59.160 one case in particular that was totally ridiculous. And he went down and people tried to run his car off
00:50:04.340 a cliff and kill him. And so part of it for me too, is like, I'm from that mindset of like tough
00:50:10.380 liberals who want to, who actually can stand up. We need, we need people to be toughened on all
00:50:16.200 sides of discourse. And I feel like, you know, when people were burning copies of Ulysses,
00:50:21.340 I feel like we're supposed to be the ones with the fire hoses putting out the fires. We're supposed
00:50:25.240 to be the ones who can be really tough and stand our ground. And if we can't do that, if we're going
00:50:29.300 to duck from those fights and that hardening of a, of a, of a sort of intense, positive,
00:50:35.280 really tough version of, of what the points are that we're arguing, then it's not, it's not a
00:50:41.140 legitimate case. We need to be, that's who we need. It's a tough time and we need tough discourse.
00:50:46.160 Yeah. Well then, and that is exactly why you should seek out the most articulate
00:50:49.740 exponents of your, of your opponent's position. You know, I just had that experience talking with
00:50:55.140 Sam Harris and we talked for 10 hours and, you know, there are things we agree about. One of the
00:51:00.040 most fundamental things is that both Sam and I are concerned that our ethical structures,
00:51:05.640 it would be better if our ethical structures were grounded in something that was self-evidently
00:51:10.040 solid. And so he wants to do that. He wants to ground ethics in the realm of fact. And I can,
00:51:15.160 I can understand that. It's, it's a perfectly valid desire. I don't believe it's possible in the way
00:51:19.940 that he sees it as possible. And we talked about that a lot, but I learned a tremendous amount as a
00:51:24.760 consequence of discussing things with him over a 10 hour period, because it really pushed me.
00:51:31.020 Yeah, no, you need to be pushed, man, because absolutely, you know, all the pick all kinds of
00:51:35.960 holes and you need to go in understanding that you might not have all the answers.
00:51:40.000 Well, that you don't have all the answers. And that's what I think is interesting is
00:51:44.480 your approach to that is to ground things in an evolutionary model, which I think is a pretty
00:51:48.780 good bedrock, right? I mean, that's been your sort of approach to this is to talk about like a sort of
00:51:54.700 archetypal and evolutionary underlay. But Sam's unbelievably articulate and sharp. I mean,
00:52:00.620 I had the pleasure of having dinner with you and him and Douglas Murray in London. And it's,
00:52:05.200 it's a vibrant, respectful, sometimes disagreeable mode of conversation that's incredibly fruitful.
00:52:12.280 It's also, it's also stressful as hell, you know, because you can definitely get put on the spot and
00:52:17.700 have your cherished assumptions undermined as well as look like a fool because, you know,
00:52:23.680 some of the time, even if you have an argument well thought through, in that kind of discourse,
00:52:28.980 you don't have it immediately at hand, you forget about it. And then, you know, you appear as if
00:52:34.560 you're a fool. And so you put a lot on the line in the discussion like that. But what you do is
00:52:38.640 at least no one's paying attention to any of this story. So don't worry. Yeah, for you to have a
00:52:43.480 moment where Sam Harris goes, ha, gotcha. Right, right. So, well, so one more thing.
00:52:48.600 One more quick thing before we move on with this. So there's a lot of proposals that I'll get or that
00:52:53.560 are floated. Let's say there's an economic idea that somebody has in my group that wants to be
00:52:59.040 passed on. The first thing that I do with it is I send it to, I have a friend who's a really sharp
00:53:04.520 libertarian. I have a friend who's a hardcore kind of Wall Street Republican. And I have another
00:53:09.540 friend who's kind of a big money person. So they represent sort of the different poles of economic
00:53:15.000 policy from the things that some of the policies that might be interesting to me. The first thing
00:53:20.220 I do is send it to all them and go poke as many holes in this as you possibly can. And tell me what
00:53:24.720 all blind spots are. And that's not only for me to cover all those arguments. It's also for me to be
00:53:30.280 able to learn and see if there's a better approach to it, if it's fundamentally flawed. And then certainly
00:53:34.880 if I'm going to advocate something that's a new policy, I'm not strong on economics. I mean,
00:53:39.240 it's probably my spot, but it's to sort of know if we're going to advocate it, what are all the
00:53:44.060 pitfalls we're falling into and what are we potentially missing? And I find very few people
00:53:48.800 are taking an approach like this. And it's one of the things that I think is exciting. It's funny
00:53:52.880 when the IDW has this sort of fear of an alt-right overlay, but the majority of you guys skew pretty
00:53:59.660 liberal. I mean, if you look at Joe, if you look at Dave, certainly, if you look at Eric and Brett,
00:54:05.140 I mean, it's gotten more of a skew. But what's nice is ideas can be parsed in a way
00:54:09.980 where as much as it is embarrassing when you're put on the spot and you're wrong,
00:54:14.020 no one's looking to twist the knife. You know what I mean? Everyone's kind of saying,
00:54:17.780 yeah, I disagree with this. It's high disagreeableness, but it's respectful discourse.
00:54:22.620 Yeah, well, it's also best based at least to some degree on trust. You know, if I can assume that
00:54:27.780 you're different than me temperamentally, but that you're actually striving towards the truth,
00:54:32.200 then we can have a discussion. And I certainly feel that with Sam. Yes. I mean, I believe that he's
00:54:39.580 an honest person. And, you know, or I believe that he's as honest as me, let's say, which is more to
00:54:46.300 the point. Right. Because, well, maybe an honest person is too high a barrier for anyone to actually
00:54:52.000 achieve, given all of our faults. But I don't believe there's any evidence that he's striving with
00:54:56.780 less intensity towards the truth than I am. Right. Well, to get back to, I think the furthest
00:55:02.900 right in the group is clearly Ben, you know, and to get back to that point, it's like there's sort
00:55:07.440 of three approaches. Either he's a horrible human, and he's a hardcore alt writer, in which case,
00:55:12.960 if you believe that, why from a Sun Tzu perspective, Art of War, would you not want to know your enemy
00:55:17.760 and figure it out? Or he's somebody who has views and ideologies. I disagree with him on a number of
00:55:23.920 fronts, you know, but he's somebody who is interested in the truth and is willing to admit
00:55:29.620 when he has gone too far, or not gone too far, that's the wrong phrase. He's willing to admit
00:55:34.560 missteps that he perceives that he's made, and is also willing to take in new information. And it's
00:55:39.420 like, I'd love to actually have a discussion with him about something, a discussion, not a debate,
00:55:44.180 because he's a world-class debater, and he would probably obliterate me. But an actual meaningful
00:55:48.380 discussion about some of the stuff where I see there to be differences. Yeah. Well, I think he's,
00:55:53.340 I think he might be in for that. Like, you and I have discussed the possibility of finding high
00:55:59.560 resolution, intelligent, centrist Democrats, and getting some of these IDW people to interact with
00:56:04.480 them. Yeah. And so, you know, that looks like it's a real possibility and be interesting to see what
00:56:09.200 that would do to the political discourse. And I guess this is a step on that, a step in that direction.
00:56:13.680 I talked to Ben, and to Dave Rubin, and to Joe, or to Sam, and, you know, hypothetically,
00:56:24.840 they're at least willing to look into this as a possibility. And it would be very interesting as
00:56:29.720 well. Go ahead. Go ahead. Sorry. No, well, it would also be very interesting to see how this would play
00:56:35.520 out with regards to long-form discourse, because we need to find politicians. You know, one of the
00:56:40.000 things Joe told me when I was talking to him, Joe Rogan, when I was talking to him about his three
00:56:45.220 hour interviews, I said, well, does that ever not work out for people? And he says, well, yeah,
00:56:48.800 I've had guests on who just run out of steam at about 45 minutes. They don't have anything else to
00:56:54.120 say. And we don't need politicians who run out of steam at 45 minutes. We need people who can engage
00:56:59.340 in intense discussion for three hours without running out of detail, without running out of ideas.
00:57:03.660 Right.
00:57:04.460 A detailed grasp of the political realm. And we need to be able to see that. So we could move
00:57:09.040 away from these like six minute CNN interviews. And well, yeah, it's moving away from the tweets
00:57:14.500 and the nonsense, which to my mind is inherently polarizing. It's like, look at what this, you know,
00:57:20.960 asshole, hardcore lefty did on a college campus, or look at this most egregious example of something
00:57:26.300 that's over here. And I'm not saying those things aren't important, or that we don't have to pay
00:57:29.820 attention to them. But the vast majority of people in America right now need real nuts and
00:57:35.980 bolts solutions. And there's a corruption. Maybe there's a technological rule here.
00:57:42.640 So imagine that this is like, we could call this the Peterson principle, let's say. The narrower the
00:57:47.880 bandwidth, the more likely polarized information will be delivered over it. I think it's great,
00:57:52.880 but I think we should call it the Hurwitz principle. You'll have to come up with a more elegant
00:57:58.540 formulation then. But it seems to me that that's highly probable, because if you have to compress
00:58:03.300 complex information into a very narrow channel, and broadcast TV would certainly qualify, and Twitter,
00:58:09.860 of course, even worse, is that you have to radically oversimplify it.
00:58:15.520 Right. And I'm always torn between this, because like I said, I have a lot of friends, I mean,
00:58:21.720 across the gamut of political orientations. I read a lot of news and follow a lot of people
00:58:26.140 across the political orientation, like all the way left to right, not far right or very far left,
00:58:32.100 but I'm talking about within the reasonable polls. And it's just, it's amazing to me how skewed it is.
00:58:40.660 If you turn on MSNBC and then Fox News, we're in two different worlds. If you read the Huffington
00:58:45.160 Post and the Drudge Report. And so it's very easy to get, like, I think I have more of awareness of what
00:58:50.820 the triggers are for conservatives and what the triggers are for liberals, but also the fact that
00:58:57.100 these triggers are getting this outsized airtime constantly by this condensed means of information.
00:59:03.780 And I think that's where, like, the IDW view, like, there's way more listeners to you guys than
00:59:11.000 makes any fucking sense. Like, the fact that you had 10, two and a half hour lectures on the
00:59:15.620 psychological underpinnings of the Bible, and that more than two people watched that makes no sense.
00:59:21.160 And so there's a ton of people who are really curious for more in-depth analysis and it's not
00:59:25.640 being met elsewhere. And I think that's where the polls have been missing. I think that that's
00:59:29.500 proverbially the group of people who are being missed and who aren't being discounted.
00:59:36.360 And, you know, it's really important that we have these more in-depth conversations about it.
00:59:43.060 So that's our little conspiracy plan, so to speak, then, is to take the IDW.
00:59:47.280 Right, right, is to inquire if the people who've been loosely grouped together as the IDW are willing
00:59:57.040 to engage in serious discourse with, to begin with, with centrist Democrats. It doesn't have
01:00:01.800 to be limited to that, but I think that's a nice counterplay, too, because-
01:00:04.820 Well, for me, it can even be Democrats who aren't centrist, who are tilting even left
01:00:09.720 of that, who are willing to engage in reasonable discourse and might even have, like, different
01:00:16.000 districts and states have different notions. And it might really make sense for an aspect
01:00:20.820 of the party that's further left to test something in San Francisco or New York and see what new
01:00:25.220 ideas come out of it, right? It's just, for me, it can't be with moral condescension and it can't
01:00:30.420 be a purity test. You know, you and I've talked so much about how the left and the right get off
01:00:34.300 track in different ways. Yeah. The right gets very fascistic and you see them coming. And it's,
01:00:39.500 it's, you know, fascistic overlay. We know what that is historically. It's, it will go heavily
01:00:44.960 towards voter suppression. There's a lot of issues that are very highly problematic on that side of
01:00:49.140 the fence. And when the-
01:00:50.420 It gets, it gets, it gets ethnocentric in the superior sense, right? Yeah. Yeah. And, you know,
01:00:57.220 or, or, you know, illegal or hardcore power grabbing in a way that's effective, let's say.
01:01:03.460 And with the left, the left, you know, cannibalizes and it's purity tests and it goes kind of crazy
01:01:08.600 that way. I think that we are seeing from the Democrats now an awareness that real solutions need
01:01:14.220 to be offered. Again, anti-corruption is a huge thing that I'm seeing Democrats start to tackle in a
01:01:19.020 very serious way, healthcare, jobs, education. And it's really important that these things get
01:01:24.060 brought out. And so for me, it's like, we can test things that are further, that are further left and
01:01:29.220 see if they work or don't work. Yeah. But we also need to do that from a position that isn't morally
01:01:34.100 condescending in a position that's also appreciative of different cultures in different places around
01:01:38.620 the country. What's going to work in New York isn't going to work in Alabama. And liberals are so good
01:01:44.680 at, I think it understanding and appreciating other cultures, you know, like it's one of the things
01:01:50.800 that is sort of a trademark, but it's interesting to me that, that, I think we have fallen down on
01:01:55.440 our understanding of different cultures within the United States. And so it's this very interesting
01:02:00.040 divide that like, if it's foreign or perceived as exotic, it's all good, but like we need a better
01:02:04.920 understanding of certain cultures also, you know, Alabama and Texas. And the right certainly is guilty
01:02:11.860 of this too. I mean, the right treats Chicago as if it's some like horrible, you know, no man's land.
01:02:18.560 I'm not saying this is a problem only of the left, but I'm saying we can really benefit with
01:02:22.260 conversations. I'm talking to a lot of candidates in Oklahoma, in Orange County, in Texas, who have an
01:02:28.540 understanding of their constituents and what they want to do is help them and to present a reasonable
01:02:33.080 alternative of real solutions to them about real issues and not the bullshit that everybody is
01:02:37.700 engaging in. It's absolutely essential. And if we, if we offer all that and lose, then tough luck,
01:02:43.600 we got to make a better case. We got to make a better argument. Like we screwed up. We got to try
01:02:47.780 harder. Yeah. And by the way, the other thing, Jordan, that's really heartening. I'm seeing with a lot
01:02:53.720 of the actual candidates, as opposed to the noise is more and more of an emphasis on personal
01:02:58.780 accountability. You know, it's, it's about helping people and creating new opportunities
01:03:02.780 for them economically, but in a way that empowers them with the freedom to sort of make their life
01:03:09.340 or make their mark. If that's a freedom that they can seize, it's about sort of trying to level a
01:03:14.020 baseline. Like we talked about how healthcare can be super helpful for business and for entrepreneurial
01:03:18.400 ventures. We talked about why it's essential for the robustness of communities, but it's about
01:03:23.620 setting the terms under which people can be free to make choices and to perform at a level that they
01:03:29.280 want to, to build their whole future. So it's not that it's a shift from sort of us running ahead
01:03:34.540 and blocking for people to us trying to, to lift the constraints so that people can have better
01:03:40.020 choices for their, for their families, for their kids and their education system, for their own work
01:03:44.820 so that they can do better and make their own mark. And because a lot of what we're standing on,
01:03:49.600 a lot of the successes that we're standing on as a country are on the shoulders of working men and
01:03:53.980 women. That's a lot of what's accounted for the things that we've done well. And in fact, a lot
01:03:58.520 of the world, frankly, a lot of the world has benefited enormously from the working people of the United
01:04:03.500 States. Yeah. You know, in a way that I don't think gets nearly enough recognition. And so we,
01:04:08.860 I, I, I see the, the accruing financial success of the poor around the world, which is happening at a
01:04:15.800 very rapid rate is in large part, a consequence of the, the sacrifices made either willingly or
01:04:23.460 unwillingly by the American working class over the last 30 years. Absolutely. The working class
01:04:28.200 that's, it, it, it's competition with the world has been opened up at the level of the working class
01:04:34.540 first. And that's been unbelievably beneficial to the rest of the world and probably a pretty good
01:04:39.600 medium to long-term strategy because, well, and the American working class, it's like they have,
01:04:44.640 they've built affluence for people around the globe. Like they're heroes. There's no,
01:04:51.080 there's no infantilizing of them in a way that's derogatory. And it's like, it's our job to make
01:04:56.860 sure that they can have choices for their future, that they, that they are free to compete, at least
01:05:01.700 with the same rules that everybody else competes to differentiate themselves because they've always
01:05:06.260 kicked ass. I mean, it's an, it's an amazing, amazing workforce and they've always done well.
01:05:11.180 And it's like, if people are complaining about, you know, entitlements, it's like,
01:05:15.520 we can also look at corporate subsidies. It's like, it's not about one tilt or the other. It's
01:05:20.680 about the same rules that people need to differentiate themselves. And it's about creating
01:05:24.740 more freedom for them to reach their accomplishments. Well, it seems to me that the Democrats would have
01:05:29.100 a lot more success and, and, and also be able to generate a counter narrative to the radicals on the
01:05:36.280 left if they made the assumption that there's all sorts of good things about free market capitalism,
01:05:42.000 but crony capitalism is not one of them. No, and that's part of your emphasis on, let's call it
01:05:48.900 regulation of corruption. It's like the free enterprise system works real well, except when it gets warped
01:05:56.200 and twisted. And so we have to remove the warping and twisting so that people do have the opportunity
01:06:00.940 to compete and to cooperate as freely as possible. And that's in everyone's interests, including
01:06:05.860 radical capitalists, because to the degree that the system is corrupt, then that reflects badly on
01:06:11.460 the system and produces a tremendous amount of opposition towards it. Absolutely.
01:06:16.000 We want that.
01:06:17.940 When a woman experiences an unplanned pregnancy, she often feels alone and afraid. Too often,
01:06:23.880 her first response is to seek out an abortion because that's what left-leaning institutions have
01:06:28.640 conditioned her to do. But because of the generosity of listeners like you, that search may lead her
01:06:34.040 to a pre-born network clinic where, by the grace of God, she'll choose life, not just for her baby,
01:06:39.480 but for herself. Pre-born offers God's love and compassion to hurting women and provides a free
01:06:44.740 ultrasound to introduce them to the life growing inside them. This combination helps women to choose
01:06:50.080 life, and it's how pre-born saves 200 babies every single day. Thanks to the Daily Wire's partnership
01:06:56.020 with Pre-born, we're able to make our powerful documentary, Choosing Life, available to all on
01:07:01.340 Daily Wire Plus. Join us in thanking Pre-born for bringing this important work out from behind our
01:07:06.720 paywall, and consider making a donation today to support their life-saving work. You can sponsor one
01:07:12.260 ultrasound for just $28. If you have the means, you can sponsor Pre-born's entire network for a day
01:07:17.760 for $5,000. Make a donation today. Just dial pound 250 and say the keyword baby. That's pound 250 baby.
01:07:25.580 Or go to preborn.com slash Jordan. That's preborn.com slash Jordan.
01:07:30.580 And you want to be, it's like, if you can't win fairly, what are you doing in the game?
01:07:39.340 Like, that's it. It's, to me, it's a very tough argument. That's like, if you can't sign a pledge,
01:07:44.540 which, which has been put forth, it's called the Disclose Act. There's a number of variations of it.
01:07:49.160 It's no dark money, you know, no, it's a strong stance against interference by hostile actors.
01:07:55.800 It's no voter suppression. It's getting universal health care, universal voter registration.
01:08:02.000 There's a lot of stuff of like, let's get everybody out voting. Let's lift the suppression
01:08:05.820 and voter intimidation, and then win fair or square. And it's like, for me, it's like, I want
01:08:10.600 the right to win fairly, and I want the right to lose fairly. If we make a better argument,
01:08:15.440 then let's do it. But, you know, it's also anti-gerrymandering, which is a really big problem.
01:08:19.800 And there's a lot of people who vow to not participate in gerrymandering,
01:08:23.000 and to also have it be either bipartisan committees, or judges who determine the
01:08:27.720 boundaries. Yeah, well, it's really, it's really quite an outrage that it's not bipartisan committees.
01:08:32.060 It's ridiculous. I mean, imagine stepping onto, you know, into a boxing ring and telling someone
01:08:36.840 that you're going to fight them, but they can only step in certain areas and you can go anywhere.
01:08:40.320 And they can only hit you with a jab, but you get to use your full battery shots. It's like,
01:08:44.380 man up, right? Step into the arena. If you think you can win, then win fair and square. And there's a,
01:08:49.560 there's a big emphasis, I think, that's not getting enough press and Democrats cleaning up
01:08:54.220 our side of the aisle in a way that is accountable and that is, is going to be clear. And there's no
01:09:02.820 way you should disagree with that. I believe if you're a heart, you're a Republican and you want
01:09:07.080 to win, you should be aware that you can win fairly and under the same rules. And that, I mean,
01:09:11.660 we can all agree on that. Yeah. Yeah. Well, it would be nice if the Democrats too,
01:09:16.280 could help everyone figuring out, figure out what the problems actually are. Because right now,
01:09:21.960 mostly what we see in the media is a discussion of pseudo problems. Exactly. And so I'd like to see
01:09:28.920 what, what are the, what are the real challenges that confront us at the moment? And what are the
01:09:34.380 proper solutions to those challenges? Yeah. It's so funny because I feel like the, the racial
01:09:40.160 discourse or discourse around racism has gotten so loaded and so complicated since president Trump
01:09:48.160 has, his, his run or has taken office. And it's, I feel like all sides of the issue are like, there's
01:09:55.220 nowhere you can step without being a landline. And for me, it's really important to boil down to,
01:09:59.340 I went with a program called the five that goes and runs an entrepreneurial program in the prison
01:10:05.840 system. And it's absolutely unbelievable how, you know, what I call the, um, the prison industrial
01:10:13.800 complex works, the privatization of prison. And you can sidestep all the arguments about who's doing
01:10:18.640 what and who's responsible. African-Americans in, you know, who are up for drug charges, the exact
01:10:24.840 same drug charges as, as whites are 600% more likely to get put in prison. Like that's a straight
01:10:31.320 step. That's not, well, they're doing more crime. There's no argument to be had about that.
01:10:35.080 There, there are certain unbelievable biases of how, you know, African-Americans and people of
01:10:40.980 color are put into a prison system, which profits from them and which they're doing a lot of work
01:10:46.320 and basically, you know, no wages. It's very problematic. Anybody would agree with that across
01:10:52.460 the spectrum. I think if you look at those numbers and it's patently unfair, if there's one thing,
01:10:57.220 we need to have a intelligent discussion about drug policy. That would be a real start.
01:11:01.700 And about the prison system and who it's stewed towards and the ways in which it is unfairly
01:11:07.040 implemented. And that's a no shit major problem. And if you, if you discuss the actual hardcore facts
01:11:13.980 and if you can boil it down to those things, no one's going to disagree with that.
01:11:18.400 So, you know what, you know what Bjorn Lomborg did? This is really interesting. He's a,
01:11:22.280 he's a Scandinavian. Um, he used to be an environmentalist. He still is, but he calls
01:11:27.260 himself the skeptical environmentalist now. And I really liked Bjorn Lomborg. He wrote a great book
01:11:31.780 called how to spend 50 billion. And then a few later, years later, 75 billion to make the world
01:11:37.540 a better place, a book I would highly recommend it. This is what he did. And this is really nice
01:11:41.140 procedural, um, uh, achievement. He looked at the UN. I hope I got this story right. It's a while since I,
01:11:49.040 since I read it over the UN puts out a lot of development goals, say like, I think they have 150
01:11:54.420 development goals, something like that. And that's too many goals. You can't do 150 things.
01:11:59.480 And I worked on a UN committee for a while and we were trying to prioritize the goals because you have
01:12:04.300 to prioritize things in order to accomplish them. And one of the things we discovered was that you
01:12:08.620 couldn't prioritize the goals because every goal had its constituents and they were outraged if any other
01:12:14.100 goal was prioritized over theirs. And that was a terrible thing because, well, that's like tragedy of
01:12:19.500 the commons because the goals couldn't be prioritized and they couldn't be, then they couldn't be
01:12:23.820 accomplished essentially. So Lombard figured out a way around this. So his idea was this. Okay. So
01:12:30.800 there's a bunch of things that need to be done because there's a bunch of problems out there in
01:12:34.720 the world. And, uh, but there's too many to simultaneously address. So we have to figure out
01:12:39.160 where we're going to start. So what we're going to do is a cost benefit analysis. So we're going to
01:12:43.620 concentrate on doing as much good for the least amount of money as we can. And that's going to be our
01:12:48.320 initial, um, strategy because why not? Right. Um, because money is a scarce resource and it involves
01:12:54.980 human labor and human intelligence and all of that. And so you don't want to waste it. And, but he
01:12:59.580 didn't believe that he would be capable of doing the prioritization. So what he did was get 10 teams
01:13:04.600 of economists together, some of which were headed by world-class economists, people who'd won Nobel
01:13:09.380 prize. And he had them organized teams and then he had them go through the development goals and
01:13:14.660 prioritize them and make an economic case. And so then you had 10 separate cases. He said, here's
01:13:19.520 the rule. You've got $50 billion. You can spend it on whatever you want, but you have to justify it in
01:13:25.080 terms of cost benefit analysis. And they have to choose from this list. And so then he got each
01:13:29.240 economist to generate a list of their priorities. Then he averaged across the lists and came up with a
01:13:33.960 final list of, of both problems, which is a big deal, right? Because the first issue is what are
01:13:40.080 the problems that we need to solve? Like, you know, you might think, well, what politicians should be
01:13:45.240 generating his answers. It's like, no, no, no, that's not right. If you're trying to solve a complex
01:13:50.000 problem, a set of the complex problems, the first thing you want to do is formulate the problems
01:13:54.080 properly. And then you work towards, towards defining what the solutions might be. So it'd be really
01:13:59.560 interesting to see if we could have an intelligent political conversation that would concentrate on
01:14:03.920 on defining what the actual problems are. And it's not capitalism versus socialism, right? That's
01:14:10.680 not a problem. That's fun is everyone gets to scream at each other and then be superior and get
01:14:16.800 in zingers and get in retweets. Without having to know anything about anything, right? And there's no
01:14:22.600 actual motivation towards trying to come up with solutions to the vast majority of people who have
01:14:27.780 who are dealing with real problems. And that's the part that pisses me off is it's like, it's this
01:14:32.100 masturbatory venture of discussing the overall culture. And, you know, if you turn on the news,
01:14:37.100 it's these little soundbites of the most, you know, sort of tabloid nature of the furthest extremes
01:14:43.500 versus nuts and bolts discussions of what actually makes sense.
01:14:46.580 Yeah. Okay. So what we need to do, let's say, generally speaking, is we got to figure out what the
01:14:51.140 problems are. Like what are the most serious problems that currently confront us as a society?
01:14:55.640 That'd be a nice start. Like what are the 15 most, most crucial problems that confront us as we move
01:15:02.760 farther into the 21st century? That's a hard, that's a hard question. And then
01:15:06.840 I think the first, okay, no, go ahead. Well, I mean, the first thing for me looking at the system is
01:15:14.240 political corruption. And I'm not aiming that at either party. I mean, I have my opinions about
01:15:20.080 which party does that more, but neither one is perfect. And the average rating of Congress right
01:15:25.300 now is 15%. I'm sorry, it's 18% is the approval rating of Congress, which is better what Rourke says
01:15:31.160 is slightly above syphilis. It ain't good. And if you don't believe that the people in government
01:15:36.500 are serving your needs and are instead, you know, more interested in furthering the big money donors
01:15:42.400 over the interests of people, which is proven out time and time again, that's a huge problem.
01:15:47.520 When trust is eroded that the people representing you are representing rich donors rather than average
01:15:52.440 Americans and seeing to your needs, you can't get anywhere. And I think that's part of what's so
01:15:57.720 important about this pledge that a lot of Democrats are working on. It's not, it's a pledge to say,
01:16:02.400 here's what we, here's what we will disclose. And here's ways that we will clean up our side of the
01:16:06.740 aisle. And I think if it's, it's essential, it's essential that you start with your own reflection
01:16:11.760 of fixing the things that you need to fix in order to make a better argument. I, for me, corruption
01:16:16.600 in government's the top of the list here. But go ahead.
01:16:21.020 Well, I was thinking more of it procedurally, not, not so much in terms of what the problems might be.
01:16:27.860 I mean, one of the problems I see emerging, well, one of them would be this increase in political
01:16:33.100 polarization, which I think is very, very dangerous. I think another potential problem on the horizon is
01:16:38.220 something like, as our society becomes more cognitively complex, which is happening very,
01:16:43.200 very rapidly, the number of people, the proportion of people who are likely to be dispossessed as a
01:16:48.160 consequence of, of, of, of, of lesser ability to deal with that kind of complexity, that problem is
01:16:58.340 going to increase. And that's across the board. And it isn't obvious what to do about that. I don't
01:17:03.500 really buy guaranteed basic income solutions, in part, because I don't think that money is actually
01:17:08.960 the solution to the problem of what to do in life. Like money can stave off.
01:17:13.240 Well, in some ways, it's the opposite. If you lose a goal that you're striving for, that makes life tough.
01:17:20.080 And this is a conversation that I think should be a roundtable with Democrats and conservatives and
01:17:25.600 Republicans to say, what's a pro capitalist argument for how we're going to train people and bring them
01:17:30.700 along? Right? What's, what's, what's a, you know, a liberal argument for having concern for people
01:17:36.240 who don't have the same advantages from the same background? Yeah, well, you know, that the Silicon
01:17:39.880 Valley types, they tend to skew pretty heavily liberal. And that's because they're generally creative
01:17:44.220 and entrepreneurial. But they're very rapidly building a world in which people who are at the
01:17:49.280 lower end of the cognitive distribution are going to be adrift. Right? And we need to fix that from both
01:17:54.780 sides. So rather than saying, all or nothing, right? Like tough luck, capitalism, free market,
01:18:00.680 forget it. Or, you know, we need to support every single person in every single way. That has to be
01:18:06.340 the subject of complicated, in-depth conversations with people weighing in from a variety of different
01:18:10.460 political views. It just has to be. In order, it's so complicated. It's so complicated that we need,
01:18:17.320 we need opinions and brains from all sides of the issues. That's for sure. We definitely might also
01:18:22.040 need different test cases. Well, you can also think about that, those tough conversations,
01:18:27.940 you know, like between someone who leans very liberal and someone like Ben Shapiro. You can think
01:18:32.220 about those as stress tests. Right. Your damn stupid solution, which you've put a lot of effort into
01:18:37.940 elaborating, let's say, can't withstand the stress of a single argument with someone who disagrees with
01:18:43.220 you. What the hell do you, what makes you think that's going to work when you implement it in the real
01:18:46.760 world? Obviously, it's going to fail merely because those people are going to object to it,
01:18:52.400 independent of its objective merits. And so if it can't fly with your opponents, I mean,
01:18:58.820 I'm not talking about the fringe opponents, because nothing flies with the fringe except what the fringe
01:19:03.940 believes. But it's... I also think that the fringe is overly represented, and that's what's so hard.
01:19:09.860 Like when Mark Duplass apologized for saying people should follow Ben Shapiro, right? There
01:19:16.040 was an apology he put forth. It was... I haven't worked with Duplass. I mean, I've heard he's a
01:19:21.260 terrific guy, and I think that his aim was sort of a furtherance of an ending of polarization and
01:19:27.580 discourse. I don't know who... Who are all those people who are personally deeply wounded by him
01:19:34.440 saying that you should just follow somebody, whether you agree with him or not? And I also don't know
01:19:38.500 who the apology is geared towards. Like, who is he apologizing to? And if we have sort of a mob
01:19:44.760 dictate... I mean, the one thing I say a lot, and this is especially true in Hollywood, is there's a
01:19:49.680 huge and widening gap between conversations that people are having in public and what they really
01:19:53.840 believe in private. Yeah, that's the sign of a creeping totalitarianism. That's really, really
01:19:58.560 dangerous. Yeah. And it's also from both sides. I'm not just saying that this is a liberal value. I
01:20:03.960 think there's a lot more people who have more complicated views that they don't feel safe to
01:20:09.200 articulate because fear of complete reputation, slandering, right? They'll be out of work. It'll
01:20:15.880 be everything else. And so I don't... It's a very interesting thing. I have a lot of sympathy for
01:20:22.500 people who step out and say something and then sort of retreat and apologize because I feel like the
01:20:27.480 force of a mob response from the fringe, even if it's sort of pluralistic ignorance, right? I mean,
01:20:34.060 90% of people might agree with you, but if the 10 are allowed and it's all that you're seeing
01:20:38.060 everywhere, it has the power to warp reality. Yeah. That's how mind control cults function.
01:20:43.560 You know, mind control cults, one of the first things they do if they can get you off to a complex
01:20:47.140 is privacy deprivation. And everywhere that you look, if there's two versions, they try and split me.
01:20:52.780 I went undercover into mind control cults to research one of my knowledge. And when you're there,
01:20:57.040 everything that I say that is cult Greg, they're splitting cult Greg from Greg, Greg.
01:21:03.220 And everything that I say, I'm like, wow, this is fun. It's sort of love bombing. Everyone leans in,
01:21:07.660 it's direct eye contact. It's sort of like non-sexual touching. Everyone's engaged. But if I say,
01:21:12.340 you know, this is weird, I kind of, you know, miss being in contact with some of my other friends.
01:21:16.240 It's the sort of wind chill that everyone's back. And it's what I think is happening, but from a
01:21:23.120 smaller percentage of people that, but they're really, really loud. And if you see it all of a
01:21:29.220 sudden, it seems to be the reality when you're getting harder. And there's this sort of apologies
01:21:33.740 that I don't believe when a lot of people make the apologies. They read flat to me. It doesn't feel
01:21:39.360 heartfelt. And also like, let's say you really did screw up. Like part of what we need to do is screw
01:21:45.740 up. Part of what we need to do is think about things and say things and overstaff and go, wow,
01:21:50.100 that was kind of a screw up. But what's a blanket apology to society versus individual stuff to say,
01:21:56.740 hey, I forwarded this position. I wish I'd articulated it better. I don't always articulate
01:22:01.640 myself perfectly. Which is probably what I should have said. Rather than I'm so sorry to have had a
01:22:06.600 thought that isn't appropriate. And we're seeing this across the boards. And I think it's really
01:22:11.620 being... Well, part of what needs to be done, and this is something we've already talked about,
01:22:15.000 is to generate better dialogue that isn't like that. And hope that the mere fact that it's of
01:22:21.660 much higher quality is sufficient to attract people to it and away from this other nonsense.
01:22:26.680 Yes. It's hard because as you said, the shorter the span. Like I have people
01:22:30.460 who know that we're good friends. You know, our work is kind of tangled up. We mention each other
01:22:35.940 in our books. We've known each other shit since I was, you know, 20 years old. And people will say,
01:22:41.380 well, he believes in coerced monogamy. Like dismissed. It'll be one thing. And it's like,
01:22:46.740 it's an anthropological term or sociological term with specific meaning that takes one Google search
01:22:53.420 to sort of prove that it's not that Jordan Peterson is running around like in a World War
01:22:58.060 II propaganda poster, like dragging women off into the caves of...
01:23:02.820 Giving them to useless men.
01:23:04.920 Right. I mean, but it's so disprovable. But it's also so damning on a format like Twitter,
01:23:09.820 where you have, you know, 280 characters.
01:23:12.380 Yeah.
01:23:12.880 And so it's that reputation thing where like, if you associate with you, you should never
01:23:19.040 be discussed again. If you should follow Ben Shapiro, whether you agree with him or not,
01:23:24.120 you have to be bombarded and shamed until you apologize for even saying that you'll do that.
01:23:28.360 And it's like, look, I've learned the most from my friends who disagree with me the most.
01:23:33.680 I have a wide range of friends. It's so funny that one of the stories that I tell
01:23:37.420 after President Trump was elected, I was in Vegas on a book tour. And one of my friends
01:23:42.160 was a crusty old armor and sniper from the spec ops community. He passed away recently,
01:23:47.880 but he was just a great friend and a great guy. And I did a lot of shooting with him for the
01:23:53.820 books. I did a lot of research and he wanted me to see his ranch. And we pull up to his ranch
01:23:57.440 and he has a tank parked in his front yard and sticking out of the end of it is a Trump
01:24:03.700 pent sign. So at that exact moment, my wife sends me a text and it's her and my daughter
01:24:10.500 at like one of the, you know, a thousand women marches and they just come up on Miley Cyrus.
01:24:15.420 So they're like, my daughter's posing with Miley Cyrus. And the text sort of comes in and my buddy
01:24:19.700 looks at me, he goes, what's that? And I looked up at him. I said, nothing. But then I showed him
01:24:25.500 and we had a good laugh about it in sort of this way that, that it's like, you need people around
01:24:31.980 who have a whole variety of things. And one thing that was interesting with him was around
01:24:35.000 the time of the Muslim ban. And I was talking to him a lot about that. And I, I'm, I'm personally
01:24:39.660 very opposed to Muslim ban for instance, but there's, there's a lot of reasons. If you talk
01:24:43.900 to people in the military, there's a lot of problems recruiting translators now, like that
01:24:49.480 has a negative effect that rippled through the intelligence community. It's not good. And you
01:24:54.680 know, and whether I have friends who are vehement advocates of it, but I'm arguing with them
01:24:59.400 saying, what is the actual ground truth of how that is affecting things? And then there's other
01:25:03.880 people in the military who were like, this has been a total shit show. It's really hurting our
01:25:07.900 intelligence efforts and our allies. I feel like that's the level of the discussion. Or one of the
01:25:12.920 other things for me, that's very relevant is, you know, I don't know if you, if you know this,
01:25:17.260 but there's 15,000 Muslim physicians from the seven banned Muslim countries in America,
01:25:22.200 and they predominantly work in, in rural red state districts. They're providing the majority of,
01:25:28.820 not the majority, they're providing a ton of medical care to a lot of people in red states.
01:25:33.380 And by the way, those are the people, if they fled Aleppo, they hate Bashir al-Assad and extremism.
01:25:39.220 Like this is, this is, this is exactly the sorts of people who you want to have in these areas.
01:25:44.580 And if they're pulled out, we're going to have a lot of rural areas that more affluent physicians
01:25:49.600 are not going to have access to medical care. Because more affluent physicians or people from America
01:25:56.300 don't always want to go to these regions. There's a lot of like bigger recruiting of physicians.
01:26:01.020 So you have these people who are, who are great sort of citizens and representatives who hate
01:26:06.220 extremism, who are treating people in deep red states. And it's like, that's really important.
01:26:10.460 And you pair that with the Intel picture that I think is deeply problematic.
01:26:14.860 And you have a sort of argument that isn't just, you should have more empathy for Muslims,
01:26:20.300 which then just deteriorates into, well, they're terrorists. And like, it just evolves rather
01:26:24.940 looking at sort of the temperament of the people who are here. And you have to have conversations
01:26:29.180 with people from all sides of the, like, I had a lot of conversations with people in the military
01:26:33.740 who support it or against it to kind of start to zero in on what, where my position is.
01:26:38.780 Okay, so the plan is something like this. So we'll, we've recorded this video, I'm going to post it as
01:26:46.380 a public invitation to the intellectual dark web. I think that's what I'll title it or subtitle it.
01:26:51.340 The idea would be to bring on people, Democrats, at least to begin with, because that would
01:26:57.900 kind of run counter to the IDW reputation, let's say. Democrats who are capable and will capable of
01:27:05.340 and willing to engage in long form discussions about complex high resolution issues, and to provide
01:27:11.580 a more, at least in part, a more centrist alternative to the radical leftists, but also to bring on
01:27:17.340 people with more left leaning views to have them try their arguments out in a more, in a more fully
01:27:26.380 fledged manner, something like that. Yeah, I mean, I think, how does anyone lose? Like if someone's
01:27:31.820 conservative and not moving, they should know what the reasonable version is of the argument
01:27:36.140 they're rejecting. Like, good for you then, you know? Yeah. All that they're seeing is reports of
01:27:41.260 like students gone wild on college campuses and the craziness that I believe, I believe is a problem.
01:27:47.340 I mean, you've certainly, you've dealt with protests and you've dealt with that sort of
01:27:52.540 like brutal groupthink and anti-First Amendment, you know, like sort of shutting down speech. It's always
01:27:59.980 tricky to talk about freedom of speech and First Amendment because it's unclear always where that
01:28:04.860 lapses into being legal or constitutional versus just people articulating things loudly. But I have
01:28:10.220 no question that that's a problem. But I think that there's, I think that it is overrepresented and
01:28:14.940 amplified by the media and by both sides of the media. Yeah, I think so too. Sexier topic. And I think we
01:28:21.020 should get people in who are willing to have conversations that are solution geared so that if people
01:28:26.780 realize that, you see, for me, part of what my interest is, is in sort of trying to transcend
01:28:31.820 politics and getting to a place where like, I don't care if you're a Republican or Democrat
01:28:36.140 as much as I care if you're offering real solutions. Yes, right. Exactly. So then the other element of
01:28:40.780 this would be to figure out what the problems are, to see if we can come to some sort of consensus about
01:28:45.660 where the challenges that confront us really lie. Right. And if there's a Democrat running for office who is
01:28:51.820 very dark money embedded and beholden to donors and there's somebody who's a cleaner
01:28:59.020 choice who's running, who's a Republican, it's like, let's have that out. You know, it's not,
01:29:03.340 this is, this is about actually starting to figure out solutions for, for, for people in terms of the
01:29:09.900 level of discourse that we can have and the solutions that need to be offered. Because while
01:29:13.340 we're here talking about all this stuff, there's a lot of people dealing with a lot of real life problems.
01:29:17.900 Okay. Okay. So I've started to introduce you to a number of people. You've met Sam and you met Dave
01:29:25.900 Rubin. Um, you met Douglas Murray briefly. Who else, who else have I introduced you to?
01:29:32.460 Um, a number of other peripheral people or, or other crews, but I think, you know, I, I,
01:29:39.660 there's a number of people I'd like to talk to to start to get this viewpoint. Um, and the viewpoint
01:29:45.340 that's more aligned with my politics represented. Um, so, you know, we'll keep, we should, we should
01:29:51.420 keep at it. And we, I think that the aim for me, there's a couple of aims. I mean, the aim is to sort
01:29:55.500 of end polarizing discourse, raise the level of conversation that can be had for actual solutions
01:30:01.420 that we need for actual people. And sort of long-term aim for me is just to, to, to eliminate
01:30:09.660 the levels of corruption in the government. And that's the boards either side. That's a long-term
01:30:14.460 aim for me is like, who's willing to sign on to a disclose act? Who's willing to say that they're
01:30:18.460 going to disclose all their donors? Everyone paid for an app. Who's willing to not take PAC money as
01:30:23.820 individual candidates? Um, you know, who's willing to stand up against gerrymandering? And that to me
01:30:29.260 is party agnostic. That's who's willing to stand on to that. Yep. Okay. Well, sounds like a plan.
01:30:39.580 All right. All right. I'll start, I'll start the ball rolling on my end. I've got some people that I
01:30:45.340 want you to meet on the Republican end of the distribution as well. Great. I'm happy to meet
01:30:50.540 with anyone who is, you know, reasonable and willing to have discourse instead of chest beat.
01:30:55.580 Yep. Okay. Good. Good talking to you, Greg. Bye.
01:31:25.580 Bye.