The Jordan B. Peterson Podcast


151. Build a Better Democrat? | Gregg Hurwitz


Summary

Greg Hurwitz is an American novelist, scriptwriter, and producer. In the last few years leading up to the presidential election, Greg has been working with an independent team of Hollywood writers, producers, and directors to design and promote a moderate political message for the Democrats. In this episode, Greg talks to Dr. Jordan B. Peterson about his new series, Build a Better Democrat, and how to become a better Democratic Party voter. This episode was recorded on December 20th, 2020, before the most recent events on Capitol Hill and was recorded before the White House Correspondents Dinner on December 18th. It was produced by Greg Hurwitz, a former student of mine from Harvard, and someone I've known for a long time as someone who has many occupations, although he has many other occupations, which we ll talk about today. Enjoy this episode. Remember, hit subscribe and don t forget to check out Don't miss out on the latest episode of the Daily Wire Plus podcast! Subscribe today using our podcast s promo code POWER10 for 10% off your first month with discount codes POWER10 at checkout. If you're looking for thrillers, check out Greg's books, Orphan X, The Orphan series, or The Girl Next Door. And if you re looking for thriller novels, don't miss it! Check out the Orphanx series. Don't forget to hit the preorder link in the description on OrphanX! on Amazon, starting on January 26th! Don t miss out! You'll get a copy of the newest episode of Orphan x Orphanphanx on January 27th, coming out on Amazon Prime Day! It's coming soon! Subscribe to my new book, The Dark Lord of the Mind by Orphan and I'll be giving you a free copy of my new novel, The White House Journalist, The Other Way by The Dark Side of the Earth by The New York Times on January 31st, February 6th, 2019, 2020. I'm looking forward to hearing from you! Thank you so much, Mikayla Peterson, I'll see you in the next episode of The Daily Wire plus! -- -- I'll send you all the details of the podcast on Monday, February 4th, 5/27, 2020! Thanks for listening to this episode? -- My ad-free version of this podcast?


Transcript

00:00:00.960 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.480 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.740 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.100 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.420 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.360 If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.800 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.460 Let this be the first step towards the brighter future you deserve.
00:00:53.820 Welcome to the Jordan B. Peterson podcast.
00:00:56.080 I'm Mikayla Peterson.
00:00:58.220 This is episode 2, season 4, an in-person podcast yet again.
00:01:03.400 This episode is between Greg Hurwitz and Jordan Peterson.
00:01:06.660 It's called Build a Better Democrat, and was recorded December 20th, 2020, before the most recent events on Capitol Hill.
00:01:14.540 Greg Hurwitz is an American novelist, scriptwriter, and producer.
00:01:18.700 He has my favorite thriller books, hands down, no question about it.
00:01:22.580 If you like the Bourne movies, he writes novels kind of similar to that, that are absolutely riveting.
00:01:28.400 I'd recommend the Orphan X series.
00:01:30.840 He actually has the newest Orphan X out on January 26th.
00:01:35.240 There's a pre-order link in the description that I would highly recommend checking out.
00:01:39.540 Dad was also Greg's undergrad thesis advisor when he worked at Harvard.
00:01:43.880 That's how they know each other.
00:01:44.820 But more to the point of the episode, in the last few years leading up to the presidential election, Greg has been working with an independent team of Hollywood writers, producers, and directors to design and promote a moderate political message for the Democrats.
00:01:59.460 Worth a listen.
00:02:00.200 And if you prefer to watch, the video version will be up on YouTube tomorrow, Monday, January 18th, 2021, on Dad's YouTube channel.
00:02:09.460 This episode is brought to you by two awesome companies.
00:02:12.320 One, Headspace.
00:02:13.760 I love Headspace.
00:02:14.840 It's an app that does guided meditation.
00:02:17.100 When I was younger and stupider, I thought meditation was a complete waste of time, which is ridiculous given how long people have been practicing meditation.
00:02:24.760 It can be hard to get into, especially if you have 10,000 things buzzing around your head at one time.
00:02:29.360 Headspace simplifies it.
00:02:31.640 You also don't have to do it for very long to get the benefits.
00:02:34.880 There are scientifically backed benefits to meditation.
00:02:37.980 I started with a 10-minute segment when I wake up.
00:02:42.040 Highly recommend this rather than instantly checking social media or your phone and starting your day stressed out.
00:02:48.600 I've found it very helpful.
00:02:50.380 I've been using Headspace for years before these advertisements started.
00:02:53.880 It's simple, it's easy, and it works.
00:02:55.780 You deserve to feel happier, and Headspace is meditation made simple.
00:03:00.040 Go to headspace.com slash jbp.
00:03:03.020 That's headspace.com slash jbp for a free one-month trial with access to Headspace's full library of meditations for every situation.
00:03:11.560 That's the best deal offered right now.
00:03:13.500 Head to headspace.com slash jbp today.
00:03:16.200 The second company that's gracious enough to sponsor this episode is called Thinker, T-H-I-N-K-R dot org.
00:03:24.160 They summarize the key ideas from new and noteworthy nonfiction, giving you access to an entire library of great books in bite-sized form.
00:03:32.460 You can read or listen to hundreds of titles in a matter of minutes, from old classics like Dale Carnegie's How to Win Friends and Influence People,
00:03:39.280 to recent bestsellers like Never Split the Difference.
00:03:42.900 I use it frequently after I read a book to try and really remember the key points, especially before a podcast episode.
00:03:48.760 If you want to get into reading and you don't have enough time, or let's be honest, possibly not enough self-discipline,
00:03:54.620 Thinker.org can really help you out.
00:03:57.020 If you want to challenge your preconceptions, expand your horizons, and become a better thinker,
00:04:01.020 go to thinker.org, that's T-H-I-N-K-R dot org, to start a free trial today.
00:04:07.460 Again, that's T-H-I-N-K-R dot org.
00:04:10.280 Enjoy this episode.
00:04:11.820 Remember to rate, hit subscribe, and don't forget to check out Greg's books if you're looking for thrillers.
00:04:31.020 I'm talking today with Greg Hurwitz, resident of Los Angeles, California, a former student of mine from Harvard,
00:04:44.060 and someone I've known for a long time.
00:04:46.960 Greg's a novelist, although he has very many other occupations, which we'll talk about today.
00:04:53.000 It's a pleasure to see you, Greg. It's been a while since we talked.
00:04:56.760 Good to see you too, Jordan.
00:04:57.800 Maybe we could start by you just outlining some of the things that you do,
00:05:01.460 and then I think we'll focus on the political stuff more today,
00:05:04.820 not necessarily from a political perspective, though.
00:05:08.720 Well, I came in from novels.
00:05:11.040 I'm a novelist, I write the Orphan X series,
00:05:13.820 and I've also worked in screenplays and TV and comics and some other stuff.
00:05:18.920 And I started to get involved in politics around 2016,
00:05:22.560 in large part because before that, I kind of thought democracy would be fine without me.
00:05:29.040 I didn't really feel any responsibilities as a citizen.
00:05:32.260 I kind of had a lot of opinions, but didn't do a whole lot about it.
00:05:36.320 And one of the things that I wanted to do when Donald Trump was elected,
00:05:40.580 he was not a candidate or a president to my liking or who was a match with my value set.
00:05:49.960 And the first thing that I asked myself, it's funny, you give that lecture about the Old Testament,
00:05:54.200 that one of the answers that Old Testament answers is always like,
00:05:58.660 God's angry, we screwed up.
00:06:00.280 And so I really took that approach all the way down.
00:06:03.980 I thought rather than starting to go on offense and tackle people who voted or thought differently than me
00:06:13.140 or had different ideologies, I would try and think about the failings of the Democratic Party,
00:06:18.860 the status quo, all the parts of society that I was part of,
00:06:23.820 and how badly we would have to have fallen short for him to be seen as a viable and preferable alternative
00:06:31.420 to the candidate that we were putting forth.
00:06:33.820 And so I started to work with a lot of candidates.
00:06:35.660 I was mostly interested in candidates in purple districts, talking to red voters, right?
00:06:41.160 And so for the midterms, we worked with 30 candidates, Democrat in deep red districts,
00:06:46.640 talking about making good faith arguments the way it's supposed to be, right?
00:06:51.240 I have an opinion, I have a preference in political party to make good faith arguments to people
00:06:57.420 to try and win them over to a different point of view.
00:06:59.720 And we had a lot of success.
00:07:01.180 I'd say that the 30 candidates that we work with, 21-1 in terms of flipping those seats.
00:07:06.820 Well, you all have foreign viewers here.
00:07:08.600 So when you talk about deep red states, deep blue states, purple states, what do you mean?
00:07:14.860 Republican versus Democrat, right?
00:07:16.840 I wasn't interested in figuring out how...
00:07:18.500 I wasn't...
00:07:19.600 I'm not interested in any conversations that take place in the bubble of like-minded people.
00:07:26.100 So I was interested in races in Oklahoma and New Mexico and Ohio and Virginia.
00:07:33.340 And so we really went there.
00:07:34.760 And long story short off that, we started to...
00:07:37.440 I wrote a bunch of op-eds.
00:07:38.680 I wrote one with you for the Wall Street Journal.
00:07:40.300 And I did a lot for the bulwark trying to talk across the aisle.
00:07:45.440 And I went out and talked to, I think, about a 360-degree arc of Americans, whether it was
00:07:50.820 military, evangelicals, Black Lives Matter, Hispanic, Texas, Mexicans, different population
00:07:58.520 than Miami, Cubans, right?
00:08:01.560 Different population from California, Mexicans.
00:08:03.820 And really talking to different groups and listening and figuring it out.
00:08:06.880 And I wound up doing about 200 digital and television commercials.
00:08:10.840 All this political orcas pro bono with a small team of us here.
00:08:15.360 Yeah.
00:08:15.380 Do you want to describe the team?
00:08:17.520 Yeah.
00:08:18.360 It's me.
00:08:19.780 It's Marshall Hruskovitz, who's a TV showrunner and creator.
00:08:23.020 He created 30-something.
00:08:24.700 Billy Ray, Oscar-nominated screenwriter.
00:08:27.340 He wrote Captain Phillips.
00:08:28.900 He just did the Comey rule.
00:08:30.620 Sean Ryan, the creator of The Shield, the TV producer.
00:08:33.080 And Lita Calagridis, she has a ton of credits from Shutter Island to she worked on Avatar
00:08:42.040 and wrote a good amount of that with James Cameron.
00:08:44.800 And what was interesting was, in terms of the Hollywood system, after Trump was elected,
00:08:49.840 I think the Democrats were humbled.
00:08:52.880 And then they're always willing to meet with Hollywood-y people.
00:08:55.120 But the washout rate was, there weren't a lot of people who were interested in having
00:09:01.120 different kinds of conversations.
00:09:02.260 And I decided if I could actually get in front of Democratic leadership, and Marshall, too,
00:09:06.340 was on that first trip with me, that I would say exactly what I thought all the time to
00:09:10.940 the best of my ability.
00:09:12.220 Okay, so let me walk through this.
00:09:14.500 So, um, a couple of years ago, maybe that was in 2016, about, you had some political
00:09:23.300 awakening, let's say, and I guess that was a tenant on Trump's election.
00:09:27.780 And your response to that was, how did the Democrats sink so low as to allow this to happen?
00:09:33.760 Is that a reasonable way of summarizing it?
00:09:36.600 Rather, what the hell's wrong with all those Trump voters?
00:09:39.520 Yeah, and like, what, let me start to explore in earnest, my confirmation biases and blind
00:09:45.460 spots and talk to everybody who has a different perspective or point of view than me, in earnest
00:09:51.460 to try to figure that out.
00:09:52.920 Yeah, well, you guys decided that you were going to produce messages for the Democrat Party.
00:10:00.760 Yeah, and that's, that was...
00:10:02.360 And do that on your own accord, in some sense, or on your own, on your own, on your own dollar,
00:10:08.080 but also independently?
00:10:10.780 Yeah, I mean, the line we used was, and I remember sitting in my living room talking
00:10:14.840 about this, I said, we asked for no money, no credit, and no permission.
00:10:18.820 And you said, to me, that's exactly what Orphan X does, my, my protagonist of my thriller series.
00:10:25.340 It was this really funny confluence of my political life and the things that I was writing in the
00:10:30.720 fiction world.
00:10:31.600 And what we realized is we can't go, we couldn't go through everything we did was on our own.
00:10:35.860 We raised our own money. We, one of the things we realized is the cost of admission for getting
00:10:40.800 through messaging that I thought was a more, more persuade, making good faith persuasion arguments,
00:10:46.700 but also that was fair. Every single economic fact that I put in any of the 200 commercials
00:10:52.300 that I produced, I ran through a friend of mine who's like a Wall Street Republican.
00:10:56.840 Like I always wanted opposition fact testing. We tried to do nothing fair that wasn't fair.
00:11:02.280 I'm not suggesting we got this right all the time, but I tried to not do, I didn't want ads
00:11:07.100 that went after Trump's kids in certain ways that were off bounds and personal. I was trying,
00:11:12.200 you know, cause look, you're, if you're, if you're messaging and making propaganda is really
00:11:17.180 what it is. That's, that's Goebbels. You're in Goebbels arena. That's dangerous stuff. You got to
00:11:22.640 take it really, really seriously to try to engage and make arguments without getting corrupted by what
00:11:29.240 that is. Yeah. Well, that's, that's why it's dangerous is that you don't understand. People
00:11:33.340 don't understand when they start to mess with the truth that they're starting to mess with their
00:11:36.900 own psyches because you, if you start playing in the domain of deceit, you'll get tangled up in that
00:11:43.420 so fast and make your head spin. And then you undo yourself. I mean, you can undo yourself even if
00:11:48.780 you stick pretty close to the truth. Okay. So, I mean, what happened, what, what you guys did and the way
00:11:54.100 you went about it has struck me as quite, I don't know, unbelievable, I guess. And that's why I want
00:12:00.780 to dwell on it a bit. So you decided that you had a political responsibility. You organized yourself
00:12:06.480 with a group of people, what a group that was much larger to begin with, but that shrank quickly to
00:12:11.180 those that were actually dedicated over some long period of time to putting a lot of work into this.
00:12:17.520 And it's not surprising you got a bunch of attrition as a consequence of that. Then you decided
00:12:21.660 that you would make messages that were in alignment with the, at least in principle with the Democratic
00:12:26.960 Party, but you didn't get permission from the party brass, so to speak, to do that. You did that
00:12:33.620 independently. Well, there's a weird, well, two things about the attrition rate. One of them was I
00:12:39.660 quickly discovered that a lot of people who are interested in the sort of loudest online outrage are
00:12:46.380 equally devoted to the status quo as the opposition. And so one of the things I came to very quickly
00:12:51.320 was, it matters much more, much more important than language policing, right? And permission
00:12:58.260 structures of who's allowed to say what is an orientation on people's intentions and the actual
00:13:03.200 outcomes. And that's one way you can assess the groups of people of whether someone's going to be
00:13:08.360 useful. If you roll up your sleeves and get in to actually get something done, whether that's winning
00:13:12.660 a race in Oklahoma, right? Or trying to talk in good faith and respectfully to voters in Western
00:13:19.140 Pennsylvania, it's going to be messy. You have to, there's no. Okay. Describe that. What do you mean
00:13:26.240 messy? Like what's messy about it? We've talked a little bit about the psychological consequences of
00:13:32.400 this, this kind of action, even these kinds of discussions. By messy. I mean, I mean, good meaning
00:13:40.080 I don't, I'm the, the further along I get with this, the more convinced I am that you cannot have
00:13:45.920 a perfect conversation that where everyone is, is contained in all the language goes seamlessly
00:13:53.020 about race, about gender and about class in America. And so when there's too much constriction
00:14:00.060 around language from the left and, and, or from the right, basically it's an, it, they're,
00:14:06.120 they're barking around the perimeter of the fertile solutions. They're barking around the
00:14:12.360 perimeter to make sure that nobody can have the kinds of conversations that you need to have.
00:14:16.440 You have to talk about those things imperfectly. You have to. So why would be, why would people be
00:14:21.540 motivated to not allow that to happen? Do you think? Well, because look, so for the, there there's
00:14:28.260 different skews and everything is a generalization, right? So I'm going to generalize a little bit.
00:14:33.260 I think that, that there's in the far right, we see a kind of corruption and ossification around
00:14:40.380 sort of Donald Trump and what he represents, but he was saying things that hit people in a way that,
00:14:46.060 that were things that they weren't allowed to say. I have a whole bunch of theories about the
00:14:49.960 Republicans. I'm going to keep it focused on my looking in the proverbial mirror. I think that a lot
00:14:55.680 of the language policing of the left is actually a way to maintain the status quo because what status
00:15:02.460 is cool and to, and to whose advantage, let's say that you're a rich Hollywood elite, much like me,
00:15:14.660 right. Or, you know, or somebody who is, who is in, in the kinds of groups that, that I move in,
00:15:21.860 that, that, that you move in, but let's say further left of you, like I am, or more, you know,
00:15:26.680 we're both liberal, but if you can pull, if you can talk and have all of the lingo and know exactly
00:15:34.800 what the permission structures are, and you say Latin X instead of Latino, and you do all this
00:15:39.660 stuff. And in a way, what you're doing is, is you're, you're making sure that the conversations
00:15:44.380 that are the real conversations that bring change that are messier don't necessarily occur. But if you
00:15:51.040 have all the language down, you can sort of maintain your position and your money and your relative
00:15:56.280 stature. Yeah. So you can, you can assume that if there was a solution that was being proposed,
00:16:01.340 you'd be part of the solution and not part of the problem. You single that with, you signal that with
00:16:06.200 the language, but you're also, you're, you're, you're casting like, look, I made a, I'll give you an
00:16:11.880 example. I made a video about the, for me, I was, I was exceedingly opposed from day one to
00:16:18.840 messages of chaos from the democratic party. Right. I think conservatives particularly have
00:16:23.840 a reaction to, uh, to chaos. I think they have a legitimate reaction when people announce sort of,
00:16:30.680 you know, police free zones in Seattle and in Portland. And from day one, I was saying this
00:16:36.440 whole notion of sanctuary cities doesn't make sense to me for a variety of reasons. Let's say we have
00:16:40.780 the next president and people decide that voting rights are not going to be applied to in, you know,
00:16:45.740 Birmingham, Alabama, right. And they're going to be a sanctuary city for that. There's all these
00:16:50.560 complexities around it. I made some commercials about black leadership calling for a lack of
00:16:59.600 violence in the protests. Keisha Lance bottom, the mayor of Atlanta gave a speech that I think was a
00:17:05.300 speech with the most thundering moral authority that I've heard from a public figure when Atlanta
00:17:09.620 was tearing itself apart. It's extraordinary speech. I referenced, try reference other people.
00:17:14.080 The only blowback that I got from that was from, um, incredibly affluent, um, sort of coastal elite
00:17:25.380 saying, how dare you, you know, selectively close, you know, African American people decrying violence,
00:17:34.940 you know, when they watch somebody get murdered and they're protesting how they can, and it's the
00:17:38.680 epitome of white privilege and it's all this stuff. And what's interesting is I've long thought that
00:17:42.540 Trump works through projection, like Trump will everything with Trump that he, that he makes as
00:17:48.120 a claim for others. There's a lot of projection that goes on. And I've increasingly seen that from
00:17:51.620 aspects of the left where I thought, wow, how far do you have to be removed from the ramifications of
00:17:58.020 violence to not be worried? Like how many houses and mansions and security guards and gated communities
00:18:05.180 do you have to have access to, to be unconcerned with violent action, whether that community is a
00:18:10.360 community of color, right? Whether it's a white working class community to, to simply say violent
00:18:15.400 protest is something that we're not for. Like, how dare you advocate that when you're rich enough
00:18:20.320 to never have to be there when the tourist violent, um, protesters leave. And the, let's say the black
00:18:27.900 community is left there with the wreckage of their community. Like to be opposed to that message is
00:18:32.720 basically saying, I want to keep letting people protest as loud as they want. It's in a way that won't
00:18:37.760 ever affect me or my, my children aren't at risk. My, my family's not at risk. My house doesn't feel
00:18:42.120 at risk, but I'll use all the right language so that I can be protected and sort of maintain all
00:18:47.020 of that. And when you're trying to wade in to really like win an election so that we don't, you know,
00:18:52.260 the African-American media doesn't have to contend with another, with more, I'll call it, um, more
00:18:58.700 voter rights being thrown out, like real concrete issues. There's real concrete issues there.
00:19:03.600 But if you can chirp about something, that's a slogan like that, you don't have to get into the
00:19:07.720 real solutions or fixes, but at the same time, right. But you can, you can take on the, you can
00:19:13.140 take on the assumed status of someone who's actually working to solve the problem. I think a lot of
00:19:18.880 that, a lot of politically correct language, I don't know. I guess that would be language that's in
00:19:24.340 alignment with, uh, with, with any given doctrine is an attempt to take on the moral virtues of that
00:19:33.420 doctrine without necessarily having to bear any of the responsibility for actions in alignment with
00:19:39.880 that doctrine or to bear any responsibility for the consequences. Like I was furious. I was furious
00:19:48.240 when the protest erupted with Georgia Floyd, there were video after video of African-Americans
00:19:52.480 protesting. Some of them were like telling, you know, turning in people who were either, you know,
00:20:00.220 anarchists who were throwing bricks into and committing property damage of saying, you know,
00:20:05.680 grabbing people, handing them over to the police. A lot of people in African-American community were
00:20:09.400 like, this is our community. We live here. And of course, I'm not implying that nobody in the
00:20:14.060 African-American community, um, crossed the line in the course of those protests. I'm not saying that,
00:20:20.300 but I'm saying there was an awareness within that community that when the cameras are gone and lights
00:20:24.680 go up, nobody's going to come in and rebuild that community. And when all the tourists leave,
00:20:29.740 and everybody's had their march and their protests, they have to contend with it. And there was a
00:20:33.820 measure of discipline in that community, whether it was Keisha Lance Bottoms. I think the,
00:20:37.520 the president of the NAACP in either Oregon or Washington had a great op-ed killer. Mike,
00:20:43.660 the rapper was out there saying we cannot have violence. We're not tearing down our own city.
00:20:47.940 This isn't civil disobedience. The point of civil disobedience of course, is that you bear the cost,
00:20:54.240 you bear the moral responsibility of your transgression. Right. Exactly.
00:20:57.740 African community understood this by and large. And a lot of the loudest voices who were protesting
00:21:03.700 against it, who were, were for me was a frustration were from incredibly affluent. And here I'll use the
00:21:11.120 word privilege, which I don't like to use, um, people in the white community. And that for me was,
00:21:17.140 it's a similar kind of projection as I would see Trump doing. Like there's screaming about privilege
00:21:21.360 all the time. And you're like, how, how do you not understand that, that destruction of property,
00:21:26.240 destruction of small businesses, risks to families. Look, somebody I'll give you a stat.
00:21:30.740 That's an interesting stat here. The average voter who voted for Obama and then Trump thinks about
00:21:37.700 politics on average, four minutes a week, four minutes a week. Right. So people in the bubble
00:21:43.760 think, don't think about politics four minutes a week. And so four minutes a week is about what you
00:21:48.720 can manage to worry about the emoluments clause and Russian hacking. When you're at the bottom of
00:21:54.640 Maslow's hierarchy of needs, right? You got a sick kid, you're out of health insurance and you don't
00:21:58.520 have a job. You might have a special needs kid. You might have a parent in a home. You have COVID
00:22:03.060 hitting. You don't have time for any of this. You don't have time to have the kinds of conversations
00:22:09.360 around nuance of, of weather. And when everyone was shocked about the, about the Latin vote,
00:22:15.160 I was just thinking how many of you people actually have friends and family who are Hispanic,
00:22:19.620 who you talk to. I mean, the joke was that the big shock was that Biden won the Latin X votes and
00:22:26.100 Trump won the Latino vote. A lot of the Latino community. I mean, I mean, they don't, what do
00:22:31.640 you think accounted for Trump's attractiveness to the Latino community? This kind of ties back in,
00:22:36.300 this ties back into a broader question I want to ask you. It's like, um, I've been interested in
00:22:42.880 what you've been doing and supporting it to the degree that I've been able to, and to the degree that
00:22:48.060 that's useful, I suppose, because I was very interested in your, um, willingness to look at
00:22:56.420 what had gone wrong with the democratic democratic party and to try to fix that. That seemed to me to
00:23:02.320 be a win, no matter that's a win for everyone, no matter where they are on the political spectrum,
00:23:08.580 because the higher, the function of both parties, the better the political outcome, as far as I'm
00:23:15.120 concerned, right? You want as little stupidity as possible all across the spectrum. So it seemed to
00:23:20.260 me that re reducing some of the foolishness that characterized particularly the radical left,
00:23:26.340 the careless radical left within the democratic party and focusing on a more pragmatic, let's say,
00:23:33.120 but also wiser and less resentment driven strategy would be a good thing overall. Um,
00:23:40.240 so that opens up the broader can of worms, which is what exactly had the Democrats done
00:23:47.140 so badly that they lost to Trump. Well, so to me, there's a couple of things
00:23:54.860 and we can talk about the Hispanic vote. We should talk about, let's talk about that specifically and
00:23:59.780 the broader question in general. Well, so look, if you, I mean, I have friends and family who there's,
00:24:07.400 there's such an array of, we talk about the Hispanic vote, like it's some monolith,
00:24:11.500 right? It's not remotely that Cuban Americans are like anything ever resembling socialism.
00:24:17.160 I will never vote for you. And if you compare Trump to Fidel Castro, read a fucking book.
00:24:22.000 That's basically the attitude of the Cuban Americans, excuse my language. It's, it's, it's,
00:24:27.100 and they, they say, I don't care what he calls us. I don't care what he does to us. The only thing
00:24:31.160 that we have learned that we learned out of that is that the only power that you can trust is economic
00:24:37.080 power. The rest of it's an illusion and socialism wants to come in and threaten that. I want business
00:24:42.080 opportunity, right? I want less regulations. They won't go near us. It's very, very different.
00:24:47.580 And the Hispanic community is it's, it's incredible. You think that's particularly true
00:24:51.460 of the Cuban Americans, Cuban, Venezuelan Americans, Venezuelans. Yeah. Well, they have reason for it.
00:24:57.760 Like, like my most kids, a lot of the most conservative friends and associates who I have,
00:25:02.840 whether it's, it's people who are friends of mine, whether it's workers or Mexican Americans in LA,
00:25:07.960 they also, they don't want Mexico to come over here. They don't want open borders. Many of them,
00:25:13.840 they left that. Why is that hard to understand? They tend to be, you know, cat Catholic families.
00:25:20.780 So if you think about politics for four minutes a week and somebody comes in all of a sudden,
00:25:25.100 and they're talking about socialism, defunding the police, and then announcing all sorts of gender
00:25:30.360 complexities, you know, and I say, this is somebody with a, you know, I always, I preface it to say,
00:25:37.500 you know, I have a, I have a trans godson, uh, you know, lesbian sister. This is not like where my
00:25:43.060 personal politics are for what people should be allowed to do and where my personal politics fall
00:25:49.280 are very different than what I think the priority and the, and the ranking of discussion is if you're
00:25:54.880 going to go talk to somebody who thinks about politics for four minutes a week and bring up
00:25:58.280 elaborate critical race theory and, and like, and start to talk to them about the fact that boys
00:26:03.980 aren't boys and girls aren't girls, and they should just announce this and have announcements at the
00:26:08.040 age of 18. I don't think any Democrats grasp when you think about politics four minutes a week,
00:26:13.540 and they talk about Trump and his transgressions, which I believe are more damaging and dangerous
00:26:18.720 than those of the left. But I don't think anybody has any idea of the kind of transgressions that
00:26:23.180 that represents to people who are either on the center or on the right.
00:26:27.720 Well, the four minutes a week thing really is interesting too, because one of the things I was
00:26:31.700 really struck by over the last four years with all my encounters with journalists, many of which were
00:26:37.980 good, by the way, I had lots of good encounters with journalists, but the worst encounters I ever had
00:26:42.740 were always, almost always with journalists as well, is that the journalists think about the world
00:26:48.940 politically all the time. Like they're every single decision they make every, I mean, obviously,
00:26:55.580 this is a generalization. But if you're in that world, everything is political. But for the typical
00:27:01.460 person, that's just not the case at all. And that's actually good. One of the best political science
00:27:06.700 theories I ever read was predicated on the idea or put forth the idea that in a highly functioning
00:27:14.040 political system, especially a democratic system, the less people think about politics,
00:27:19.900 the better the system is working. And towards the end, I didn't think politically at all. I'm not
00:27:25.920 even interested in politics. You know, I didn't, I didn't, I mean, it's, I couldn't agree with that more.
00:27:33.840 I mean, one of the things I think a lot about is I have, I have a friend, one of my closest friends
00:27:37.200 who you've met, born again, Christian, he was raised as a son of a missionary, um, all through other parts
00:27:42.560 of the world. And, you know, but he lives in LA. He, he, he worked a bit in the industry, a very,
00:27:48.800 um, rounded conservative friend of mine. He has gay friends, friends from whatever,
00:27:54.400 but he went in the booth and told me during the election in 2016, he said, I just went in and I
00:27:58.960 thought, forget it. I'm voting for Trump. I can't, I can't bring myself to vote for Hillary Clinton.
00:28:03.820 I was really angry at him at first. Cause it was like, and then I realized I shouldn't say really
00:28:09.840 angry with him, but I was, I realized that I didn't understand that for the things that I saw
00:28:15.240 for the clouds, I saw massing on the horizon with Donald Trump. And we're seeing some of that here
00:28:19.960 with his, the legal threats to the election, trying to undermine election security, his own large,
00:28:25.340 largely appointed Republican judges shooting a lot of that down. There's a lot of things we don't
00:28:29.400 need to get into all that. Cause everyone can have an answer for everything that I say,
00:28:32.760 but the realization I had with him was, Oh my God, he is a canary of a particular coal mine.
00:28:40.160 He's a guy who rides a motorcycle. He likes guns. He likes kind of different kinds of freedoms. He,
00:28:44.920 he in a different, he has a different relationship with freedoms versus security than I do.
00:28:49.500 I'm a canary down a different coal mine, right? Part of that might be from me looking at the sort of
00:28:54.400 authoritarian shadowy NIST that I saw coming in with Trump. That's what I alert to. I can't decide
00:29:01.740 that my friend who I know and love and who has been in my house and accepts my friends,
00:29:06.160 my family, everybody, and has a broad range of friends and family. I can't determine that he's
00:29:10.740 either foolish or dumb or wrong or a bad person anymore. I can't determine that he's an ignorant
00:29:17.620 canary down an ignorant coal mine, right? Cause if he's my friend and I'm that close to him and he's
00:29:22.120 here in LA and that's a choice he made, I better listen to what that was, even if the gut instinct for
00:29:28.900 him. And so then I was thinking about this a lot. And one of the things that I think has been a
00:29:32.980 blessing of the Trump presidency is there's some conversations we're having now that are,
00:29:37.580 that are awful and hard. Like it's sort of like, you know, it's, it's, we talk about this all the
00:29:43.260 time, obviously with young, with Freud, it gets, you go through hell before you get anywhere else.
00:29:47.320 We wouldn't be having any of these conversations. If we were now in year four of a Hillary Clinton
00:29:51.820 presidency, we're having different conversations. They're worse right now in a lot of ways about race,
00:29:57.720 about class, but the fact that has stuck with me the most. And one of the things I'll say is I went
00:30:03.460 in open eyed all the way down to assess my party in the political situation. I've only gotten more
00:30:09.780 disillusioned and angry with the democratic party. Okay. Okay. So, okay. So let's, let's go return to
00:30:15.600 that. Okay. I'm going to keep that in mind. Let's return to that. So you put together this team or
00:30:22.300 this team was organized to produce messages that would support the Democrat, democratic party
00:30:29.460 fundamentally, but the, but the overarching philosophy was one of self-criticism. Let's say
00:30:36.000 if the self includes the democratic party and what other, what are the rules? What were the other
00:30:45.260 rules for the messaging? See, I don't think people are going to understand exactly what you did.
00:30:49.820 You made these ads, but you went out and did it with your own team. And so who were the ads
00:30:54.640 generated? How are the ads generated? Who were they targeted to? What was their consequence? And
00:31:00.400 what were, what were the rules that you used and agreed on when you were making the ads and how did
00:31:06.360 you agree on them? Sorry, that's a lot of questions, but part of this is it was so, it was all
00:31:12.260 entrepreneurial, Jordan. It was all outside of the political. If I'd still be waiting for the first
00:31:17.020 approval from the D trip C to do my first, you know, $2,000 commercial, we couldn't wait for it.
00:31:24.000 The, the, the fief domes and bailiwicks within the party and the institutional, just bureaucratic mess
00:31:31.120 is sufficient that, that a lot of what got done, got done with a network of people entrepreneurially
00:31:36.440 and free market. Right. That's pretty funny. Really? Publicans. Yeah. Yeah. And it gave rise
00:31:43.540 to it. And really all that it was, was, was our own ethical bearing. You know, I, I ran the thing.
00:31:49.620 And so I, we did testing to make sure we were, that the ads were effective, that we weren't just
00:31:53.980 shouting at each other on Twitter and getting the most likes. And what would, how define effective?
00:31:58.220 How do you know? I mean, we, there was a woman is incredible who did, you know, we did testing
00:32:05.180 focus groups. We saw how they move people. I mean, I can send you deck after deck after deck
00:32:10.260 of the analysis. Okay. So you were looking at, you were looking at pre, uh, post exposure shift
00:32:16.100 in political attitude as a consequence of the advertisements. And what was nice was that our
00:32:20.820 gut instinct, me by that, I mean, me, Marshall, Billy, Lita, Sean, our gut instinct was we're not
00:32:27.400 going to make Trump bashing ads. We made some when they were fair. That was a big, important thing.
00:32:32.980 Like I did the one with, for Republican voters against Trump, where it was, uh, it was just
00:32:38.300 Reagan city on a Hill speech. And I just showed Reagan, I just showed Trump doing the opposite
00:32:42.580 in every regard. For the first time in our memory, many Americans are asking, does history still have
00:32:49.260 a place for America? There are some who answer no, and we must tell our children not to dream as we
00:32:54.840 once dreamed together tonight. Let us say that America is still united, still strong,
00:33:02.420 still compassionate, still willing to stand by those who are persecuted or alone. For those who
00:33:08.300 are victims of police States or government induced torture or terror, let us speak for them. I believe
00:33:14.040 we can embark on a new age of reform in this country that will make government again responsive
00:33:18.720 to people. We can fight corruption while we work to bring into our government women and men of
00:33:24.320 competence and high integrity. Tomorrow, you will be making a choice between different visions of
00:33:29.440 the future. Are you more confident that our economy will create productive work for our society?
00:33:35.540 Or are you less confident? Do you feel you can keep the job you have or gain a job if you don't have
00:33:41.100 one? Are you pleased with the ability of young people to buy a home? Of the elderly to live their
00:33:46.220 remaining years in happiness? Of our youngsters to take pride in the world we have built for them?
00:33:51.480 Are you convinced that we have earned the respect of the world and our allies? Let us resolve tonight
00:33:56.680 that young Americans will always find a city of hope in a country that is free. And let us resolve.
00:34:02.800 They will say of our day and of our generation that we did keep faith with our God, that we did act
00:34:08.700 worthy of ourselves, that we did protect and pass on lovingly, that shining city on a hill.
00:34:22.500 Going online without ExpressVPN is like not paying attention to the safety demonstration on a flight.
00:34:28.040 Most of the time, you'll probably be fine. But what if one day that weird yellow mask drops down from
00:34:33.320 overhead and you have no idea what to do? In our hyper-connected world, your digital privacy isn't
00:34:38.660 just a luxury. It's a fundamental right. Every time you connect to an unsecured network in a cafe,
00:34:43.920 hotel, or airport, you're essentially broadcasting your personal information to anyone with a technical
00:34:48.980 know-how to intercept it. And let's be clear, it doesn't take a genius hacker to do this.
00:34:53.540 With some off-the-shelf hardware, even a tech-savvy teenager could potentially access your passwords,
00:34:58.420 bank logins, and credit card details. Now, you might think, what's the big deal? Who'd want my data
00:35:03.940 anyway? Well, on the dark web, your personal information could fetch up to $1,000. That's
00:35:09.500 right, there's a whole underground economy built on stolen identities. Enter ExpressVPN. It's like a
00:35:15.460 digital fortress, creating an encrypted tunnel between your device and the internet. Their
00:35:19.840 encryption is so robust that it would take a hacker with a supercomputer over a billion years to
00:35:24.740 crack it. But don't let its power fool you. ExpressVPN is incredibly user-friendly. With just
00:35:29.880 one click, you're protected across all your devices. Phones, laptops, tablets, you name it.
00:35:34.700 That's why I use ExpressVPN whenever I'm traveling or working from a coffee shop. It gives me peace of
00:35:39.660 mind knowing that my research, communications, and personal data are shielded from prying eyes.
00:35:44.580 Secure your online data today by visiting expressvpn.com slash Jordan. That's E-X-P-R-E-S-S-V-P-N.com
00:35:52.680 slash Jordan, and you can get an extra three months free. Expressvpn.com slash Jordan.
00:36:01.660 Starting a business can be tough, but thanks to Shopify, running your online storefront is easier
00:36:06.700 than ever. Shopify is the global commerce platform that helps you sell at every stage of your business.
00:36:11.780 From the launch your online shop stage, all the way to the did we just hit a million orders stage,
00:36:16.900 Shopify is here to help you grow. Our marketing team uses Shopify every day to sell our merchandise,
00:36:21.840 and we love how easy it is to add more items, ship products, and track conversions. With Shopify,
00:36:27.780 customize your online store to your style with flexible templates and powerful tools,
00:36:32.340 alongside an endless list of integrations and third-party apps like on-demand printing,
00:36:36.820 accounting, and chatbots. Shopify helps you turn browsers into buyers with the internet's best
00:36:41.720 converting checkout, up to 36% better compared to other leading e-commerce platforms.
00:36:46.360 No matter how big you want to grow, Shopify gives you everything you need to take control
00:36:50.740 and take your business to the next level. Sign up for a $1 per month trial period at
00:36:55.600 shopify.com slash jbp, all lowercase. Go to shopify.com slash jbp now to grow your business
00:37:02.660 no matter what stage you're in. That's shopify.com slash jbp.
00:37:06.960 In today's chaotic world, many of us are searching for a way to aim higher and find spiritual peace.
00:37:16.240 But here's the thing. Prayer, the most common tool we have, isn't just about saying whatever
00:37:20.540 comes to mind. It's a skill that needs to be developed. That's where Hallow comes in.
00:37:25.340 As the number one prayer and meditation app, Hallow is launching an exceptional new series called
00:37:30.220 How to Pray. Imagine learning how to use scripture as a launchpad for profound conversations with God.
00:37:36.100 How to properly enter into imaginative prayer. And how to incorporate prayers reaching far back in
00:37:42.240 church history. This isn't your average guided meditation. It's a comprehensive two-week journey
00:37:47.480 into the heart of prayer, led by some of the most respected spiritual leaders of our time.
00:37:52.680 From guests including Bishop Robert Barron, Father Mike Schmitz, and Jonathan Rumi,
00:37:57.300 known for his role as Jesus in the hit series The Chosen, you'll discover prayer techniques that have
00:38:02.100 stood the test of time while equipping yourself with the tools needed to face life's challenges
00:38:06.440 with renewed strength. Ready to revolutionize your prayer life? You can check out the new series as
00:38:12.180 well as an extensive catalog of guided prayers when you download the Hallow app. Just go to
00:38:17.200 Hallow.com slash Jordan and download the Hallow app today for an exclusive three-month trial.
00:38:21.980 That's Hallow.com slash Jordan. Elevate your prayer life today.
00:38:30.900 In the op-eds, I was equally harsh on Democrats and Republicans both.
00:38:37.440 But, you know, so part of it was for me, as I said, we can never lie to any standard. I don't want to bend
00:38:44.320 the truth. I don't want to lie. I went out and got a hardcore lifelong Wall Street Republican to do all the
00:38:49.000 fact-checking. And it was down to, like, even if we were kind of bullshitty about something.
00:38:53.980 So, you mean he'd respond? Sorry?
00:38:59.460 You asked him to tell you if he thought that you had been
00:39:02.880 even playing with the truth rather than breaking it.
00:39:07.200 Oh, and I paid him as a researcher to say, you know, here's the claims we're making. Are they fair?
00:39:12.820 Check multiple sources. And I wanted somebody who expressly was not a Democrat to do all of that.
00:39:17.740 Yeah, well, that seems to me to be, you'd want to have someone like that around you if you're making
00:39:22.060 complex political decisions. That's right. You need an enemy to tell you, to point out your
00:39:28.340 weaknesses. Okay, so you set up this crew, which was quite large to begin with, and then got smaller.
00:39:35.720 You went and met with Democrat leaders.
00:39:37.820 Yeah, we spoke at, we spoke, Marshall and I addressed both caucuses, and we still, we, in an
00:39:44.480 ongoing way, we do candidate training and we deal with, we deal with leadership also.
00:39:48.880 So, how is it that you managed to, look, one of the things you said was that had you waited around
00:39:55.740 for permission, you'd still be waiting. And then another thing that needs to be pointed out to people
00:40:01.660 is that without permission, you could go ahead anyways and make your political statement, right?
00:40:07.780 You just had to go do it. But you still have positive relationships with the Democratic Party
00:40:16.520 per se. Now, isn't that weird? How the hell do you account for that? Why not?
00:40:20.560 Because I was, because I was fair and I was respectful. And what we did was, you know,
00:40:26.380 it's like that judge's description of pornography. I don't know what it is, but I know it when I see
00:40:29.960 it. And so Marshall and I would say, look, we know how to make a unifying, uplifting message
00:40:35.220 that's positive, that brings back sort of core democratic values and can speak across the aisle
00:40:40.880 to people with a psychological profile that's more conservative, with an understanding and respect
00:40:46.280 for the fact that conservatives and liberals in concert are what holds society together.
00:40:51.020 Now, Marshall and I have an opinion that is that the bad, the dangerous aspects of the left are
00:40:58.880 embodied mostly in academia, culture, let's call it journalism, and then a small tiny cabal of very
00:41:08.220 far left members of the Democratic Party. And to some extent, it's crept through the party. Some of the
00:41:13.180 things that are the excess of the left that obviously you've discussed at great length. For us
00:41:18.600 on the right, it is, it is codified all the way through the Senate, all the way up to Donald
00:41:23.940 Trump, who holds the nuclear football. So I was trying to figure out, well, why does this threat
00:41:27.680 from the left that to me is, is, is much less, but just as dangerous. And there's a lot of canaries
00:41:33.480 and coal mines going, Hey man, pay attention. That's bad news. The same way that we were about Trump.
00:41:38.960 Why is it being given kind of this, um, equal, if not more waiting to the, to what I felt was the
00:41:47.320 clear and present danger of the excesses of the Trump administration and what was happening there.
00:41:52.360 And so part of it was, is, is the respectful conversation with Trump voters or mostly I'm
00:41:59.680 sure I screwed up plenty, but you know, and, and to do a unifying message and to show them,
00:42:04.840 one of the things people don't realize is that, is that messaging messaging becomes content. If we
00:42:10.540 could get the message right, we could solidify the story and then that could change policy. And then
00:42:16.240 that can change the democratic platform. So we, so we talked a lot about that when, when you first
00:42:20.920 embarked on this venture. So, and correct me if I'm wrong about any, as far as you're concerned about
00:42:26.700 anything that I'm saying, the first thing is, is that if you produce a message, a story,
00:42:33.760 that story has an ethic, has an implicit ethic. And if the story is accepted, then the implicit
00:42:41.160 ethic is accepted. And then the implicit ethic will be made explicit across time. So a story is like,
00:42:48.660 it's like the seed ground for, for explicit policy. So you took, you, you got a hand, you got a grip on
00:42:57.900 the story. Now, one of the things that concerned me about the radical left was that because of the
00:43:03.740 they had a story and it's a powerful mythological story, um, benevolent nature, tyrannical culture,
00:43:11.140 um, the noble savage, that's another part of it. Um, because they had a coherent story,
00:43:19.680 they had a disproportionate effect on policy and the moderate Democrats policy. So let me just
00:43:28.500 interrupt. Okay. So if you look at, so AOC did the green new deal, push that through, which to me
00:43:34.820 was not an adult piece of legislation. It was a Trojan horse filled with everything, zero votes in the
00:43:39.980 Senate, zero votes from the Democrat. Every measure. If you look at HR, HR one is anti-corruption. It was
00:43:46.680 prescription drug. There was the, the, the actual policies and body of who the Democrats are is much
00:43:53.760 more moderate and pro-capitalist, um, than, than, than it is in policy. So what's being parroted
00:43:59.540 loudly is not in fact, democratic policy. In my estimation, the flaw or the fault in the democratic
00:44:06.920 party is their failure to stand up and keep the elements of the party in their proper places
00:44:13.120 to state who they are and to draw a line for what they're opposed to. And I think that that act of them
00:44:18.580 being like, well, we can't really, we're concerned to criticize, you know, defund the police or we're
00:44:23.640 too concerned. Yeah. Part of what I would say to them was, look, if you're scared of AOC's Twitter
00:44:28.660 following, Americans are not going to deem you to be worthy of carrying the nuclear football. Like
00:44:33.940 that's just a very low checksum analysis. If you can't just clearly say that defunding the police,
00:44:40.080 whether that means other things, which it does is it, is a slogan that makes no sense and terrifies
00:44:44.740 the vast majority of Americans rightly so. And a ton of immigrants rightly so like, or, or, or,
00:44:51.060 you know, people of Hispanic origin, if you can't understand and state that clearly, cause you're
00:44:55.440 afraid of the blowback, you're not going to be trusted to lead. And that's, and so it's a problem
00:45:02.140 of, of, of degree. And I think that's one of the biggest topics of friction you and I have had for a
00:45:07.980 long time, not, not negative friction, but just where we're, we've been hammering away at that,
00:45:12.780 where I keep saying to you, the radical left is not the kind of threat in America relative to the
00:45:18.220 threat that's posed by Donald Trump as it represents in Canada. Um, and we can disagree
00:45:23.680 about that because I think that the threats represented by the radical left represent an
00:45:28.720 equally dystopic. And I can tell you, like, I don't really understand it myself to some degree is
00:45:34.620 that I've been more, and this is surprised to me, I would say I've been more reactive to the
00:45:41.960 threat posed by the radical leftists. And I, I think it's, it's possible that it's because I'm in
00:45:48.460 academia. And so I'm, can I tell you why I have a theory about why that is too. Okay. Yeah.
00:45:54.660 My theory is, is that the right, the right comes in the front door. They're like, here we are. We
00:46:02.180 want more money. We want more power. We don't like government. We're going to shrink it. Even
00:46:06.020 the, the, the let's call it racism or anything that goes down from that pole to in-group
00:46:13.000 favoritism, right? Like normal in-group favoritism. There's plenty of people who are like, look, I grew
00:46:17.280 up in rural North Carolina. I'm fine with having a black president. I'm fine with doing whatever.
00:46:22.940 This is my culture. I don't want to be asked to celebrate another culture all the time in every
00:46:28.720 way, or I'll speak called racist. That doesn't make sense to me. Whereas the left comes in and
00:46:33.320 they say, well, we like all of our stuff and we like our whole situation. Like, like the examples
00:46:38.560 I was giving earlier, we're going to say defund the police when we're rich enough to not be in a
00:46:42.620 neighborhood that that will have an effect of. We we're, we're above the pale that if everything
00:46:47.860 is moved through that lens, we're successful enough that we have money and we have resources
00:46:52.600 anyways. And we're going to wrap ourselves in a, in sanctimony, right? We're going to wrap
00:46:58.820 ourselves in sanctimony, want to maintain the status quo as much as you do, but pretend it's
00:47:03.900 because we're morally superior and you're morally inferior. And that's, that's a, that's shame
00:47:09.420 inducing. That's like a maternal scolding instinct that elicits, I think, rage. And so that's a,
00:47:17.720 that's a big difference in the two. And I think that accounts for why some people are like, Hey,
00:47:22.520 look, it's, it's, it's just, it's complicated. There's this element of moral superiority. And one
00:47:28.020 of the things that, you know, I did so much work with the evangelical community and they've been
00:47:32.160 great, like making good faith arguments to reach out and talking about, you know, the values and
00:47:37.720 attributes of Christ and trying to talk to voters. And there were some voters who, who, who we were
00:47:42.600 very successful in talking to, I want to say Obama got 26% evangelical vote. Hillary didn't go after
00:47:48.060 it at all. She got 13 and lost. And Biden was back up at 22, 23. We thought that was a very
00:47:52.740 important community to talk to. What people don't realize is if you look at Trump, if you look at
00:47:56.940 anybody from a Christian worldview, you can dislike everything. You can dislike him legally. You can
00:48:03.220 dislike the policies. You can dislike a lot of these politicians, but the deeming of somebody is morally
00:48:08.160 inferior, right? Whether it's followers of Trump, whether it's, you know, voters, no one can do
00:48:15.040 that, but God, you're not allowed to do that. You don't know where someone is on their journey.
00:48:19.460 You don't know if he's a sinner at the nadir of his existence and is going to turn around. And
00:48:23.340 there's a bigger, weird moral frame that gets put on it. Um, of course there's aspects of that,
00:48:30.680 that will come in from the right, right. With homosexuality, with the more racist element,
00:48:35.640 but that aspect, if, if I, if I'm arguing that the, that the, that the right has been more infected
00:48:43.360 up the power structure by the worst authoritarian excesses of the right, I think that the narrative
00:48:50.360 of moral judgment has infected a wider swath of the left, if that makes sense.
00:48:56.160 Well, it's worth thinking about anyways. I mean, it's a real mystery to me because
00:48:59.420 I suspect that if I, particularly because I'm Canadian. And so that, that puts me
00:49:05.120 culturally to the left of the typical American, let's say, I suspect if I read through a list of
00:49:12.480 policy decisions made by the Democrats and made by the Republicans over the last 20 years, and I was
00:49:18.560 blind to the party who supported it, I would end up supporting more Democrat legislation than Republican
00:49:25.600 legislation, but there's still something about the radicals in, in, on the left that, that disturbed
00:49:32.980 me in a way that. Look, it's, it's, so here's another way to look at it, right? We've all seen
00:49:40.520 Cape fear, right? In Cape fear guy gets out of prison. He goes after his defense attorney,
00:49:47.020 criminals who are escaped, who do go after counsel, go after the defense attorney and not after the
00:49:52.380 prosecutor. And the reason for that is percentage wise. And I believe that's true. I've heard that
00:49:57.880 I haven't sourced it, but let's pretend it's true for the sake of this, uh, parallel. I think a lot
00:50:03.300 of what has happened is people figure the prosecutor is doing their job, but if your defense attorney,
00:50:08.820 who's supposed to be looking out for you, doesn't, there's a different kind of anger. I think that's
00:50:14.300 what the Democrats represent. So it's a betrayal. Yes. I think that that's interesting. Here's,
00:50:19.400 here's the Uber statistic for everything for me that when I arrived at, I felt like the scales
00:50:24.940 fell from my eyes over the last 40 or 50 years, $50 trillion with a T have moved from the bottom
00:50:33.300 90% of Americans to the top 1%. That's not through innovation competition and pure free market
00:50:42.220 capitalism. It's just not it's corporate giveaways. It's lobbyists writing bills. It's there's a whole
00:50:48.740 structure. It's the weird inevitability of the Pareto distribution, right? The idea, the, the,
00:50:55.320 the age of a law that the rich get rich and the poor get poor. It's unbelievable,
00:50:59.660 unbelievably difficult to keep that under control. And it is definitely something that destabilizes
00:51:06.400 societies. And that's the thing. And it's been done largely that 50 trillion effects of globalization
00:51:12.540 certainly play a role in that, but it's not like all of a sudden, everybody who's a CEO got that
00:51:18.640 much more brilliant. So this is the largest transfer of wealth I believe in history. And it didn't go
00:51:25.440 a socialist way. That's when everyone's pulling their hair out about socialism. I just look at it
00:51:29.740 and go $50 trillion. So if you're white working class and that happened under Obama, that happened
00:51:35.600 under Clinton. Yeah. Well, that's, but see, that's the peculiar thing is that it's, it's not self-evident
00:51:40.420 that policy can stop that. Like one of the things I've been terrified about since really learning
00:51:46.680 about the Pareto distribution is it's implacability, you know, as you pointed out, this,
00:51:53.480 this distribution happened even under systems of governance or, or ideologies of governance that
00:51:59.880 hypothetically should have stopped it or at least slowed it. It'd be interesting to find out if that
00:52:04.380 transfer took place more rapidly under Republicans than Democrats or not.
00:52:08.520 Well, I think that whatever it is, if you look at that one fact, that is a failing of basically the
00:52:18.360 entire ruling class in America, like that. And part of, you know, I realized that I had a moment
00:52:24.200 of realization. I'm going to tell you, this was really funny. So, you know, there was the tarp give
00:52:28.200 after nine 11, there was a tarp bailout and the airlines were in trouble and they were bailed out.
00:52:32.160 I'm, I'm, my statistics might be slightly wrong, but call it 53 billion dollar bailout. Okay.
00:52:39.240 When COVID hit, they needed enough. And we were, you know, us Democrats were hopeful that that would
00:52:45.320 go into the workers and everyone else. It went into stock buyback so that the stock prices rose,
00:52:50.360 right? 35% of the stock market is foreign owned. It was a straight corporate giveaway. That's a
00:52:55.260 transfer of wealth. It didn't go back to the workers that pissed me off. When COVID started,
00:53:00.520 they asked for another bailout and it was like the exact same number, call it $52 billion.
00:53:05.500 I got all mad. I'm like, I'm going to call Marshall. We're going to do commercial. And I
00:53:08.900 stopped for a minute and I said, you know what? I'm the asshole who's being served by that. And I
00:53:15.260 don't mean this in a self-flaguating privilege way. Like, let me take a peek into my 401k. Guess what
00:53:21.140 stocks I'm probably holding, right? A ton of airline stock. So when there's a stock buyback,
00:53:27.580 which I can get angry about, part of the thing is to have a realization to go, look,
00:53:32.400 that's not, that's a really good example because it shows that's a good, what would you say? That
00:53:39.000 sheds an interesting light on the implacability of the Pareto distribution. It's like you're part
00:53:45.140 of the problem, even though you object to it ideologically and you're part of the problem
00:53:49.240 because of where you sit in the economic structure. And right. But the thing is about this,
00:53:55.060 and you talk about regulation or policies not working, which I want to return to in a minute,
00:53:59.540 but part of what I realized was, um, that's not because I'm a good investor. That's not because
00:54:05.860 I'm smart. That's not free market. It's not because you're cruel and malevolent either.
00:54:11.360 No, it's not. But this is not my investing genius or the free market at work, right? We don't have a
00:54:17.480 transfer of wealth of that extent going the other direction. And so part of it is like, okay,
00:54:23.140 so I'm a beneficiary. So what, so what a solution for a lot of things is, you know,
00:54:28.720 you give money, you scream about privilege and you self-flagellate. As far as I'm concerned,
00:54:32.980 that's, that's all a self. It's self, it's a self-focused reaction as opposed to me saying,
00:54:40.660 how do we start to address that problem? And the thing is, is it has to be partially,
00:54:45.940 partially policy, partially regulation. We can't, we can no longer,
00:54:49.980 well, it is something that we fight all the time. You know, one of the things you just said
00:54:54.560 shed light. I think again, for me on my irritation with the left end of the ideological spectrum is
00:55:02.980 that it's just too much to see people who benefit say like we're in a position like you are, or like
00:55:10.140 I am in because we're not beneficiaries of the Pareto distribution in a major way.
00:55:15.520 Now it seems to me too much for me to also expect to be admired as a, a paragon of virtue in
00:55:24.320 relationship to my attitude towards the poor, let's say, because then I'm asking for too much.
00:55:29.120 I'm asking to be a beneficiary of the system, the way it's set up now. And I'm asking to be
00:55:34.640 admired for my objection to the very system that is enriching me. And the second one of those is too
00:55:40.700 much to ask for. And this, the solution for that. And like, when I had that realization,
00:55:46.120 I was like, huh, let me get on that. Let me look at policies. Let's have an economic summit,
00:55:51.280 like the one you and I did like, and I'm not claiming I'm going to like go out and fix the
00:55:55.740 whole problem. But if all I do is sit around and go, Oh, I feel so guilty. Let me do a couple of
00:56:00.320 think pieces about it and talk about white privilege. It's a name. It's just more self-focused
00:56:05.460 bullshit for those people who think about politics four minutes a week, because they can only afford
00:56:09.940 to think about politics for four minutes a week. So what do we do? And what I do is I try to
00:56:14.660 advocate for, you know, policies that will work, even if some of those are conservative policies,
00:56:20.720 right? I mean, I have a ton of people across the, across the aisle, across the whole spectrum
00:56:24.900 who I'd reach out to go, what do you think of this? Are there libertarian answers? There's got to be
00:56:29.620 some, there's got to be some regulatory answers because it's, it's so out of control.
00:56:33.620 Well, hopefully, because the, the, the end of the Pareto game is that far too few people have far
00:56:40.180 too much of everything. And that's not even good for them. I mean, you're not rich if you have to
00:56:44.840 live in a gated community. That's right. That's a gilded cage. You know, it's not, it's not an
00:56:50.540 indication of wealth. Wealth is when you can walk around your city freely at night. That's wealth.
00:56:56.360 That's exactly it. And so that's so much of what we arrived at in the messaging. When we try to talk
00:57:02.340 to people across the aisle, country club, Republicans, let's say there's a difference
00:57:07.040 for me, for people who are at the wrong end of the people who are at the wrong end of this system.
00:57:12.100 And it pisses me off when, when people get so angry about the fact that like all these people are
00:57:16.980 voting in ways that hurt their own interests. Right. Yeah. I always say like, I vote in ways that
00:57:22.460 hurt my own interests. I don't just vote. Like, how do you know what their interests are? Their
00:57:25.460 interest could be moral. Their interest could be familial. Their interest could be religious.
00:57:28.780 It's not just their financial interests. First of all. Yeah. Their interest could be their
00:57:32.720 children's future rather than their current, than their own current reality. I mean, I learned a long
00:57:38.400 time ago that, that small businessmen didn't vote for socialist policies in Canada, even when they
00:57:44.240 were pro small business, because they didn't want to be small businessmen. They wanted to be big
00:57:48.480 businessmen. So they were voting their dream, not their reality. And it's not obvious that that's a
00:57:53.220 mistake, even though, well, you could criticize it and you could point out it's lax, but it doesn't mean
00:57:58.380 it's inadequacies, but that doesn't mean that it's a mistake.
00:58:02.000 So let's get into our, the canary in the coal mine discussion again. So I think like for me,
00:58:07.260 it's, it's glaringly apparent. And I know lots of people, especially a good number of the people
00:58:13.000 who are your listeners will in good faith, disagree with me on that. To me, it's glaringly apparent
00:58:17.620 the difference in terms of what a Trump presidency, let's say in a Biden presidency, in terms of the
00:58:24.720 relative levels of corruption and undermining of the democratic norms. I know there's a lot of
00:58:31.200 counter arguments. I'm happy to have all of those, but for the sake of this discussion, what like,
00:58:35.860 from my perspective, it's this big slice here, like of a totem pole. The vast majority of Americans are
00:58:41.500 so far down. They're so far down below that when they're looking up, they can't possibly distinguish
00:58:49.540 some subtle, well, do you see trumpets and emoluments clause, and he's doing fundraisers
00:58:54.580 on the South lawn of the white house. And that's unacceptable, but the kind of fundraising and
00:58:58.560 enrichment that, that like, you know, the Clintons did was different for this other week. They can't
00:59:03.560 differentiate that. So for me, what is, and so then that gets to the question of was the vote for
00:59:08.940 Trump, like my friend who went in that booth and said, forget it. I don't care. Was that a wiser
00:59:15.400 thing? Because that's a higher disagreeable irritation structure before we get somebody
00:59:21.280 who's even more threatening from the right? Right. Yeah. That's a good question. It's certainly
00:59:26.420 possible. Is it Trump presidency? There's no way now that we can move forward. I think without having
00:59:33.980 much more robust and angry dialogue about that $50 trillion movement from the bottom 90 to the top
00:59:40.620 one percent. Right. And there's some race conversations beyond the, um, I believe that
00:59:46.360 this, the, the surface stuff that we distract ourselves with all the time, like cultural
00:59:50.800 appropriation, there's all these issues and there's, I've often, I've often thought, and I'm interrupting
00:59:56.460 you partly because this is such a crucial point, that distribution of wealth problem.
01:00:02.440 I've, I've, I've come to believe that even though the left focuses on that as the primary problem,
01:00:12.480 they actually don't focus on it enough because the attribution of the problem is wrong. I don't
01:00:18.700 think there's any evidence whatsoever that the Pareto distribution is a secondary consequence
01:00:23.060 of the capitalist system. So what that means to me is that the left wingers aren't actually
01:00:28.860 taking the problem of relative poverty seriously enough because they've got a handful of stock
01:00:36.680 answers that have been applied with absolutely no success whatsoever. Um, and in their more radical,
01:00:44.500 in their more radical, uh, guises, they're not looking at the problem with enough seriousness,
01:00:49.360 but the problem exists and it doesn't exist too, because people are malevolent or greedy,
01:00:54.680 although that might add to it. It's much, it's much more complex problem than that. And one that's
01:01:00.940 much more difficult to solve. But some of the greediness, like the level of lobbying in America
01:01:06.900 and with lobbyists literally submitting bills and forgetting to take the lobby firms heading off the
01:01:12.560 paper, there's an aspect of that. But the other way that I look at that is to go, how badly did we
01:01:19.180 fail as capitalists? And that's me, right? Yeah. How badly did we fail that enough people went
01:01:26.000 and picked up the shiny object called socialism or democratic socialism, which is different from
01:01:31.820 socialism and think that that's a good idea. Yeah. That's a, well, that's a good question.
01:01:35.880 That's, that's the reverse of the question. The Democrats should be asking themselves.
01:01:39.780 So the Democrats should be asking themselves, well, how did Trump become so attractive? What did we do
01:01:45.820 wrong? And the, and the right wingers, the more conservative types should be asking, we haven't
01:01:51.300 been, we haven't solved the problem of wealth distribution well enough to stop socialism from
01:01:59.520 being attractive as an option, even though the historical record with regards to its more radical
01:02:04.780 forms is dreadful. And we've also failed to embody the core values of free market capitalism,
01:02:11.400 innovation competition, where, where we have, where we have not pulled the ladder up behind us,
01:02:17.300 where we have allowed and built a robust system of smart capitalism all the way down. That is a
01:02:24.040 solid foundational base that we can stand on to win. It's so God, so God damn difficult. It's so
01:02:30.320 God damn difficult though. Like, look, let me give you an example. So I started a company 20 years ago and
01:02:36.780 it, um, it, it, it struggled along for a long time. And then when I got better known it, that solved our
01:02:43.940 marketing problem, but it's a psychological testing company. And when we first designed it,
01:02:51.260 we designed, we, we consciously designed a company that would require no employees that would, that would,
01:03:00.520 would have no overhead and that would be, um, replicable. So it was computerized and, and so
01:03:09.140 it can scale without an increase in cost. And like, I'm sensitive to the problems caused by the
01:03:17.140 distribution. But when I set up that company, I set it up in a way that absolutely contributed to it
01:03:24.920 because we don't pay any, no one in this company gets paid except the three people that own it.
01:03:30.120 That's it. And that's part of the inexorably, I can't say that damn word inexorability. I just
01:03:39.780 did the audio version of my book and I had to redo all the times I said inexorable because I said it
01:03:45.240 wrong. Anyways, it's the inexorability of the Pareto distribution is very difficult to
01:03:51.400 escape from. It'd be lovely if we could have a discussion politically where that problem became
01:03:58.220 central and everyone's attention could be focused on that, that the capitalists who we could admire,
01:04:04.520 at least in some guises could sit down and say, look, we have to figure out how to get more money
01:04:09.800 to the bottom part of the population within this, within a structure that can also generate wealth,
01:04:15.220 because of course, capitalism does that extremely well. So that problem has to be brought to the
01:04:19.860 forefront. The thing is the further I got into this, Jordan, the more that I realized that
01:04:25.540 everything foundationally is moral. That's it. You said once to me, and I think like shortly after
01:04:32.880 college, you said there's only moral decisions. That's it. I was like, that's so weird though.
01:04:37.520 Cause sometimes there's pragmatic, sometimes there's something. And the more you look at that,
01:04:41.600 the more it's borne out that in fact, like any shortcut you pay for any shortcut pushes. I mean,
01:04:47.960 look, we don't have to get all into union synchronicity, but you know, that drill.
01:04:51.420 And so one of the things that I think about a lot is that
01:04:54.280 we can argue as if we're sitting around in college, right? Drinking and there's a libertarian
01:05:03.580 and there's a conservative and there's a liberal and we are a Democrat conservative and liberal,
01:05:07.420 let's say. And we all know what we're going to say already. Nothing that is pure ideologically
01:05:13.020 will ever work or function. And the only answer to it, like for me, part of what I realized was I
01:05:17.760 realized I'm going to be arrogant enough to try to go on an adventure that tries to tell stories,
01:05:24.840 not remessage the Democrats that I'm lying and repackaging and putting pig on a lipstick that I
01:05:30.420 can make an argument for. I think you meant lipstick on a pig. What did I say? You said pig on a lipstick,
01:05:36.940 which gets the ratio of pig to lipstick seriously wrong. That should appeal to the Trump viewers of
01:05:46.580 this broadcast. That's horrible. That's horrible. But it's not about, it can't be about deception,
01:05:53.720 right? It has to be about making actual arguments for why the core liberal values that I believe are
01:06:00.460 most imperfectly, but approximately embodied by the Democrats can have appeal to conservatives,
01:06:07.820 right? And we've talked to all that big. So hopefully you also tilt the Democrats in that
01:06:12.280 direction, like by producing that message, right? It gives them, it gives them a center around which
01:06:18.580 to align. Absolutely. A necessary thing. You need that center. But here's the, here's the complexity
01:06:25.500 that I realized is we didn't make any money at all. We said, everything is pro bono and we didn't have
01:06:32.140 any credit and we wouldn't have done any of this. Like the no permission part is like, here we went
01:06:36.680 off and did it, but there's a ton of money to be made in advertising. I mean, we have, I sent you that
01:06:42.000 article, right? There's an estimate that we created, you know, and it could very well be off or overblown,
01:06:48.600 but there's an estimate that the ad structures that we put in place created a billion dollars of
01:06:52.760 advertising. We, and we aimed it at the swing States. We aimed at evangelicals. We aimed it at the
01:06:57.380 Hispanic communities and the places that really mattered a lot. That's a lot of value, even if it's
01:07:03.440 off by 50%. And the thing is, is part of how we got there was people, when they do an ad buy with a
01:07:10.600 commercial, they make money on the ad buy, right? And so part of it was, we'd go off, we'd make some
01:07:17.460 commercial, we'd test it, we'd make sure that it was honest. It would be saying something that's
01:07:21.300 slightly different, but the cost of having our message conveyed in a way that might hope to be
01:07:27.820 transformational was for us to give it to them and say, here, say that you did it. And if there's
01:07:33.680 any sales or anything to do with it, you go make a bunch of money off it and just say it was you.
01:07:38.820 And so that's the price of it. Because if we said, well, we want to be cut in on the revenue streams,
01:07:43.900 then they'd have a bunch of reasons to choose their own creative over our creative, right?
01:07:48.460 Right. So what you did by taking yourself out of the fee structure, you enabled your voice.
01:07:55.000 That's right. And we allowed for other people. It's like, you don't get to have all these things,
01:07:59.980 right? That's, we don't get to have all the credit and make a ton of money and also be adored by the
01:08:06.040 democratic establishment and then also be transformative. And so I'm not saying that like
01:08:12.460 it's any great shakes morally, but that was the part of me that was like, the solution is in doing,
01:08:17.980 the solution is in when I realized the airline buyout thing that I was an inadvertent recipient
01:08:22.960 of that in a way that's kind of rigged. That's kind of a rigged game when there's a buyout and
01:08:27.680 there's a stock buyout and I just make more money despite them being in failure is to do stuff and
01:08:32.940 to try and do stuff properly. And so I think that a lot of it is we have such a failure of moral
01:08:38.360 leadership right now in corporations. I mean, I was thinking back to like, would it be amazing if we
01:08:44.600 looked up to, you know, more leaders of industry and more for the, it just, we're, we're so removed
01:08:55.480 from our paragons, from our avatars of meaning, I guess is what I mean to say. Like the fact that a
01:09:01.280 politician is supposed to be there to help you and to do good for the community is almost laughable.
01:09:05.560 Now the fact that a college or university represents the production of a Renaissance man
01:09:11.860 or woman in pure form. I mean, the, the lie of that I think was laid bare by that college
01:09:16.840 admissions scandal. People were so furious about it because the answer to that should be no kid could
01:09:21.460 cheat to get into university. They'd wash out in the first month if they didn't deserve to be there
01:09:25.980 in a way. So we were removing ourselves, like the money-making mechanism of business,
01:09:31.140 like great businesses and business people should be building a whole pyramid and structure of success
01:09:36.780 under them. That's how you win. And part of, part of that again, is it's a, that's a timeframe
01:09:43.680 problem, you know, is that the, the more fundamental, the more morally fundamental a decision
01:09:51.840 is the longer the timeframe over which it operates. And so you might expect people who are benefiting
01:10:01.740 from the capitalist system to set up their structures so that capitalism itself would be supported across
01:10:07.960 a long span of time. And that would mean cutting in the people that in the bottom of the hierarchy,
01:10:13.480 but short-term considerations arise to make doing such things very, very difficult.
01:10:21.180 Right. And if you keep doing, if you keep making a difficult, another eight, 10, 12 years,
01:10:25.780 then AOC is the president and the whole system is going to change. Let's say.
01:10:29.860 Right. Well, that's the risk. If the system, if the system fails enough people, then there's
01:10:34.160 enough people who are willing to, especially young men who are willing to, um, take their chances in
01:10:41.300 the revolution, you know, at least it's exciting. So here's the hardest thing that I had to figure
01:10:46.800 out, which was this, I had an okay time. I think part of this is my, I've always had a very diverse
01:10:52.380 background of friends because I write thrillers. I have a lot of friends in military community,
01:10:57.180 a lot of kind of hardcore conservatives. The hardest thing for me was to try to apply the same
01:11:02.900 self-awareness of my blind spots. And I don't want to say empathy, but sort of
01:11:08.900 seek to really understand the further elements of the left. That was the hardest thing for me.
01:11:15.580 Instead of just saying you're idiots and you're squawking, you're doing all this stuff to, to
01:11:20.300 really slow down and listen and understand that a lot of these younger, especially the younger kids,
01:11:26.380 younger, um, who are coming up, who are very attracted to democratic socialism, who are way more
01:11:32.480 radical in a lot of their views than are appealing to me. When I stopped and looked at the world through
01:11:38.100 their perspective and could get over my inherent, like, um, you're, you're always more angry at
01:11:43.420 your own side in some ways that you're inside. But man, if I was, I have, you know, my wife's a
01:11:49.820 college professor, as you know, and she teaches at CSUN, which, which is a lot of the kids, Cal State
01:11:55.200 Northridge, a lot of kids from tough backgrounds. Like they don't have time to be political. Those are
01:11:59.700 the kids she has. They're working two jobs. There's help support their family. They're raising
01:12:03.960 their younger sibling. Like these are working kids. So many of those kids come out of, and they
01:12:09.820 made the right choices, not drugs, not, you know, didn't wind up in prison. They went and did this.
01:12:16.040 They're holding their family together. They have like $130,000 in debt coming out and they're in
01:12:21.860 jobs that they're earning. You know, maybe they have a master's in psychology at the end of it. And
01:12:26.560 they're, they're making $33 an hour, you know, okay. So, so that's, so that means the price to
01:12:33.320 buy into the system at a point where you have a chance of thriving, can't get too high.
01:12:41.200 So student load debt would, would certainly contribute to that as would the price entry
01:12:46.020 price of, of real estate. That's right. Okay. So, so that's a real danger. So, um, but I think
01:12:53.660 that people are, why would you look at other means and other systems and why wouldn't you
01:12:57.560 just criticize the system and for us to come along, me to come along and say, well, that's
01:13:02.380 ridiculous and democratic socialism and defund the, like all these things that you're talking
01:13:06.100 about, the system are so foolish. I think, I think you could look, the other thing that you can,
01:13:10.760 you can credit the, especially young people who are attracted to the farther left ideals is that
01:13:15.880 there is genuine concern about the unfairness of the economic distribution. It's like, and you know,
01:13:23.060 so there are a lot of poor people who are at zero, which is a hell of a place to be because
01:13:28.640 you get to the point where you can't get out because every you need what, I don't know how
01:13:36.840 to say it exactly. You can't afford a bank account. You can't afford an address. Like you can't get the
01:13:42.700 basic necessities that would allow you to play in the system. You can't afford $20 a month, the fee to
01:13:49.360 keep your bank account open. If you're under a thousand dollars. Right, right. Exactly. Exactly.
01:13:53.720 That. And so that's the cost of being stuck at zero and young people look at that and they think,
01:13:57.700 well, that's a terrible waste of human resources, which it is. And it's dreadfully unfair, which it
01:14:03.380 is. Now the, the problem is, is that the solutions that are conjured up on the radical left don't seem
01:14:10.920 to work, but I think not well anyways. So I think it's reasonable to say, you can be sympathetic
01:14:17.220 to the motivations that drive the attraction to those theories. And it would be lovely if they
01:14:23.500 work, but so many, so many of them when put into practice, don't produce the result that's intended.
01:14:28.280 Like, I don't see any evidence I've, I've looked and it's, it's tough to parse through it, but it
01:14:34.820 isn't the evidence that left-wing governments had have been better at controlling the Pareto
01:14:40.180 distribution problem than right-wing governments is very, it's very sparse. And that actually is
01:14:46.400 unbelievably disheartening. But I think we're down, I think we're getting too academic abstract
01:14:51.780 with it in a way. And I think the part of it is to say, when they say that we go, here's a bunch of
01:14:56.980 facts. And to me, that's the same thing with a MAGA voter, let's say of me coming in and going, here's
01:15:02.540 a bunch of facts about Donald Trump's corruption, rather than joining and saying, look, you know,
01:15:08.400 you're 22, you're not supposed to have a world economic view and an understanding of the whole
01:15:14.680 breadth of history, especially from our academic systems that are failing you as you're under
01:15:19.720 crippling debt. And you're addicted to screens because companies have hired teams of addiction
01:15:24.600 specialists to, you know, throw shrapnel in your nervous system. And the biomed companies are all
01:15:29.880 over you. They're being devoured from every angle. And we're coming in with PowerPoint presentations
01:15:34.940 about why it's dumb rather than going, God, you're right. There's a lot of problems here. And you're on,
01:15:40.760 you're maybe some of the language and reasons and reasoning that you're, that you're moving towards
01:15:46.580 aren't the ones that will get you where you want to be, but boy, are you right about a lot. Let's start
01:15:51.480 from there then. Like you can't change somebody's opinion without seeking to understand them first.
01:15:57.360 Yeah. Motivations. Absolutely. And you shouldn't assume that all the positive motivations are on
01:16:02.540 your side. Okay. I want to ask you some, some other questions here. So you, um, you, you produce
01:16:10.120 200 commercials. We're going to show some of those interspersed in this, in this, uh, video now.
01:16:19.920 How are they distributed? How many people watched them? That's the first question.
01:16:24.000 We did a ton of online digital, excuse me. They, they arranged some of them. We ran during NFL games
01:16:33.100 and swing States on television. Some of them like the Reagan city on a Hill one. One of the benefits was
01:16:39.020 some of the ones we did were really innovative, um, in ways that are kind of fun that we could
01:16:43.720 talk about in different ways. And so we got secondary, like our, our Reagan city on a Hill
01:16:48.160 one, Brian Williams asked James Carvell about that on a show. So there was also a secondary
01:16:53.460 sort of conversational aspect of, of crossover into mainstream media in different ways. Cause people
01:17:00.040 would then write, I did a whole series of ASMR commercials. This is a fun, what's ASMR?
01:17:04.820 Oh, it's the whisper. Do you remember those videos where people would like whisper and make
01:17:09.560 sounds on the microphone that were super trendy? No, I must've been out of, out of function. I
01:17:15.000 must've been malfunctioning during that period. But so basically, you know, we realized that the
01:17:20.880 commercials that a lot of the democratic, um, agencies were putting where these like, you know,
01:17:26.540 Trump, you know, Trump is beholden to China and it would have the dark shadowy Trump and all the
01:17:31.760 stuff. And so a lot of, you know, and they got tons of people are mashing the like button and
01:17:35.840 sharing it. But what we found in some of the testing, there's a brilliant woman we worked
01:17:39.480 with, with the testing is it moved undecided voters, 10 points towards Trump. And the reason
01:17:43.880 for that is if your nervous system is put in a fight or flight by the ominous score and the facts,
01:17:50.260 then you move more towards, you're more inclined or receptive to conservative messaging.
01:17:55.900 That's right. Jingoism, xenophobia, strongman leadership. So I did this commercial series
01:18:01.880 where we hired, um, a wonderful actress to, to whisper. It's a, it's a sort of seductive
01:18:07.960 whispering of the mic. Cause I thought we need to talk to voters, nervous systems. That's another
01:18:12.160 part of storytelling, right? You're not talking just to their prefrontal cortex. You want to talk
01:18:16.960 to the decision-making mechanisms. We need to lower the guard because there's so much screaming
01:18:21.720 about politics. And so that was what we did. She said, there's so much screaming, you know,
01:18:26.440 I want to, I want to tell you, this is the only way we can cut through the noise. And it's a sort
01:18:30.160 of whispered soft messaging. Hi there. It's just you and me.
01:18:39.700 And this aren't my own name. So after looting everything else, Donald J. Trump,
01:18:51.720 is looting our election, he hired his rich donor buddy to slash the postal service. So
01:19:02.160 our votes can't be counted. Voter suppression is Trump's biggest grab. Vote early.
01:19:13.560 Let's steal some shell.
01:19:24.300 Sounds, sounds. Midas Dutch is responsible for the content of this advertising.
01:19:29.620 Let's get his paws off of our eggs.
01:19:32.380 So that was effective. And then that got written up in a bunch of places secondarily.
01:19:36.380 So how many people are typically viewing these ads?
01:19:39.380 Most of the ones that we did with big launches were in the millions.
01:19:42.900 And how would that compare to it? Well, I could say a typical political ad, but this is atypical
01:19:48.100 because the, in some sense, because the technological infrastructure for doing this is so new. I don't
01:19:53.900 know what you'd compare it to because I mean, an ad used to be an evanescent thing, right? You'd throw
01:19:58.180 it up on a TV show. And I mean, it could run in sequential TV shows, but the ad would run and
01:20:04.620 then people wouldn't have access to it. Now, of course they have access to everything all the
01:20:08.280 time. So I don't know. And it's different. I mean, it would be Instagram. It would be Facebook.
01:20:12.740 It's on Twitter. Then it's shared widely. Then some of them we cut down and we ran on television
01:20:17.840 stations. I mean, but we got, I think I could confidently say we got over a hundred million
01:20:22.740 views of stuff, if not more. Okay. Okay. What kind of, um, uh, here's a nasty question. I suppose
01:20:31.500 I wouldn't, what good did you do and what harm did you do or what harm might've you done? Let's
01:20:38.040 start with good. What good do you think you did within the democratic party? Let's say,
01:20:42.960 but we have advocated and elevated the purest place for me in the democratic party or the first
01:20:48.700 time house candidates. That's the love for me to work with because it's so hard as people move up
01:20:55.640 in the structure and make all the compromises that they have to make, it gets more and more sticky
01:21:01.260 and the fiefdoms and bailiwicks. And, and so the house candidates are wonderful. We supported them.
01:21:06.620 I mean, I think we, we had a big, I look, I would say it's so hard to say, and it's so hard to want to
01:21:13.500 take credit. I, yeah, I wouldn't, I wouldn't want to remove our efforts from the midterm or
01:21:22.700 from the presidential. I wouldn't be comfortable removing them and thinking that the outcome is
01:21:28.460 the same. Now, whether that's a slice that we laid on top of, you know, tons of people who did
01:21:34.440 other work and laid other slices down community organizers, the politicians, you know, and yes,
01:21:40.260 the democratic institutions with fundraising, the D trip C there's, there is some credit that is
01:21:45.080 there too, but we put and targeted all of our messaging. Our theory of the case was right. We
01:21:50.940 put it straight into the swing States. We put it straight into persuasion messaging for moderate
01:21:55.980 voters. We went after a lot of Hispanic commercials. Like our theory of the case was right. We went after
01:22:02.560 evangelicals. We did. So what do you mean? But what do you mean by theory of the case?
01:22:06.380 We didn't do anything that was woke politically correct. Everything was, was ambitious. It was
01:22:13.360 unifying. We had diversity in our ads. We did several sort of black lives matter. Um, you know,
01:22:19.880 we had the Lincoln project. I got one to them, but it was, they were very, um, clean on the parameters
01:22:26.860 of what we were representing as a unified, positive vision of America. And our, our criticism, I think
01:22:32.180 some of the commercials with Trump here or there, we did get, I probably, I failed in getting snarky
01:22:39.400 in ways that might not have been as fair or persuasive. So you did get snarky in a way that
01:22:45.880 was so, so was, was not as persuasive here or there. Some of those 200 ads, if I look back on,
01:22:52.420 I'd go, yeah, but I had an instinct and I'd send them to testing and I'd get the answer back. I was like,
01:22:58.060 I think I can try this way in of this kind of attack route. Okay. And we got back. There was
01:23:02.580 like, Nope, it's turning people off again. And I was wrong. So I was wrong plenty. I mean,
01:23:06.560 I rushed into being wrong everywhere again and again. And again, we didn't understand the
01:23:11.900 permission structures. We didn't understand the etiquette. And at a certain point we, you know,
01:23:16.560 early on, it was funny. I joked with Billy. He'd send an email out to like a bunch of senators
01:23:22.040 and heads of different committees. And I'd get like a worried call from one of our political
01:23:26.500 people being like, you can't, you can't CC all these people. There's all this internal,
01:23:29.900 you know, stuff going on. And so the first reaction is kind of chagrin or embarrassment
01:23:35.160 of like, Oh, we stumbled into, you know, this not knowing anything. But then we were like,
01:23:40.580 wait a minute, that's idiotic. We're trying to win a race. If everything that you're saying,
01:23:45.440 you believe that Trump is that damaging and threatening, we don't have time for any of this
01:23:51.000 internecine bullshit. So knock it off. We're going to CC all of you. We don't have time to break that up.
01:23:55.800 And everyone kind of went, okay. Like we were so clueless in some ways that it almost benefited us
01:24:02.400 because we were breaking a lot of established norms in little ways that, that if people came up
01:24:08.940 with it, with an issue about whether it was some subtlety of language policing or hierarchical stuff
01:24:14.580 or bureaucracy, we just said, we're not doing any of that. We're here to win. If you don't want to be
01:24:18.820 involved, we'll take you right off, but we're not going to navigate any of those things.
01:24:22.120 And since we, since no one had given us permission and no one was paying us,
01:24:26.860 no one could fire us. And so it just worked. It was really weird.
01:24:31.540 Well, it is really weird that it was even possible, but it's less weird when you,
01:24:36.920 the weird thing is that you did this without any payment that you just decided to do it.
01:24:42.400 And the second weird thing is that you actually went ahead and did it. And as a project, it worked,
01:24:49.620 even though the outcome of it might be very difficult to measure.
01:24:52.860 Um, you alluded harm. I did, you asked about the harm. Well, I also want, I wanted to go more
01:24:56.900 into the good first though. You talked about the, the newly elected people. Um, you didn't tie that
01:25:02.920 exactly to the good that you've done, but there obviously is a link there. So I'm, I'm just going to
01:25:07.460 ask you to make that more explicit. That's the, they're the best home. You know, we have amazing
01:25:12.020 candidates. If you looked up Alyssa Slotkin, if you looked up Dean Phillips, if you looked up Haley
01:25:17.180 Stevens, if you looked up Lucy McBath, um, some people, we lost this round because of the bad
01:25:22.960 messaging, really bad party messaging and inability to draw a line against socialism and defund the
01:25:27.980 police. But we're working with candidates like Michelle de la Isla, who's the mayor of Topeka and
01:25:32.380 Kansas. Okay. So you're, you're pleased about the effectiveness of your strategy on raising the
01:25:41.360 overall capacity for performance of a new generation or part of a new generation of
01:25:46.780 political decision makers. Who I admire and who anybody would anybody reasonable. If you sat down
01:25:53.320 with, with Alyssa Slotkin, who was 20 years in the CIA, you might not agree with her on all the
01:25:58.660 politics. It's impossible to not completely admire her as a States woman. She's, she's an exceptional.
01:26:04.220 So you got high. You think you were on the side of the high quality people? Yeah. Okay. Well,
01:26:09.620 that's cool. All right. I mean, it's hard for me to see. And I got us more oriented on getting things
01:26:15.600 done and accomplished than talking about them theoretically in terms of like, here's how we
01:26:21.720 get shit done. Now it's enough with the abstract people are hurting. People aren't a bad place. And I
01:26:27.460 think also to some extent I've helped to untangle in, in, in small look, it's so hard. I think I've
01:26:34.160 done a lot in, I helped build a coalition. I played a role in stitching together, you know,
01:26:40.740 the Rick Wilsons and the David Frums and the, um, Bill crystals. Of course they're doing it on their
01:26:46.200 own. And a lot of people had reach, but I was the center of a particularly diverse and interesting,
01:26:53.160 you know, Chris Halverson, who you introduced me to, right. The former chaplain of the Senate,
01:26:58.060 evangelical leader. There's, there's people who I called and pulled into ventures in a whole host
01:27:04.880 of different ways. You know, one of my favorite candidates who lost this time due to bad, you
01:27:09.560 know, again, messaging that was unfair, wonderful woman. So she tore a small, she was in the most rural
01:27:14.820 district of any Democrat. Democrats don't tend rural in New Mexico. So I called Chris Halverson,
01:27:20.160 who you introduced me to. And I said, she's a, she's a Christian woman. She's a woman of faith.
01:27:24.860 She's bipartisan. She's in the most rural district and she has concerns about rural medicine or rural
01:27:30.580 healthcare. And that's not predominantly Democrat who are, who are Republicans who you love, who you
01:27:35.740 can get her in touch with to help solve that problem for constituents. It's a win for her.
01:27:40.520 That just helps people in a rural district. It's politically agnostic. And ideally we would be
01:27:46.940 politically agnostic. Like I would love to support Republican candidates who I felt like were,
01:27:52.240 you know, good faith. Well, that's, that's the question, you know, because you could look at this
01:27:56.140 as a zero sum game and you could say, well, it's bad news for the conservatives. If better quality
01:28:02.680 liberals emerge on the political landscape, because their probability of victory is higher,
01:28:07.900 but then you could take the opposite viewpoint. You could say, no, everyone wins. If the quality of
01:28:13.920 candidates on both sides of the spectrum is L are elevated. If the quality is elevated.
01:28:19.720 And that's the only thing that keeps us, see, I view the extremism, like we're on a seesaw,
01:28:24.300 right? And so the game theory is, is that if people are in the middle and they're dealing with
01:28:28.080 each other, like, you know, um, tip O'Neill and Ronald Reagan, everyone can kind of be in the middle
01:28:34.300 and you think the seesaw, but when one group starts to move and it's chicken or egg, the other has to
01:28:38.860 move out to hold balance. And when you have the mainstream media divided since essentially the
01:28:44.240 Gore Vidal, William Buckley jr debates, and there's a wonderful documentary, the best of
01:28:48.440 enemies. And you have social media driving that way and you have, you know, capitalism by
01:28:55.720 appropriating people for dopamine hits for the most extremism online, moving people that way,
01:29:01.280 everybody's moved out. And so part of it for me is if we can bring the Democrats back
01:29:05.580 towards the middle, I hope is we can get Republicans there. And the other thing just
01:29:10.100 Well, that, okay. So that's that, that, well, that's, that's the bet here, right? Is that
01:29:14.600 And Jordan, what, just one thing that I've wanted to talk to you about this for, for, for a minute,
01:29:19.600 which is, you know, we talk about moderates and we talk about moderates and more extremes.
01:29:24.620 And the distinction I make, it's like in psychology between process and content in a way,
01:29:29.220 like my politics are actually probably significantly more progressive than where I live to, to negotiate
01:29:38.340 and understand that within a system of governance, no one gets everything they want. And we need
01:29:43.760 imperfect incremental progress where everybody takes less of what they want. And so I think
01:29:48.460 that there's an aspect where we can talk about moderate in, in process. Like, I don't care how
01:29:54.000 extreme within reason of reasonableness, like AOC holding her views. I don't view that as some moral
01:30:01.060 issue. If she was more willing to engage on more fair terms about them. Like, I feel like we,
01:30:08.640 as a party could sustain a democratic socialist weighing in once in a while from a deep blue district
01:30:14.400 to hold us to account for ways that maybe we would get too tangled up in corruption. It's all about the,
01:30:20.260 the ability to engage properly. And then people's politics can be wherever they are.
01:30:24.660 Well, that's sort of, that's some, that is essentially the subordination of ideology to
01:30:28.980 the constitution in some sense, right? Is that regardless of your stated goal and regardless
01:30:34.660 of your ideological position, you swear to abide by the process rules, right? Yeah. Well,
01:30:41.440 well, and it's, it all gets, that makes you a moderate and that's reasonable, a reasonable for
01:30:48.820 that makes you a kind of moderate and it's a reasonable. Okay. I wanted to ask you too.
01:30:54.840 What? So I know that the fact that you did this and the fact that it was successful,
01:31:01.240 at least as a project, let's say, and probably because of its consequences,
01:31:07.620 but certainly as a project, because it's been going on for four years and manifested itself on a
01:31:14.500 pretty large scale. What were the consequences for you personally, psychologically? What did this
01:31:21.480 do to you? How did it change you? Wow. That's a, it was one of the best and worst things I ever went
01:31:33.460 through in my life. I feel like it was like going to college for four years. Um, I don't have a belief
01:31:40.600 that we can, so I was in large part engaged in the info wars, especially in acutely the last
01:31:46.080 seven months. Cause I was on viewing sort of people who are online are largely radicalized and weaponized.
01:31:55.200 And I don't mean that flippantly. I mean that some of the playbooks that are issued by the extremes
01:32:01.320 politically are the ISIS playbook. A lot of the opinions that we believe that we hold are out of
01:32:07.300 troll farms in St. Petersburg. So we're here as a country with that $50 trillion is moving to the top
01:32:13.600 1%. We actually believe that we're personally damaged. If a black man kneels on the sidelines
01:32:20.400 of an NFL game to peacefully protest or a white girl gets dreadlocks at Yale, like the things that
01:32:27.160 we come to believe are these giant stakes to me are all the distraction games for the movement of
01:32:32.960 that $50 trillion to the 1%. Cause we're not talking about the prison industrial complex, right? We're not
01:32:38.620 talking about the real stuff. And as we're fighting about this, all that keeps happening. So there is a,
01:32:44.480 a, a view of the level of corruption, intricacy, difficulty that is, that's dizzying
01:32:54.180 being in there and online. I'm not sure that somebody can live any substantial portion of
01:33:00.520 their life online and social media and not be insane. Oh, well, it really, it like, it's, it's,
01:33:06.540 it's like most of what I've encountered online or a huge proportion of it has been intensely positive,
01:33:13.620 but even too positive. I would say is like so many people comment in the comment sessions section,
01:33:21.100 say on YouTube that being exposed to my work and it's based on the ideas of other people. Like
01:33:27.440 it's not my work, you know, exactly because no one's work is their work exactly, but being exposed
01:33:33.520 to these ideas, which I've been communicating, let's say has so positively changed their lives.
01:33:39.260 But to hear that from thousands of people is just, it's overwhelming. Like that's the positive side of
01:33:45.420 it. It's too much. You get amplified too much. And the negative side is just deadly. Like on my
01:33:53.000 YouTube channel, the positive to negative comment ratio is about a hundred to one, you know, and that's
01:33:57.620 about as good as you could ever hope for, but the positive ones are overwhelming and the negative ones
01:34:03.840 are deadly. And, you know, you see this because people, people will get attacked by 20 people on
01:34:10.600 Twitter and they'll go into convulsions to apologize. And if you put yourself in the center
01:34:15.760 of this monstrosity, that's multiplied by thousands or tens of thousands or hundreds of thousands. And
01:34:21.560 it's, and it's not real. It's not real. That's what's so crazy. It's not your family and your
01:34:26.840 community. You don't know what cross-section that is. You don't know. Like if you're at high school
01:34:31.580 and there's one table of mean kids screaming in a corner of a cafe or gossiping, well, it's real,
01:34:37.220 but it's impossible to parameterize, you know, with, when it's your family or your immediate
01:34:43.500 community, you know, these people and, and you can put some walls around it.
01:34:48.260 You can't give it the proper weight. No, that's what I mean. That's what I mean. You can't give
01:34:51.940 it the proper weight. You don't know what, well, it's because we're not adapted to this environment.
01:34:55.700 It's like, well, and so here we have these, our kids, you know, who I'm increasingly worried
01:35:00.020 about who were, were there. It's like, it's like trying to exist and go through puberty and go and
01:35:06.760 enter the academic world and the world of culture and ideas while being constantly, um, blared out
01:35:14.540 with the stream of the most salacious, upsetting and activating gossip. That's been algorithmically
01:35:22.720 selected to target you. And that's how politics is. That's what we're dealing with. And I was in
01:35:28.320 there. And the thing is, for me, is I talked to, I talked to anybody. I mean, you know, that I'm like,
01:35:32.900 I'll talk to you for a variety of reasons. And, um, you know, one, there's all the obvious reasons
01:35:38.720 for free speech and talking to people who don't agree with you, but also it's like a, what are my
01:35:42.920 blind spots? I always want to know. That's hard. It's hard to go in. I had a, one time I had a two
01:35:48.160 hour conversation with Eric Weinstein, you know, about where he was on everything, but I was right
01:35:53.720 in the middle of it. That takes a while. You know, Eric's, Eric's incredibly bright. I'm talking to some
01:35:58.800 of the brightest, most impassioned representatives all the way around. That's all the spokes of the
01:36:05.060 wheel that I could get to. And it takes a while to come out. Yeah. And to find your center and to
01:36:11.200 accommodate and assimilate. And Eric raises a ton of stuff. He pokes at a ton of soft spots. We're,
01:36:16.480 we're, we're friendly. It's a very respectful exchange, but then to come out, incorporate the
01:36:21.880 parts that are right, get clarity on where I think he's, he's skewed for his reasons and his info
01:36:27.280 tunnel. That's different from my info tunnel. And I was doing it in every direction. And so it felt
01:36:32.880 like being torn apart to hold the center because I was just being torn every which way. Well, it makes
01:36:37.480 me, it gives me more understanding even of why people will settle into their ideological bubble.
01:36:43.580 Oh, you know, well, it's a relief. It's such a relief, you know, because you got to ask yourself
01:36:48.720 just how many questions do you want to ask yourself? Right. You know, and I've always thought that
01:36:54.960 exploration in the world of ideas is an of unlimited value. Although, you know, I, I have a conservative
01:37:01.180 element. Um, but man, too much of it can tear you apart. We have that like really, really high
01:37:11.660 openness. Yeah. That's the danger of high openness for about six or seven months that every day was a
01:37:18.860 deep dive into something that was toxic and skewed to understand it, to try and come back out and put
01:37:25.940 myself together. And by the way, going into it, I, you know, there was plenty of times you cross
01:37:30.560 someone who wants to destroy you or scream at you or tell you why you're awful or make your, you make
01:37:35.800 an accidental mistake. Like when I was working, when we were discussing this a couple of years ago,
01:37:42.040 I swiped one of your videos, if you remember, and put a voiceover over it. And, you know, I thought
01:37:47.400 that was warranted. I mean, in retrospect, I did it without sufficient consideration. And I mean,
01:37:54.240 I realized that very rapidly. I did a voiceover of one of the videos that you'd produced because,
01:37:59.820 well, I thought I had something to say about it. And I was irritated about, well, I was irritated.
01:38:05.620 We'll leave it at that. And that was a big mistake. And that caused a tremendous amount of friction
01:38:10.640 between the two of us. And, you know, and it was also exposed to, I don't know, a hundred thousand
01:38:15.360 people before I finally took it down. I didn't take it down exactly, but I modified it. But when
01:38:21.140 you're, when you're connected in some high intensity fashion, your mistakes are exaggerated to a point
01:38:29.220 that's just intolerable, you know? And, and again, it's not something that we're adapted to understanding
01:38:36.160 it. Make a mistake and a million people watch it. It's like, it's, it's the sort of thing that can
01:38:42.880 paralyze you into inaction. It's too much.
01:38:45.620 Right. And the thing is, is there's also, like I was talking about that, there's no way to have a
01:38:50.020 perfect conversation about race, class, and gender. There was no way for me to do this. When I first
01:38:55.920 was trying to get Democrats to go on you, Rogan, you know, Dave, Ben Shapiro. Right. And I got,
01:39:02.700 I had some success. I got a Stanley McChrystal. I helped go to talk to, to Ben.
01:39:08.300 And to me, Sam Harris had Michael Bennett on, but they were, people were really tentative and
01:39:13.620 afraid. And so the only way for me to do it was for me to go on a lot of these first.
01:39:19.180 Well, I was a novelist. I wasn't interested in all of a sudden being
01:39:22.540 put out there on the ledge necessarily in that fashion, especially for the communities that I'm
01:39:28.440 in, especially as a liberal, right. In the community that I'm in, but I had to be sort of
01:39:33.900 a case study for it. And, you know, I did it, but what's funny is there's no, if I look back at
01:39:39.000 that, there's, there's a hundred things I would say or do differently. There's no way to do that
01:39:43.020 without. Well, nobody's an expert at it. It's like, if you want to move politically,
01:39:48.440 you're going to do it badly, especially to begin with. So that's right. And we don't,
01:39:53.720 and this is one of the things that's troubled me a lot about, you know, so to answer your question
01:39:58.900 at the end of this, I was, I was not in good shape for reasons you and I can get into later
01:40:03.880 over bourbon in different, in more depth. I think I'll stick to sparkling water, given
01:40:08.700 the state of my nervous system, sparkling water and salted meat. I'll do, I'll drink enough
01:40:13.640 bourbon for both of us. But, you know, part of that process was it's, I don't know. It's
01:40:26.360 like, there's, there's, it's so much to go into. It's so much to dive down, to really try to figure
01:40:34.520 it out, to hold the, to really be open to what all those blind spots are and to come at feeling
01:40:40.440 like you're still intact. And I had to, I had to race to make mistakes. That was, I mean, we spun up
01:40:47.840 this whole operation. I mean, it doesn't even make any sense. If I look back on it, the amount of stuff
01:40:52.060 we got done, you're racing to make mistakes. You have that one lecture, the fool precedes the master.
01:40:57.620 And it was like, how quickly can I be a fool on how many fronts the most rapidly as possible to
01:41:03.840 try and just get better? I mean, we were, we were, I mean, we spun up an entire, you know, studio
01:41:09.400 operation, fundraising, distribution, dissemination network. I mean, it was, it was crazy.
01:41:15.060 Um, well, and then what was hard on you, what was hard on you was what the, the rate, the,
01:41:22.680 the intensity, that exposure to the, all the different opinions, the consequences of making
01:41:28.480 a mistake, the consequences of making a mistake. Like if my, let's say my theory of the case had
01:41:33.940 been wrong and it should have been a further left thing. And Biden wasn't the guy and all this other
01:41:39.960 stuff. And we'd lost because I put, call it that, that, that figure, which again, I'm saying is
01:41:45.520 slightly overblown. If I targeted a billion dollars in advertising to the wrong people,
01:41:50.160 giving the wrong message and blew the mark. And we lost the election by 104,000 votes instead of
01:41:55.740 winning it by 104,000 votes. That's a lot to live with. And I didn't even consider that till I was so
01:42:00.660 far in that there kind of wasn't going out. I mean, you can't consider everything as you go,
01:42:06.640 but the other thing is, is I started, I saw with more and more clarity, I was, I got way too much,
01:42:13.860 not too much. I had a ton of information constantly daily, hourly about my blind spots
01:42:20.220 and confirmation biases and a lot of anger. And, and a lot of those things, the, the upside of a
01:42:28.260 lot of those things, your blind spots, your confirmation biases, your prejudices, all those
01:42:32.560 things is they protect you from being overwhelmed. Like they're, they're, they're, they're compression
01:42:38.900 algorithms and they, they remove information, hordes of it. And a lot of that information is
01:42:43.480 valuable, but Jesus, it's like, how much information can you swallow? I mean, it was so that's right.
01:42:49.300 And what was, what was really, and the only way that I determined to make headway in something
01:42:54.700 that complex and that corrupt, and I don't mean entirely corrupt, but I mean, you know,
01:42:59.260 politics and with that much anger and rage and outrage and frustration and pain and grief,
01:43:07.480 like there, it was, it's, it was, was to try and go forth as cleanly as possible. And I,
01:43:12.980 no one can do that. I tried the same thing, you know, cause I was dealing with people's
01:43:17.240 psychological problems and trying to step very carefully to not make a mistake, but
01:43:22.260 there's, there's, there's no not failing. If you're doing that all day, every day with those
01:43:29.680 stakes multiple times a day, and that'll eat you up. And part of what happened for me,
01:43:33.920 that was so funny. I had two things that were funny, which you'll be amused, particularly amused
01:43:39.040 at. But so I got to the end of this and I was really seeing things only in moral terms. And what
01:43:43.600 was interesting is at the end of this, after the, the week after the election was really,
01:43:48.980 I was pretty dysregulated. Let's just say most of the conversations I had were with my conservative
01:43:56.260 friends. I called my born again, Christian friends. I called my Navy SEALs buddies because
01:44:00.980 I was seeing, I was seeing everything in like mythological good and evil sort of terms. And
01:44:06.900 it's like, what was so funny was at the end of this exercise for liberalism, right. For moving
01:44:13.400 towards enlightenment discourse, moving towards a democratic party that I thought was less imperfect
01:44:18.860 significantly less than perfect than the, than the imperfections of the Republican party
01:44:23.380 as it stands under Trump was for me to need a lot of support from, from my conservative
01:44:28.800 friends. And fortunately I have that.
01:44:30.180 What kind of support did you get?
01:44:34.600 And why, why was it necessary to seek it out from those sources?
01:44:38.760 If you talk a lot about mythology, the seven deadly sins, right. If, if that's what you're
01:44:51.240 conscious of and seeing, right. Cause at a certain point, there's so much information and you're
01:44:55.640 so open that you're just dealing constantly with your own failings, right. And you're seeing
01:45:02.300 others and you're trying to clean them up and you're, it just, it's almost like I got stripped
01:45:06.380 down to the bone. I don't have the sort of that, the particular courage and drive of my friends who
01:45:12.480 are seals who are in the military, but this felt like it was my version of, of, of confronting
01:45:18.660 things psychologically that in the way that I could from a position of much more relative safety,
01:45:24.500 where I was getting torn apart and trying to put back together and conservatives are, they
01:45:28.900 understand, they understood that better. And if I start to talk about that in ways in it's part of why I
01:45:35.880 have such a wheelhouse of friends is like, it all comes very genuinely, but also it's strategy,
01:45:41.540 you know, it's like, it's, you need people who think, well, that's it. Well, that's a good,
01:45:46.980 that's, that's an interesting observation because, you know, you could, you could make the claim that
01:45:53.480 just as the world needs an array of political viewpoints, the full array of political viewpoints,
01:45:59.420 you know, barring corruption, let's say, um, you need to surround yourself with a full array of
01:46:07.900 personalities from conservative to liberal, because that way the bases are covered and
01:46:14.380 different people are going to be different situations are going to call for different
01:46:20.160 people. And thank God there are different people. And actually that's a good way to end.
01:46:24.300 This is like, thank God there are different people because no one can do everything all the
01:46:30.320 time. And so we specialize and that's true in the political realm as well. And we need to understand
01:46:35.460 that and not assume that the conservatives are right or the liberals are right, but to understand that
01:46:41.380 each of them is right now. And then, well, and the thing is that that's also such an interesting
01:46:47.820 part, you brought up that, that, um, disagreement that you and I have, that was pretty intense,
01:46:54.520 but we worked out and I knew we'd work it out and you knew we'd work it out.
01:46:58.100 Yeah. It shorted me right out. You know, I mean, I partly because I was tearing myself apart about
01:47:03.080 being impulsive and also partly because of the magnitude of the mistake, the public nature of the
01:47:10.160 mistake, let's say, you know, when clearly, because it also, I feel like when we went through that,
01:47:15.220 we knew we would get through it. We just knew it would be Rocky in a way. And what was interesting
01:47:20.060 is what I found was some of the people most adept at pointing out my blind spots. When I then was in
01:47:26.360 free fall as a result of being subjected to how many blind spots I had, of course, they'd be the
01:47:32.600 ones who would be able to orient me. Right. Cause they're the ones who could see those things.
01:47:38.000 So, right. So that maybe that's part of the thing too. Why part of that was in that, that week where
01:47:45.060 I was the most acutely flailing, let's say, of course, I'm going to go to the people who were the
01:47:50.000 ones who had the most views that made me think about things. Cause I was like, I feel lost and
01:47:55.400 spun. I'm not going to talk to people whose views I feel like that's a home base for me. Cause
01:48:02.120 I'm not on my home base. I need different kind of input, you know? And I think the same is true vice
01:48:07.400 versa. So you've pulled back. What happens now, Greg? What's like, you must, you've graduated from
01:48:18.760 your new college, let's say. And so that's a tremendous relief. Like it is when you graduate
01:48:23.820 from anything, when you accomplish something or when you finish with something, but it's also
01:48:27.460 leaves a huge hole. Um, you are obviously going to concentrate on your writing again.
01:48:33.500 You have been, you have a new book coming out in January. What's the name of that book?
01:48:37.660 Prodigal son. And your last book?
01:48:40.980 The one that came out previous is Into the Fire. They're, they're what, I mean, so what's
01:48:45.520 happened that's so amazing with this. I had a talk with a mutual friend of ours a while ago talking
01:48:51.440 about this. And I was like, for me, one of the things that's been so amazing is the amount
01:48:56.840 that I've learned and the confluence with the series that I'm writing now with the orphan
01:49:02.580 acts and what he's doing. It's, it's like bizarre levels of synchronicity with it. And so it's funny
01:49:08.980 because I'm, I'm coming out and going right into a draft of a new novel.
01:49:13.080 Yeah, right. Well, you, there's so, you know, so much more. I mean, I've watched you over the
01:49:16.780 last four years, you know, so much more about the way things work than you did before that that's
01:49:23.920 got to have nothing but a beneficial effect on your ability to spin up stories.
01:49:28.980 Well, and to try to address them in that way where it's easier, you know, and this is always
01:49:33.620 why I started, you know, I don't, I don't want to write and make propaganda, right? This was the
01:49:38.360 sort of necessity and it felt a bit like a call to duty, but I want to write and have people think
01:49:43.440 through formats that are one step removed through science fiction, right? Through the thrillers,
01:49:47.880 through orphan X. And it's been really gratifying, you know, as those, as that series has built
01:49:53.440 that some of it's in there, but so I'm having a renewed love affair with that. It was a little hard
01:50:00.860 to take my hands off the steering wheel with the politics. And I'm sure you know, that feeling like
01:50:05.280 when you're just, I was going so hard so long that part of it was like, well, I should be involved
01:50:09.860 with these 50 things, but I've narrowed the scope. I have a few pet projects. I want to keep working
01:50:14.500 in terms of discussions with the evangelical community and doing across the aisle. The
01:50:19.900 anti-polarization stuff is really. Yeah. Yeah. I would, I really like, well, we did some of that
01:50:24.600 in Washington when we were bringing Republicans and Democrats, um, congressmen and senators as well.
01:50:31.600 I believe I was really unhappy when I couldn't do that anymore because that was so worthwhile and
01:50:37.820 the opportunity is still sitting there. And so hopefully my health will hold. I'd love to do that
01:50:44.040 again. It's such gratifying work. We, we should. And I think what's funny with that is it's like,
01:50:49.640 you know, here, you and I are all tingled up in different political discussions in different ways,
01:50:54.140 but neither of us were particularly political. Really? You know, it's like that never was the
01:51:00.880 source or drive. Well, I always made a decision. I've had to make a decision between politics and
01:51:06.060 other routes continually in my life. And I always picked the alternative routes. Yeah, me too.
01:51:10.700 So, so, all right. Well, we should wrap this up, I guess. Um, yeah, we've covered a lot of ground.
01:51:17.100 I was really good talking to you. Great talking to you, George. We'll talk soon.
01:51:22.520 We'll talk soon.