The Glenn Beck Program - September 18, 2021


Ep 117 | The Dark Horses: From Campus Villains to Political Peacemakers | Bret Weinstein & Heather Heying | The Glenn Beck Podcast


Episode Stats

Length

1 hour and 28 minutes

Words per Minute

166.15558

Word Count

14,738

Sentence Count

898

Misogynist Sentences

4

Hate Speech Sentences

12


Summary

Heather and Brett Weinstein are a husband and wife team who have become an integral part of our new normal. They first gained prominence in 2017, when they were both professors at Evergreen College and disagreed with the college s Day of Absence, which demanded that all white students and faculty leave campus for a day.


Transcript

00:00:00.000 Let me start with some good news.
00:00:02.220 Life is great, really is.
00:00:04.640 Would you pick a time, any time in history that you would rather live than right now and where you're living?
00:00:10.960 Best time in human history.
00:00:12.600 Poverty rates are lower than ever.
00:00:14.460 Same with global hunger.
00:00:16.340 Access to clean water.
00:00:18.220 People make more, work less.
00:00:20.960 Transportation, modern medicine, access to information, women's rights, human rights.
00:00:26.740 So why does it feel like everything is collapsing?
00:00:31.180 That is a really tough question to answer, but it is what lies at the center of a book called A Hunter-Gatherer's Guide to the 21st Century Evolution and the Challenges of Modern Life.
00:00:44.360 This couple is the reason I started this podcast, but they have never agreed to be guests until now.
00:00:53.220 It is a husband and wife evolutionary biologist team who have become an integral part of our new normal.
00:01:02.940 They first gained prominence in 2017, I think, when they were at Evergreen College.
00:01:10.500 They were both professors.
00:01:11.980 They disagreed with the college's day of absence, which demanded that all white students and faculty leave campus for a day.
00:01:19.540 They thought it was crazy.
00:01:20.860 The outrage was immediate and explosive.
00:01:25.320 They were practically forced to resign.
00:01:28.380 Well, then they went on to become founding members of the intellectual dark web, a group of politically homeless rebels, mostly from the left, who violated one of the left's countless holy laws.
00:01:41.000 This is going to be a fascinating conversation.
00:01:44.840 I have no idea where it's going to end up.
00:01:47.020 Please welcome Heather Hang and Brett Weinstein.
00:01:51.440 I'm so honored to have you both here.
00:02:06.080 It's really, it's great.
00:02:07.900 I am a big fan of yours.
00:02:09.900 I have been watching you for a long time and have wanted this interview for probably over three years.
00:02:16.300 And I appreciate it.
00:02:18.900 I'm a fan of yours because of your courage.
00:02:23.960 In almost all things that I have seen, you are taking massive hits.
00:02:30.760 And you're doing it because you believe you are standing up for the truth.
00:02:36.160 I happen to believe that too, but you're standing up for the truth and damn the consequences, I guess.
00:02:42.040 So thanks for being here.
00:02:43.260 Thanks for having us.
00:02:44.320 Thank you.
00:02:45.020 How difficult has this been, this journey of yours?
00:02:49.280 It varies.
00:02:50.180 I think in many ways, it is its own educational experience.
00:02:54.780 And each time we end up in a new chapter, we learn more.
00:03:00.340 I think, you know, informally speaking, we feel that it is making us anti-fragile, which is not a fun process.
00:03:07.800 But in the end, being stronger and able to withstand the slights and challenges is important.
00:03:17.540 If we're going to fight for what matters, it is important that we have the skills to do it.
00:03:22.080 And that's a process that's learned.
00:03:23.960 How, and I want to get to the book because I think we have, we've had similar thoughts, except mine kind of resides a little bit with the Beatles.
00:03:35.160 And yours is in the intellectual space.
00:03:37.640 But, and it's important that we talk about the book because it is, I think people can feel the problem and they can name it here and here and here, but they don't really understand the scope of it.
00:03:55.380 And I think you guys are really hitting the scope of it.
00:03:58.800 But before we hit that, your lives have changed so much.
00:04:04.000 And you believed in so much, how much, two things.
00:04:11.920 I know that I believed what I believed.
00:04:14.480 And then I found out, holy cow, there's a lot of things I was absolutely certain of that are not true.
00:04:22.160 And betrayed by my own people.
00:04:25.920 You know what I mean?
00:04:26.500 I mean, how, how much of this has happened to you guys to where you're like, I would have, that's not who I thought we all were.
00:04:36.700 Yeah, very, that, that is certainly true.
00:04:39.820 That's not who I thought we all were.
00:04:41.780 Right.
00:04:42.900 But our politics, I don't think have changed.
00:04:46.540 But who we see engaging in similar thinking and truth-telling has changed.
00:04:53.860 Yeah.
00:04:54.320 Right?
00:04:55.100 Who, who...
00:04:55.840 Ten years ago, you and I would have, we would have never thought of being at the same table because we're on opposite sides, but we're really not.
00:05:03.920 We're really not.
00:05:04.720 And, I mean, even ten years ago, it was becoming clear, you know, in the, in the milieus where we were getting our news, that NPR was becoming harder to listen to, for instance, for us, even ten years ago.
00:05:18.680 But that is obviously coming to a head even more in the last one, two, three, four, five years, you know, with every...
00:05:28.300 And that, of course, is one of the points of the book, that the rate of change itself is changing so fast that the, the search for coherence is becoming ever more difficult.
00:05:38.320 I talked to Condoleezza Rice 15 years ago, and I remember during the interview, she said something, because it was biblical what she said.
00:05:46.960 I don't know if she knew it at the time, but what she said, and it stuck out to me because of what it implies.
00:05:53.900 She said, you're starting to see the birth pangs of the things to come.
00:05:59.360 And when you think of that, you realize we're giving birth to something, and I'm not sure I want to see what's coming out on the other end.
00:06:08.180 And birth pangs happen faster and faster and closer to closer and the stronger they get.
00:06:15.240 And I think that's true.
00:06:16.400 Things are accelerating at such a rapid rate.
00:06:20.020 Why is that happening?
00:06:21.040 Why is, is the acceleration due to technology?
00:06:26.520 Is the acceleration, why is it happening like this?
00:06:30.240 Well, there are really two reasons.
00:06:32.660 And I should say this actually applies to your earlier question, too.
00:06:36.020 In some sense, the reason that our politics hasn't changed very much is that science, and in particular evolutionary biology, is our north star.
00:06:45.520 And so, what we believed and what we believe now were based on a model of the universe that hasn't changed very much.
00:06:53.420 Now, who we find ourselves interacting with has changed radically.
00:06:57.140 Wait, wait, wait.
00:06:58.080 So, explain that part.
00:06:59.780 For people who don't necessarily understand evolutionary biology, what are you talking about here?
00:07:05.760 Well, one, so, I'm sure you had something particular in mind, but one thing that I hear in what Brett just said is our understanding of the universe is based on sort of physical reality.
00:07:16.460 And it's not, you know, our politics aren't dependent on the social scene.
00:07:19.960 Yeah.
00:07:20.160 And so, who it is that we're talking with and who it is that we're finding common ground with has changed.
00:07:25.660 But that doesn't mean that what we think about underlying processes has, because that was never a social set of beliefs for us.
00:07:33.460 Right.
00:07:33.700 Okay.
00:07:34.560 So, human beings, we may differ here on what the best way to view ourselves is, but maybe less than you would imagine.
00:07:44.160 We are creatures that have to function in the world, and our nature was built in an environment that we no longer live in.
00:07:52.360 Right.
00:07:52.660 Now, we humans are better than any other creature at changing our software to adapt to new habitats, but our habitats are now changing so rapidly that we can't keep up.
00:08:04.400 And that is partially about technology, but it's also about the fact, in evolutionary biology, we have a metaphor that we like.
00:08:11.120 We talk about adaptive landscapes in which opportunities are peaks.
00:08:15.860 They're basically mountain ranges of opportunity.
00:08:17.860 And in order to get from one opportunity to another, you have to pass through what we call an adaptive valley.
00:08:23.360 And so, your trepidations about where we're headed is part and parcel of the fact that whatever else may be true, we are in transition.
00:08:30.800 Yes.
00:08:31.080 Our tools are not up to managing the problems that we now face.
00:08:35.880 So, we have to find a new way of being.
00:08:37.380 Our, not only our physical tools, but our mental and psychological tools.
00:08:43.580 Even our tools of governance, as excellent as they are.
00:08:47.460 It feels like, the government feels like it's stuck in 1950.
00:08:51.220 The world feels like it's headed towards scientific, not scientific, the technology field feels like it's in 2050.
00:09:00.020 And we're kind of standing in between going, that world doesn't work, and I don't even know what that world means.
00:09:07.880 Well, let's be, you know, frank about this.
00:09:11.700 Our governmental structures are really 18th century structures, and they're brilliant, but they couldn't possibly manage a world this far from what the founders were capable of understanding.
00:09:22.360 I mean, the founders never saw a train or a chainsaw.
00:09:25.520 So, let me ask you on that, because I would guess that we actually agree on this.
00:09:33.680 The founders believed, I mean, they thought it would last 30 years.
00:09:37.280 They did not think it was going to last, like, now.
00:09:40.240 And I think they would get here, if they could, with a time machine, they're here.
00:09:44.320 They would look at everything, and they'd say, great, how long did America last?
00:09:48.100 You know, they would not recognize what they have done.
00:09:51.020 And I don't think they would have recognized it in 1960, you know, but they did allow, through constitutional adaptation, et cetera, et cetera, that they did allow us to adapt.
00:10:03.500 They did say, we're not, we don't see everything, so just adapt and adopt, you know, constitutional amendments to get there.
00:10:09.500 It's not the framework or the mission statement, we hold these truths to be self-evident, or the framework that is adaptable.
00:10:18.220 It's that we are trying to force a bunch of stuff into it that it's not meant to do.
00:10:24.780 It's meant to reject that kind of stuff, right?
00:10:28.100 Yeah.
00:10:28.420 No, I see it very much the same way, and I think your framing of, we can't go back to 1950.
00:10:34.540 Right.
00:10:34.780 And we don't know what, and we don't want to, and those who would claim that that was a perfect world are misunderstanding what that world was.
00:10:43.480 And those who are saying, you know, full bore ahead to 2050, no matter what, because progress is always the answer.
00:10:53.300 And if it requires rejecting everything about the old, who cares?
00:10:58.060 Well, that's also a mistake.
00:10:59.140 And so, you know, progress, yes, and tradition, yes, but we're going to have to pick and choose, and we're going to have to do so with intention.
00:11:06.560 And that's why I think we're having such, I always look at right wing, left wing.
00:11:12.000 It's a bird.
00:11:13.380 And if you chop off one of the wings, it ain't going to fly.
00:11:16.820 And conservatives, I don't know how conservatives define themselves, but I look at conservative for the root of the word.
00:11:25.080 I'm conserving the things that do work and looking to the future on new things that might work better.
00:11:32.400 But let's not throw all of this out.
00:11:36.000 It's amazing how difficult this lesson is for both the right and left wing.
00:11:40.300 Each one seems to think that it has the answer, and it doesn't understand that the dynamism of our system is in the tension between these.
00:11:48.800 Yes.
00:11:49.440 Right.
00:11:49.900 Liberals always want to improve things, and they always underestimate the unintended consequences.
00:11:55.520 Yes.
00:11:55.760 And conservatives, if left to their own devices, would avoid progress because of those unintended consequences.
00:12:00.800 Right.
00:12:01.040 So what we really need to do, though, is adapt our system.
00:12:05.300 I think the founders would have expected us to change it much more radically than we have in order to keep up.
00:12:11.320 You know, we've treated certain things as sacred that shouldn't have been and that they wouldn't have expected.
00:12:15.680 But what we need to do is get a system that is not so violent in the fluctuation that balances these two forces.
00:12:25.400 In other words, you mentioned up top that there's quite a bit we agree on, and this is now well established.
00:12:31.320 The Hidden Tribes report revealed that there is what they call the exhausted middle, this vast group of us that aren't extreme, that agree on most of what we want at least.
00:12:41.680 And then we differ over how close we are to it and what might be done to go the rest of the distance.
00:12:46.860 But we are being shut out by extremists that are pushing us into viewing, you know, the other side is not quite human.
00:12:55.620 And that's a very dangerous process, unfortunately, with a lot of historical precedent that suggests we should not continue down that road.
00:13:05.020 So now this has been a book that you've wanted to write for like 10 years, right?
00:13:10.140 Yep.
00:13:10.960 Why now?
00:13:11.960 And let's start on the main message.
00:13:15.140 I mean, 21st century hunter and gatherer doesn't seem to go together.
00:13:21.360 Yeah.
00:13:22.100 So it's a hunter-gatherer's guide to the 21st century.
00:13:24.860 And as we say in it, we could have named it a number of things, you know, an agriculturalist's guide, a post-industrialist's guide, a mammal's guide.
00:13:32.980 Like all of these things are true descriptors of what the modern human condition is and are true moments in time from our evolutionary history.
00:13:43.700 And hunter-gatherer is like the moment that most people have in their head.
00:13:46.940 It's like, oh, that's what we're adapted to, right?
00:13:48.640 Like hunter-gatherer is on the African savannah.
00:13:51.440 But we've been all of these things.
00:13:53.200 And so our point in the book, using the term of art from evolution, the environment of evolutionary adaptedness for humans, is not just hunter-gatherers.
00:14:02.860 It's all of these points in our history.
00:14:04.940 And because humans are so flexible, are so much software compared to hardware, we are born with so much capability that isn't hardwired in more than any other animal on the planet, then, you know, we are adapted to being post-industrialists.
00:14:19.820 Not as well as we should be, and not maybe even possibly to be perfectly adapted, given the rate of change.
00:14:26.840 But, you know, why now?
00:14:29.000 Why now in part?
00:14:30.220 Because we were talking about writing it for, you know, for 10 years while we were teaching, while we were still at Evergreen.
00:14:37.440 And we were able to now.
00:14:40.180 You know, when Evergreen blew up, we began to have an audience.
00:14:45.600 And the words of our many generations of students were still ringing in our ears.
00:14:50.360 You know, we're still affiliated, connected with many of them, who had been saying to us for years,
00:14:54.760 please find a way to take the teaching of evolutionary biology, and specifically what the two of you have, what the two of us have been doing,
00:15:02.960 into some kind of packaged form so that it can be delivered unto other people.
00:15:08.420 My dad said before he died, he said,
00:15:12.260 I'm kind of glad I'm not going to be around to have to see and make the decisions that your generation and the one following you are going to have to make.
00:15:22.700 He said, but it's going to be fascinating one way or another.
00:15:28.740 He said, he was born in 1926, and he said, I remember as a kid, we looked up at the moon, and we never thought a man could walk on the moon.
00:15:41.380 And when we started seeing Buck Rogers and stuff, that was movie stuff.
00:15:45.160 That wasn't actually landing on the moon.
00:15:46.900 He said, halfway through my life, we're on the moon.
00:15:51.020 And look at us now.
00:15:52.460 He said, all of the technology, everything that is moving so rapidly.
00:15:56.760 And I said, I know it's really exciting.
00:15:58.680 He said, it is, but we're missing something.
00:16:02.960 And I said, what?
00:16:03.820 And I think this is what the whole point of your book is.
00:16:07.860 He said, read Plato.
00:16:11.500 We're struggling exactly the same place.
00:16:15.880 Read Jesus.
00:16:17.660 Same people, same problems.
00:16:19.900 We've had such an explosive growth here, and really nothing here in many ways.
00:16:25.840 Is that the problem, that we're growing so fast, part of the problem, that we're growing so fast, and yet we haven't worked on this along with the fact that social media, we're in tribes.
00:16:40.260 It forces us back to that tribal nature, because that's been in us for thousands and thousands of years.
00:16:48.980 Well, it forces us back in one way.
00:16:52.060 But, I mean, really what you're saying, the reason that Plato is still resonant, and Plato is resonant.
00:16:57.720 I mean, you see Plato's cave in The Matrix.
00:17:00.260 People don't necessarily know that they're revisiting that same puzzle.
00:17:03.440 But part of the problem is that these things aren't really fundamentally about those who first spotted them.
00:17:11.100 They are part of the underlying architecture.
00:17:14.080 They are failures of game theory that result in certain problems reoccurring in a new form until you figure out how to solve them.
00:17:22.040 Isn't that part of, that's one thing that I don't understand that people think, that we evolve.
00:17:30.700 Well, yes, we do, but we're all born at the same starting point.
00:17:35.800 And so we have to discover that when I first read Plato, when I was an adult, I was like, oh my gosh, this is brilliant, and it's exactly what I'm struggling with.
00:17:45.500 Because isn't that a self-discovery thing?
00:17:47.920 You can't pass that information and those answers on biologically, can you?
00:17:54.640 Well, first of all, one of the things that our field has done a terrible job of recognizing and therefore conveying is that our cultural layer, our software, is biological.
00:18:06.280 You'll very often hear people say, is it cultural or is it biological?
00:18:10.320 It's a false dichotomy.
00:18:11.980 You can say, is it genetic or is it cultural?
00:18:13.980 That's a fair division.
00:18:15.800 But culture is every bit as biological as genes.
00:18:19.580 The problem is when you do discover the answer to a puzzle.
00:18:23.560 So let's, for example, say that our rights of free expression, which the founders clearly understood very well, were in some sense a response to part of Plato's observation about what happens when the person who has escaped the cave returns and attempts to convey it.
00:18:40.580 Right?
00:18:41.060 So our speech rights are, in effect, an agreement that we have that actually we don't get to do that to the guy who comes back into the cave and says that the shadows aren't the real world.
00:18:50.640 But the problem is if you live in a world downstream of that discovery and you've always had these rights, you don't realize how important they are and what they are really protecting.
00:19:00.800 And so you uninvent the progress because you don't see the problem, which is, of course, one of the themes of our book.
00:19:06.700 We talk about Chesterton's Fence and the idea that if you see a structure and you don't know what its purpose is, you'd be foolish to think it probably doesn't have one and therefore it can be safely eliminated until you've understood what the purpose was.
00:19:20.480 At that point, then you know whether it continues to play that role or whether that role is obsolete.
00:19:27.160 And so, you know, the reinvention of problems that were solved is a next-level problem that we have to address because it will happen in every era of history where we have been successful and people suddenly see, I don't know, efficiency as dominant over the elegant solution architecture.
00:19:44.120 I remember just a few years ago, for the very first time, I thought, no, these things are not self-evident.
00:19:52.680 They're not.
00:19:54.340 They were at one point, but they've been lost.
00:19:57.440 Freedom of speech.
00:19:59.280 I mean, wow, what happened there?
00:20:00.920 I thought we all agreed on it.
00:20:02.060 What happened?
00:20:03.040 Yeah, what happened?
00:20:03.600 I mean, part of what happened, I think, is that there's a reductionism that takes over when we are asked to understand things and we are able to measure.
00:20:16.320 And so, you know, once we can measure something and we have numbers, the numbers stick in our heads and we forget the emergent whole and we forget how glorious humans are, actually, and what emergent beings we are.
00:20:26.780 And so, social media is a stand-in for social interaction and it's become even more so, of course, during COVID, but, you know, really since its inception, people mistook it for the entire thing.
00:20:40.100 They mistook social media for social interaction and, you know, the fact is we're here with you and that is a very different experience than if we were on a Zoom call with you, right?
00:20:49.860 Without being, without having any access at any conscious level to why or what exactly is being conveyed that's more because we're actually here with you, there's just more, right?
00:21:00.520 Because humans are more than even many of the things we have yet measured.
00:21:05.840 And, you know, as scientists, we hope, you know, we are seeking an objective, complete, accurate understanding of the universe, hoping to get ever, ever closer, more and more refined, knowing we won't ever get there.
00:21:18.400 And also knowing that the tendency to reduce to a single variable will tend to exclude other things that may have just as much meaning, if not more.
00:21:28.060 I don't remember which chapter it was.
00:21:29.300 Towards the beginning, you did the lineage of the human journey.
00:21:35.060 Can you just take us through that quickly?
00:21:38.680 Yeah.
00:21:39.240 I mean, as briefly as you can.
00:21:41.100 I think we can probably cut it from 3.5 billion years.
00:21:45.060 Yeah, okay.
00:21:45.560 We don't have that much time.
00:21:46.560 At least do it faster, though.
00:21:47.540 Yeah.
00:21:47.740 Yeah.
00:21:48.780 So roughly 3.5 billion years ago, life came into existence on Earth.
00:21:55.700 And 2 billion-ish, I think, if memory serves, we have things like the evolution of sex 1 to 2 billion years ago.
00:22:07.120 And it's possible that in our lineage, that has been unchanging since then.
00:22:12.740 By 500 million years or so, we had become animals, and we already had multicellularity.
00:22:20.900 We already had multicellularity.
00:22:25.000 And before that, we had—
00:22:26.260 What does that mean?
00:22:26.780 That means that we, instead of a single cell moving around on its own, we're now aggregates of cells that aren't just aggregates.
00:22:32.980 We are actually combining with those other cells and their specialization between them.
00:22:37.700 So it's, you know, it's some—early on, anyway, life evolving is sort of increasing level of specialization.
00:22:44.560 Mm-hmm.
00:23:14.560 It's very surprising the first time you realize that although we animals and plants, in some sense, are distinct evolutions of sexual reproduction,
00:23:24.060 that what we call male parts of plants and female parts of plants actually have the same biases built in them in terms of, for example,
00:23:33.940 their enthusiasm for sex with partners about whom they are not choosy, right?
00:23:40.200 Female parts of a plant are very choosy, and male parts of a plant much less so.
00:23:44.580 And the reason is exactly the same, right?
00:23:47.040 So this is why we presented in the book the model of evolution that we did, which is that once you begin to realize that, yes,
00:23:55.580 there are, you know, an indefinitely large number of particulars, and you could learn them all,
00:23:59.900 but there's also the underlying problems that are being solved, and there aren't that many solutions.
00:24:04.920 So just as Plato's cave, you know, continues to reoccur because the underlying dynamics are really what's being noticed rather than some specific instance of something,
00:24:14.340 the fact that any time, you know, if you start out with creatures that have sex but they have identical-sized gametes,
00:24:21.640 you will quickly get to a state in which they no longer have identical-sized gametes,
00:24:25.720 and the one that has large gametes will become choosy, and the one that has small mobile gametes will be much less choosy,
00:24:31.800 and it doesn't matter whether you're talking about a tree or a gorilla, the logic is the same.
00:24:38.220 So once you get that message and you say, well, okay, maybe what I should do, instead of investing in knowing all of the creatures of the Earth,
00:24:45.180 what I should do is invest in understanding what problem they're solving and what the themes are in those solutions,
00:24:50.680 which then brings us to our problem, which is humans, better than any other creature that has ever existed, by far,
00:24:59.240 are wonderful at switching from one niche to another.
00:25:03.260 When we describe virtually any other species on the planet, we are talking about a type of creature and an opportunity that it exploits.
00:25:10.680 But for us, we can't name what that opportunity is that humans exploit because we do so many different things
00:25:15.920 and have throughout our history done so many different things.
00:25:19.240 How is that possible?
00:25:20.400 Because we have a near-miraculous capacity to bootstrap new programming.
00:25:26.500 We are kind of a generalist robot, and we have the ability to write our own new software for opportunities no ancestor had ever addressed.
00:25:35.480 That capacity gives us the ability to think our way out of the problems that I think all three of us here at the table recognize are headed towards us.
00:25:44.460 But what we don't have is the ability to keep up with the rate of change,
00:25:48.860 the fact that we don't even live in the world that we were born into, right?
00:25:52.320 We cannot educate our children and say, here's the world that you're going to have to make a living in
00:25:56.400 because none of us have any idea what it's going to look like.
00:25:58.780 That's too fast.
00:26:00.160 Even our amazing rate of change, our ability to change in a reasonable and coherent way at a rapid rate,
00:26:07.440 is just far outstripped by that technological pace.
00:26:11.420 And if we don't rein it in, you know, it's—
00:26:13.840 Well, that's not going to happen.
00:26:15.580 Is it?
00:26:16.140 Raining technology?
00:26:17.360 Yeah, I mean, Ray Kurzweil is a friend of mine, and I think he's the most fascinating guy
00:26:25.900 and the scariest dude I've ever met in my life because he doesn't—he, you know,
00:26:33.520 I said to him at one point about upgrading, you know, becoming the singularity,
00:26:38.560 and I said, well, but what about those people who don't?
00:26:42.540 And he said, well, everybody will want to, but what if you don't want to do that?
00:26:48.000 Well, why would you?
00:26:49.020 And so there's this circular thinking that everything's going to be okay
00:26:54.260 and to disregard what that actually means to the homo sapien, you know?
00:27:00.940 And we're just moving so quickly, and there's such arrogance in there.
00:27:07.860 I mean, every piece of technology we have is the greatest experiment on the human species ever done.
00:27:15.860 Yep.
00:27:16.240 Every piece.
00:27:17.600 It just keeps going up.
00:27:18.660 But let's take your question.
00:27:19.980 Okay.
00:27:20.780 Will we rein it in?
00:27:22.280 That's a hard question to answer.
00:27:24.440 Will it be reined in?
00:27:25.920 That's an easy question.
00:27:27.380 We will either rein it in or—
00:27:29.340 It will rein us in.
00:27:30.020 It will rein us in.
00:27:30.860 The processes are already unsustainable in very simple, literal terms.
00:27:35.860 And what that means is either we figure out a comparatively gentle path to some way of existing
00:27:42.520 that does not continue down this trajectory, or the trajectory will collapse because it is unstable.
00:27:47.020 So what are the things that are fighting us, fighting the natural man, if you will, that technology is exacerbating,
00:28:01.360 that we need to recognize and rein in?
00:28:05.420 I'm not sure natural versus unnatural is the right dichotomy, because part of what we explore in the book is the idea that this is actually—this could have been predicted, right?
00:28:19.680 That this—the human condition is one of such software, of such lability, of such flexibility, that, of course, we would end up becoming not only our worst enemy.
00:28:33.400 Yeah.
00:28:33.600 But what we also need to do, then, is figure out how to be our best friend, right?
00:28:38.980 Like, to recognize that we have to share the planet with everyone else, and also that that doesn't just include non-humans, but all of the humans who are here.
00:28:48.620 And, sure, there are probably too many of us, but we have to do what we can to live well with the people who are here now and figure out, you know, our better angels, how to explore our better angels, which are—
00:29:06.180 How do you do that in a society that is just pounding each other?
00:29:10.800 You stand up and do something—I've been working to try to rescue people that—and I don't care what your sexuality—I don't care.
00:29:18.200 I don't care.
00:29:19.360 I understand, I think, how people must have felt trying to rescue Jews from Germans.
00:29:27.460 These people mean nothing to you.
00:29:29.500 Just give them to me, and we'll take them.
00:29:32.560 And it's just—
00:29:33.080 You're talking about Afghanistan?
00:29:33.980 Yeah.
00:29:34.340 Yeah.
00:29:34.840 And it's just this weird thing where it's just—it's not even human.
00:29:40.040 The emotion is—it's weird.
00:29:41.820 And I was over there, and, you know, people started saying all kinds of stuff, and my first reaction was, no good deed goes unpunished.
00:29:53.340 I mean, you can do anything, and nobody is—everybody's so cynical and everything else.
00:29:59.760 How do you become better angels when there's zero reward for it in society?
00:30:07.660 Well, in part, we haven't really understood the puzzle yet.
00:30:15.180 All evolved creatures have the same purpose, and that is a very unfortunate statement, because if you share a purpose with a malaria pathogen, it's not much of a purpose.
00:30:27.200 It's not honorable.
00:30:28.120 It's not decent.
00:30:28.800 So we humans have the most remarkable capacity to do tremendous things.
00:30:34.480 We can be compassionate.
00:30:35.860 We can develop insight.
00:30:37.540 We can innovate.
00:30:38.420 We can create beauty.
00:30:40.600 All of these things.
00:30:41.900 But those are all—they all came to be in the service of that mind-numbing objective.
00:30:46.820 But once you recognize that—this is another place that our field has not been very good.
00:30:52.540 We've given people the impression, and in fact, most evolutionists believe that creatures are trying to produce as many offspring as possible.
00:30:59.940 And really, producing more offspring is a means to an end.
00:31:03.840 The end is to lodge one's genes as deeply into the future as possible.
00:31:08.820 And so the reason that you see the pattern that you're seeing is because, in fact, people are wired, without their awareness of it, for lineage versus lineage competition.
00:31:18.840 And to the extent that two lineages are fighting over a limited resource, like a patch of territory, the desire to rid that territory of the other lineage is extremely powerful.
00:31:30.480 This accounts for the greatest tragedies of history, I think, without exception.
00:31:34.560 And the problem, therefore, is to convey the message to whatever part of the human is listening, that actually the objective, getting into the future, requires that we stop doing that now.
00:31:51.040 That's how we got here.
00:31:52.200 But we have to stop doing the thing that got all of our lineages to this point in history if we want this to continue.
00:31:58.020 And really, you know, philosophically speaking, we have the most glorious opportunity that any creature has ever had.
00:32:07.020 The earth, damaged as it is, is a beautiful place, more beautiful than any other we've seen or even have reason to know exists.
00:32:15.620 I contend that it is a powerful spirit that is alive that will kill us before we can kill it.
00:32:24.640 It will rid itself of the disease of man if we don't, if we don't, if we don't harmonize, it's bigger than us.
00:32:33.740 It will just, at least I think, we're already seeing signs of things that it's doing.
00:32:39.740 But it's going to survive, even if it means it kills us.
00:32:45.100 You don't think so?
00:32:46.040 There's a manner of speaking in which you have to be right.
00:32:48.960 But I mean, just even notice that, you know, there are 400 civilian nuclear reactors operating on planet Earth that require constant vigilance to keep them cool enough that they don't melt down and spill out all of their content.
00:33:02.400 So we've even set up nature.
00:33:04.400 So even if we were to do ourselves in, we would take a lot with us.
00:33:08.980 Yeah.
00:33:09.280 Right?
00:33:09.500 We've created an inadvertent doomsday machine for no good reason.
00:33:13.480 So the recognition that we have the most beautiful planet that we know of, and that all we have to do in order to continue this indefinitely and really to provide the opportunity of being human, which is the most glorious opportunity there is, to provide it to the maximum number of people, all we have to do is solve the sustainability puzzle.
00:33:33.340 Right?
00:33:33.560 All we have to do is figure out how to get along well enough and to not do things that undermine our opportunity.
00:33:39.000 We have to not liquidate the planet.
00:33:40.840 And that's really the puzzle we're trying to solve.
00:33:42.920 How do you do that when, I mean, this is a puzzle I've been trying to solve because I believe in, I believe in the free market.
00:33:54.320 I believe it is the fastest way to solve problems.
00:33:58.200 But I also believe that, you know, you can't just read Wealth of Nations.
00:34:05.680 You have to read moral sentiments.
00:34:08.360 If the people are good, the free market will create all kinds of things.
00:34:12.080 The people are not good.
00:34:13.840 It will create whatever it is they desire.
00:34:17.540 And there's something in, as I look at capitalism, you look at it and it goes great.
00:34:25.540 And then people make a lot of money, as de Tocqueville said, and they get a lot of power and then they kick the door behind them or they just get so greedy.
00:34:33.140 They just.
00:34:34.060 But the average person.
00:34:35.760 Is I found I've found I've talked to Palestinians without cameras, talk to Palestinians.
00:34:40.980 They sound exactly like the Jews.
00:34:42.960 I just want my family.
00:34:44.480 I just all of this crap.
00:34:46.600 That's the tired American.
00:34:48.360 How do you get that into the power structure?
00:34:52.000 Well, OK, I don't want to get us too far afield, but I do think that this is a place where left and right need to see each other.
00:35:01.760 They need to understand the part of the puzzle that the other gets and realize the part of the puzzle that they've got wrong.
00:35:07.200 So I would argue markets are the best mechanism we've got for figuring out how to do things.
00:35:17.420 They are absolutely brilliant at solving that kind of problem.
00:35:22.000 They are terrible at figuring out what problem to solve.
00:35:26.540 As you point out, they discover every defect in the human character and figure out how to exploit it.
00:35:31.240 And they turn us into monsters that we are not inherently.
00:35:33.500 Correct.
00:35:34.320 And so there are places that markets don't belong.
00:35:37.200 Places like sex and music, right?
00:35:40.700 Yes.
00:35:41.200 What you want to do is figure out how to take that amazing tool and point it at the problems that we want solved and to keep it away from the problems we don't want solved.
00:35:51.500 We don't want it finding our defects.
00:35:53.160 We want to tell it what our values are and then allow it to solve the problems of how best to reach and enhance those values.
00:36:02.320 So the problem is if you're a market fundamentalist and you think, well, this is the best tool we've got for problem solving.
00:36:08.760 Let's point it at everything.
00:36:09.840 Then you end up creating the hazard where it solves all sorts of problems you don't want solved.
00:36:13.680 And so really it's a tool.
00:36:15.760 And, you know, just like, you know, a power saw, right?
00:36:19.400 A power saw is a great tool, but you don't want it running around your shop right now.
00:36:23.280 You want it under control doing things that you need done.
00:36:26.520 And that's possible.
00:36:28.380 You know, on the left, though, what you hear is, you know, capitalism is the enemy, right?
00:36:32.400 They don't like markets at all, right?
00:36:34.340 And then on the right, you hear that markets are what we've got for solving problems, so let's get to it.
00:36:39.160 And really the point is let's figure out how to refine that tool so that it does what we need done.
00:36:45.000 There remains the problem, though, that Glenn invoked of the people who make it and kick the door closed behind them or try to.
00:36:52.820 People who, or the people, some of them with very good intention that are working to solve problems that you're like, no, don't, no, no, what are you doing?
00:37:03.980 No, no, no, that's a bad, but they feel it's not an American thing.
00:37:08.240 It's a human thing.
00:37:09.740 There's a mountain.
00:37:10.640 I'm going to cross it.
00:37:11.840 And so they're breaking barriers that maybe we should hold off on.
00:37:18.340 Let's just all talk about that before we do it.
00:37:23.380 I do.
00:37:23.580 And I think those are two different problems.
00:37:25.900 Okay.
00:37:26.380 Right.
00:37:26.660 The first one is one of sort of a willingness to dehumanize the other.
00:37:32.040 Yes.
00:37:32.380 So that you can excuse your own bad behavior.
00:37:35.000 Correct.
00:37:35.320 And that, I think, is easier, more easily dealt with, although we haven't seen it dealt with, than the sort of the tech utopianism of if there's a problem that can be approached, then we will approach it no matter what and no breaks.
00:37:50.960 Right.
00:37:51.200 Okay, so can you tell me how do we begin to solve, if that is a problem, the first set of people, how can the average person begin to affect meaningful change there?
00:38:11.400 You know, we've begun to see, we know of some people and organizations who are explicitly working to bridge the divide, cross the divide.
00:38:22.120 And it's very much like what you're talking about doing in Afghanistan, but people working, for instance, within the U.S. to bring, you know, red voters and blue voters together into the same room and just have them meet and exchange names and eye contact and say, hey, I also have a kid and a dog and a mortgage.
00:38:41.400 And I'm human, and I'm human too.
00:38:42.940 Right.
00:38:43.160 And there is value for a very few in keeping us from recognizing each other's humanity.
00:38:53.400 So how, with social media, it's much harder to remind ourselves that we're all human.
00:39:01.500 And I think, you know, basically what these sorts of conversations and as much in human contact as possible, where you actually are engaging with people with kindness and generosity, but that doesn't necessarily get to the people who've already kicked the door closed behind them.
00:39:17.700 So, I think this, I want to introduce you to my Beatles thought that I had.
00:39:25.160 I was listening to, I don't remember which song it was, it might have been Revolution, and I'd been listening to some John Lennon.
00:39:34.560 And I thought, part of me said, thank God they don't have that tool in their hand.
00:39:43.600 Because the hippie movement in the 60s, I mean, it did a lot of good.
00:39:48.660 It also did some bad.
00:39:50.060 But the real hippie movement was based not in drugs or anything.
00:39:55.340 It was based in love, a new kind of consciousness.
00:39:59.220 And you did have, if you listen to the music of the 60s, much of it is very inspiring, you know.
00:40:06.840 And based in love, we don't have that now.
00:40:11.600 We don't have a John Lennon.
00:40:13.080 We don't have anybody preaching that.
00:40:15.480 The songs, the culture, you said, keep, you know, business out of music, which I'd like to explore with you.
00:40:22.160 But we don't have a soundtrack that's helping us move forward in a loving way.
00:40:31.440 Right.
00:40:31.980 And the 60s had that.
00:40:33.740 Well, all of these have been unnatural eras.
00:40:37.480 And the problem is you have to track, you know, there, in some ways, we have better music now.
00:40:45.860 Yeah.
00:40:46.080 Because there are more bands.
00:40:48.260 It's not all concentrated by the fact that you've got three stations in your area that, and everybody's listening to the same, you know, the same soundtrack.
00:40:56.060 You have a lot of room.
00:40:57.640 But the point is, it means that we're not synchronized in the way that ancestral humans would have been.
00:41:02.700 And that people were artificially synchronized in the 60s.
00:41:05.360 I miss, and, you know, this is so stupid, but I miss the fact that it was much-watched, you know, must-watch Thursday night, NBC.
00:41:14.760 And we all would come to work, and we would talk about friends.
00:41:18.200 Right.
00:41:18.660 You know what I mean?
00:41:19.420 I don't really miss that era, but there was something that bound us together that we don't have, except for the Super Bowl now.
00:41:28.120 Even if the TV was terrible, the fact that everybody was at least thinking about the same conundrum had a value.
00:41:35.000 And so there's a question about how do you get the best of both worlds?
00:41:38.020 How do you get the increase in quality that comes from having many more bands contributing to our sonic landscape?
00:41:44.720 At the same time, you get enough synchrony that we understand each other.
00:41:49.340 And, you know, again, the problem of hypernovelty that we talk about in the book.
00:41:55.600 Explain that.
00:41:56.340 The rate of change that is outstripping our capacity to alter ourselves so that we are not being harmed, that we are not being made sick psychologically, physiologically, and socially by our environment.
00:42:08.320 That problem is actually, it's multilayered.
00:42:12.660 So not only, you know, people our age have the experience of living in a very different world than they grew up, and we all remember, you know, at the beginning of the email era, right?
00:42:24.880 We all made a mistake somewhere where we sent some email that we should have thought more carefully about and didn't realize that the bandwidth was going to be reduced and that somehow this was very different than a letter and it wasn't quite like a conversation.
00:42:36.760 Yes.
00:42:36.880 And the problem is, okay, we get that, but people who are young enough who have grown up online have this as their developmental environment.
00:42:46.660 And I guess part of one of the many messages in the book is that there are things that physical reality will teach you developmentally so that you understand them, whether you could say what they were or not, right?
00:42:59.000 If you walk into a bar and you behave the way people behave on Twitter, you'll get beaten up and you'll learn not to do it, right?
00:43:05.400 On Twitter, you may not learn that lesson.
00:43:07.320 In fact, you may get so many likes that you continue to do that because it feels like the right thing to do.
00:43:12.340 You may not understand, therefore, what human conflict is and why it should be reserved for very special circumstances.
00:43:20.080 So what we are doing is we are not only creating an environment that we can't keep up with.
00:43:24.560 We are creating an environment that is shaping our children in utterly arbitrary ways.
00:43:30.020 And when they become adults, they will then be living in a world that isn't even that one that we created that shaped them.
00:43:35.440 It will be some new world and their toolkit will be out of phase with where they're going to be.
00:43:39.260 And we can see this coming a mile away.
00:43:41.260 What's it going to do to them?
00:43:42.300 It's going to make them miserable and unhealthy.
00:43:45.580 And the thing that we have to realize is that it's the generating function.
00:43:49.060 We have to stop that process.
00:43:50.720 We have to slow down the rate of change enough that we can keep up.
00:43:54.160 I guess one thing I would add is that the social media environment in particular makes us likely to choose our arguments for what we hate rather than with what we agree.
00:44:05.980 And it's not that there isn't plenty to disagree with by saying, actually, I stand for this.
00:44:11.740 And that other thing is something else that isn't odds with this thing that I stand for, as opposed to I hate that.
00:44:17.600 I don't know what I stand for, but I hate that.
00:44:19.240 And the standing primarily in opposition is de facto divisive and deadly.
00:44:29.520 I'm working on something now that I'm calling it the power of one.
00:44:39.020 And it is just you recognize that you have control, not of everything else, but you.
00:44:47.500 And just stand for it.
00:44:51.720 You know, I had a woman who saved Jews in World War II, and I had her meet with my family and tell her.
00:45:00.300 She was 16 years old.
00:45:01.260 She's saving 100 Jews, hiding them under a bar.
00:45:03.700 I mean, it's an amazing story.
00:45:06.020 And she gave me some great advice.
00:45:08.360 And in a nutshell, it is, you don't have to be a hero.
00:45:13.880 You just have to remember what your parents taught you was right and wrong.
00:45:18.800 And just don't move from that.
00:45:20.640 It's right.
00:45:21.400 It's wrong.
00:45:22.320 You know, I'm, no, that person is still a person.
00:45:25.640 And I'm going to treat, I don't care what everybody else does.
00:45:27.880 That's what I'm going to do.
00:45:28.940 And it's hard now, however, because everything is, I sit to my wife all the time.
00:45:38.220 I don't carry a phone.
00:45:39.640 And my wife does, and everybody else does.
00:45:43.000 And I'm like, you know, you just, we were fine without that.
00:45:46.540 We were fine without that.
00:45:48.160 And we can be fine again without that.
00:45:51.000 Our life may slow down.
00:45:52.240 But you're convinced that you can't live without it.
00:45:57.320 And then, because it brings a lot of great things to your life.
00:46:01.060 But then on top of it, while you're being convinced that, you're also being convinced that you should be the opposite of what your parents taught you was right and wrong.
00:46:12.780 That's absolutely right.
00:46:14.640 And I think there's part of what you're saying that's perfectly straightforward and clearly right, right?
00:46:20.180 What's right and wrong has not basically changed.
00:46:22.620 No.
00:46:23.620 But it has in society, but it hasn't in reality.
00:46:28.040 Right.
00:46:28.240 But it has in society, not because the fundamentals are different, but because the particulars are.
00:46:33.600 Explain that.
00:46:34.960 For example, our capacity to be rewarded for behavior that creates massive harm in lives that we never see, right?
00:46:44.940 Even just, even your investment portfolio, right?
00:46:48.600 You're damned if you do and damned if you don't, right?
00:46:51.080 You are in competition with other investors.
00:46:54.160 And what you need in order to stay afloat is to find the investments that pay back.
00:47:00.840 But those investments may be profitable, not because they are actually enhancing the world in some way, but because they successfully externalize harm onto someone who can't defend themselves.
00:47:11.400 And so that's obviously immoral.
00:47:13.400 But if it's, you know, if it's buried in your retirement fund, you don't know that you're harming somebody else, nor are you in a position to do anything about it.
00:47:20.500 So we have to begin to recognize that, although you can't operationalize this, but just as a thought experiment.
00:47:27.240 Hang on just a second.
00:47:27.800 Because what you sound like you're saying is ESG would be a good thing, but that is, then somebody else is making the judgments on what, do you know what ESGs are?
00:47:40.200 Environmental, social justice, and governance scores.
00:47:45.680 And all companies are, banks are now starting to make loans based on, well, are you involved in social justice?
00:47:52.440 What do you, and that's an, right now, at least for sure, it's an arbitrary number.
00:47:57.620 They decide what you are or what you're not.
00:48:00.920 That, institutionalizing that kind of stuff scares the hell out of me.
00:48:04.740 Right.
00:48:04.980 Well, this is exactly the kind of thing that conservatives correctly fear.
00:48:08.520 Because if you build that system, no matter how well-intentioned you are, it is guaranteed to be gamed.
00:48:13.100 And it's just the next landscape of warfare.
00:48:15.420 Yep.
00:48:15.560 Right?
00:48:15.900 So that's not, that's not where you go.
00:48:17.860 Well, and it includes a category error, right, like lumping in environmental with social justice causes would seem to not be aligning those of us who deeply care about the environment and see a tremendous amount of harm in modern social justice, for instance.
00:48:32.580 So I want to go back to something that we were talking about earlier, which is what happens when you gather conservatives and liberals together.
00:48:41.520 And one trick that I have learned is that the first thing you should do is figure out what it is that you actually agree on.
00:48:52.060 Because very frequently, the puzzles that we differ over, actually, there's half of them that we don't differ over.
00:48:57.820 And then, but we recognize the other person.
00:49:00.520 We sort of figure that everything that they're saying is wrong.
00:49:03.420 This isn't the case.
00:49:04.420 So, for example, if I go into a room of conservatives, I can be pretty sure that we're going to disagree over environmental sustainability and, in particular, climate change, right?
00:49:18.120 But if I ask the question, if you believed that human beings were causing substantial alteration to the climate that was going to degrade the capacity of the earth to sustain people in the future, right, would you be in favor of doing something about it?
00:49:35.620 Virtually every reasonable person will agree to that.
00:49:37.900 It's a fascinating, it's fascinating, because I, I, you can't argue with global warming.
00:49:46.260 You can't argue with a thermometer, right?
00:49:48.000 Is a thermometer going up or down?
00:49:49.280 What's, what's happening?
00:49:51.080 You can't argue with the changes that are happening.
00:49:54.300 You might be able to argue, is man doing this or is this cyclical?
00:49:59.320 But we could have that argument and it's a healthy argument to have.
00:50:02.120 What's strange is, you can't have the argument on how to fix it.
00:50:09.360 That's the hardest one.
00:50:11.320 Right.
00:50:11.840 That's the one that we're presented with, destroy the planet or destroy humanity and everything that we have.
00:50:19.720 Just burn it all up at once.
00:50:21.680 That's what it seems like to conservatives.
00:50:23.660 You're like, no, I, I don't think we should do it that way.
00:50:26.900 Right.
00:50:27.100 But I, I'm sure there are things that we could agree on, but those things don't ever really seem to ever be talked about.
00:50:34.000 That everything's always pushed.
00:50:35.880 Well, that's why you have to get to the place where you've agreed that if we are damaging the planet in this way, that we all agree that we should stop.
00:50:44.380 Right.
00:50:44.620 Whether we know how to do it, whether that's even plausible, because that is a conversation that people can have and be decent to each other.
00:50:50.660 Right.
00:50:50.880 What do we do if that's where we are?
00:50:53.180 Right.
00:50:53.400 Can I just interrupt for a moment?
00:50:55.400 This reminds me very much of a framing in different domains that I've been making lately, which is, you know, two people come together and the first person says, my God, there's a problem.
00:51:05.500 And person B says, yeah, there sure is.
00:51:07.780 And person one says, therefore, the solution is X.
00:51:11.420 And person two says, well, that's a big problem, but I don't think your solution is going to do it.
00:51:15.460 And the first person, and, you know, this might be team blue and team red, or it might be reversed, but the first person says, if you don't agree with my solution, then you don't think this is a problem.
00:51:26.740 And there is just a basic logical failure here.
00:51:29.620 But it is the way that many of us are demonized and dismissed because we say that solution that you've come up with strikes us as wrong or incomplete, or at least how about there's other things on the table.
00:51:42.280 Or another scam to build an empire of money for something else.
00:51:46.620 Right.
00:51:46.840 You're like, well, wait, that's not.
00:51:48.820 But that rejection of solution says nothing about whether or not you thought there was a problem in the first place.
00:51:53.400 And, you know, I know conservatives, they, and this is not a blanket statement, conservatives, especially that live in rural areas, they consider themselves great stewards of the land.
00:52:07.340 And generally they are.
00:52:08.780 My grandfather, I remember him saying all the time, he's a farmer.
00:52:12.660 These nuts are telling us how to manage Yellowstone and the forest in California.
00:52:18.900 And I remember him distinctly in the 70s saying, California will burn to the ground, burn to the ground if they do this.
00:52:26.680 He was a real environmentalist in my book, you know, but God, no, you can't be an environmentalist if you disagree with what they're doing to save the environment.
00:52:38.380 All right.
00:52:39.040 So the solution here, to the extent that there is one, is actually weirdly buried in the second to last chapter of our book.
00:52:47.700 Because what we describe is a pattern, an evolved human pattern that goes back millions of years in all likelihood.
00:52:55.500 Certainly it goes back hundreds of thousands in which human beings swap out their software program to engage new opportunities.
00:53:04.140 And the way they do that is by plugging their minds into each other and essentially engaging in what we would now call parallel processing, right?
00:53:14.760 And this wouldn't look like anything strange.
00:53:16.500 It would look like people standing around a campfire discussing what the problems are, what solutions might look like.
00:53:23.140 And with their different types of expertise, they would come up with a new solution, a prototype.
00:53:27.960 And then over time it would be refined and then it would be driven into the cultural layer and handed one generation to the next.
00:53:33.760 Now, the point, though, is this implies that our minds have two ways of running.
00:53:41.220 If we are in a situation that our ancestors' wisdom is applicable to, then we are wise to apply it, maybe refine it a little bit, but not to question it too much, right?
00:53:50.840 That is a naturally conservative impulse.
00:53:52.780 If we are in a situation that our ancestors' wisdom is inapplicable because they didn't know anything about this predicament, then we have to rise to this collective consciousness and figure out what to do about the puzzle.
00:54:05.520 And the point, we are in a battle about whether or not this is that moment.
00:54:11.960 Is this the moment to, you know, double down on the ancestors' wisdom or is this the moment to do something different, which is inherently dangerous?
00:54:19.460 It's that danger that liberals don't tend to see.
00:54:21.920 They want progress.
00:54:22.700 They want to fix problems and they don't recognize the hazard of unintended consequences.
00:54:27.600 Now, we are facing issues that are far more, in my opinion, far more dangerous than the 1950s.
00:54:34.040 That was nuclear holocaust.
00:54:36.020 This is, I mean, this is just, I mean, as Stephen Hawking said, the end of Homo sapiens by 2050.
00:54:45.360 You make the wrong, and he didn't mean that we'd be extinct.
00:54:48.040 We would be chipped.
00:54:50.200 We'd be incorporated into it.
00:54:52.520 So, I don't want that.
00:54:54.180 I don't think a lot of people want that.
00:54:56.460 You know, if we're not careful right now, and no one is talking about it.
00:55:03.180 No one's talking about the real, true peril that is on our doorstep.
00:55:10.020 Right.
00:55:10.260 The peril that is actually, should be uncontroversial, right?
00:55:14.300 Just the simple, the degree to which the power of our tools has gone up, the degree to which we are interconnected has gone up.
00:55:20.880 We are now one experiment, and your population can't make a mistake and go extinct, and everybody else will, you know, do the next thing.
00:55:29.460 We're all in it together at this point.
00:55:31.060 But the key to getting people out of their autopilot cultural mode and into their conscious, how do we solve this new problem mode, is the recognition that the old program, when you run it, creates errors.
00:55:45.080 It throws errors like a computer program that runs into some sort of an input it doesn't expect.
00:55:49.760 And what are those errors that the old programming, the conservatives need to see?
00:55:57.200 The fact that, you know, for example, that we have a situation in which the most productive and dynamic nation that has ever existed, which has become a model for the West, right?
00:56:10.800 That people have rightly adopted what the founders got right, and they've employed it across the world to good effect.
00:56:16.940 That we would endanger that by failing to recognize the humanity of other participants in that system in some sort of team sport mentality that jeopardizes the whole project is obviously insane, right?
00:56:30.220 Just the simple fact that we seem to be coming apart at the seams, despite the fact that whatever our problems are, the system is still functioning.
00:56:36.640 So let me, amazingly so, amazingly so, I thought we would be gone.
00:56:44.420 Ten years ago, I thought we'll never, this thing is resilient, it's crazy.
00:56:50.340 But the ancient wisdom would come from George Washington who said, don't do the two-party thing, don't, don't, because you'll do tribal and then one person will realize they can milk it and become more popular and then the other one will up their game.
00:57:05.840 I mean, that's, that, there's your founder's wisdom.
00:57:08.840 Well, we did it, so now how do you undo it?
00:57:11.820 First thing to do, I would say, is you need to recognize, we need to separate effectively the founder's values from their mechanism, right?
00:57:20.880 Their mechanism was great, but it's run its course.
00:57:23.580 It's, it cannot solve these problems, right?
00:57:26.260 First past the post is a lethal hazard now, and it's part of what's causing this, you know, team against team mentality inside of our system that threatens us.
00:57:35.440 But the way you get to the conscious mind engaging the puzzle rather than the autopilot mind is you recognize the errors.
00:57:43.240 And so, if you get these puzzles really well refined, the thing that you see very often sounds like a riddle, right?
00:57:52.140 Or it sounds like a paradox.
00:57:53.440 That's the indication that there's something here that needs to be thought about carefully rather than just doing what we all learned must be done.
00:57:59.860 So, I like to say the following thing.
00:58:03.320 I'm a liberal.
00:58:04.880 In fact, I'm a reluctant radical.
00:58:07.360 I don't want to be a radical because radical change is dangerous, but I don't think we have a choice.
00:58:11.180 We have to engage radical change, as frightening as that is.
00:58:14.200 That's one riddle.
00:58:15.400 But the other side of it is I want to live in a world that is so good that I get to be a conservative.
00:58:22.380 I'm not a liberal because change is my objective, but we do need to get somewhere that the point is actually from here, change would be a mistake, right?
00:58:31.360 Change isn't worth it.
00:58:32.520 What we're upending is so successful at doing what we say we want to accomplish.
00:58:36.780 So, don't you think, though, that I could be wrong.
00:58:42.640 I think the average person would agree with you on that, both sides, that this isn't working.
00:58:50.220 But, you know, when you say, because you scare me when you say, but the founder's systems aren't working.
00:58:56.180 We're not using the founder's systems.
00:58:58.420 We haven't been for over 100 years.
00:59:01.800 We've upended all of the – because, as you're saying, you know, we have to have these conversations, us standing around a campfire.
00:59:11.820 What's happening to us is everything is global, and so only the elites get to make those decisions around their campfire at Davos.
00:59:22.760 Yep.
00:59:23.100 And we're left out.
00:59:24.900 The real change comes, real meaningful change comes.
00:59:29.000 For instance, I'm not against universal health care.
00:59:33.600 When you can show me it works and can sustain itself, otherwise, I hate this system, too, but it's the best one we got, you know?
00:59:42.980 So, let's keep doing it.
00:59:44.420 We should be 50 states instead of one country, 50 little experiments.
00:59:50.440 Try it.
00:59:51.400 Do this.
00:59:52.280 See what works.
00:59:53.040 Take the best of that and work.
00:59:54.760 But you have to break them down to the family, to the community, to the state, before it goes global.
01:00:02.700 Everybody now is trying to solve the problem for the world.
01:00:06.080 And it immediately goes there.
01:00:08.540 The problem –
01:00:09.640 Immediately.
01:00:09.920 Immediately, yeah.
01:00:10.640 The problem is you've got a tension between two truths here, and, you know, the tension is a very real one.
01:00:17.560 On the one hand, the idea of the laboratory of the states is a very good one, right?
01:00:21.220 Prototype something and figure out how to get it to work before you globalize it, because that's the only sane thing we do.
01:00:26.420 On the other hand, you have game theory puzzles that will cause – you know, let's say, for example, you had a rational system of taxation, and then let's say Oklahoma decides, well, you know what?
01:00:39.280 We would like to attract some businesses, and so we're going to create a very hospitable business climate by lowering taxes in Oklahoma.
01:00:46.200 Businesses start flooding to Oklahoma.
01:00:47.700 Every other state notices, hey, our businesses are leaving, and they're going to Oklahoma.
01:00:51.680 They all lower their taxes, and now your rational across-the-board tax structure has triggered a race to the bottom.
01:00:58.020 Break it down – I understand.
01:00:59.160 Break it down even smaller, though.
01:01:00.800 I'm using the 50 states as the biggest it should be.
01:01:05.540 I'm saying all of these things – the single – and Eisenhower talked about this in his farewell address,
01:01:13.340 and it's brilliant – if we would have listened to just that one address, we would have fixed a lot of this stuff.
01:01:21.020 The man, the single scientist working in a laboratory that is doing it because he has an idea and says,
01:01:28.540 I think this might work, and might work his whole life on it.
01:01:32.120 That's pretty much gone.
01:01:33.400 It's now white coats in a laboratory working for somebody that's getting grants from somebody else.
01:01:38.360 So, we're losing the individual spark.
01:01:43.560 That's the key to Western society, is the individual spark.
01:01:48.640 Well, it's the environment that frees the individual to have the spark and to act on it.
01:01:54.140 Yes.
01:01:54.400 Right?
01:01:54.800 So, you know, it's both things.
01:01:57.180 Yeah, I was going to say just that.
01:02:00.220 This feels like a different level of the problem, which you may be right.
01:02:03.940 It may then fall into the right solution that's more global.
01:02:07.680 But, you know, we've talked a lot about the problem with modern science, which is exactly as you describe.
01:02:13.760 That at this point, individuals have a very hard time saying, I want to be a scientist.
01:02:20.380 I'm going to train to be a scientist.
01:02:22.340 I'm going to go do science.
01:02:23.740 You immediately get pulled into someone else's lab with someone else's grants, with someone else's questions that they've already asked.
01:02:31.800 And questions that you don't ask.
01:02:34.300 And lots of questions that you are precluded from asking because what kinds of questions are getting funded by NSF and NIH and DOD right now is a question of fashion.
01:02:44.220 And who's deciding what's fashionable?
01:02:46.180 It is, in fact, you know, the elites, be they in science or politics.
01:02:50.000 And it's very, very difficult.
01:02:52.160 This is one of the reasons, actually, that we were at some apparently, you know, podunk little college in the Pacific Northwest for so long was that it actually provided the opportunity to ask whatever questions we wanted.
01:03:04.000 It was one of the very rare institutions of higher ed, and it's not anymore, but it was one of the very rare institutions of higher ed in the modern world where you could actually investigate what you were interested in investigating.
01:03:16.120 That said, you couldn't do so if you were trying to do high-tech science.
01:03:20.560 Anything high-tech still requires the big grants, and the universities want that because they get a big chunk of the overhead, and that is how they are running.
01:03:29.840 So it's the business model that is the flaw.
01:03:32.700 So isn't this the exact same situation that our pilgrims, the founders, whatever, came here for?
01:03:42.440 That model is broken.
01:03:45.060 It just doesn't work because it's now so corrupt that you can't work, except there's no place to go.
01:03:53.700 I mean, that's one of the things I like about Elon Musk.
01:03:56.080 Let's get off planet before this thing explodes.
01:03:58.400 You know, there's no place to go.
01:04:01.940 We have to reboot without losing the important framework.
01:04:12.360 Right.
01:04:12.520 Well, we are, ironically enough, doing exactly the opposite of what we should be doing.
01:04:17.900 We should be recognizing that we are in the same boat and that noticing each other's humanity is the first step,
01:04:24.780 and then figuring out what we agree on is the second step, and from there we can begin to engage the stuff that we disagree over and figure out where to go.
01:04:33.580 And I would point out the solution to your puzzle, or at least the prototype for it, with respect to, you know, the laboratory of the states,
01:04:42.720 this is actually a concept that comes out of Catholicism of all places called subsidiarity, which means everything should be governed at the lowest effective level.
01:04:52.580 But it has to be the lowest effective level.
01:04:54.540 And we have some global problems, right?
01:04:56.960 Those problems have to be governed at a global level, right?
01:05:00.220 Not having the oceans governed by anybody is a recipe for disaster.
01:05:04.800 Right.
01:05:04.980 But having the oceans governed by a group of people that are just on the take and are all afraid of China or all afraid of us or whoever is also not a solution.
01:05:15.520 Right.
01:05:15.800 That's a Band-Aid that looks like it's going to heal, but it's festering underneath.
01:05:20.320 Well, you have two problems tangled together there.
01:05:23.280 So you've got the how are we to govern ourselves issue, and then what are we to do about the corruption, right?
01:05:30.040 Because we can say, look, certain things have to be governed at a global level, and other things you don't want your, you know, your local parks governed by the, you know, the global structure, right?
01:05:40.860 So we have to have that thing.
01:05:43.520 But even if you had a system that governed everything at the lowest effective level, if it's corrupt, then it may be, you know, it may be worse than the problems it's built ostensibly to solve.
01:05:54.780 So we have to solve the corruption issue as we address the let's make sure everything important is governed.
01:06:00.480 And this is one of these places where the founders missed it because they hadn't seen sufficiently dangerous technology to intuit what a global problem of this sort would look like.
01:06:11.720 There was no such thing as a manager or middle manager when they were around, you know what I mean?
01:06:17.700 They didn't.
01:06:18.580 The one thing I think they really, truly missed was they didn't see that corporations could ever be more powerful than a government.
01:06:27.200 Yeah, they missed the questions of scale, right?
01:06:29.340 And they just could not imagine it.
01:06:32.020 Right, and that's the first miss that I've seen.
01:06:36.960 And they had all kinds of little safety mechanisms all over.
01:06:40.320 That one, nothing.
01:06:42.620 And too many conservatives are like, well, it's a corporation they got.
01:06:46.200 No, no, no, no, no.
01:06:47.480 This is a different ballgame.
01:06:49.400 This is a different ballgame.
01:06:51.540 Right.
01:06:51.820 We are de facto governed by these entities that we have no protection.
01:06:57.420 No protection.
01:06:58.000 And they are merging with the governments of the world.
01:07:01.880 And, I mean, I have always mocked, oh, you know, these dystopian movies where like, oh, yeah, well, he works for the corporation.
01:07:10.560 And it's like, oh, stop it.
01:07:13.020 Don't laugh.
01:07:13.880 That is exactly the right or the left was right about that.
01:07:18.240 The right was wrong.
01:07:19.480 And now it seems like very few people even want to talk about that.
01:07:23.620 It's like, wait.
01:07:24.860 Right.
01:07:25.440 Major error.
01:07:26.440 Suddenly talking about the problems of the corporations is not something that's on the table anymore.
01:07:31.320 Right.
01:07:31.980 Right.
01:07:32.460 Yeah, which goes to a sort of overarching issue, which is, you know, the founders, of course, didn't know anything about evolution because it hadn't been described yet.
01:07:42.220 What they effectively did was they created an evolving system.
01:07:45.900 And so what we are living in is the consequences of the fact that they built a system that has all of the characteristics necessary to create adaptive evolution.
01:07:54.360 And creatures have evolved or as if creatures have evolved in it.
01:07:58.640 And many of them are predatory and they're extremely difficult to reign in.
01:08:03.520 Because they, I mean, one of the things, again, they had an evolving system, but it was evolving in structure.
01:08:11.600 So if you want to evolve because it becomes outdated, amend it, okay, which required the council to come together.
01:08:21.060 And so you weren't getting these little teeny changes that all of a sudden, a hundred years later, amount to something that doesn't recognize anything constitutional.
01:08:32.580 You know what I mean?
01:08:33.000 No, we are effectively on a craft hurtling through a dangerous landscape and there's no one at the helm.
01:08:40.760 No one at the helm.
01:08:41.960 And nobody even, they'll give lip service to the Bill of Rights, but they don't actually stand for the Bill of Rights.
01:08:48.800 Nobody does stand for the Bill of Rights.
01:08:50.140 You're like, wait, can we just get, can you give me eight of those?
01:08:54.660 I'll get on board with you.
01:08:55.860 Can you give me eight of those rights that you're willing to stand for right now?
01:09:00.000 No, it's, it's all, I mean, it's very, there's almost literally nobody at the helm in the U.S. at the moment, right?
01:09:05.940 We have an apparently senile commander in chief who was effectively the choice of the DNC, which is not an elected body.
01:09:15.440 And I don't know who's really actually running it.
01:09:18.580 Right.
01:09:19.040 I mean, in fact, I think if, if, if you were in on those conversations, it would be even more frightening because I don't think anything's really running it.
01:09:26.680 Right. It's, you know, it's a corrupt entity and it is doing what corrupt entities do, which is serving very narrow, short term interests.
01:09:35.340 And, you know, the consequences for, for us as Americans and for the planet are, you know, absolutely arbitrary.
01:09:43.940 This is very different than what humans have ever experienced before.
01:09:49.400 But it's very similar, isn't it?
01:09:54.420 In some ways.
01:09:55.620 Yeah. I mean, we, we make a call for more campfires in the book.
01:10:01.020 And as things are scaling up, there are, as you alluded to earlier, the equivalent of campfires that are happening outside of the reach of everyone else.
01:10:10.460 Yes.
01:10:10.720 Whereas used to be, you know, we have always had hierarchy, all of our groups, all of our tribal groups have always had hierarchy.
01:10:18.600 Often male hierarchies are separate from female hierarchies.
01:10:21.780 And then to some degree, some, you know, familial level hierarchies.
01:10:24.420 But there is, there was an ability, these like fission, fusion groups were able to move in between each other and say, oh, I'm here now and you're here.
01:10:34.320 And, you know, sometimes they would war, but more often they would meet and maybe sometimes exchange people, you know, in marriage, whatever.
01:10:41.360 However, there wasn't the possibility of one of those groups simply owning everyone else.
01:10:49.660 And in part, it was because people really did come together in a way that was equalizing.
01:10:55.900 Campfire. Breaking bread. I see the whites of your eyes.
01:10:59.940 And the power amplifiers.
01:11:01.560 It's the power amplifiers of modernity that are making this very much unlike anything that came before.
01:11:05.860 But it also, and I don't know which came first, the technology or the loss of this belief, but it is, we didn't just come together to come together.
01:11:15.860 We came together because in the end, I could see the whites of your eyes, but I truly believed you believed at the core, fundamentally, the same kind of things that I believed.
01:11:31.860 All men were created equal, you know, you have a right to free speech, all this stuff.
01:11:35.860 When we lost that, I don't know how to, I don't know how to get together with somebody who says, no, the Bill of Rights, no, that's occasionally you can speak.
01:11:46.540 You can speak if you have that.
01:11:47.780 How do you, because it feels like the media, the amplification, feels like half the country believes, half the country doesn't, or a third, a third, and then crazy, crazy crazies on both sides.
01:12:01.560 What is the truth?
01:12:02.760 Well, I mean, there is a long history in humanity of othering those that you, that are a threat to your own resources, right?
01:12:11.900 In so many languages, those who are not you, you, the person who is being, who is naming themselves are the chosen people, and everyone else is someone else.
01:12:20.620 So, it's not like that's new, right?
01:12:23.860 But upon being able to come together, crossing those borders between tribes, generally, there is a recognition of humanity.
01:12:31.560 And I think, A, the scale problem, right, that we just, we have power amplifiers, we have too many people, and we have too little opportunity to actually engage.
01:12:41.260 And this is where the screens are actually doing an incredible amount of damage, that everyone talks about screens, but no, we don't have the measurements for all of the sensory stuff that is being interchanged when we're actually in person with one another.
01:12:56.620 But there is value in that, that no one has named, no one has even attempted to measure, or a few people have.
01:13:03.600 And as we spend less and less time actually, like, in vivo with one another, it's going to be easier and easier to go tribal in this way that seems permanent.
01:13:13.060 I would argue in the U.S., we have to take ownership of the piece of the equation that our side got wrong.
01:13:24.320 And each of us.
01:13:25.800 Well, sure.
01:13:27.060 I mean, I went on the, they called it the Glenn Beck Apology Tour, and they said it was for something, no, I, you can't be human, really, truly human, if half the country says they hate you, and not go, gosh, do they have a point?
01:13:42.120 I mean, am I that person at all?
01:13:44.600 And I did apologize for things that I thought, you know, I shouldn't have said that, or I shouldn't have said it this way, or whatever.
01:13:50.900 We have to start here, and then our side, and then the world, you know what I mean?
01:14:00.140 Right.
01:14:00.640 Well, it may not even, I mean, I agree, the hallmark of an honest broker is that you go back and clean up what you've got wrong.
01:14:07.200 And that is fundamental, and there aren't enough people doing that.
01:14:10.760 But in this case, we have a tension.
01:14:17.080 People detect something is wrong.
01:14:19.100 Frankly, everybody knows the system is rigged, and for most people, the system is rigged against them.
01:14:23.720 Yeah.
01:14:23.920 They detect it.
01:14:24.520 And so the question is, is that because the fundamentals are wrong?
01:14:27.780 Is that because the society that we have described is incorrect?
01:14:33.420 Or is it because it is not working?
01:14:35.620 And what we are seeing on the left, the collapse of reason that we are seeing on the left, is really lots of people who know that things are wrong, who know that they have been taken advantage of and been mistreated, misunderstanding what the source of that is.
01:14:55.300 And not realizing that the right thing to pursue is a repair of the unfairness of the system.
01:15:01.760 It is not the dismantling of that system, because the system is the best thing that we have in terms of producing fairness.
01:15:09.840 It just, we never got there.
01:15:11.640 And this is where the right, I think, has made an error, which is that the right imagined the system worked better than it did.
01:15:18.560 I think that's changed a lot.
01:15:20.520 I think it's changed a lot.
01:15:22.340 I know 20 years ago, I was, golly, gee, I'm gray and red, white, and blue.
01:15:28.600 And now you're like, no, this, you know, but that's the problem is after you're done, if you have time, I'll take you over to our vault across the walkway here.
01:15:38.740 I collect everything that I can to preserve American history, but I collect a lot of the dark stuff about American history.
01:15:47.300 I can outdo any liberal professor on the dark side of American history.
01:15:52.160 It's important to remember that, you know, but I think there's a lot of people like I was 20 years ago that are just like, no, that's, well, we did make, what, I mean.
01:16:04.680 No, we did some horrific things, horrific things that need to be addressed and set right and just talked about and like, don't ever do that again.
01:16:14.200 That'll leave a mark.
01:16:15.060 But now, now, I think because the GOP 10 years ago so betrayed the Tea Party people, and I don't mean all the Tea Party people, I mean the people who actually believed in the Constitution, it betrayed them.
01:16:34.060 They went, ooh, that's one of the first questions I asked you, your own side.
01:16:38.440 Have you gone, oh my gosh, that's not who I thought we were.
01:16:41.800 That's what we went through, many of us.
01:16:43.700 It depends what you mean by your own side.
01:16:45.320 If you mean the Democratic Party, absolutely.
01:16:47.480 The thing is, you know, it's metastatic, right?
01:16:51.960 It is not.
01:16:53.120 It went beyond the GOP.
01:16:55.500 It was the whole system that we thought we knew what it was to trust.
01:17:02.100 And it's like, no, that's dirty, too.
01:17:04.280 It's all dirty.
01:17:05.220 I mean, Occupy on the left is to Tea Party on the right.
01:17:08.780 And I think very similar things happened for those of us who were hopeful that that could have been an answer and to see it decay both for external and internal reasons.
01:17:18.780 That it, you know, it turned out not to have any possibility of success.
01:17:23.520 That's what's sad about the Me Too movement.
01:17:26.420 That's what's sad about BLM.
01:17:28.340 Because I've talked to people who marched in Black Lives Matter here in Houston where there was a shooting.
01:17:34.840 And we all were huddling behind a car.
01:17:38.800 And it didn't matter what color you were.
01:17:40.540 And we all were like, okay, this is bad.
01:17:42.620 And we started talking.
01:17:43.960 And a lot of people were involved in that because they had real problems and real issues.
01:17:49.380 But that's not what the system was made for.
01:17:54.200 You know, that's not Black Lives Matter, Inc.
01:17:57.600 Right.
01:17:58.060 That's right.
01:17:58.540 Total difference.
01:17:59.700 Right.
01:18:00.180 And this is exactly where we landed is Black Lives Matter as a slogan, as a belief, of course.
01:18:05.680 Right.
01:18:05.980 Right.
01:18:06.300 Yeah.
01:18:06.640 Absolutely.
01:18:07.220 It's too bad that it even needs to be said.
01:18:08.920 Yeah.
01:18:09.100 On the other hand, once you look at the fine print of what the organization was interested in.
01:18:13.380 And it's like, wait a second.
01:18:14.580 And who even was running?
01:18:15.460 I mean, the global thing is like, whoa, these are a bunch of white people.
01:18:20.320 What are you doing?
01:18:21.280 What is that?
01:18:22.120 Right.
01:18:22.360 Well, and, you know, from defunding the police to attacks on the family, the point was it had nothing to do with Black Lives Mattering.
01:18:31.360 It was a very bizarre ideological agenda that was a lethal hazard to what does work about our system.
01:18:39.480 Well, and the same arguments can be made without exactly the same corporate structure behind it for Me Too, right?
01:18:45.380 Like, Me Too had the potential to be an awakening, to actually reveal to the vast majority of good men who are out there that the vast majority of women have unfortunate to really intolerable and horrible experiences as young women.
01:18:59.700 And most men don't know that because most men aren't those sorts of men, right?
01:19:03.300 Like, that's what it should have been.
01:19:04.960 Instead, it went off the rails because it was maybe because it was designed to.
01:19:09.900 I don't know.
01:19:10.420 But it went off the rails.
01:19:11.880 But we've, you know, I said at the end of the Barack Obama administration, he made me a better man.
01:19:17.740 I didn't like him at all.
01:19:19.220 I didn't like his policies at all.
01:19:20.780 But he made me a better man because he pushed me against the wall all the time.
01:19:24.320 And I had to go, wait, is that right?
01:19:26.480 Or am I wrong?
01:19:27.320 Or what's, what's, I mean, we are learning a lot.
01:19:31.540 There was a lot of good that came out of Me Too and Black Lives Matter.
01:19:34.920 Even if it wasn't directly related to their goals, a lot of people did step back.
01:19:41.200 And they didn't necessarily join in.
01:19:43.300 They were just like, you know, they have a point on that and that and that.
01:19:46.480 And that's good.
01:19:47.440 Yeah.
01:19:47.800 Some good things.
01:19:48.720 I mean, to get back to an earlier point, you know, I think liberals had so many successes in the 20th century, right?
01:19:56.580 You know, the women, women's emancipation and civil rights and gay rights and worker protections and this, these became values that almost everyone holds in common.
01:20:09.300 And I think because of that momentum, it now seems to many on the left that change must always be necessary.
01:20:17.060 And as you said earlier, Brett, you know, change isn't inherently good.
01:20:20.580 Change isn't inherently bad.
01:20:22.160 Change will sometimes be necessary and change isn't always necessary.
01:20:26.520 And if we're actually moving in the right direction, upending the system that's moving in the right direction is clearly going to be a betrayal of the values that you are claiming to have.
01:20:36.620 Are you guys optimistic?
01:20:40.080 Occasionally?
01:20:40.640 Well, if I can return to the idea of an adaptive valley, things are very dark.
01:20:48.640 And that doesn't necessarily mean one shouldn't be optimistic because in some sense, they would have to be in order for us to accomplish what we need to accomplish now.
01:20:59.140 Dark is before the dawn.
01:21:00.480 Right.
01:21:00.700 And so the problem, if you think about, you know, the adaptive landscape I was describing, the problem is if you have to move on from the opportunity that you've been exploiting and to find a new one, it is very easy to head in a direction where that opportunity does not exist.
01:21:18.400 It's very easy to pass into that valley and to not arrive somewhere.
01:21:22.260 So what you really need is very careful thinking about where that next opportunity is.
01:21:28.400 And whatever the explanation for it, if I, you know, if I could get one thought into the mind of Americans generally, it would be that something has us divided for reasons we may never know.
01:21:43.880 There may not even be reasons.
01:21:45.040 It may just be some process, but the key to us getting out of this is the recognition that most of us agree on the values to be pursued.
01:21:55.300 We agree on what a good society would look like.
01:21:57.440 We may disagree on how close we are and what might be done to get us the rest of the way, but we agree.
01:22:01.700 We don't want a system that is rigged in favor of one race and against another, for example.
01:22:06.360 We don't want a system that bars people from doing whatever job they want to do because of the sex they were born into.
01:22:11.460 We want a fair system in which opportunity is broadly distributed.
01:22:15.900 Now, once you recognize that virtually everybody you meet can agree to that much, and then you realize that we've all been led to believe that there's another team and that those people don't agree, they don't agree with you on anything, right?
01:22:29.460 That they're bad people who want bad things.
01:22:31.840 And the point is, well, all right, wouldn't the right thing to do be to recognize we know what the objective of the project is, we all understand something has gone awry, and the correct thing to do is to talk to each other about how we might get ourselves out of this and get back on track where we can fight about the details of how to get there, not where we're going.
01:22:53.140 Historically, you guys are on exactly the right path, one that wasn't necessarily taken the last time we had a horrible, horrible world war.
01:23:04.100 In my research over the years on the Holocaust, the biggest thing that happened was nobody knew Jews.
01:23:16.040 The Jews that were saved were generally saved by people who said, yeah, the Jews are like that, but not this one.
01:23:22.740 I know this one.
01:23:23.680 And when you understand that, you realize we better start talking to each other.
01:23:30.020 That's right.
01:23:30.520 Because, oh, I'll save a conservative or a liberal because I know this one, but the rest of them are like that.
01:23:37.360 No, that's not true.
01:23:39.100 That's not true.
01:23:40.040 I call it the comic bookification process.
01:23:43.060 Yes.
01:23:43.460 We believe that we have the capacity to be superheroes, and we believe that there are supervillains on the other side of the screen or out there in the world.
01:23:51.620 And it is the very, very, very rare human being who is either.
01:23:58.080 Just look within yourself and see, no matter how remarkable you are, what your weaknesses are.
01:24:06.160 Or if you're in the opposite camp and you feel that you're not doing well, find the strengths that you have and recognize that that mixture in different amounts, in different relative amounts, is going to be present in every single other human being.
01:24:19.380 You know, it stops us from wheeling gallows in front of people's houses.
01:24:25.760 That's right.
01:24:26.540 Right.
01:24:26.940 Which, of course, is returning into passionability.
01:24:30.680 I'll tell you something funny.
01:24:33.360 We live in Portland, which is almost a cartoon of liberals.
01:24:37.680 You know, Texas exists.
01:24:39.200 Right.
01:24:39.720 And there are liberals here.
01:24:42.000 We have met them.
01:24:43.240 Yes.
01:24:44.020 But even in Portland, where you would imagine that there's just simply no reason to be accessed, a funny thing happens, for me at least, and I know for Heather, when we are open about our doubts about the conventional wisdom of the left.
01:25:00.680 Which is that people who would espouse, you know, all of the usual slogans, as soon as they hear that you're actually, you have your own doubts and that you're willing to voice them, it is amazing what people will volunteer.
01:25:15.000 So, why do you think they work so hard to silence people who have just a little bit of courage?
01:25:22.740 Yeah.
01:25:22.880 They need to silence that courage.
01:25:24.800 They need to crush it.
01:25:27.160 They need to.
01:25:27.580 Courage is contagious.
01:25:29.300 Courage is contagious.
01:25:30.680 Yes, you're absolutely right.
01:25:31.820 Yes.
01:25:32.220 And this is where this impulse to authoritarianism comes from, right?
01:25:36.020 Because they need to control the conversation in order that the doubts don't emerge and restore us to a conversation that might actually stand a chance of putting us back on track.
01:25:47.640 Yeah.
01:25:48.120 And it is remarkable how many conversations both Brett and I have had with, you know, UPS drivers, cashiers, waitresses, whatever, or just overheard while paddleboarding or, you know, sitting in a park and then sometimes talking with the people and sometimes just eavesdropping.
01:26:05.540 But, you know, it's Portland, it's mostly liberals and the number of people who are saying this thing is coming, it's authoritarian, it's dangerous, and we need to stand up is high.
01:26:19.760 It's still quiet, but it's still quiet, but it's still quiet, but it's still quiet, but it's a lot of people.
01:26:22.680 My guess is it's coming from the left faster than it's coming from the right, but it will come from the right, too.
01:26:31.280 We're passing all of the exits where you're getting to such, I mean, this is the way revolutions happen, communist revolutions happen.
01:26:41.040 You sow the seed of discontent, you overload the system, you get it so nothing's working, there's no safety anywhere, and the people will cry out, help us, and they will, and they'll crush that.
01:26:55.380 And it'll be either side that does it if we don't change our way soon.
01:27:01.280 Yeah, it's imperative, and, you know, we are prevented from doing it by being told that, you know, you will be guilty by virtue of your associations if you talk to people on the other side.
01:27:15.680 And those of us who have talked to people on the other side, you know, have paid a price for it.
01:27:20.720 On the other hand, it's quite clear that that is the road forward.
01:27:24.260 It is.
01:27:24.860 You know, anybody who can't manage it is not going to be of help.
01:27:28.660 I started this podcast three years ago, and you were the number one target to put on this show because I wanted this show to be a model where people can disagree, and we go away friends.
01:27:43.180 We can see each other going, you're a normal person, I'm a normal person.
01:27:46.400 We don't hate each other, and we don't hate the country, and we don't hate freedom.
01:27:50.140 We just disagree, right?
01:27:52.820 That's right.
01:27:53.240 Let's not conflate the ideas with the person.
01:27:55.440 I'm glad that you're both here.
01:27:56.640 I hope you come back.
01:27:57.680 Thank you.
01:27:58.300 Thank you so much.
01:27:59.080 Thanks so much.
01:28:04.840 Just a reminder, I'd love you to rate and subscribe to the podcast and pass this on to a friend so it can be discovered by other people.
01:28:12.020 We'll see you next time.