The Joe Rogan Experience - December 19, 2017


Joe Rogan Experience #1055 - Bret Weinstein


Episode Stats

Length

2 hours and 57 minutes

Words per Minute

158.15053

Word Count

28,048

Sentence Count

1,927

Misogynist Sentences

35


Summary

Evergreen State College settled with Naima Nadeau, the woman who accused the college of sexual assault, for $500,000. What was the real reason for the settlement? Why did they settle? And why did they do it in the first place? Jordan Peterson and I try to answer these questions and much more on this episode of the podcast. This is a standalone episode, so people can listen to it without having to listen to the rest of the conversation with Jordan and Jamie, who are in the process of filing a lawsuit against the college for the $500k they paid. If you want to learn more about the settlement, go to gimlet.fm/EvergreenStateCollege and click here to see the full story from beginning to end, including the details of the settlement and why it was so bizarre. And if you don't want to know what happened at Evergreen, then you'll have to wait until the end of the episode to find out what actually happened at the bottom of this whole mess, because it's a long and complicated story, because we're not going to tell you the whole story on the other side of the story, which is why you should listen to this episode. Thanks to Brett Weinstein for coming on the podcast, and for bringing it to us. . We appreciate your support, and we'll see you next week! with a new episode of Jordan Peterson's new podcast, "Podcasts on the Road to Nowhere." - Jordan Peterson and Jamie's new book, "The Real You're Not My Real Life Story." in the next episode of his new podcast on the New York Times article on Jordan Peterson, "No One's Real Life Is My True Identity." and much, much more! - and we're looking forward to hearing from you! and we hope you like it. and that you'll join us next week's episode of "The Other Side of the Story" on The Real Thing, Jordan Peterson is a podcast that's coming soon. - And we'll hear from Jordan Peterson on his new book "Praying for You, Not Me? and so on and so much more. , and we can't wait to hear from you, so we'll get back to you in a better version of this podcast in a few weeks, so let us know what you say about Jordan Peterson in the future. --


Transcript

00:00:09.000 Boom, and we're live.
00:00:10.000 How are you?
00:00:11.000 What's going on, man?
00:00:12.000 I'm doing all right.
00:00:12.000 A lot of stuff is going on.
00:00:14.000 A lot of stuff.
00:00:14.000 A lot of stuff.
00:00:15.000 Since your settlement, Jamie and I were hoping that you would come in with one of those Paul Wall grills with diamonds on it and maybe some furs, pull up in a Cadillac.
00:00:24.000 Well, unfortunately, it's enough money to make a difference, but it's not enough money to...
00:00:30.000 To no longer have to think about such things.
00:00:32.000 Yeah, you could temporarily ball, though.
00:00:34.000 If you were irresponsible and you didn't have a family...
00:00:36.000 If I didn't have a family...
00:00:38.000 You'd get crazy for a couple months.
00:00:39.000 Yeah.
00:00:40.000 As it happens, it gives us room to think about how to replace our incomes.
00:00:45.000 We have about two years at our current rate of burning to keep the family...
00:00:51.000 Yeah, for people...
00:00:52.000 So this is a standalone podcast, so people can kind of...
00:00:56.000 What you need to do, if you're really interested in this, if you're really interested, is Google Brett Weinstein, and you will get the full story from beginning to end with Evergreen State College, where...
00:01:08.000 I'll just give you the short version of it.
00:01:10.000 There was a bunch of what you would kindly call social active people.
00:01:29.000 I think we're good to go.
00:01:38.000 They decided to literally take over the college for a short period of time.
00:01:43.000 The dust settled.
00:01:44.000 You just got a big fat settlement.
00:01:46.000 You're out of there.
00:01:48.000 And then the woman who was kind of at the head of it, she got a nice little chunk of change too, which I thought was quite odd.
00:01:54.000 Yeah, the college settled with her even though it was quite clear she had no legal case.
00:01:58.000 So there's a bit of a mystery about why they would have paid her to resign when in fact they could have just stood their ground.
00:02:07.000 She could have probably become the president of the college if she wanted to.
00:02:10.000 The way that guy had responded to the students when they told him to put his hands down because he was being threatening, and then he put his hands down.
00:02:18.000 He's chalking them and using hand gestures, and they're like, put your hands down!
00:02:21.000 You're threatening us with microaggressions!
00:02:24.000 He puts his hands down, and they all started laughing.
00:02:26.000 Which just tells you everything about what their intent was, what was really going on there.
00:02:32.000 But after I saw that, I'm like, God, she could have been the president.
00:02:35.000 They should have just had her be president.
00:02:36.000 The whole thing, it was a kangaroo court.
00:02:39.000 It was mockery.
00:02:40.000 The whole thing was absurd.
00:02:42.000 I will say, it's very hard to get the story right in any sort of short synopsis.
00:02:46.000 And so for anybody who really wants, it's not even complete.
00:02:50.000 But Heather and I, my wife Heather was also a professor at Evergreen.
00:02:53.000 And We wrote a more complete version of the story that allows people to see how the internal politics of the college played out into what they ultimately saw on YouTube.
00:03:04.000 So that's in the Washington Examiner last Tuesday.
00:03:09.000 I read that.
00:03:09.000 Yeah, it's deeply disturbing.
00:03:12.000 It's but it's not if you've been paying attention and you know you and I and Jordan Peterson and boy a bunch of people have tried to figure out what's going on today like why has this movement become so aggressive and so not just aggressive but absurd it's not they're not it's not logical like the way they're approaching things is from this very strange entitled and just oddly fanatical way Trevor
00:03:53.000 Burrus But there is also a strategic movement under the surface which we can't listen into directly, and it is more sophisticated than we think.
00:04:08.000 In other words, it is managing to wield power in spite of the fact that its explanation for why it is entitled to wield power doesn't make any sense.
00:04:17.000 So what do you think the underlying motivation is?
00:04:20.000 Or at least their underlying plan?
00:04:22.000 I would say we have to be careful.
00:04:25.000 There are a lot of people who go along with it who I would argue are tools of the movement and they are doing its bidding without understanding the objective.
00:04:33.000 But the prime movers are quite clearly interested in taking power.
00:04:39.000 They want power.
00:04:40.000 They have a superficial rationalization for why they are entitled to power and they are wielding Weaponized stigma as a mechanism for gaining it.
00:04:49.000 So you said that Naima could have become the president of the college.
00:04:54.000 I don't know whether she could have.
00:04:57.000 It would have been a very unusual path.
00:04:59.000 But I do know that for more than a year, I watched people Yeah, I think.
00:05:26.000 Something that a person who is cynical enough to wield stigma to get it might covet.
00:05:35.000 And so anyway, I think there's a small number of people who really do know what they're doing.
00:05:39.000 And their point is we can wipe many of the obstacles to our having power off the map by throwing accusations at them that they cannot resist.
00:05:51.000 Yeah, the big accusation is racism.
00:05:54.000 That's always the big one.
00:05:56.000 That's one that you never want to have thrown your way.
00:05:59.000 If you are deemed a racist, it's akin to being deemed a rapist.
00:06:04.000 Even if it's a false charge, like, boy, most people are going to hear the first statement before they ever look into the possibility of you being exonerated.
00:06:14.000 You being called a racist is a very, very dangerous thing in today's society.
00:06:19.000 It's a very dangerous thing.
00:06:21.000 I do think it's important that we not do their bidding by inflating that danger beyond what it actually is.
00:06:30.000 We will never know for sure what the trajectory would have been absent what happened at Evergreen.
00:06:35.000 But I do think standing up To the mob at Evergreen and just saying, frankly, no, I'm not a racist.
00:06:42.000 Well, here's what's really important for people who don't know you.
00:06:45.000 You're a very progressive guy.
00:06:46.000 You're very left-wing, very left-leaning.
00:06:50.000 You're not in any way, shape, or form a conservative.
00:06:53.000 And so this is the left eating itself.
00:06:58.000 It is the left eating itself, but I was also in the lucky position of being able to imagine what was going to happen when that accusation broke.
00:07:10.000 And knowing that there were so many people who knew that that couldn't possibly be right about me, that I was going to survive it.
00:07:17.000 And it's not as if there aren't huge numbers of people who still to this day apparently believe I'm a racist in spite of the fact that nothing has emerged.
00:07:25.000 And all of the time, with all of the incentive for somebody to bring something forward that would suggest that I have an issue with race somewhere in my history, it never emerged.
00:07:34.000 And so people do need to understand that it is possible to survive that accusation.
00:07:42.000 Well, especially with your background, though.
00:07:44.000 I mean, people don't know your background.
00:07:47.000 Was it at Yale that you had?
00:07:49.000 Penn.
00:07:49.000 Penn.
00:07:50.000 Yeah.
00:07:50.000 Tell that story if you could.
00:07:51.000 Sure, I'll do the quick and dirty version.
00:07:54.000 Cliff Notes.
00:07:55.000 I was a freshman at Penn.
00:07:58.000 A friend of mine was rushing a fraternity, and that fraternity was holding what they called a mystery event.
00:08:07.000 And I wasn't rushing a fraternity.
00:08:09.000 I wasn't interested.
00:08:10.000 But he convinced me that I didn't have anything to do that night and that I should go with him.
00:08:13.000 And I did.
00:08:15.000 And after some relatively standard fraternity shenanigans, the event turned into one in which the fraternity had hired prostitutes, black prostitutes from the local environment.
00:08:30.000 And the situation, Penn, is in a pretty rough neighborhood, and this was a pretty wealthy Jewish frat.
00:08:37.000 And so there was, you know, something...
00:08:42.000 I think?
00:09:00.000 A friend and other members of the fraternity rolled into the dorm that night and told me what happened.
00:09:06.000 I was absolutely appalled because what had happened was the fraternity had enacted a mock rape of these prostitutes using cucumbers and ketchup.
00:09:22.000 The idea that there was anything acceptable about an organization that had special privileges on the campus behaving this way, I couldn't get past it.
00:09:33.000 And so anyway, I went to the paper and various things unfolded.
00:09:37.000 The paper botched the story and made it look like I was troubled by the fact that there were strippers rather than troubled by what actually had ended up happening.
00:09:46.000 And I ended up writing an editorial for the For the paper, and all hell broke loose.
00:09:53.000 I got death threats.
00:09:55.000 The police started tracking phone numbers on my phone to see who was threatening me.
00:10:01.000 Ultimately, there was a trial of the fraternity.
00:10:03.000 The college did not want to put the fraternity on trial, but ultimately, public pressure forced them.
00:10:08.000 To do it, I testified at the trial, although since I hadn't seen the thing directly, what I could say was limited.
00:10:14.000 But while I was in the witness room with the other witnesses, people who had rushed the fraternity but had not pledged it, the fraternity brothers, including I believe the president, came into the witness room and started bullying these witnesses and coaching them on what they should say in front of this university panel.
00:10:34.000 Anyway, ultimately, the university threw the frat off campus for a year, forbid them from pledging a class.
00:10:45.000 And anyway, there's a lot more to the story.
00:10:47.000 I got an award from the National Organization for Women, actually, for...
00:10:52.000 I forgot their terminology, but basically for standing up for women who...
00:10:59.000 Needed to be defended or something like that.
00:11:00.000 You said you made the biggest mistake, one of the biggest mistakes of your life by leaving.
00:11:05.000 Meaning that if you were there you could have probably stopped it or you could at least have known exactly what was going on?
00:11:10.000 There's no way I could have stopped it.
00:11:12.000 Right.
00:11:13.000 I left because I didn't want to be party to this event.
00:11:19.000 I think journalists know this.
00:11:21.000 I was a freshman in college, so I didn't know it yet.
00:11:24.000 But journalists understand that sometimes something horrifying happens and that The job that you are best positioned to do is documenting it so that the world can understand how such things occur and do something about it.
00:11:38.000 And if I had understood that that was probably my highest and best use at that moment, I would have stuck around and paid attention to what had happened rather than having to go through convincing the world that something had happened which I had not directly seen.
00:11:53.000 Yeah, but how could you have known?
00:11:55.000 There's no, I don't think you should be hard on yourself at all.
00:11:57.000 You're 18, right?
00:11:59.000 Exactly.
00:11:59.000 How could you have known?
00:12:00.000 I didn't.
00:12:01.000 How could you have known, even if you were 30, how could you have known what was going to happen?
00:12:05.000 It looked like prostitutes, and you're like, eh, don't want to be here.
00:12:08.000 See ya.
00:12:09.000 Exactly.
00:12:09.000 You didn't know there was going to be a mock rape.
00:12:11.000 Right.
00:12:12.000 I couldn't have known, but in any case, in retrospect, it was a mistake.
00:12:16.000 It's so crazy that they let groups of kids live in a house together and get hammered when they don't even have their frontal cortex develop yet.
00:12:24.000 I mean, they're just all living together, feeding off of each other.
00:12:27.000 You have mob mentality, all this diffusion of responsibility because you have a large group of people that's also like you and everybody's...
00:12:34.000 It's so bananas that...
00:12:36.000 As few incidents happen as they do.
00:12:39.000 I mean, you would think that those things would just be chaos the moment you open up the door to every frat house.
00:12:45.000 Yeah, and, you know, there's more chaos than we know because a lot of what takes place we don't ever find out about.
00:12:49.000 But it's a shame because one could take the thing that drives people into those organizations and one could use it to power something that was useful and interesting and, you know, really was deeply enriching.
00:13:03.000 And I know, you know, how much flack am I going to take for Giving the fraternity system a hard time on your podcast.
00:13:08.000 But plenty of people will tell me how enriching their fraternity experience was.
00:13:14.000 And, you know, even at Penn, there were a couple fraternities that were, I was, as you can imagine, hated throughout the fraternity system.
00:13:23.000 At Penn after I had come forward.
00:13:25.000 But there were two fraternities that actually didn't hate me and were welcoming even in that climate.
00:13:30.000 So I don't want to portray them as a monolith.
00:13:32.000 They're not.
00:13:33.000 But it does seem like a wasted opportunity that that kind of energy that goes into fraternity life could be directed to something really amazing.
00:13:42.000 And it's a shame that it doesn't happen more often.
00:13:44.000 Well, I'm sure there's a lot of great camaraderie and it's probably a lot of fun to go through that experience together with people that are your same age and you're actually living in a house together.
00:13:52.000 But just, man, you should probably have some fucking adults in that room.
00:13:55.000 Something like that.
00:13:56.000 I mean, it just seems like, you know, just to limit your liability.
00:14:02.000 Yep.
00:14:03.000 Right.
00:14:03.000 Well, at the very least.
00:14:04.000 How about you 35 year olds running around going, hey, what are you doing over there, Mike?
00:14:09.000 You sure about that?
00:14:10.000 That's going to light the whole place on fire.
00:14:12.000 Don't do that.
00:14:12.000 You won't be able to put that out.
00:14:14.000 Yep.
00:14:14.000 Don't light that one on fire.
00:14:16.000 Yeah.
00:14:17.000 Just, I don't know, just crazy.
00:14:19.000 So that's your situation.
00:14:20.000 So anybody that would think of you as a conservative or a racist, it's like it's clearly the evidence points to the contrary.
00:14:29.000 Well, you know, this issue of conservatism is one that I would like to also get right.
00:14:35.000 And I'm not sure I ever say this in a way that people know what I'm talking about.
00:14:37.000 But I'm very progressive.
00:14:40.000 But I'm very progressive because I live in a world that's really screwed up.
00:14:44.000 And so the idea that we have to make some progress seems just transparently correct to me.
00:14:57.000 Mm-hmm.
00:15:04.000 And was unnecessary.
00:15:05.000 And so that would turn me into a conservative because it would be the right thing to be.
00:15:11.000 And a proper analysis would tell you this is the time to conserve the structure rather than change it.
00:15:16.000 But in this world, yeah, it turns out I'm a progressive and the events that people keep telling me that they're sure I'm now a closet conservative, that what I faced must have turned me against the left.
00:15:30.000 And that's not at all what happened.
00:15:33.000 Yeah, I agree with you.
00:15:34.000 I've faced that myself, that people say, oh, you know, you're going to turn more conservative with all this.
00:15:39.000 No, I'm just more resentful of this fake progressive movement that, like you said, has ulterior motives.
00:15:47.000 There's more to it.
00:15:49.000 And it's not accurate.
00:15:51.000 Like, the portrayals of humans in these movements are not accurate.
00:15:58.000 I don't think it's healthy.
00:15:59.000 I don't think it's normal.
00:16:04.000 Look, the game of capitalism is a very confusing one, and there's certainly some very evil aspects to it, right?
00:16:10.000 But the idea that the answer is Marxism seems to me to be just as Just as poorly thought out.
00:16:20.000 Oh, it's at least as poorly thought out.
00:16:24.000 You know, Marxism, the flaw is more obvious.
00:16:27.000 I think the flaw is what we in biology would call group selection.
00:16:33.000 The belief that if we just all row in the same direction, we'll get somewhere marvelous.
00:16:37.000 And that's true that if we did all row in the same direction, we would.
00:16:40.000 But there's a very good game theoretic reason that that can't be.
00:16:44.000 That as soon as you have everybody rowing in the same direction, Then the win goes to the person who figures out how not to row and gets the benefit of everybody else's rowing in that direction while they sweep in the profits.
00:16:55.000 And so that tears apart anything structured the way communism is structured.
00:17:00.000 Yeah, and that's conveniently ignored.
00:17:02.000 I think I really have always believed that competition is good, and it's because I've been involved in competition my whole life, and I think it helps you understand yourself.
00:17:11.000 You're competing against other people, but ultimately you're really competing against yourself.
00:17:16.000 Because you're trying to better yourself and I I believe that that's the argument for getting children involved in athletics or Games or something that's very difficult to do whether it's chess or pool or something I think things that are hard to do are good for you Competing is good for you because it teaches you about focus and discipline and understanding that there's you can reap the rewards of hard work and you know obviously this can get Distorted.
00:17:43.000 And you can get these, you know, billionaire oligarchs who, you know, control vast amounts of wealth and then they have their family and everyone inherits it and you have these fucking mutants that are all inbred and they're all in the same bloodline.
00:17:56.000 I mean, that's history, right?
00:17:57.000 I mean, that has taken place.
00:17:59.000 But I think that...
00:18:01.000 We should work very hard for equality of opportunity.
00:18:06.000 I think equality of opportunity, give everybody a chance to play a game, everybody a chance to get into something and try to better themselves with some endeavor.
00:18:16.000 But whenever I hear equality of outcome, That's when I put my foot down.
00:18:21.000 I'm like, that's not real.
00:18:23.000 You can't say that because some people work harder.
00:18:26.000 And if you have true equality, you're never going to have equality of outcome.
00:18:31.000 Because true equality is, I have friends that are brilliant, that are fantastic human beings, but they're essentially beach bums.
00:18:39.000 You know, they just like kick back and relax and do the minimal amount of work, get things done, and just enjoy life.
00:18:46.000 Have a couple cocktails, go to the beach, have laughs with friends.
00:18:51.000 That's what they like to do.
00:18:52.000 And then I have other friends that want to be, you know, world martial arts champions.
00:18:57.000 You have two different kinds of lives, two different types of human beings, a style of human.
00:19:04.000 One person is going to be extremely satisfied with one life and extremely dissatisfied with the other life.
00:19:10.000 And you can interchange them back and forth.
00:19:12.000 Well, you said a bunch of things we could spend three hours unpacking what you just said.
00:19:18.000 Let's say a number of things.
00:19:19.000 One, Equality of opportunity is something I have yet to find the reasonable person that does not agree on this point, in principle.
00:19:28.000 Lots of people will tell you it's not worth the effort of trying to pursue it because of the danger of what happens if you do, but nobody disagrees, nobody reasonable disagrees that it would be desirable to have that.
00:19:40.000 Equality of outcome, it's impossible.
00:19:43.000 If you pursue it, you end up with a dystopia.
00:19:46.000 And even if it were possible, it would not be desirable for the reasons you point to about the benefits of what you're calling competition.
00:19:54.000 And I would want to tear competition into a couple different values.
00:20:01.000 I think?
00:20:16.000 What you are trying to accomplish is real.
00:20:20.000 It is the world telling you how successful you are at something directly rather than through some sort of social channel, some sort of reward handed to you or some compliment given to you by somebody.
00:20:32.000 And there's a tremendous danger in a socially mediated world in which those who are successful are successful because some social thing has told them that they are correct.
00:20:43.000 Because you can be dead wrong Seem correct and move ahead in a social world, whereas if you're doing carpentry, if you're in some sort of competition, the nature of the beast is one that will tell you when you've got it wrong, and therefore it will allow you to actually improve your insight in whatever form you have it.
00:21:02.000 So I'm a tremendous fan of the idea that even if your world is largely socially mediated, you have to make sure that some part of it isn't.
00:21:11.000 And you are confronting something real enough to tell you when you're confused so that you can learn how not to be confused.
00:21:18.000 But there are people in this world that do want to push towards an equality of outcome.
00:21:24.000 Yes.
00:21:24.000 And they make it sound as if this is not just logical but ethical and possible in the future and that you are on the wrong side of history if you think that capitalism and competition and all these things you just talked about are good.
00:21:40.000 And that really the best thing is to force people to become some sort of utopian creature that works together in unison and everybody is egalitarian and there's no need for feminism and men's rights activists because everybody looks at everyone as an equal.
00:21:57.000 Well, there are two kinds of people who will advocate for equality of outcome.
00:22:01.000 One kind of person who will is confused.
00:22:03.000 They don't understand what happens if you go down this road.
00:22:06.000 And the other one is cynical.
00:22:07.000 And they're using this as an excuse to justify something that just so happens to reward them.
00:22:14.000 But equality of opportunity isn't this way.
00:22:16.000 It solves all of those problems.
00:22:18.000 Nobody believes that you're going to have it ever realized in a perfect form.
00:22:23.000 There's always going to be bad luck that's going to reduce somebody's opportunity.
00:22:28.000 What you don't want is any systematic bias in luck.
00:22:32.000 In other words, we're all going to suffer some bad luck, and we'll all have some good luck, and some of those things will actually shape the trajectory of our lives.
00:22:39.000 What you don't want is some population that just so happens to suffer more than its share of bad luck, which is what we have now.
00:22:46.000 So there is something to pursue here, but we're so busy on this other pointless conversation about equality of outcome that we can't get back to the thing that we all agree on that's actually the right goal.
00:22:59.000 Right.
00:22:59.000 And there's also some background chatter that you'll get from the less thought out where the people are really just upset that other people have more.
00:23:10.000 And so this is an uneducated, not very well thought out perspective on equality of outcome.
00:23:17.000 They just are upset that someone else has something and they're trying to somehow or another diminish the effect of their hard work and get something for themselves.
00:23:28.000 Well, you can imagine, though, so maybe we take a little digression here.
00:23:33.000 My experience over the last six months has done a bunch of things in my life.
00:23:38.000 One of them is it has put me in touch with quite a number of black conservatives, which I must tell you, it has changed my understanding of the world substantially, because I knew there were black conservatives, and I always thought, Are they confused?
00:23:53.000 Are they not understanding which side they should be on?
00:23:56.000 And that is not what's going on.
00:23:58.000 What is going on is that there is a dialogue, which I couldn't hear at least, which looks at the world.
00:24:05.000 You can view the unfairness in the world two different ways.
00:24:09.000 You can look at it and you can say, well, there's structural unfairness in the world and the cards that we in this community are dealt are not fair.
00:24:19.000 We're not getting our share of the good cards.
00:24:22.000 Or you can look at the world from the point of view of personal responsibility and you can say, well, it's kind of an academic question whether the cards you got were fair.
00:24:30.000 You should play them as best you can given what they are.
00:24:37.000 One has almost no ability to address the question of how the cards are dealt.
00:24:43.000 It's just not in the range of somebody.
00:24:46.000 An individual who discovers that the cards are unfairly dealt can't do very much about that fact.
00:24:50.000 But that individual can do a hell of a lot about their own position in the world by recognizing that actually, especially if the cards are unfairly dealt, you need to play them very well.
00:25:03.000 And that developing the skill to play them well has the ability, I think?
00:25:26.000 Dealing on the personal responsibility side pays back better, which I don't think frees us in civilization from addressing the question of how the cards are dealt.
00:25:34.000 I think we should be focused on it.
00:25:35.000 But anyway, it more or less solved the mystery in a way that I thought was quite fascinating.
00:25:42.000 And, you know, I'm heartened that they were willing to invite me into that conversation so I could hear it and finally figure out what was going on.
00:25:51.000 So they advocate towards discipline and personal responsibility as being core tenants that you should reinforce.
00:25:56.000 Right.
00:25:56.000 And they are very sensitive to the issue of what happens when you focus on the unfairness of how the cards are dealt, which is that what progressives typically miss is that it really does create a culture of dependency.
00:26:11.000 If you focus on the fact that the cards are unfairly dealt and that that's why you're facing a disadvantage, which is largely true.
00:26:17.000 Nonetheless, it demotivates you from pursuing success because you recognize that you're starting at a disadvantage and that you're unlikely to win the game.
00:26:28.000 On the other hand, the game isn't what we think.
00:26:30.000 If you can make progress and deliver your kids a...
00:26:35.000 Head start relative to where you were.
00:26:37.000 That's a win in the game.
00:26:39.000 So anyway, again, I don't want to trivialize any part of this.
00:26:42.000 I think the unfairness of the way the cards are dealt is really important and we have obligations to address it.
00:26:47.000 But from the point of view of individuals within a community trying to plot a course, being focused on the personal responsibility side makes a ton of sense.
00:26:56.000 What are your thoughts on affirmative action?
00:26:58.000 I've changed my tune.
00:26:59.000 I used to be for affirmative action because it is justified.
00:27:04.000 But now the downsides of it are they loom very large for me.
00:27:11.000 And so what I would say is we want to separate the whether or not it is justified to engage in some kind of intentional intervention to fix a problem that has become chronic.
00:27:24.000 And then we separate that from what it is that we are advocating.
00:27:27.000 And I don't think the substitute for it is something that anybody has properly spelled out.
00:27:32.000 But let's just say I would be much more inclined to see a substantial investment In community that makes sense rather than it applied at the individual level.
00:27:46.000 Because I will say before any of what happened to me at Evergreen happened, I did have the experience of having quite a number of black students in particular who suffered a totally...
00:28:02.000 A stigma that had nothing to do with them.
00:28:05.000 Students who did very well on their own merits, who lived in a world that was quick to judge them as having succeeded based on some advantage that, for all I know, they didn't even have.
00:28:16.000 I don't know that any of this...
00:28:19.000 In the role of professor, you don't necessarily know how your student ends up in front of you.
00:28:23.000 But I had no reason.
00:28:25.000 I had some very bright students who I think suffered a stigma that came from the fact that people in general imagined that affirmative action was playing some role in their world that it wasn't.
00:28:35.000 I mean, affirmative action isn't even legal in Washington.
00:28:37.000 So the fact that the stigma appends to people is preposterous in that context in particular.
00:28:44.000 That's fascinating.
00:28:45.000 I had a friend who was a fireman who told similar stories and he was talking about the resentment of the other people that were on the fire force if a guy got in even if a guy was qualified if a guy was black because they assumed that he wasn't as qualified and the reason why he got through was through affirmative action.
00:29:03.000 He was like it's so crazy because what it is is they're trying to combat racism by fueling racism.
00:29:09.000 Inadvertently.
00:29:10.000 Yes, it creates this whole cascade of effects.
00:29:14.000 And so the question really, the civilization-wide question, is what do we do about...
00:29:20.000 I mean, we could retune the analysis for each of the populations in question.
00:29:25.000 In the case of black people in the New World, that is the Americas, Most black folks in the New World are here via the route of slavery.
00:29:35.000 And the thing is, slavery was so brutal in destroying the cultures that these people had access to where they came from, and then putting them together in a synthetic culture that was Built to serve the masters,
00:29:55.000 right?
00:29:56.000 In other words, denying black folks the ability to learn to read obviously limits access to the huge library of insights that happened to be housed with the population that transported these folks from Africa.
00:30:13.000 So in any case, my point would be that is hobbling.
00:30:16.000 That was intentionally hobbling during slavery.
00:30:20.000 The legacy of that hobbling is one that's hard to quantify.
00:30:25.000 We don't know what role that plays, but I can say Any population that has, I'm going to speak now as I do as a biologist, I think of us as robots that have a computer on our shoulders that runs software and our culture is the software.
00:30:43.000 If you take that robot with the computer and you delete the software package and then you install some other software package designed to make it do one particular job, that has tremendous harm built into it and it's reversible.
00:30:58.000 But It is not...
00:31:00.000 I think we know that we didn't succeed in fixing this problem.
00:31:04.000 I think emancipation did not properly deal with how much harm had been done by bringing people from different parts of Africa and pooling them in one population based on effectively skin color alone.
00:31:19.000 And then at the point that they were freed, there was no...
00:31:26.000 There was no understanding because our understanding of biology and culture wasn't sophisticated yet.
00:31:32.000 It still isn't.
00:31:33.000 And so I think we do at some point have to do an honest accounting of how much damage happened in that process.
00:31:43.000 And we also have to realize that that damage, you know, I mean, I know right now having been active in trying to make the world a better place, I know that I'm running afoul of an argument called white man's burden, right?
00:31:55.000 And so we all know that this is a narrative.
00:31:57.000 But the point is that white man's burden argument...
00:32:04.000 I think the real story of what happened in the Americas is not a nice story.
00:32:11.000 And the implications are with us to this day.
00:32:15.000 Nobody knows how deep they go because we haven't studied the question properly.
00:32:19.000 And in fact, many of the people who are...
00:32:23.000 On the left pushing the sort of naive narrative about Equality are I think fearful of what will happen if we study the question I don't share their fear I think you're right about fearful that there's there's there's a lot of like what you were talking about before like With that woman bringing up ridiculous things in meetings and people just sort of showing their jugular please don't attack you get a lot of that I think of us In terms of,
00:32:50.000 you know, the United States or just this mass of humans, I think of us as a super organism.
00:32:56.000 And I think if you had an organism that had a broken knee, you would go, well, I've got to fix that knee.
00:33:01.000 You know, I can't just give that knee less work.
00:33:05.000 Is it possible to fix the knee?
00:33:07.000 Yeah, well, let's fix the knee.
00:33:08.000 You don't want to just give the knee less work.
00:33:10.000 You don't want to make it easier for the knee to get by.
00:33:13.000 What you want to do is, like, strengthen it.
00:33:15.000 So my thought, and I've said this as a very simplistic way of looking at it, but if you really wanted to make America great again, right, you really wanted to make America great, What you would want to do is have less losers.
00:33:27.000 So you'd want to go and find these places where people are in these economically deprived areas where there's a ton of crime and violence and they don't have like a real good sense of like a potential positive outcome from where they're at and transform that.
00:33:51.000 With a fraction of the money that we spend trying to rebuild nations and invading Afghanistan, we could invest in many of our gigantic problems that we have in inner cities and completely rebuild them.
00:34:04.000 It could be done.
00:34:05.000 And it could have radical implications on the entire country as a whole.
00:34:11.000 If you have, instead of Like a place like Baltimore, for instance, right?
00:34:15.000 I had Michael Woods on, who was a former police officer in Baltimore, and he sort of explained all the different issues that happened in Baltimore, particularly where there was areas where they literally weren't selling homes to black people.
00:34:28.000 They would not sell homes.
00:34:29.000 Like this is like a white-only area.
00:34:32.000 Like they had systematic racism built into the system for a long time.
00:34:37.000 If someone just invested money into, not someone, the United States government, if we systematically invested money into these places and rebuilt them with community centers, places where people could go where they were safe, staff them with a ton of people that were motivated,
00:34:55.000 counselors, people that wanted to help, give them activities, give them skills and trades, and show them ways out.
00:35:04.000 Design it.
00:35:05.000 That's not Not nearly as impossible as trying to rebuild Afghanistan or nation building, but we're doing that.
00:35:12.000 We're doing that all over the place.
00:35:13.000 I mean, Halliburton got no-bid contracts for billions of dollars to do shit that we don't even know what the fuck they were doing over there, right?
00:35:20.000 If we could have a fraction of that money and invest it into inner cities, you could literally change entire generations of human beings that are coming out of there.
00:35:32.000 A couple things.
00:35:33.000 One, by the way, I love the analogy of the busted knee because, in fact, we used to make this mistake medically, right?
00:35:41.000 Until recently, we didn't really understand that part of the healing process was not protecting the knee but putting it through physical therapy that properly exposed it to stresses so that it rebuilt and came back strong.
00:35:54.000 And so we are making that error and we have made that error.
00:35:57.000 In terms of What to do with the stratification of society in a way that locks up opportunity in some communities and not others.
00:36:08.000 I think we should be honest with ourselves about why that happens.
00:36:11.000 So, I agree.
00:36:14.000 You could make what would be massive investments in communities for a fraction of what we spend tinkering abroad in ways that have just spent huge amounts of treasure on projects that didn't work.
00:36:27.000 Right?
00:36:28.000 So we could do that.
00:36:29.000 The reason that that doesn't happen, I don't think has anything to do with it being in obvious that it would be a good thing to do.
00:36:37.000 I think it has to do with the same group selection issue that we were talking about with respect to communism, which is to say, if you are at the top of the system, Do you want to educate somebody else's kids to compete with yours?
00:36:54.000 And so there's a reason that we...
00:36:56.000 Why do our public schools suck?
00:36:58.000 Is it because we don't know how to make a school?
00:37:00.000 I don't think so.
00:37:04.000 People know how to make a school when it's for their kids, and they're not so interested in making a school for other people's kids.
00:37:11.000 And so this is a deep, chronic problem with our sociopolitical system.
00:37:18.000 We have to confront that, and actually I think we have to come to agreement that actually it is in the long term wise to educate other people's kids, even if in the short term it's economically frightening.
00:37:33.000 Yeah, the big conspiracy about children and schools and keeping them stupid and making sure that the school system is frustrating.
00:37:44.000 I've always felt like that conspiracy was really just, there's no motivation to make it better.
00:37:50.000 And when you look at the amount of money that teachers get paid, it's just disturbing.
00:37:56.000 Think of the job of, I don't have to tell you, you're a goddamn teacher.
00:38:01.000 But you got paid.
00:38:04.000 Don't go about it that way, folks.
00:38:05.000 But what we're talking about here is the most important thing that can happen to your child in the developmental phase, right?
00:38:15.000 The education, like giving them a view of the world, explaining them all these things that they had not known before.
00:38:24.000 First experiences with so many different subjects and topics and concepts come from your teachers.
00:38:29.000 And I got really lucky, man.
00:38:32.000 I mean, I went to a public school in high school in Newton, Massachusetts.
00:38:36.000 I went to Newton South High School, and it was a really good school.
00:38:38.000 And I still had shitty teachers.
00:38:40.000 I still had, you know, even in that really good school, comparatively really good.
00:38:44.000 Because I went to a school before that in Jamaica Plain, which was like an inner-city school, and it was scary.
00:38:50.000 Real dangerous.
00:38:51.000 Just, you know, not like the most dangerous in Boston, but very sketchy.
00:38:58.000 17-year-old kids in seventh grade that had never graduated, like violence, like a lot of weirdness.
00:39:05.000 Like I had my head down, got through that year, and then all of a sudden I was in this, we called it fast times at Hebrew High, because it was like a predominantly Jewish neighborhood.
00:39:15.000 But it was, even then, there were still some terrible teachers there.
00:39:19.000 You know, it's just...
00:39:20.000 The job is so important, and we in this country have done some weird thing where we've taken one of the most important jobs that you could ever hire someone to do, educate children,
00:39:36.000 and made it almost like it's inconsequential.
00:39:39.000 Yeah, you know, I'm hesitant.
00:39:41.000 I'm not always hesitant about conspiracy.
00:39:44.000 There are conspiracies and we don't deal well with it.
00:39:46.000 But one doesn't need conspiracy to explain this.
00:39:49.000 This effectively can evolve without anybody's consciously thinking I want to sabotage somebody else's kid's school.
00:39:57.000 But I will say my experience in school was It was horrifying.
00:40:05.000 The school did not work for me.
00:40:07.000 And maybe, you know, every five teachers I hit one who invested and cared and took the time to scratch their head about why I wasn't succeeding.
00:40:17.000 That must have had an impact on you, though, as an educator.
00:40:20.000 You remember those people, those one out of five.
00:40:23.000 Well, this is the funny thing.
00:40:25.000 While I was teaching, I taught for 14 years at Evergreen, and I felt like maybe I wasn't the only, but I was very nearly the only person on faculty anywhere that I could think of who had not been a good student.
00:40:42.000 LAUGHTER Well, but this was a very interesting window because I, you know, I became a professor because I loved science.
00:40:50.000 And so the academy is where science happened, not because I wanted to be a teacher.
00:40:55.000 My experience in school made me want to get away from the thing as fast as possible.
00:40:59.000 But having had the experience of school completely failing and my To my own surprise, learning how to think without school, that's not where I learned it.
00:41:10.000 I learned it primarily from my grandfather and my brother and other people in my environment who weren't associated with school.
00:41:16.000 But once I got to being a professor, and I think this was only possible at Evergreen, Evergreen made no rules about what you did in the classroom.
00:41:28.000 Literally no rule about what subject you taught.
00:41:31.000 They could hire you as a biologist and you could teach dance if people showed up to take it.
00:41:35.000 So you could teach whatever you wanted.
00:41:37.000 And the key was you could teach in whatever way you wanted.
00:41:42.000 That's so crazy.
00:41:43.000 Seems like a great idea if you're super motivated.
00:41:46.000 It worked two ways.
00:41:47.000 People abused it, and they would use it to reduce their workload to next to nothing, and they wouldn't invest in their students.
00:41:53.000 And then other people looked at this, and it was glorious to have that kind of freedom.
00:41:59.000 And so I taught in a way that would have worked for me if I had been a student there, which changed a lot of things.
00:42:06.000 And it actually worked for a lot of people.
00:42:09.000 Bad students don't typically become professors, so there's almost nobody on the faculty anywhere who has a clue why bad students are the way they are, right?
00:42:17.000 They just don't intuit it because it wasn't their experience.
00:42:20.000 But if you were like me and you were a bad student, and then you ended up with a class, suddenly all sorts of, you know, bad students aren't uncommon.
00:42:29.000 And so suddenly somebody who's speaking to them and says, I know that you're being a bad student isn't synonymous with you not having potential.
00:42:36.000 Right?
00:42:37.000 That's really empowering for them.
00:42:39.000 So anyway, it was an interesting experience.
00:42:41.000 And my wife actually was a tremendous student.
00:42:46.000 She loved school.
00:42:47.000 And we would actually often teach the same students either together or they would take my program and then they would bounce over to her program next.
00:42:55.000 And so they would get these kind of two different views.
00:42:59.000 And each of us, we were both...
00:43:04.000 Enlightened by our relationship with each other because you know to the extent that I might have been dismissive of the great students right here I had one who you know was my my closest person on earth and you know I got to a window into how she saw the world and she got a window into what the kids who weren't performing well in school might have been thinking and so anyway that was a very very useful background to have for teaching.
00:43:29.000 There's a tremendous amount of power in teaching people.
00:43:32.000 It's a weird relationship, and especially when you're teaching someone and you're giving them credit towards their degree.
00:43:41.000 I was a very poor student in high school, and I wasted a ton of time just going to college so people didn't think I was a loser.
00:43:50.000 But I taught at Boston University.
00:43:53.000 I used to teach Taekwondo.
00:43:54.000 And I taught an accredited course.
00:43:56.000 It was pass-fail-A, but it actually counted towards your GPA. So I had kids in my class, and I would tell them, it's really simple.
00:44:04.000 Just try, and you get an A. Just show up and try, and I'll give you an A. Because, like, athletics, it's not...
00:44:13.000 It's not fair.
00:44:14.000 You know, there's people that are just extreme endomorphs and their body holds on to too much fat.
00:44:19.000 They didn't have a background in any athletics.
00:44:22.000 They're not flexible.
00:44:23.000 It's very difficult for them to understand how to move their body correctly.
00:44:27.000 It's essentially like...
00:44:30.000 Trying to teach someone to be a professional speaker where someone has been speaking English their whole life where other people are just learning it for the first time.
00:44:38.000 It didn't seem fair to me.
00:44:40.000 So to judge them in terms of their actual outcome, it's almost contrary to what I said.
00:44:46.000 I was giving them a quality of outcome.
00:44:49.000 But not really.
00:44:50.000 No.
00:44:52.000 I'm pretty sure I know what you were doing because I did the same thing.
00:44:55.000 So I would tell students, you show up and try and you are completely safe.
00:45:01.000 You're going to get full credit and you'll get a nice evaluation.
00:45:04.000 We wrote written evaluations of them.
00:45:06.000 But...
00:45:07.000 If you want an evaluation that raves about you, that talks about your extraordinary capacity, you're going to have to strive.
00:45:15.000 And so what I wanted to do, which sounds like what you were doing, is make them safe enough to discover what they could do.
00:45:23.000 I don't know whether this will make sense to your audience or not, but I think we are overly concerned.
00:45:29.000 We have been sold the idea that the job of a teacher is to assess how much the student has learned, that basically the job of the teacher is largely to report out to the world how qualified this person is.
00:45:42.000 Mm-hmm.
00:46:10.000 Learned.
00:46:11.000 So I steered away from the idea that my job was to tell the world how well this person had done.
00:46:17.000 Unless they'd done great.
00:46:18.000 If they'd done really well, if they'd, you know, surprised themselves and me about everything that they were capable of.
00:46:23.000 I loved saying that.
00:46:25.000 But I didn't want to run down a student who hadn't really shined in the classroom because it was a totally artificial moment to judge how much they had picked up.
00:46:35.000 Yeah, no need to run them down.
00:46:38.000 And again, everybody's starting off at a different position.
00:46:42.000 It's not like everybody's on the same line and the gun goes off and everybody runs with the first step in the same spot.
00:46:49.000 It's just not the case.
00:46:51.000 Right.
00:46:51.000 And there are some or many activities in which how you do initially doesn't necessarily predict ultimately whether you'll be unusually good.
00:47:03.000 Sure.
00:47:04.000 Yeah.
00:47:04.000 Well, one of the things that I found, particularly with athletics, is that people who are extraordinarily gifted oftentimes don't excel.
00:47:12.000 Because it seems to come too easy for them, and they never develop the proper stamina for hard, difficult work.
00:47:19.000 They shy away from that, because if things aren't easy, they...
00:47:23.000 Here's a perfect example.
00:47:25.000 People that are really gifted in one martial art, like, say, someone who's gifted as a striker.
00:47:32.000 Well, they'll enter into mixed martial arts and find that they really are not very good at grappling.
00:47:37.000 And so they don't like that feeling of being dominated in training, so they don't give it 100%.
00:47:43.000 They don't throw themselves into it.
00:47:44.000 Instead, they avoid it, and they try to find workarounds, and they're almost always defeated by grapplers.
00:47:51.000 It's like they've, because of the fact that they're gifted and talented, they've avoided the difficult, real character-building moments.
00:48:01.000 So this is interesting that you say that.
00:48:04.000 I was on a train in New York and looking at Twitter and somebody had posted an article that caused a dime to drop for me with respect to mixed martial arts and why they were suddenly a thing in my life.
00:48:19.000 Where they hadn't been.
00:48:20.000 I had assumed that it was simply the fact that I'd showed up on your podcast and obviously you're in that world.
00:48:25.000 And so a certain number of mixed martial arts people were now following me and I was seeing their tweets and things.
00:48:31.000 But this guy, I think his name is John Kerbo, posted an article that he had written about Bruce Lee and his argument.
00:48:39.000 Did you see this article?
00:48:40.000 His argument was why Bruce Lee and mixed martial arts points the direction to how to fix our political dialogue, something like that.
00:48:49.000 And so anyway, his argument was, and I'm no expert on this at all, but his argument was that Bruce Lee, effectively the innovator at the beginning of mixed martial arts, was interested in testing a martial art against all comers rather than requiring the person on the other side to be practicing the same martial art.
00:49:15.000 Yeah.
00:49:31.000 Right?
00:49:32.000 Then it's a formalism.
00:49:33.000 But if you're really good at arguing, then it shouldn't really matter.
00:49:37.000 As long as the person speaks the same language, you should be able to meet them on that playing field and hash stuff out.
00:49:42.000 So anyway, I think a number of things link up in this way.
00:49:48.000 That there is...
00:49:49.000 a kind of artificial boundary placed between things and that those who are interested in tearing down those boundaries even though that opens up a huge range of things that they may face often have something to teach in their particular realm and I was trying to think of other examples of this and Lars Andersen,
00:50:14.000 the archer.
00:50:15.000 Again, this is a place where you're an expert and I'm not.
00:50:18.000 But what do you think of Lars Andersen?
00:50:20.000 Well, he does like a lot of weird trick shots with archery.
00:50:24.000 It's kind of fun to watch.
00:50:26.000 Very interesting stuff.
00:50:27.000 And what he's essentially done is he believes he has not reinvented, but rediscovered a method of holding arrows in your fingers.
00:50:37.000 So that you could, with practice, repeat.
00:50:40.000 Here, Jamie's got a video of this character.
00:50:43.000 That you could release a bunch of arrows in a row.
00:50:46.000 And he's capable of shooting way faster than the average person and multiple arrows.
00:50:53.000 He can throw things in the air and hit it with multiple arrows before it hits the ground.
00:50:58.000 Yep.
00:50:59.000 So my thought here is that this is...
00:51:03.000 I have a number of different examples.
00:51:06.000 So there he's throwing a...
00:51:07.000 He's throwing a bottle cap in the air and he hits it with an arrow.
00:51:12.000 That's crazy.
00:51:13.000 Even if there are lots of cuts here that we're not seeing where he misses, the fact that he can do this with enough reliability to make a video, you can tell there are enough things where he hits two things in one shot that it can't be completely fiction.
00:51:26.000 And if you look at...
00:51:29.000 So he took a lot of crap for this from people who were in the archery world who didn't like the way he violated all of the rules about what good archery form looks like.
00:51:38.000 And so he made a response video.
00:51:40.000 English is in his first language, so it's a little hard to follow.
00:51:43.000 But...
00:51:44.000 But anyway, it's pretty clear that what he's done is he's just said, well, okay, there are bows, there are arrows.
00:51:50.000 What is the best way to think about these things from the point of view of solving these different problems?
00:51:54.000 And he's discovered a whole landscape of stuff that I don't think you would discover if you took archery and took it seriously and learned from a master who had good form.
00:52:05.000 You'd never discover it.
00:52:06.000 And I guess I might as well put the other examples on the table.
00:52:11.000 Let's see.
00:52:13.000 There's Danny McCaskill, the bicyclist.
00:52:18.000 Are you aware of this guy?
00:52:20.000 I feel like I've heard that name before.
00:52:22.000 Scottish kid.
00:52:23.000 He's very young.
00:52:24.000 I think at some point, unfortunately, he became a surprise sensation on YouTube and Red Bull figured out that he was a moneymaker and they've sort of pushed him.
00:52:34.000 So he's now had a couple of serious accidents.
00:52:36.000 But anyway, he is capable of doing things on a bicycle, especially in an urban environment, hopping from...
00:52:44.000 Oh, this guy?
00:52:45.000 Yeah, this guy.
00:52:46.000 Look at this.
00:52:47.000 Did you see that one of these guys, these daredevil YouTube guys, just died?
00:52:52.000 A guy in China.
00:52:54.000 Oh, God.
00:52:55.000 This guy's out of his mind.
00:52:57.000 He's out of his mind.
00:52:58.000 But, I mean, check that out.
00:52:59.000 For people that are listening only, this guy's on a rooftop, and he's doing these stunts where he's riding his bike...
00:53:06.000 Doing flips from one side of the building to the other.
00:53:09.000 He's riding on the edges of the building, looking down, hitting the brakes, certain death on either side.
00:53:16.000 Jesus.
00:53:17.000 Yeah, I mean, look at that.
00:53:18.000 Oh, I have a hard time looking at that, Brett.
00:53:21.000 It's pretty rough.
00:53:22.000 Look at that.
00:53:22.000 It's pretty rough.
00:53:24.000 This guy's fallen and hurt himself now?
00:53:26.000 That's not a BMX bike either.
00:53:28.000 No, it's like a regular bike.
00:53:30.000 I think it's not quite standard, but it's more mountain bike proportions.
00:53:35.000 But anyway, just...
00:53:36.000 It probably has to be to be so rugged.
00:53:38.000 Right.
00:53:39.000 Oh, yeah.
00:53:39.000 It's souped up in particular ways so that it can endure the kind of forces that he's putting on it.
00:53:44.000 Yeah, the impacts.
00:53:49.000 I mean, in fact, you know, in the era we grew up, we didn't know that these things were possible.
00:53:53.000 And, you know, what he's doing emerged from observed trials, which was a very regimented kind of competition for mountain bikers.
00:54:02.000 But anyway, he's discovered a landscape of possibilities on a bicycle.
00:54:06.000 Not unlike the landscape of possibilities that Lars Anderson has discovered in archery.
00:54:13.000 Parkour is another place where, you know, growing up, I didn't know that all that was possible.
00:54:19.000 And I thought, you know, frankly, I thought that Olympic gymnastics was pretty interesting.
00:54:24.000 And, you know, now I look at Olympic gymnastics and I think...
00:54:28.000 That's a...
00:54:29.000 I mean, I get it.
00:54:31.000 It's tame.
00:54:31.000 It's tame.
00:54:32.000 And the thing is, it has one advantage, which is that everything is so standardized that you can compare to competitors.
00:54:39.000 If you want to award a medal, maybe it has to look like Olympic gymnastics.
00:54:43.000 But these guys who can look at an urban environment and figure out how they can make use of these objects and their relationship to each other...
00:54:54.000 Are discovering something about what the human body is capable of that isn't obvious if there isn't somebody to point it out.
00:55:03.000 So there are innovators.
00:55:03.000 I don't even know if we know who the initial innovators are with something like parkour.
00:55:08.000 Probably somebody does.
00:55:09.000 Probably Russians.
00:55:10.000 Could be.
00:55:11.000 Those Russian kids, you ever see those videos of them hanging off of the side of buildings?
00:55:14.000 I really think that they were the innovators of this.
00:55:17.000 That was the first stuff that we ever saw.
00:55:19.000 Frick.
00:55:20.000 Maybe.
00:55:21.000 And you know...
00:55:22.000 Isn't parkour French?
00:55:23.000 It is.
00:55:23.000 I think so, yeah.
00:55:25.000 But in any case, one of the things, and you know, when you asked me to come on, we decided we would talk a little bit about what to do about planet Earth, because I had mentioned the first time I was on your show that that was an important...
00:55:38.000 Let's get away from parkour.
00:55:39.000 Well, but my feeling is actually this is about parkour.
00:55:44.000 And my point is, where we are with civilization...
00:55:50.000 We're stuck and we are on a trajectory that you don't have to be deeply knowledgeable to recognize that it is unstable on enough different fronts that we can't go on like this much longer.
00:56:05.000 We're playing with powerful enough tools that we're in tremendous danger of something going wrong.
00:56:09.000 And so the question is, it's very hard to imagine How you use normal tools?
00:56:16.000 You know, are you going to win an election and get policy through Congress that's going to change the world and suddenly make us safe?
00:56:23.000 It's almost impossible to imagine something like that happening.
00:56:26.000 And so the question is, is there a parkour kind of innovation, something that is not obvious to us that it's there?
00:56:36.000 The city was always there.
00:56:37.000 People were not always doing parkour with the objects in it, but they could have been.
00:56:41.000 And so are we missing the obvious?
00:56:44.000 Is it in front of us what we're supposed to do to take a civilization that's hurtling out of control with too many people consuming too fast and using mechanisms that are dangerous?
00:56:57.000 Is there a route to put us back on track to something reasonable that looks like one of these innovations that you don't know it existed until somebody shows you that it's there?
00:57:08.000 So what do you think those things are?
00:57:12.000 Well...
00:57:12.000 I'd assume there's more than one, right?
00:57:14.000 There's more than one...
00:57:16.000 I would say the conversation doesn't sound familiar.
00:57:21.000 And...
00:57:23.000 I don't think anybody has the answer to that question.
00:57:27.000 Before I go deeply into this question, I should probably say something a little bit self-protective, which is talking about the question of what to do with planet Earth can be an idle discussion,
00:57:42.000 in which case there's nothing to be navigated.
00:57:45.000 But if you want to do it seriously, there's a danger of triggering a kind of I think we're good to go.
00:58:10.000 In order to have a conversation about what to do about planet Earth, obviously we're talking about very serious stuff.
00:58:18.000 And for anybody to contemplate that they might know or might be tuned into a conversation that could find its way to some new answer, we are in danger of triggering that, oh my God, that's arrogant circuit.
00:58:32.000 And so at some level, in order to have this conversation properly, I need to I need to turn off my own sensitivity to hearing that little warning bell in my own head.
00:58:43.000 And, you know, if the conversation is preposterous, fine.
00:58:50.000 That's something a reasonable person could conclude about anybody who was talking about changing the way the world functions.
00:58:55.000 Maybe it is preposterous.
00:58:56.000 And I leave that possibility open.
00:58:58.000 On the other hand, you know, I have kids.
00:59:02.000 I'm pretty sure I can do the math myself on how much danger we're in.
00:59:06.000 I may not know the full extent of it, but I can tell that we're in enough danger that we have to do something counterintuitive and different enough that it stands a chance of changing the way the place functions or my kids and your kids are in serious trouble.
00:59:22.000 So anyway, that's why I would go down this road.
00:59:27.000 But I have to do it in that kind of context where I'm not too worried about whether people hear this as, you know, me being full of myself or something like that.
00:59:36.000 Does that make sense?
00:59:38.000 Yes.
00:59:38.000 Why I would make that?
00:59:39.000 I get it.
00:59:40.000 See, I lack those self-protective instincts.
00:59:42.000 I just spout off.
00:59:44.000 All right.
00:59:44.000 Fair enough.
00:59:45.000 But go ahead.
00:59:46.000 Okay.
00:59:47.000 So...
00:59:49.000 There are a lot of...
00:59:50.000 So I should say, where does this all come from?
00:59:53.000 My initial foray into this style of thinking actually starts with Eric, who you had on your podcast.
01:00:02.000 I saw a lot of feedback about...
01:00:04.000 Him on your podcast.
01:00:05.000 He was great.
01:00:06.000 People were really jazzed about that.
01:00:07.000 You have a brilliant brother.
01:00:08.000 He's great.
01:00:09.000 I have noticed that.
01:00:10.000 He is absolutely amazing.
01:00:13.000 And there's no place to hide from him because he is so good across all levels of analysis.
01:00:20.000 So he obviously, even on your podcast, he was...
01:00:23.000 Playing around in biology space very adeptly.
01:00:26.000 I can't do that.
01:00:27.000 I can't go over into math space and do the same favor for him.
01:00:31.000 But anyway, yeah, he's a very interesting thinker and across many more levels than I think anybody else I've encountered.
01:00:40.000 But anyway, he some years ago after the financial collapse of 2008 decided that there needed to be a proactive discussion about what had gone wrong in economic space that had allowed that catastrophe to happen.
01:00:59.000 And so he and some collaborators Put together a conference called the Economic Manhattan Project.
01:01:08.000 It was at the Perimeter Institute in Canada.
01:01:11.000 And so I went, I attended this conference, and it was the first place I encountered an intentional conversation about changing a large enough piece of the puzzle to actually fix the way the world works, to prevent another financial collapse like the one that happened in 2008. And I also met people at that conference who have continued on in these conversations.
01:01:37.000 I joined Occupy.
01:01:41.000 I mean, not that there was anything really to join, but I participated in it in hopes that it would turn into something capable of changing the way we functioned.
01:01:49.000 And I ended up being very disappointed and frustrated by the quality of the conversation inside of Occupy Wall Street.
01:01:57.000 But anyway, it revealed some things to me.
01:02:01.000 And then after that, there was a group of people who gathered in something that ultimately was called Game B. And Game B is really where the thinking that I want to talk to you about.
01:02:16.000 It emerged most clearly.
01:02:18.000 Game B no longer exists, but a group of us from across the political spectrum, various different kinds of expertise.
01:02:27.000 We had tech people.
01:02:29.000 We had professors from various different disciplines.
01:02:32.000 We had a Buddhist.
01:02:33.000 I mean, we really had a lot of different people who were united basically by an understanding that That they each arrived at, that the trajectory we were on was so dangerous that it required us to take action.
01:02:46.000 And we tried out various different ideas about what might be sufficient to avert the danger we were heading towards and give humanity more time to find a way to exist on the planet.
01:03:02.000 So I should probably say something about what Game B means, and it carries a relevance into what we might do in the present.
01:03:10.000 Game B was basically proceeded from the idea that what we live in is a game-theoretic landscape, right?
01:03:19.000 That the winners in this game-theoretic landscape are individuals who have figured out Where there's a niche, some of them have figured out how to engage in something called rent seeking, which rent seeking basically means making money without producing value.
01:03:37.000 So there's a lot of stuff that goes on in our economy that is not productive and good, but nonetheless generates fortunes.
01:03:43.000 So that's rent seeking as opposed to innovation or productivity.
01:03:47.000 You talking about like hedge fund type stuff, moving money around?
01:03:50.000 Well, I want to be a little careful about this because it is quite possible for things like hedge funds to actually correct inefficiencies in the economy in a way that is productive.
01:04:02.000 That doesn't mean that that's the average thing that they do.
01:04:04.000 So what things are you referring to?
01:04:05.000 Well, I mean, you know, let's take the...
01:04:19.000 Right.
01:04:21.000 Right.
01:04:21.000 Right.
01:04:23.000 Cable company may produce some benefit.
01:04:27.000 They obviously have infrastructure that allows you to get content.
01:04:31.000 But what fraction of what you're paying for is actually about them delivering a service at some price and making some reasonable profit?
01:04:39.000 And what fraction of it is about the fact that they are an economic Goliath and that you don't have enough choice to be able to negotiate a decent price with them?
01:04:46.000 So there's some fraction of what they're producing that is productive, but then there's a large amount of profit there that isn't about productivity or innovation.
01:04:55.000 It's about the fact that they own a choke point and you can't get around it.
01:05:00.000 So we don't know what fraction of the economy is rent-seeking and what fraction of it is productive.
01:05:09.000 But especially if one is broad-minded about thinking about all the ways that one can engage in rent-seeking, One can actually be destructive of value.
01:05:20.000 If you destroy future well-being for our descendants, it may look productive in the present, but it isn't productive.
01:05:28.000 It's actually destructive.
01:05:29.000 So that's a kind of rent-seeking that we don't even typically model.
01:05:33.000 But where are we headed?
01:05:38.000 So, oh yes.
01:05:39.000 So...
01:05:41.000 We live in a game-theoretic landscape.
01:05:43.000 That's both good and bad.
01:05:44.000 As you point out, competition is a healthy thing.
01:05:48.000 And competition in markets produces a huge amount of value.
01:05:53.000 So I hear people deriding capitalism, and I always want to make the same point to them, which is, You've got two things glued together and you are challenging them as a package, but there's no reason they have to be packaged.
01:06:09.000 So we would be foolish to give up markets.
01:06:11.000 Markets are amazingly powerful engines of innovation.
01:06:14.000 They are capable of solving problems that we cannot solve deliberately even if we wanted to.
01:06:20.000 So we need markets, but we don't want markets Ruling the planet and deciding that anything that spits out a profit is therefore good and that we should be exposed to whatever the market discovers can be viable.
01:06:41.000 We're good to go.
01:07:08.000 We're good to go.
01:07:19.000 I think?
01:07:30.000 We're good to go.
01:07:53.000 Markets are good.
01:07:54.000 Allowing markets to discover any and every mechanism for exploiting you is not good at all.
01:08:02.000 And in fact, it's a large part of why we're in the predicament that we're in, is that we let the market decide what problems to solve, and then if it solves them...
01:08:11.000 And they're very lucrative.
01:08:13.000 There's no way to say no.
01:08:16.000 But aren't they attractive to people?
01:08:18.000 I mean, that's one of the reasons why a phone is so addictive, is because it's attractive.
01:08:21.000 Because you can get access to information at the drop of a hat, so you just constantly want to...
01:08:27.000 Feed that machine.
01:08:28.000 You constantly want to like, oh, what can I do?
01:08:31.000 Can I play a game on it?
01:08:32.000 Ooh, can I take a picture of myself?
01:08:33.000 Ooh, can I give myself a dog nose?
01:08:35.000 Ooh.
01:08:36.000 Right.
01:08:36.000 But imagine, I mean, it's amazing to me that it's even hard to do this thought experiment.
01:08:41.000 But put yourself in, you know, in your own mind 15 years ago.
01:08:49.000 And present yourself the deal that the phone represents.
01:08:53.000 You know, hey Joe, check this phone out, right?
01:08:57.000 This phone is going to allow you to navigate in a place you've never been.
01:09:01.000 It's going to connect you with all sorts of people who share your interests.
01:09:05.000 You're going to be able to say a sentence that you think is clever and suddenly hundreds or thousands of people are going to be able to react to it.
01:09:12.000 I mean, all sorts of marvelous things.
01:09:15.000 And then the point is, well, here's the downside, okay?
01:09:18.000 You're going to be hooked into megacorporations that are going to study your psychology and they are going to compete in order to keep you paying attention to their sight.
01:09:29.000 And they're going to become so sophisticated that you're going to lose control over your own mind.
01:09:35.000 You are going to become addicted to it in the way that you might become addicted to nicotine, right?
01:09:40.000 Right?
01:09:41.000 So that's a pretty high cost.
01:09:43.000 What's more, you are going to surveil yourself.
01:09:48.000 You're going to surveil yourself, and your only protection from surveilling yourself are going to be end-user license agreements that you're not going to be legally sophisticated enough to understand.
01:09:57.000 And so you're going to be at the mercy of whoever has access to your phone camera, your metadata, your, you know, all of these things.
01:10:11.000 Right?
01:10:12.000 I probably wouldn't.
01:10:13.000 You wouldn't?
01:10:13.000 I'd probably be like, eh, I'll put it down if I don't like it.
01:10:16.000 Okay.
01:10:16.000 Well, I must tell you, if you told me that I was going to be bugging myself with a sophisticated device like that, then I know I can't even turn it upside down because there's a camera on both sides of the damn thing.
01:10:30.000 Right.
01:10:31.000 The cost is really high, but we signed up for it incrementally in a way that never left the ability to say no.
01:10:38.000 And what's worse, it is now inconceivable.
01:10:42.000 If we discovered that the net cost of such a device exceeded the value of it by 10 times, we still couldn't get rid of them.
01:10:50.000 You can't pull them back.
01:10:51.000 You can't unmake them.
01:10:52.000 There's too much benefit to them, though.
01:10:54.000 You're making it seem as if it's only a negative.
01:10:57.000 But it's not only a negative.
01:10:58.000 It's also answering every single question you could ever have about anything technical, anything involving history, anything involving facts.
01:11:07.000 And obviously, in today's day and age with hashtag fake news, you're going to get a lot of bullshit facts in there as well.
01:11:13.000 But just the sheer access to information, the ability to contact each other instantaneously, there's a lot of pros to it.
01:11:19.000 Huge number.
01:11:21.000 Believe me, I'm not underrating the value of it.
01:11:23.000 I mean, you pointed to it yourself.
01:11:25.000 Having even just Wikipedia in your pocket is like, that's such a fantastic gift to have that access to information, not only on your home computer, but right there in your pocket.
01:11:37.000 That's amazing.
01:11:38.000 So I'm not saying the benefit isn't spectacular, and I've signed up for it like everybody else.
01:11:43.000 But the cost is very high and didn't have to be.
01:11:48.000 In other words, if you had set the bounds in which the market was going to solve this problem so that you, for example, prevented it from breaching our ability to protect our own privacy, you could have had the benefit of Wikipedia and instant communication and all of these things without the huge downside.
01:12:07.000 So the privacy issue being cookies or cameras?
01:12:11.000 Which one are you referring to?
01:12:13.000 Well, first of all, I think the cameras are a bit of a red herring.
01:12:15.000 I don't think anybody, first of all, there's a huge amount of data involved in video.
01:12:19.000 To the extent there's an issue, it would be more about the microphone and the fact that it can listen into conversations and basically track who's thinking what.
01:12:28.000 And there's so much power in that potentially that even if it's not being used presently, it's only a matter of time before somebody taps into that data and starts using it to shape things they are not entitled to shape.
01:12:41.000 Right.
01:12:42.000 So how do we fix it?
01:12:44.000 Well, so let me – we've gotten a little off track here.
01:12:48.000 Game A is what we live, right?
01:12:50.000 It's a market in which we decide how to behave and if we have insight, maybe we come out ahead.
01:12:55.000 If we don't have correct insight, maybe we lose.
01:13:00.000 But anyway, that's game A. It's the market as we find it.
01:13:05.000 Game B was the idea that there are ways that you could restructure the deal we have with each other so that you could compete in Game A's terms without...
01:13:22.000 Losing to Game A. So the conclusion, and again, Game B is not a live organization anymore, but it was a place in which a lot of work was done that I think feeds into the conversation about what we do very clearly.
01:13:33.000 In order to change the way the world functions, most of the mechanisms that have functioned in the past are no longer viable.
01:13:43.000 It is almost inconceivable to imagine that you could have a revolution in any standard sense that would successfully capture power and then wield it wisely.
01:13:55.000 I can't even imagine it happening.
01:13:57.000 So, Game B is the idea that one needs to Create an entity that is capable of competing in the market.
01:14:07.000 It is capable of competing in game A's terms and winning against game A. So game A is the way things run.
01:14:16.000 Game B is an alternative that can compete in game A's terms and win.
01:14:21.000 And that sounds, first time you hear it, it sounds preposterous because the system has so much inertia in it that you would think it is completely impervious to any challenge.
01:14:31.000 But there's a hidden factor which I think is evident in those various examples we were looking at in parkour, in Lars Anderson and his archery innovations, Danny McCaskill, I would also say Jane Goodall and her success at sorting out what was going on with chimps.
01:14:53.000 The point is, systems that become very difficult to dislodge, that have great inertia, are almost inevitably feeble in a particular fashion.
01:15:06.000 So this is true of academic disciplines, too.
01:15:10.000 If a discipline becomes stuck, it is very hard to get a hearing within the discipline, but the discipline...
01:15:23.000 We're good to go.
01:15:43.000 The goods to people at a low rate.
01:15:46.000 Most people are dissatisfied.
01:15:47.000 They are unhealthy.
01:15:50.000 They are not well protected from things like bad luck.
01:15:53.000 And those are all problems that can be solved by an entity that is capable of restructuring the deals between people.
01:16:03.000 In other words, let's take an obvious one like insurance.
01:16:09.000 Insurance is not well delivered by a market.
01:16:11.000 And there's a very good reason it isn't, which is that the strategy for winning in the delivery of insurance is Perfectly obvious.
01:16:20.000 You want to insure people who need it very little, and you want to uninsure people who need it a lot.
01:16:25.000 That's how you win at the insurance game.
01:16:27.000 So the insurance industry is always looking to make that deal with the world.
01:16:32.000 It's always looking to figure out how to dis-insure those who are most likely to need it.
01:16:38.000 And what that means is that we can't provide a risk pool A risk pool just means you don't know if you're going to get a brain tumor or I'm going to get a brain tumor so how about we both agree to pay for whoever's treatment needs it and whoever has the bad luck wins in that deal and whoever has the good luck loses but because we don't know who it is ahead of time it's a win for both of us.
01:16:57.000 So that structure is one that you can build inside of this competitive architecture and what I'm getting at is that The conversation of people that has coalesced,
01:17:14.000 people who are discussing the question of how to make things function in a way that solves the problems that we all face without having to win some unimaginable electoral victory or to challenge these governments outright,
01:17:32.000 that conversation centers around a game theoretic I mentioned before that I had participated in Occupy and had been quite disappointed and really where I was before I ran into this conversation was I was,
01:17:54.000 if I'm honest with myself, I was becoming a little desperate because I could see how much trouble we were in, but every mechanism that you might use to fix it seemed very unlikely to function.
01:18:05.000 When I heard a presentation that said, actually, there's a mechanism that does not go through any of the familiar historical means, but uses...
01:18:15.000 Tools that we all see deployed, right, the same tools that cause Facebook to be successful can be used to repair the system, that begins to sound plausible to me.
01:18:30.000 Does that make any sense?
01:18:32.000 Sort of, but we're on a long road.
01:18:36.000 Yes.
01:18:36.000 Is there a way to boil this down?
01:18:38.000 Well, let's try an example.
01:18:39.000 Okay.
01:18:40.000 How do you feel about Bitcoin?
01:18:43.000 I think it's fascinating.
01:18:45.000 You wish you had more of it?
01:18:46.000 Well, no.
01:18:47.000 I have some of it, but it's not mine.
01:18:49.000 It's all donated towards Fight for the Forgotten.
01:18:52.000 They're building wells in the Congo.
01:18:54.000 I had Andreas Antonopoulos on, and he introduced me to Bitcoin, so he set up a Bitcoin wallet for me.
01:19:00.000 I took donations from people, but I didn't think it was right.
01:19:03.000 They just gave me...
01:19:04.000 It was very little at the time.
01:19:05.000 But I said, I'll just give this to my friend Justin, who builds wells in the Congo.
01:19:10.000 So now it's worth...
01:19:11.000 Is it worth like 70 grand or something like that?
01:19:15.000 Something like that.
01:19:17.000 He's got to figure out when he wants to cash out.
01:19:19.000 It's up to him.
01:19:20.000 But so far they've gotten at least 10,000 out of it and we've built a bunch of wells with that money.
01:19:29.000 So I think it's great in that regard.
01:19:31.000 It's served an amazing purpose for those people in the Congo.
01:19:35.000 But people were actually mad at me that I didn't buy those things with Bitcoin.
01:19:40.000 That instead used the, I kept the Bitcoin, but gave him the money value of the Bitcoin.
01:19:45.000 They were upset, like, why didn't you just pay for it with Bitcoin?
01:19:48.000 But I wanted to see as an experiment where the Bitcoin goes, and it turns out it was a lucky guess on my part, and now it's worth far more.
01:19:56.000 Because what was worth $5,000 at the time is now, yeah.
01:20:01.000 It's 100 grand.
01:20:02.000 Crazy.
01:20:03.000 So I like it.
01:20:04.000 Okay.
01:20:05.000 I like it, too.
01:20:06.000 And whether it's Bitcoin or not, there's something clearly happening in the blockchain cryptocurrency world.
01:20:14.000 So the world is trying to figure out which of these currencies is going to function at the moment its blockchain is looking the most promising.
01:20:21.000 There are obstacles to it functioning.
01:20:23.000 There are ways in which those obstacles are being addressed and, you know, it'd be pointless to get into the details of it.
01:20:30.000 I like the community.
01:20:30.000 I really like the idea behind it.
01:20:33.000 To me, I like things that don't have a whole lot of rules where people sort of figure out what's right.
01:20:40.000 Right.
01:20:41.000 Good.
01:20:42.000 So if we take...
01:20:45.000 Bitcoin as an example of something that addresses the problems of fiat currency like the dollar.
01:20:53.000 Nobody asked permission to build it.
01:20:55.000 In fact, we don't even know who innovated it.
01:20:57.000 It's a pseudonym that we have.
01:20:59.000 We don't know whose identity it is.
01:21:01.000 I heard recently someone thought it was Elon Musk, which I would not be surprised, that crazy guy.
01:21:07.000 You know, I couldn't say, and I think it doesn't much matter.
01:21:10.000 What we know is that somebody, without asking permission, found a We're good to go.
01:21:43.000 That's not hard.
01:21:44.000 Bitcoin functions inside of this realm.
01:21:48.000 It's not illegal.
01:21:49.000 It works.
01:21:50.000 It's reliable.
01:21:51.000 The problems have been solved by people because solving those problems made sense.
01:21:56.000 It enriched them for doing so.
01:21:57.000 They made their currency more functional.
01:22:00.000 So that entity is a solution that is superior to Game A. But it functions in game-made terms.
01:22:11.000 It's winning in game-made terms at the moment.
01:22:13.000 Now, I'm not saying it isn't going to crash.
01:22:14.000 It probably is going to crash because it's probably overinflated at the moment.
01:22:18.000 How much will it crash?
01:22:19.000 Will it come back?
01:22:20.000 Are we going to go through repeated bubbles and Bitcoin will still win out?
01:22:26.000 Clearly nobody knows.
01:22:27.000 But the idea that it is a superior solution invented inside this other system that is winning against the dollar at the moment.
01:22:38.000 That's a fine example.
01:22:41.000 Likewise, you could use...
01:22:43.000 Wikipedia, as a fine example.
01:22:45.000 This is an entity that is functioning inside.
01:22:48.000 It is competing with the old encyclopedias.
01:22:51.000 It is competing with for-profit services that would deliver information.
01:22:55.000 And frankly, it's winning because it's superior.
01:22:59.000 Is it perfect?
01:22:59.000 No.
01:23:00.000 But it's a demonstration that you can do things inside this space that, in fact...
01:23:07.000 Have reorganized our relationship to information.
01:23:10.000 And in fact, in a way that we don't typically acknowledge is challenging the Academy.
01:23:18.000 I mean, maybe part of why the Academy was so feeble at the point that this social justice madness started to challenge it had to do with the fact that without the Academy's permission, information became free.
01:23:31.000 The fact that everybody was a level playing field for information meant that the academy needed to figure out what it was going to deliver on top of that information and it didn't figure it out.
01:23:41.000 And I would say there was an obvious answer which it missed.
01:23:44.000 It needed to deliver stuff that didn't scale.
01:23:46.000 It needed to teach insight and critical thinking and how to wield that information properly rather than continuing to deliver textbook level information when effectively textbooks are obsolete.
01:23:58.000 But these are examples of successful, competitively successful, innovative challenges to the model that preexisted them.
01:24:09.000 And the question is, can that set of models be systematized so that it...
01:24:18.000 Without having to do the impossible, simply replaces the system as it stands because it delivers the things that the system claims to deliver more successfully than the system delivers.
01:24:29.000 Okay, like what things we're talking about?
01:24:35.000 Insulation from bad luck.
01:24:38.000 Luck is a tremendously negative influence.
01:24:41.000 So like insurance, but something that we collectively utilize?
01:24:45.000 Correct.
01:24:46.000 Something that we collectively utilize.
01:24:48.000 So imagine that you could have wonderful insurance, but that in signing up for that insurance, you were agreeing to some sort of larger social entity.
01:24:59.000 So like your taxes would go towards life insurance, but meaning like things that go wrong in life.
01:25:06.000 Like some part of what you would spend on things would be attributed to this fund.
01:25:13.000 Yeah.
01:25:14.000 And I think we unfortunately default to thinking of everything in monetary terms.
01:25:20.000 You could also invest in such an entity.
01:25:24.000 Let's talk about the question of teachers that you were pointing out.
01:25:27.000 Why are good teachers so few and far between?
01:25:31.000 Well, of course, we pay at a level that we get exactly what we ordered.
01:25:35.000 And the few good teachers that we run into are by and large people who are doing it in spite of the fact that they're being economically penalized.
01:25:42.000 But what if your insurance, your access to excellent insurance that correctly hedged out the danger of medical bad luck came with some sort of social obligation in which,
01:25:58.000 I don't know, the three years after you had...
01:26:02.000 Gone to graduate school and gotten your advanced degree in something, you spent teaching in some school that needed that.
01:26:09.000 So you didn't sideline yourself from the economy for the rest of your life teaching in some school where you were forever going to be hobbled by bad administrators, but you decided to take some period of time and invest it in your community or somebody else's community using expertise that you got that you'll be highly paid for later.
01:26:28.000 So it's almost like you have mandatory military service in a way.
01:26:33.000 Mandatory military service is, yes, it is one version of a much larger space of potential agreements that you could sign up for in exchange for benefits that you can't, most of us cannot negotiate on the open market.
01:26:48.000 So to get your education, you would agree to use that education for the good of the community for a certain amount of time?
01:26:57.000 Absolutely.
01:26:58.000 Absolutely.
01:26:59.000 So instead of walking away with massive debt that is going to hobble you in economic terms as you're trying to find your niche, that you would sign up for some agreement that was, you know...
01:27:13.000 But wouldn't that still benefit the elite?
01:27:16.000 Because what if Scrooge McDuck has a kid, and Scrooge McDuck's kid, he pays for his kid's education.
01:27:22.000 They'll say, listen, son, you're not going to do any service.
01:27:24.000 What you're going to do is use those three years to get ahead.
01:27:26.000 And those little fucks, when they get out of that service, they're going to be working for you.
01:27:30.000 Ha ha ha ha ha!
01:27:31.000 They throw the gold coins up in the air.
01:27:33.000 Right.
01:27:33.000 So this is another place.
01:27:37.000 Nice look.
01:27:39.000 This is another place where two things are fused together that we need to tease apart.
01:27:45.000 Okay.
01:27:46.000 I'm trying to remember.
01:27:47.000 I think Eric may have actually said this on your podcast.
01:27:50.000 If he didn't, he said it elsewhere.
01:27:52.000 But elites is not a good category.
01:27:56.000 Right.
01:27:56.000 Elites takes two things that don't belong together and it decides that they are one.
01:28:01.000 And so the Scrooge McDuck thing becomes...
01:28:05.000 It blocks the other thing.
01:28:07.000 We want people to innovate.
01:28:10.000 Right.
01:28:11.000 And...
01:28:12.000 One of the reasons that equality of outcome is absolutely not desirable, even if you could arrange it, is that the inequality of outcome is the incentive that drives people to achieve amazing stuff that we want them to achieve.
01:28:25.000 Right, of course.
01:28:26.000 So how far ahead of everybody else should you end up?
01:28:31.000 Well, this is a difficult problem because you, to the extent that somebody...
01:28:35.000 It earns a fortune because they have innovated in an important way, but then they have gone on to be a rent seeker.
01:28:45.000 We don't want them rewarded for their rent seeking, but we do want them rewarded for their innovation.
01:28:51.000 So, the problem is, the elite is two different things, and even worse, individuals who are members of the elite are composites of both things, where you get into the elite because you innovated something amazing,
01:29:09.000 but the degree to which you have been rewarded is, I don't know, 60-70% the result of rent-seeking rather than what you innovated.
01:29:18.000 And again, for people just tuning into us now, rent-seeking meaning doing things to which you extract money from the system without any real benefit to the people that are around you.
01:29:27.000 Yeah, it's any time that you get paid without producing something of value, either innovation or productivity itself or something like that.
01:29:36.000 But how would you regulate that?
01:29:37.000 How would you figure out a way to regulate the amount of profit that someone could...
01:29:42.000 Like, if you have a system, right?
01:29:44.000 If you build something and then you get, you know, some sort of a residual benefit from that system because you built it.
01:29:50.000 And so you're not really doing anything, but you're just constantly collecting money from this thing.
01:29:54.000 How would you stop that or regulate that?
01:29:57.000 Is that a good example of it, the way I just described it?
01:30:00.000 Yeah, it's a pretty good example.
01:30:05.000 It's a very tough conversation because one has to be very careful that you don't remove an incentive to do something valuable, even though the value of it may be very subtle.
01:30:17.000 So the example of investment, like playing the stock market...
01:30:23.000 Might look unproductive, but to the extent that you are correcting the fact that certain things are undervalued and other things are overvalued, you are actually doing a kind of service that is not as obvious on the outside unless you've spent time thinking about the logic of why you want the market to be efficient.
01:30:40.000 So I don't want to declare that certain things are in and of themselves bad because we can't see the obvious value of them.
01:30:49.000 On the other hand, there's an awful lot of stuff that is either totally valueless for which people are very handsomely paid or worse.
01:30:59.000 Counterproductive, destructive of value, right?
01:31:01.000 If you take waste and you get paid to dispose of it and you dispose of it in a way that it creates cancers where you can't detect that they've been created by what you've done, but it's not that you solved the problem of that waste.
01:31:14.000 You just caused cancers in random homes that won't be able to trace their misfortune to your action.
01:31:20.000 That's not Not only is that unproductive, but it's counterproductive.
01:31:24.000 It's harmful.
01:31:25.000 So how do you address these questions?
01:31:29.000 Well, A, this is a much harder problem if you imagine that what you want to do is fix the landscape that you're walking into and say, you're a rent seeker and you're productive and you're 30% productive but 70% a rent seeker.
01:31:46.000 Nobody believes you can do that.
01:31:48.000 What you can do is restructure things so that going forward, what is rewarded is actual productivity that is not harmful or actual innovation that is not harmful.
01:32:00.000 And what is penalized and what you really want, if the system is to function...
01:32:06.000 What you want is a disincentive to do anything that hurts other people, that has a net negative impact on the system.
01:32:15.000 Like the BP oil spill, for instance.
01:32:17.000 BP oil spill, the Aliso Canyon leak, the Three Mile Island, Fukushima to the 10th degree.
01:32:26.000 So how do you...
01:32:27.000 I'm still confused as to what's different.
01:32:29.000 Like, what's going on here?
01:32:32.000 What is going on is that if you...
01:32:37.000 If you array incentives so that at the point you have solved a problem that is good to solve, that the well-being starts flowing in your direction.
01:32:48.000 You made a perfect widget.
01:32:49.000 Everybody goes, oh my god, this fixes my life.
01:32:52.000 Right.
01:32:52.000 I love this widget.
01:32:53.000 I'm going to buy a bunch of them.
01:32:54.000 Then you start balling.
01:32:56.000 You get a Paul Wall grill.
01:32:57.000 There you go.
01:32:58.000 Oh, yeah.
01:32:59.000 Exactly.
01:32:59.000 Big house.
01:33:00.000 Exactly.
01:33:01.000 But as you start moving in the direction of doing something that interferes with other people's well-being, but nonetheless they can't help themselves, if you're innovating how to addict people to their phone, then actually you're hurting people.
01:33:14.000 And we don't want you to do that.
01:33:16.000 Now, it's very hard.
01:33:16.000 Are you going to tell Facebook what it's allowed to study?
01:33:22.000 Have good intentions, but it turns out that they're, like Facebook is a perfect example.
01:33:27.000 One of the executives from Facebook, I was just reading on Digg the other day, there was an article where he was sorry for what they've done.
01:33:35.000 It was one of the original guys from Facebook.
01:33:37.000 It's like, I really think that we've done a terrible thing with Facebook, and we've made people addicted to social media.
01:33:42.000 Facebook seems to me to be particularly addictive, and I'm not exactly sure what they've done different than anybody else, but so much so that I kind of avoid Facebook.
01:33:51.000 You know, it's funny how many conversations I've had in the last month in which somebody has said that they're avoiding Facebook, including me.
01:33:57.000 I can't go there anymore.
01:33:59.000 I like Instagram, because I just see pictures, and they look pretty, and I'm simple.
01:34:04.000 Well, yeah, I mean, each of these things has their value.
01:34:08.000 I'm not on Instagram.
01:34:09.000 I'm on Twitter.
01:34:11.000 I'm on Twitter, too.
01:34:12.000 For me, what's good about it, is this it?
01:34:14.000 You're being programmed, former Facebook executive Warrens.
01:34:17.000 For me, for my business, it's very important to let people, hey, Brett Weinstein is going to be on today.
01:34:23.000 Stein, man.
01:34:24.000 It matters this month.
01:34:25.000 Sorry, Harvey fucked everything up.
01:34:26.000 Yes, exactly.
01:34:27.000 He did.
01:34:27.000 Stein.
01:34:27.000 I said it right earlier.
01:34:29.000 Brett Weinstein is going to be on today.
01:34:31.000 People tune in and now there's people that are listening right now because of social media.
01:34:36.000 And then comedy shows that I have coming up.
01:34:39.000 It works for me in a lot of those ways.
01:34:41.000 But as time has gone on, I've pushed it away in most other ways.
01:34:48.000 Well, and the thing, the interview that you just referenced, I saw it too and was quite blown away by it.
01:34:54.000 I've been tuned into that because Tristan Harris is a friend of mine, and Tristan Harris is sort of the Paul Revere on this issue who has pointed out how much danger we are actually in.
01:35:07.000 And I must say, he's a very interesting guy because his other area of expertise is magic, and so he's very interested in illusion.
01:35:15.000 Ah, sleight of hand.
01:35:16.000 Yeah, exactly.
01:35:17.000 Interesting.
01:35:17.000 And so anyway, he's watched as an insider as these economic Goliaths have conspired to not let us go and to turn their product from a facilitator of social interaction into a cigarette, which is, you know, or a slot machine or something like that.
01:35:34.000 But the...
01:35:37.000 How would you penalize them then?
01:35:40.000 Like, say, in this new system, how would that work?
01:35:42.000 Say if you came up with this new widget, and this new widget does amazing things, but turns out it also makes you addicted to widgets.
01:35:50.000 Okay, so I'm speaking only for myself here.
01:35:53.000 I would not penalize Facebook or Twitter, but what I would want to see is somebody generate the alternative that has the benefit of not doing that to you.
01:36:05.000 Okay, so a new Facebook that doesn't work with likes and all these things, you're not constantly checking on likes.
01:36:12.000 But it wouldn't be, here's the thing, it wouldn't be as successful.
01:36:15.000 People love likes.
01:36:16.000 That's why girls stick their butt out in those pictures.
01:36:18.000 They want to get those likes.
01:36:19.000 That's what that's all about.
01:36:20.000 Is that what that's all about?
01:36:21.000 It is.
01:36:22.000 That's what it is.
01:36:23.000 That's interesting, yeah.
01:36:24.000 They've said, first of all, that people that do that...
01:36:28.000 You know what those things are called?
01:36:29.000 Which things?
01:36:30.000 When girls stick their butt out and they have thirst traps.
01:36:34.000 Thirst traps?
01:36:35.000 Yeah.
01:36:35.000 You don't know about that?
01:36:36.000 I don't know.
01:36:36.000 You teach kids you don't know about thirst traps?
01:36:37.000 I don't know about thirst traps.
01:36:39.000 I'm going to tell you.
01:36:40.000 All right.
01:36:40.000 Thirst traps are you look for people that are thirsty.
01:36:43.000 They're like, ooh, girl, you look good.
01:36:45.000 Damn, you look good.
01:36:46.000 People who are like extra thirsty.
01:36:49.000 You know what thirsty means?
01:36:50.000 I'm getting it, but go ahead.
01:36:51.000 You never got that before?
01:36:52.000 No.
01:36:52.000 You ever heard of thirsty?
01:36:53.000 Uh-uh.
01:36:53.000 Okay.
01:36:53.000 You're married.
01:36:54.000 You've been around...
01:36:56.000 Intellectuals trapped up in progressive Pacific Northwest.
01:36:59.000 You're not going to embarrass me.
01:37:00.000 I'm not!
01:37:01.000 You're my friend.
01:37:02.000 No, don't worry.
01:37:02.000 Thirsty is people who are trying too hard.
01:37:08.000 Like, you're not the type of person, if you saw someone who's a beautiful girl who's in a bikini, you would say, wow, that is a beautiful girl in a bikini.
01:37:16.000 What an incredible body she has.
01:37:17.000 And you would move on.
01:37:18.000 You wouldn't be like, damn, girl, you look so fine.
01:37:21.000 How can I get with you?
01:37:22.000 If you did, you would be super thirsty.
01:37:26.000 That would be thirsty.
01:37:27.000 You're trying too hard.
01:37:29.000 But the internet is filled with thirsty people.
01:37:33.000 And so a lot of these girls become famous.
01:37:37.000 There's girls that you've never heard of them.
01:37:40.000 I had a bit about it in my last special.
01:37:44.000 There was a girl, all she does is take pictures of her butt.
01:37:47.000 And at the time, she had like 7 million followers.
01:37:50.000 Now I bet she's got 100 million or something.
01:37:52.000 I don't know.
01:37:52.000 But that these people become these places where everybody goes to stare at their butt.
01:38:01.000 And these pictures of them and their bodies and all these different things are traps for all these weird people that lack normal social skills.
01:38:12.000 And they're uber-thirsty.
01:38:15.000 That's a thirst trap.
01:38:17.000 So that's the whole reason why these people use things like Instagram.
01:38:22.000 Thirst trap!
01:38:24.000 How to thirst trap on Instagram with Cardi B. See?
01:38:27.000 Uh-huh.
01:38:28.000 See, I'm talking about the kids today.
01:38:30.000 Okay.
01:38:31.000 This girl.
01:38:31.000 I'm beginning to think maybe we can't save the world, but...
01:38:34.000 Listen, we can save the world, but the problem is we have to be cognizant of normal human desires and, like, the traps.
01:38:44.000 Thirst traps.
01:38:45.000 Okay.
01:38:45.000 So this is actually the perfect place to go, then, because...
01:38:49.000 One of the biggest obstacles to fixing the world is that although a huge fraction of the population is actually aware that things are off and they would like it to be better, there's so much low-level stuff that keeps us trapped in these unproductive kinds of cycles.
01:39:09.000 And one of the things that I keep running into now, I'm now being included in all of these conversations with folks who...
01:39:16.000 Do aspire to something better.
01:39:19.000 But everybody, and I mean really just about everybody, has stuff that to them is sacred and they want to take it off the table, right?
01:39:29.000 They're very interested in the conversation about how we might fix the world.
01:39:32.000 But, you know, if they're a libertarian, the point is, as soon as you can't even finish the word regulate.
01:39:38.000 And they're just like, oh, well, sorry, you know, who watches the watchers?
01:39:43.000 And the point, it's a bitter pill for just about everybody who's got some sacred thing that they're holding on to, is you are, if we deploy something that functions well and is capable of replacing the system we have without some gigantic catastrophe necessary in order to get over the transition,
01:40:08.000 The whole point is to everyone's net benefit.
01:40:14.000 If liberty is your thing, and I'm virtually sure liberty is your thing as it is my thing, that you will get more liberty.
01:40:24.000 Net liberty will go up in a system that functions well.
01:40:28.000 Many of the things that cause us not to be free have nothing to do with governmental regulation.
01:40:34.000 They have to do with expectations that have been created by a market that does not have our interests at heart.
01:40:41.000 And so if you're tracking net liberty, We're good to go.
01:40:59.000 To wrap their minds around the possibility of regulation that they wouldn't hate.
01:41:04.000 We are all so experienced now living in a world of malignant government where government action almost can't be useful and so it is natural to rebel against it and say I don't want any more of that.
01:41:21.000 The less the better because the actions tend to be predatory but that is not the inherent nature of regulation and so Constructing a set of incentives that cause the market to deliver you the good parts of what a phone does without secretly addicting you to something that we now know.
01:41:42.000 I don't know how many people in Silicon Valley have now issued a note of caution.
01:41:48.000 Many have switched to flip phones themselves.
01:41:51.000 They have.
01:41:51.000 I mean, this is nature's way of telling you that these algorithms have escaped our control.
01:41:58.000 The fact that the people who are in a position to make a phone call and know more or less what the algorithm does can't even protect themselves...
01:42:06.000 That ought to set off warning bells for us.
01:42:09.000 Those people are in the best position to protect themselves.
01:42:11.000 And the fact that they are bending over backwards, they are externalizing decision-making power, they're having their secretaries tell them when they can interact with certain sites in order to keep them from getting into habits that they can't manage.
01:42:25.000 This is the only warning we're going to get.
01:42:28.000 This is bad.
01:42:29.000 We could still do something about it.
01:42:31.000 But this is only getting more sophisticated.
01:42:33.000 And so if we do want to restructure things, and I would argue that even though market fundamentalists will hear their little sacred thing being challenged, what we really want to do is free markets to do what the brochure says that they do.
01:42:52.000 While eliminating what the brochure never mentions.
01:42:55.000 The brochure doesn't mention the fact that a totally free market produces predators and parasites at a huge rate, right?
01:43:04.000 It doesn't have to.
01:43:05.000 We can structure things such that a predator is not viable, so that a predator has nothing to eat.
01:43:11.000 And if the predator has nothing to eat, the habitat won't have them, right?
01:43:15.000 So that is the perspective, that But how would we do that?
01:43:22.000 These are abstract ideas, right?
01:43:24.000 How would we eliminate predators?
01:43:27.000 How would we eliminate predators?
01:43:29.000 You would eliminate predators by disincentive...
01:43:37.000 Your question actually has a hidden assumption built into it.
01:43:42.000 You've seen the market as a mature entity with lots of full-grown predators.
01:43:47.000 But just as it is with biology, all of those predators started with something simple.
01:43:53.000 And what happened was they tapped into a niche.
01:43:58.000 And because that niche was allowed to exist, the predators grow and they get more and more sophisticated at doing what they are doing.
01:44:07.000 If you don't want to see the predators, you eliminate the niche for predation.
01:44:12.000 This sounds like it would be functional if there was like 100 people.
01:44:16.000 Well, first of all, this is one of the primary questions in the various conversations where people are trying to figure out how to bootstrap such a thing, is that we have what's called Dunbar's number.
01:44:28.000 And Dunbar's number is basically a limit in the low hundreds of how many people you can keep in your head.
01:44:34.000 Right.
01:44:35.000 And so the point is, we are adapted to that.
01:44:37.000 And that number is probably an indicator of something like the number of people that you can adaptively interact with.
01:44:45.000 I mean, in other words, if you had, I mean, this is really the motivation for your question.
01:44:50.000 If you had 150 people and somebody was a bad actor, their reputation would precede them and you would detect, I should be careful interacting with that person.
01:44:59.000 So the structure would be set up for tribes, which is essentially how we evolved, right?
01:45:04.000 I mean, that's what Dunbar's number essentially reveals, is that we evolved growing up in groups of 50 to a couple hundred people, and those are the amount of people that you can keep in your mind.
01:45:16.000 We have hard drive space, essentially.
01:45:18.000 We do, but we also, human beings are very good at taking a technological solution and kludging or hacking a remedy for a problem like that.
01:45:33.000 You on board, in your mind, have the ability to track something like 150 individuals with respect to their reputation so that you know how much to trust in any given interaction.
01:45:44.000 In a group of, you know, 1,500, you don't.
01:45:48.000 On the other hand, reputation can accumulate in some way that you can check it through people who you know.
01:45:54.000 Right.
01:45:56.000 Are directly known to you are capable of giving you a reference and in fact you you know that this works because Interpersonally if somebody you trusted to a great extent gave you a recommendation of somebody else you would you would know how to evaluate it so We are,
01:46:14.000 unfortunately, for both better and worse, we are living in a technological landscape that doesn't look like anything that our ancestors faced.
01:46:24.000 That provides mechanisms for building solutions to problems that in an ancestral environment would not have been possible.
01:46:35.000 And this is not new.
01:46:36.000 I mean...
01:46:37.000 The library at Alexandria is a technological solution to the problem of information having expanded to a level that a human mind couldn't hold it.
01:46:47.000 And ironically, burned down by ideologues.
01:46:50.000 Well, sure.
01:46:51.000 I mean, that's, you know, yes.
01:46:53.000 So you have to build a structure that is robust to challenge.
01:46:57.000 And of all of the things that would have to be true for a replacement system for planet Earth, The key is understanding that there are certain values that all reasonable people agree on.
01:47:09.000 In fact, you can use to diagnose who's reasonable.
01:47:12.000 So assuming that we don't all start at the same starting point, right, whether it's from our cognitive ability or education or opportunities, how would you stop predators?
01:47:23.000 How would you stop people from preying on the weak?
01:47:26.000 How would you stop, like...
01:47:28.000 Because there are predators who they themselves are unfortunate mentally.
01:47:34.000 They themselves have a deficit of thinking.
01:47:38.000 And we're dealing with such numbers when we're dealing with 300 and whatever million people we have in this country alone.
01:47:44.000 There's plenty of dummies that you could prey on.
01:47:47.000 Well, the first thing that you would want to do is you would want to build.
01:47:50.000 So if you came to me and you said, I have a social network and it provides 85% of the functionality of Facebook, but it insulates you completely from dopamine traps being used to addict you,
01:48:07.000 I'd sign up in a second.
01:48:09.000 Okay, right.
01:48:10.000 But you are an intellectual and you are rare in that regard that you're worried about this.
01:48:16.000 These people that are in Silicon Valley that are talking about the dangers of the things they've created themselves, they're rare.
01:48:22.000 Most people are like, look at all the likes!
01:48:24.000 Right, sure.
01:48:25.000 Look at all these likes!
01:48:26.000 But, well, but, okay, so there's some...
01:48:29.000 Plus they're bored at work, right?
01:48:30.000 They want to check their likes?
01:48:31.000 Yes, but I mean...
01:48:32.000 Look at my likes for my butt pic.
01:48:34.000 That's a lot of it.
01:48:36.000 You must have a nice butt.
01:48:37.000 My butt's not bad.
01:48:38.000 Okay.
01:48:39.000 But the...
01:48:41.000 First of all, things spread in waves.
01:48:46.000 And, you know...
01:48:49.000 It is possible to just look at the landscape and say, well, yeah, only intellectuals are going to get why they should want such a thing.
01:48:55.000 But it's really, it's not accurate.
01:48:56.000 I mean, for one thing, the world is talking about blockchain currency.
01:49:00.000 The world is not talking about blockchain currency.
01:49:02.000 A very small percentage of the world is talking about blockchain currency.
01:49:05.000 The same amount that are talking about the earth being flat.
01:49:09.000 I don't think that's right.
01:49:11.000 I bet it is.
01:49:12.000 But I would say, I grant your point, that it's not a huge percentage of people who are talking about blockchain currency.
01:49:18.000 But it's enough that on the network news, people are talking about blockchain.
01:49:21.000 Is it a bubble?
01:49:23.000 So it is beginning to penetrate the public consciousness.
01:49:29.000 Penetrating it because people have to navigate lives in which economic fluctuation jeopardizes them.
01:49:36.000 And so there's an incentive.
01:49:37.000 If there's something over there, blockchain is not a joke anymore.
01:49:41.000 Do you think that blockchain, the people that are involved, interested, and comprehensively understand what blockchain is, are there more or less of them than people that are in cults?
01:49:55.000 There are vastly more paying attention.
01:49:59.000 Than people in cults.
01:50:00.000 Yeah.
01:50:01.000 Well, I mean, you'd have to define a cult.
01:50:02.000 Yeah, you'd have to define a cult.
01:50:03.000 See, because I think that's way off, because I think there's like a billion Catholics.
01:50:07.000 Well, if you're going to call Catholicism a cult, then.
01:50:09.000 It's a cult.
01:50:09.000 I grew up in it.
01:50:11.000 Okay.
01:50:11.000 It's 100%.
01:50:12.000 It's just a cult with a billion people.
01:50:14.000 Whenever you've got a guy who dresses like a wizard, he's sitting on a golden throne.
01:50:18.000 Oh, boy.
01:50:19.000 We're here, aren't we?
01:50:20.000 Yeah.
01:50:21.000 But that is why this is a problem, is that there's many people that live their lives by these...
01:50:30.000 Ridiculous ideologies that are illogical.
01:50:33.000 Okay, so I'm gonna challenge you on this.
01:50:36.000 So how would you protect them from predators and Facebook likes?
01:50:39.000 I'm gonna challenge you on this.
01:50:41.000 Okay, please do.
01:50:41.000 Catholicism is not a cult.
01:50:43.000 What is the difference between a cult?
01:50:45.000 I had a bit for my act.
01:50:47.000 A cult is bullshit, and it's created by one person, and he knows it's bullshit.
01:50:53.000 In a religion, that guy's dead.
01:50:56.000 Okay.
01:50:58.000 So let me...
01:50:59.000 Boy, this is rough territory.
01:51:01.000 For you.
01:51:02.000 Okay, let's say...
01:51:03.000 For me, this is every day.
01:51:04.000 Let's say, yeah.
01:51:05.000 All right.
01:51:06.000 Let's say Moses comes down the mountain with the tablets.
01:51:08.000 Oh, that dude.
01:51:09.000 Okay.
01:51:09.000 I don't know if Moses did come down the mountain with the tablets.
01:51:11.000 I don't think he did.
01:51:12.000 Let's say he did.
01:51:12.000 But okay.
01:51:13.000 Did he know he was bullshitting people?
01:51:15.000 Well, do you know what religious scholars actually believe in Jerusalem actually believe that was all about now, the whole burning bush?
01:51:21.000 Tell me.
01:51:21.000 They believe it was the acacia bush, which is rich in DMT, and they think the metaphor of the burning bush was actually a psychedelic experience, and that Moses, during this psychedelic DMT experience, came back from the other dimension that you go into when you go into the DMT trance with all this Really standard messages that I myself have gotten from these psychedelic experiences that you have to treat each other as if we're all one and that our separations are all illusions that you are literally living
01:51:52.000 a life that if I was born in your body and I had your genetics I would be you and you would be me because we are all the same and our differences are really what the illusion is we are these temporary beings And that negative thinking and negative feelings and all these things manifest themselves in negative actions and negative thoughts.
01:52:11.000 And you can change that.
01:52:13.000 You can change the frequency in which you exist in this world.
01:52:15.000 Okay.
01:52:15.000 Like, this is essentially what the 3,000-year-old version of Moses, or more than that, Moses' tablets were.
01:52:22.000 That they were...
01:52:24.000 God was the burning bush.
01:52:26.000 Perfect.
01:52:27.000 Perfect.
01:52:28.000 So let me ask you a question.
01:52:29.000 You've done some hallucinogens.
01:52:31.000 Oh, yeah.
01:52:31.000 And you've had some insight.
01:52:32.000 Yeah.
01:52:33.000 And that insight had something to do with treating people well.
01:52:36.000 Yes, it definitely did.
01:52:37.000 Okay.
01:52:37.000 Was it true?
01:52:39.000 Well, it definitely has benefited me, true or false.
01:52:43.000 If you ask me, is it true that you have had positive experiences from psychedelic drugs where you have interpreted those experiences and improved your life?
01:52:53.000 Yes, that's true.
01:52:54.000 Okay.
01:52:54.000 So if you now go and you convey that thing to somebody who hasn't had the experience themselves, is it bullshit?
01:53:02.000 Well, here's the thing about psychedelics as opposed to all the other ideologies is that they're very repeatable.
01:53:07.000 You don't have to believe in DMT. If you smoke it, you're going to experience it whether you believe in it or not.
01:53:11.000 Right.
01:53:12.000 And actually, I think this is, I mean, I am a cautious fan.
01:53:17.000 I say cautious because I don't think, I'm not a fan of the idea that these substances should be used recreationally.
01:53:24.000 I think that's a mistake.
01:53:25.000 I agree with you.
01:53:25.000 I think it's fine to have a great time, but that these things are so powerful The only thing that I like about doing it recreationally is it's going to get more people to do it.
01:53:40.000 And that if you think it's recreational and then you do it, there's going to be a certain percentage of those people that go, what was that?
01:53:46.000 That's not what I thought it was.
01:53:47.000 I thought it was going in there to have a good time, and I just communicated with God.
01:53:52.000 Yep.
01:53:53.000 Air quotes God.
01:53:54.000 Yeah, air quotes.
01:53:55.000 Well, but air quotes God is marvelous because air quotes God is the real deal.
01:54:00.000 Yeah.
01:54:01.000 Right?
01:54:01.000 In other words, it's not a dude on a cloud.
01:54:03.000 Right.
01:54:04.000 It's something, it is a metaphor for something lodged very deep in the mind where you can't find it directly.
01:54:10.000 And this is a hack that many cultures have used to access that layer.
01:54:15.000 Right.
01:54:15.000 And it's all very familiar to us.
01:54:16.000 And when you talk to people that have studied DMT in particular, one of the reasons why I think it's so familiar to us, when you have this experience, one of the first things that happens is you feel like you've been there before.
01:54:28.000 And they believe that this is because during REM sleep, your brain produces DMT. It's very difficult to monitor, but they have been able to, through the Cottonwood Research Foundation, which all started from the work of Dr. Rick Strassman out of the University of New Mexico, who wrote a book called DMT the Spirit Molecule,
01:54:46.000 Which was one of the very first times where the DEA allowed them to do clinical studies on people with intravenous dimethyltryptamine, which is like fucking serious shit.
01:54:57.000 So instead of like this 10 to 15 minute trip, you're gone for a long time, half hour plus and deep, deep experiences that a lot of these people mirrored.
01:55:07.000 They had like super similar experiences.
01:55:12.000 But through the Cottonwood Research Foundation, they found that live rats are producing DMT in their pineal gland.
01:55:18.000 This has been proven now, which was really just speculation.
01:55:22.000 There was anecdotal evidence, but now they know that rats produce this.
01:55:26.000 So they don't know exactly when people do it because they would have to do the same thing they do to rats.
01:55:30.000 They'd have to open your brain up until they develop some sort of sophisticated detection methods.
01:55:34.000 It's just speculation as to when the brain's producing this incredibly potent psychedelic drug.
01:55:41.000 It's there.
01:55:42.000 We know it's producing it.
01:55:43.000 We know it's produced in the liver.
01:55:44.000 It's produced in the lungs.
01:55:46.000 It's endogenous to the human system.
01:55:48.000 And we don't know why.
01:55:49.000 Yeah.
01:55:50.000 Well, I'm pretty sure I have a good insight into why.
01:55:52.000 Okay.
01:55:53.000 But for the moment, let's pursue the issue of what the implications are.
01:55:57.000 Okay, I'll hold on to why.
01:55:58.000 I'll cross my fingers.
01:55:59.000 Yeah, cross your fingers on that one.
01:56:01.000 The story you've just told is perfectly plausible, whether it's 100% accurate or not, and we don't know.
01:56:08.000 Right.
01:56:09.000 I think?
01:56:24.000 It's all very interesting.
01:56:26.000 You're not tripping in your sleeping mind for no reason at all.
01:56:30.000 You're doing it for productive reasons.
01:56:46.000 Right?
01:56:46.000 And that means that we have carried this with us from an ancestral state into the modern state and we now have molecules that we can trigger it when we want to.
01:56:56.000 That gives you access to a style of thinking that you're telling me has altered your understanding of your relationship to other people and that it metaphorically lines up with what you often hear delivered in religious terms.
01:57:11.000 Right?
01:57:13.000 Abstractly, yes.
01:57:14.000 Right.
01:57:15.000 And so, when you say Catholicism is a cult, I don't agree, because Catholicism historically must have been delivering messages that caused people to correct their thinking in ways that made them collaborate more effectively,
01:57:33.000 that made them better able to find the opportunities in their environment.
01:57:37.000 I'm not advocating that we should sign up for belief systems that are At odds with our modern environment.
01:57:47.000 But one thing we can say I believe for sure is that religions that have stood the test of time did so because their value to the people who believed in them was so great that those that disbelieved were outcompeted.
01:58:04.000 Now, so we get into trouble in the modern circumstance because we can look at many of the teachings of any of these ancient religions and we can compare them to what we learn scientifically and detect that there's something not right.
01:58:18.000 Can I stop you there?
01:58:18.000 Sure.
01:58:19.000 Scientology, is that a cult?
01:58:21.000 Too early to tell.
01:58:25.000 Okay.
01:58:25.000 But let's stop.
01:58:26.000 Because we know the guy who created it.
01:58:28.000 Well, we do.
01:58:29.000 So let me drag you back.
01:58:32.000 By the way, I'm very uncomfortable with Scientology and what it does.
01:58:35.000 But the problem is, as Scientology itself points out, if you looked at the inception of something like the Catholic Church, you might be equally troubled.
01:58:45.000 Sure.
01:58:46.000 Which is why I think they're both cults.
01:58:48.000 Well, but let's be careful about that.
01:58:50.000 Okay.
01:58:50.000 What do you think a cult is?
01:58:52.000 What do you think a cult is?
01:58:53.000 Well, I think a cult is the predatory version.
01:58:56.000 It is tapping into people's natural tendency to believe in what I call metaphorical truths, and it is using it very often to extract resources from them.
01:59:08.000 Like the Catholic Church.
01:59:10.000 No, not like the Catholic Church.
01:59:12.000 The Catholic Church is so long-standing and the population that has – I guess what I would say is if a population succeeds by believing in these things, then cult is not the correct – I think you and I have different terms.
01:59:27.000 We're using different definitions of the word cult then.
01:59:30.000 But let me take – there's a very interesting comparison.
01:59:34.000 So Joseph Smith, who started the Mormon Church, had a competitor.
01:59:38.000 Right.
01:59:39.000 He had a competitor at the time.
01:59:42.000 There's a book called The Kingdom of Matthias, right, about his competitor at the time.
01:59:47.000 And to me, the two looked equally plausible, the story that they were selling.
01:59:51.000 Now, Matthias never had more than 30 followers, and his – That religion died out and Joseph Smith won and the Mormon Church is obviously a real thing.
02:00:02.000 But these sets of beliefs are advanced by somebody, whether those somebodies are cynical when they do it or whether they are earnest.
02:00:13.000 I think many of these, the ones that we have Right.
02:00:34.000 Okay.
02:00:35.000 Yeah.
02:00:36.000 So the origins of the Catholic Church most likely came from some desire for order and a scaffolding of how to behave and to give people rules and structure for how to get through this life with the most amount of positivity and love.
02:00:53.000 And by disciplining them and having these...
02:00:57.000 Grave punishments being held over their head burning in the fires of the pits of hell if they decide to have sex with another man or wear two different types of cloth or Whatever the other silly things that were in the Old Testament by doing this what they've essentially tried to do was offer people structure No,
02:01:16.000 it's not really structure.
02:01:18.000 It is a I think I'm going to go.
02:01:42.000 I think it was smoke.
02:01:44.000 Oh, he smoked it?
02:01:44.000 That's why the whole thing, the burning bush.
02:01:46.000 Let's say he smoked it.
02:01:47.000 Let's say he was super savvy and farsighted.
02:01:50.000 And, you know, obviously he didn't know anything about molecules.
02:01:53.000 Right.
02:01:54.000 But suppose that he had some ancient model of something.
02:01:58.000 There must have been something in that plant that caused...
02:02:01.000 Crazy things to happen.
02:02:02.000 Let's suppose he didn't believe that he had contacted God or that God had contacted him.
02:02:07.000 But he woke up from the thing and was like, wait a minute, I know what these people are doing wrong.
02:02:13.000 I'm going to write it down and tell them it came from God.
02:02:16.000 Right.
02:02:17.000 Is Judaism now a cult?
02:02:19.000 I think they're all cults.
02:02:21.000 I think all ideologies are cults.
02:02:24.000 I just don't know.
02:02:25.000 I don't think there is any one person First of all, when you're dealing with Christianity, you're dealing with translations, right?
02:02:36.000 Or Judaism.
02:02:37.000 You're dealing with ancient translations of languages that aren't even spoken anymore.
02:02:42.000 And you're also dealing with an oral tradition of who knows how many years before it was ever bothered to be written down.
02:02:49.000 Right.
02:02:50.000 But written down by people.
02:02:52.000 And then people decide what stays in and what doesn't.
02:02:56.000 They change the rules.
02:02:57.000 Like the priests used to be able to marry.
02:02:59.000 Priests in the Catholic Church.
02:03:00.000 The Pope used to have wives.
02:03:02.000 Sure.
02:03:02.000 They ran armies.
02:03:04.000 I mean, this is clearly something that human beings have a hand in manipulating and changing, and they do it for the benefit of the structure itself.
02:03:12.000 They're not doing it for the benefit of the human beings that are a part of it.
02:03:15.000 They're doing it for the benefit of the structure itself.
02:03:17.000 Oh, that I don't agree.
02:03:19.000 In what case?
02:03:20.000 With like the money and the amount of power?
02:03:24.000 I'm not saying you don't have corruption in all of these structures.
02:03:27.000 You do.
02:03:28.000 But I am saying that there is a – I mean this is exactly parallel to what we were talking about before.
02:03:34.000 There is the predatory version, which I would call a cult.
02:03:38.000 And there is the earnest version, which I would not call a cult.
02:03:42.000 Now, I'm very uncomfortable with any of these things governing policy in the present, because none of them have a literal relationship with reality that allows them to deal with the fact that we've got all these new problems for which there's no religious wisdom.
02:03:56.000 Maybe there's a problem in the word cult.
02:03:59.000 Maybe we should just say...
02:04:05.000 A structure created by human beings, which is basically all structures, all structures, all models of behavior where you have to adhere to certain things.
02:04:18.000 But the problem with religion, and even with a lot of cults, is the supposed grave consequences for deviating.
02:04:26.000 Well, so, alright, let's pick up Catholicism because it's easy, because so much of the structure is visible to us.
02:04:34.000 I would argue that Catholicism is going to be true of all of Christianity, it's going to be true of Judaism, but Catholicism is...
02:04:43.000 It is easy to see how it would facilitate collaboration, that effectively it would recreate in some sense the insight that you're talking about from DMT, and that it would instantiate it in the population in a useful way that would facilitate collaboration and disrupt processes that cause infighting.
02:05:06.000 Yeah, like what I was thinking before, it might be a structure.
02:05:09.000 It's the scaffolding for human behavior and ethics.
02:05:11.000 Right.
02:05:12.000 But imagine, I mean, we can see it in Catholicism.
02:05:14.000 Sure.
02:05:14.000 We know where it is.
02:05:15.000 So every week, you have to go and confess the shit you're doing wrong to the dude in the box.
02:05:22.000 Okay?
02:05:23.000 That's how they used to spy on you.
02:05:24.000 I mean, that's what that was for.
02:05:25.000 Well, but why are they spying on you?
02:05:26.000 Because they want to make sure that they don't get overthrown.
02:05:30.000 I think you're too cynical.
02:05:32.000 Really?
02:05:32.000 Yeah, because...
02:05:33.000 What do you think they're doing?
02:05:33.000 Well, so first of all, I'm a biologist.
02:05:36.000 Okay.
02:05:36.000 Okay.
02:05:37.000 How are these priests who don't marry passing on their genes?
02:05:41.000 They're not.
02:05:42.000 Oh, they are.
02:05:43.000 Oh!
02:05:43.000 What are they doing?
02:05:44.000 They're sneaky?
02:05:45.000 No.
02:05:45.000 They're facilitating the stability of the lineage that they are in charge of.
02:05:51.000 So the point is, they don't pass on their genes directly.
02:05:53.000 They're passing on their genes indirectly through the population.
02:05:57.000 Their interests are synonymized with the population That they are preaching to.
02:06:05.000 And so you tell the dude in the box what you've been doing wrong.
02:06:08.000 Imagine it's adultery, right?
02:06:09.000 Okay.
02:06:10.000 Oh, God.
02:06:11.000 If I die before next week and I haven't confessed my adultery, I'm going to go to hell.
02:06:15.000 That's bad.
02:06:16.000 Hell is a very unpleasant place.
02:06:18.000 Tell that guy who doesn't get to have sex.
02:06:19.000 I'm going to tell him.
02:06:20.000 So now the person I've been committing adultery with is thinking, oh, shit, well, I was going to keep this a secret.
02:06:24.000 But now the priest is going to know...
02:06:26.000 That I've been committing adultery because he's going to hear from the other person.
02:06:29.000 And so I better confess, too.
02:06:31.000 So now the priest has a sense of like, oh, there's an adultery problem in the congregation.
02:06:36.000 And the priest is, you know, flipping through the book and thinking, oh, this week, what should we talk about?
02:06:41.000 And so, you know, the priest then gets up at the pulpit and, you know, turn your book to Psalm, whatever, and starts going on about...
02:06:53.000 I think?
02:07:10.000 He's taken a vow of poverty.
02:07:11.000 He's living in the church.
02:07:13.000 He does well when the town does well.
02:07:16.000 He's not having babies of his own.
02:07:19.000 So he can't really get ahead by himself.
02:07:22.000 He gets ahead, genetically speaking, when the town does well.
02:07:26.000 And he's in a position to spot what's going wrong in the town.
02:07:29.000 And he doesn't have a dog in that fight because he's not involved in business.
02:07:32.000 He's not involved in mating and dating.
02:07:35.000 You know, this is all relatively recent when it comes to the Catholic Church in the last couple hundred years.
02:07:39.000 Yes, and I would argue if you look back at any of these traditions, any of the ones that worked will successfully have addressed the question of how you prevent corruption from emerging.
02:07:49.000 Well, apparently it was sexual corruption.
02:07:51.000 These guys were rock stars.
02:07:53.000 Like, the priests were essentially the guys who had the direct line to God, and just like professors have been known to do, not you.
02:08:00.000 But, you know, some have sex with their students, you know?
02:08:03.000 Because their students, like, look at them like, oh my god, I can't believe the professor's sitting down here with my work.
02:08:09.000 I mean, that's a very minor connection in comparison to the connection to God.
02:08:14.000 Anybody who's going to have power is in danger of abusing it.
02:08:19.000 I would say it's interesting that in the Catholic Church and in other traditions where marriage and making money are impossible, that these appear to be evolved.
02:08:35.000 What's the word?
02:08:37.000 It's like...
02:08:51.000 Yeah, I think.
02:08:57.000 And if you're, you know, hanging around with cute girls, people are going to look at you funny because you're not supposed to be having sex.
02:09:03.000 So this limits their ability to get away with stuff.
02:09:08.000 I'm not saying it's zero.
02:09:10.000 Obviously, it's not.
02:09:11.000 But the idea that there will be protections in...
02:09:13.000 In each tradition for this, that religions that don't successfully protect against abuses of power will succumb in competition to religions that do it effectively.
02:09:26.000 And so you're absolutely correct that all of the changes in religious texts that people believe in, that's all human beings making decisions about what to keep and what to throw out.
02:09:37.000 I'm not arguing anything else.
02:09:39.000 What I am arguing is those that have been insightful about what to keep From the point of view of the particular problems faced by the population that they are in will have a competitive advantage because they will function more cohesively than a population of atheists who doesn't have somebody looking out to prevent outbreaks of competition inside the lineage or outbreaks of infighting.
02:10:04.000 So if you were looking at it in an objective way, like say if you were an alien from another planet that didn't understand the language and you're just observing the structure, you would see this error correction.
02:10:16.000 The structure would say, oh, they realized there was an issue here with power, and so they error corrected by making these priests be celibate, and then they figured out a way to keep them from having money, and that'll keep them from being invested in, you know, income.
02:10:28.000 Okay.
02:10:29.000 Yeah.
02:10:29.000 And I think it's very misleading to people who analyze things the way you and I would because there's so much hocus pocus associated with these structures that, you know, it's like constantly putting a finger in the eye of an analytical person.
02:10:45.000 Of course.
02:10:46.000 I mean, just the way they dress, right?
02:10:47.000 I mean, just wearing the wizard costume and holding the staff and sitting on the golden throne and all of it.
02:10:54.000 I mean, all the pageantry to it.
02:10:55.000 It's all preposterous.
02:10:56.000 I mean...
02:10:57.000 I agree.
02:10:58.000 Looking at it as a modern person, it looks preposterous to me.
02:11:01.000 On the other hand, there is no way, I'm telling you, I mean, most of my colleagues I'm sure would disagree with this, but I hope to show them to be wrong on this front.
02:11:10.000 There is no way that the huge amount of effort and resource that is invested in these structures was an error.
02:11:20.000 It cannot have been an evolutionary error because if it was, The huge investment that populations put into these structures is an opportunity for some population that behaves in exactly the same way except it doesn't make that error to win.
02:11:34.000 Right, but when you get power and then you have the momentum of that power overcoming the population and then you have positions where the The behavior patterns are extremely restrictive and you have to behave inside these behavior patterns or there's grave consequences.
02:11:53.000 I mean, you could conceivably run that for a thousand years without any error correction.
02:11:58.000 And that's what you've got with Islam, right?
02:12:01.000 You've got a very ancient form of religion that Michael Shermer wrote a piece about it that's pretty interesting, where he was talking about how it's the only religion that didn't go through the Enlightenment.
02:12:12.000 And he makes these comparisons to, like, the corrections that have been had with other religions as time has gone on that haven't happened there.
02:12:21.000 And you could equate it to resource management.
02:12:23.000 You could equate it to the part of the world in which they live.
02:12:26.000 There's a lot of different ways that you could try to figure out why this happened.
02:12:30.000 But the reality is, That thing has not changed.
02:12:34.000 That structure has not changed for a long time.
02:12:37.000 I think you've hit on exactly the right point.
02:12:42.000 But Islam's mechanism for preventing outbreaks of parasitism was to hard-code the thing.
02:12:52.000 And this is a tragedy because what it means is that where Islam needs to update, it doesn't have the mechanism for doing it.
02:13:02.000 But also has a built-in way to keep people aboard.
02:13:07.000 Yes.
02:13:08.000 Like, if you become an apostate, you can kill them.
02:13:11.000 Right.
02:13:11.000 You can kill apostates.
02:13:12.000 Right.
02:13:12.000 Which is, there's no other religion that we have right now that operates like that.
02:13:15.000 It has draconian punishments.
02:13:17.000 Imagine if Scientology did that.
02:13:20.000 Right.
02:13:21.000 Well, I mean, Scientology obviously has a huge number of completely unacceptable mechanisms to keep people from leaving.
02:13:26.000 But how is that not a cult?
02:13:27.000 Since it was created by a science fiction writer, literally out of nowhere...
02:13:33.000 I'm going to be horrified if I said it wasn't a cult.
02:13:36.000 Right.
02:13:36.000 But you said it's too early to tell.
02:13:38.000 Right.
02:13:38.000 But how is that possible?
02:13:40.000 Okay.
02:13:40.000 So there could be some benefits...
02:13:42.000 In the future, if Scientology continues to evolve and self-correct, it could get to the point where it's a behavior pattern that could be complementary and perhaps even beneficial to people.
02:13:53.000 I do not believe that this is where it is headed.
02:13:55.000 But if a thousand years from now it had flourished and the population that believed these things was successfully growing, you'd have to say, well, there's something in that set of beliefs that's working for these folks.
02:14:11.000 Right.
02:14:12.000 So anyway, I'm not arguing that it isn't a cult.
02:14:14.000 It has lots of hallmarks that make me think I'm troubled by it.
02:14:19.000 Well, including much like Mormonism, you know the guy who made it.
02:14:22.000 But this, and unlike Mormonism, which was in 1820, we have video.
02:14:27.000 We see L. Ron Hubbard's bad teeth and his fucking captain's outfit on with the medals that he gave himself.
02:14:33.000 You're like, hey...
02:14:34.000 What is this?
02:14:35.000 Right.
02:14:36.000 Oh, you get a planet when you die or something like that?
02:14:39.000 It sounds like a cult to me.
02:14:40.000 On the other hand, you know, I mean, we've got...
02:14:42.000 Unless the Mormons get a planet when you die.
02:14:43.000 Sorry.
02:14:44.000 I'm confused.
02:14:46.000 Scientologists don't get a planet?
02:14:48.000 No, no, the Mormons do.
02:14:49.000 And it was in one of the Osmond brothers' albums.
02:14:53.000 They all went to these planets in the album.
02:14:56.000 I forget what the album is.
02:14:58.000 It's a hilarious album, but if you open it up, it's all planets.
02:15:01.000 And the album, the name of the album is this concept that you get a planet when you die.
02:15:09.000 This seems stingy to me because the universe is so big.
02:15:13.000 We can each afford to get a plan.
02:15:15.000 There it is.
02:15:15.000 The plan.
02:15:16.000 That's it.
02:15:17.000 See?
02:15:18.000 That is some Joseph Smith shit.
02:15:20.000 By the way, it's a wonderful book.
02:15:23.000 If you read the actual origins of Mormonism, people that are listening, they go, wait a minute, what do they believe?
02:15:28.000 Wow, did they believe some wacky stuff.
02:15:31.000 But he had a golden, or a seer stone rather, that allowed him to read the golden tablets.
02:15:38.000 The seer stone was a special magic rock that allowed him to read these golden tablets that contained the lost work of Jesus.
02:15:45.000 Was that the inside of the album with all the planets?
02:15:47.000 So I believe he was actually illiterate.
02:15:50.000 Yeah, he was a con man.
02:15:51.000 He was murdered.
02:15:52.000 He put a sheet over his head and dictated what he was supposedly reading to somebody else.
02:15:58.000 That's my understanding of it.
02:15:59.000 Well, he was a charismatic person, like many people have created cults.
02:16:04.000 He was.
02:16:07.000 I think what they did right...
02:16:08.000 Mormons are some of the nicest people I've ever met.
02:16:11.000 They have an amazing sense of community.
02:16:13.000 They're super nice.
02:16:14.000 And I've always said that if I was going to join a cult, I think I'd join the Mormons.
02:16:17.000 But where they fucked up is the regulations came in.
02:16:21.000 They started restricting the market.
02:16:24.000 The regulators came in and stopped these people from having nine wives.
02:16:28.000 And then they're not going to be so nice anymore.
02:16:30.000 The reason why they were so nice is because they had these crazy relationships.
02:16:33.000 They were having orgies every night.
02:16:35.000 They had nine people living in the house with them they allowed to have sex with.
02:16:38.000 There's a problem with your they, because for everybody who had nine wives, there were eight dudes who had none.
02:16:44.000 Those guys got to get their shit together and start their own cult.
02:16:47.000 I guess.
02:16:48.000 Where, you know, well, maybe they're gay.
02:16:50.000 I don't know.
02:16:52.000 What about the women, too, right?
02:16:53.000 How about a woman who has ten husbands, you know?
02:16:56.000 This is bullshit for her.
02:16:57.000 That doesn't happen.
02:16:58.000 Yeah, it should, right?
02:17:00.000 No.
02:17:00.000 It should be possible.
02:17:01.000 No, no, see, there's a very good biological reason.
02:17:03.000 Oh, I understand biologically.
02:17:03.000 It doesn't make any sense, but...
02:17:05.000 You already dig deep into your biological...
02:17:08.000 Oh, man, don't, don't...
02:17:09.000 Don't go polyanders on me.
02:17:10.000 We went way far away from our original point, which was that you could figure out a way to restructure with this...
02:17:20.000 I don't want to call it Plan B because that's the abortion bill.
02:17:22.000 Game B. Game B. Game B. Well, I don't want to say there is no Game B at the moment.
02:17:27.000 What there is is a conversation that emerged from this.
02:17:30.000 And the basic point is that one can structure a superior deal to what people are able to work out for themselves.
02:17:38.000 And it can function inside of the system and it can gain...
02:17:42.000 Adherence for the very same reason that people are buying Bitcoin, for the same reason that people have decided to get smartphones, for the same reason people have signed up for Facebook.
02:17:53.000 Those are, with the exception of Bitcoin, those are game A examples and they are predatory.
02:18:02.000 But you could provide a version that was not predatory, that would function in a superior way with respect to how it enhanced your ability to function in the world, and rational people.
02:18:15.000 I mean, you know, you make a good point.
02:18:17.000 It's not that we have billions of people who are sophisticated enough to know that they should be looking for that alternative and who will jump on it.
02:18:25.000 But to the extent that you have people that are sophisticated, are aware that their lives are being disrupted by forces that they are incapable of managing, like these dopamine traps and the like, Who will embrace these technologies because they themselves are looking for a mechanism to insulate from the predatory things that have emerged in the market.
02:18:49.000 What you will get is the spread of these technologies.
02:18:52.000 And if there is one sort of key message here about all of the objections that one might raise about the difficulty of creating such alternatives and getting them to be adopted...
02:19:06.000 Right.
02:19:21.000 Pick up some new technology or agreement because it enhances your life, it solves some problem for you in some way that's good, that causes the thing to be adopted out of self-interest.
02:19:33.000 And that is what the mechanism for change is going to have to look like, is self-interest causing people to embrace a shift in the opportunities and obligations that they are signed up for.
02:19:48.000 Now, I think one of the problems that I have with this is that I always assume that this is going to be like on January the 1st, we're switching over to the new system, but it's not, right?
02:19:59.000 Can't be.
02:20:00.000 Can't be.
02:20:01.000 It's got to be almost like a natural chain of events.
02:20:07.000 It has to be like water flowing downhill.
02:20:11.000 So...
02:20:12.000 I can speak scientifically about my own field because, you know, I did enough training that I know where the bodies are buried.
02:20:21.000 When you're in a field and that field is stuck, it is impossible to move the field.
02:20:28.000 You can't get a hearing that will allow you to change the way the field thinks.
02:20:33.000 But you can step outside the field.
02:20:36.000 You can...
02:20:38.000 Leave the reservation as it were, and you can proceed by other means.
02:20:43.000 And what happens is the same thing that, you know, Lars Anderson discovered with archery, is that once you're no longer signed up for what good form looks like, there are all kinds of ways to accomplish things that are not documented.
02:20:59.000 And so finding those alternatives.
02:21:04.000 Game A is not serving people's needs.
02:21:06.000 We are all unhealthy.
02:21:09.000 You know, even if we find a way to be physically healthy, we are all overwhelmed by so much social noise and so much choice that is meaningless that we end up wasting a huge fraction of our time, spending a tremendous amount of our mental effort on puzzles that aren't interesting or worthwhile or productive.
02:21:29.000 And so we are all Each of us has a giant opportunity to upgrade our lives by simply removing a bunch of the noise, by getting the systems that are supposed to function in our interest to do so more effectively.
02:21:44.000 And therefore we are, I hate to use the word consumers, but we would be willing consumers for a better alternative were it to show up for us.
02:21:53.000 And presenting that alternative so that people find it, they experiment with it, and upon discovering that actually, you know what, I am better off when I participate in this way than that way, that that causes adoption.
02:22:06.000 And it doesn't take very much, you know, Bitcoin obviously started with, you know, an ambitious person who set the thing in motion.
02:22:14.000 What's a currency with only one person using it?
02:22:17.000 Right.
02:22:18.000 Right?
02:22:18.000 But now it's not a currency with one person using it.
02:22:21.000 It's a currency.
02:22:22.000 You've used it.
02:22:23.000 I've used it.
02:22:24.000 So anyway, it is possible to get adoption based on the fact that the thing solves problems that people are otherwise stuck with.
02:22:31.000 Do you think that because things are moving so quickly today, it seems to me that new ideas are implemented so fast, new concepts are accepted so quickly, that something like this, where it might have taken several decades, a few decades ago, would only take a few years today?
02:22:48.000 Well, there are two things.
02:22:52.000 One, there is the possibility that things will change very rapidly.
02:22:56.000 And then there is the fact that they must, because the trajectory we're on, we are playing with such powerful technologies and being operated at such a high rate, with such high throughput, that those of us who have started...
02:23:12.000 I think it's a mistake to look at the problems of the world as individual problems.
02:23:17.000 It's much more effective to look at them as symptoms of problems that don't have names.
02:23:23.000 So we have an economic system that generates technologies that create great benefits in the short term at some massive level.
02:23:37.000 We're good to go.
02:23:52.000 And liquidating the well-being of the planet at present is so fast that effectively we need to change quickly.
02:24:00.000 And it doesn't mean, I guess that's the other thing that I haven't said yet, is nobody, including me, thinks that we're going to be able to spell out an answer in the present that is correct.
02:24:12.000 But what we can do is navigate in the direction of the answer that is correct and we can discover what that answer looks like.
02:24:20.000 In other words, the right model is not the writing down of the new rules of the world and the embrace of them because you get benefits.
02:24:30.000 It is prototyping.
02:24:32.000 What the new structure would be and then you institute the prototype in some group of people who have signed up for it and then you discover what you didn't know about it that you needed to know and you correct those problems so that what you get is effectively Evolution building an elegant solution rather than what progressives often accidentally invite,
02:24:54.000 which is good intentions that produce horrifying outcomes because you didn't know what they were actually going to do once you set them in motion, right?
02:25:03.000 Communism being a great example.
02:25:05.000 You think communism is going to solve the problems of the world.
02:25:08.000 In fact, it creates, you know, It's massive harm and kills millions because you didn't understand what it would do in motion.
02:25:18.000 So we don't want to ever face that again.
02:25:21.000 We don't want to be utopians because anytime you engage in utopianism, if you set it in motion, you're going to create a dystopia.
02:25:28.000 It's virtually guaranteed.
02:25:30.000 The way to avoid that is not to imagine that you know the answer.
02:25:33.000 It's to define what objective you want the system to reach and then navigate towards it.
02:25:44.000 How do you think this could be implemented?
02:25:47.000 Well, it requires...
02:25:50.000 Capital, frankly.
02:25:51.000 And it requires capital and people who understand what the problems are.
02:25:58.000 Most importantly, it requires people who understand game theory.
02:26:03.000 Because what we keep doing is setting systems in motion that have the characteristics that guarantee evolution.
02:26:13.000 There are only four of them.
02:26:15.000 And if you set a system in motion that has those four characteristics, you will get adaptive evolution.
02:26:20.000 And what it will produce, you have no control over.
02:26:22.000 It will produce whatever the niche space allows.
02:26:26.000 So you need people who are aware of the game theoretic parameters of the space and who understand, essentially, game theory...
02:26:39.000 I think?
02:26:53.000 Use so you can effectively harness the power of evolution to build a functional system rather than build a system and then suffer the consequences of evolution that you didn't anticipate.
02:27:04.000 That's what we keep doing.
02:27:06.000 We've built an economy and a political system that evolve out from under us and they create monstrous phenomena that we didn't anticipate because they weren't in the plan.
02:27:16.000 Do you think that it's possible that, like you see the radical change, the social change that's happened just over the last couple months since the Harvey Weinstein thing?
02:27:27.000 Do you see what's happened?
02:27:28.000 I mean, it literally has probably stopped sexual harassment dead in his tracks.
02:27:33.000 Women who work in these workplaces that had to deal with the consequences of these guys, the amount of those sexual harassment episodes has probably been radically reduced like that.
02:27:44.000 Yeah, is it temporary though?
02:27:45.000 I don't know.
02:27:46.000 It's a good question.
02:27:47.000 It's a good question.
02:27:48.000 I think because my only concern is that With false accusations or overreactions or people that are just not treating this like the incredibly powerful medium that it is,
02:28:06.000 the medium for change, and then using it to their own benefit, people could get greedy and corrupt this, right?
02:28:12.000 But I think...
02:28:13.000 It's already happening.
02:28:14.000 Sure.
02:28:15.000 I'm sure.
02:28:16.000 You know any examples?
02:28:17.000 Yeah, a couple of them.
02:28:18.000 I mean, I don't want to...
02:28:20.000 Like every other man, I am hesitant to put my weight on the ice with anybody because who knows what you don't know.
02:28:27.000 Right.
02:28:27.000 But I've seen several stories now that I find very disturbing.
02:28:32.000 The first one, and I must say, you know, this was a topic of conversation with me and my friends, as I'm guessing it would have been with you and your friends, and...
02:28:43.000 I feel weird saying this on your podcast, but here it comes.
02:28:49.000 My friends and I who discussed this had a kind of reaction which was, you know, it turns out to have been a really good decision never to grope anybody who wasn't into it, right?
02:29:04.000 Right, of course.
02:29:05.000 That wasn't the reason that we didn't grope anybody.
02:29:08.000 But nonetheless, it turns out to have been a benefit that we're not worried about what's going to emerge.
02:29:15.000 Yes.
02:29:16.000 So that's what I was saying.
02:29:17.000 And then I saw this Garrison Keillor story.
02:29:19.000 Yeah.
02:29:20.000 And suddenly I didn't feel so safe anymore.
02:29:21.000 The Garrison Keillor one is the most disturbing one.
02:29:24.000 It's not, actually.
02:29:26.000 But it's bad.
02:29:27.000 No.
02:29:27.000 There's one more?
02:29:29.000 There's a worse one.
02:29:30.000 Tell everybody the Garrison Keillor story.
02:29:31.000 The Garrison Keillor story.
02:29:33.000 So, by the way, the only thing we have.
02:29:34.000 So he was fired from Minnesota Public Radio.
02:29:38.000 His radio program was not only renamed, but the old program.
02:29:45.000 Right.
02:29:48.000 Right.
02:29:48.000 Right.
02:29:54.000 Right.
02:30:11.000 He had been comforting a woman and he had leaned in, I guess, to hug her and put his hand on her back and he said her shirt was open and he touched her back and he said his hand went up about six inches that she recoiled.
02:30:27.000 He apologized in the moment.
02:30:30.000 He then sent her an email apologizing.
02:30:33.000 She said, don't worry about it.
02:30:34.000 It's not a big deal.
02:30:35.000 And he said that he and she were friendly until he got a contact from her lawyers.
02:30:44.000 And how long after the event was this?
02:30:46.000 He didn't say.
02:30:47.000 But my feeling is, let's take the worst possible interpretation, the least generous tequila interpretation of this event.
02:30:54.000 That he groped her a little bit.
02:30:55.000 Yeah.
02:30:55.000 Maybe he lost his place.
02:30:57.000 Yeah.
02:30:57.000 Maybe he just was like...
02:31:24.000 It's just a back.
02:31:28.000 If there's more, there's more.
02:31:29.000 But if there ain't more, that's really disturbing.
02:31:31.000 And this would not have happened if it wasn't for, like, Harvey Weinstein.
02:31:35.000 Harvey Weinstein, whose...
02:31:37.000 His actions were so disgusting and so egregious and so numerous.
02:31:43.000 So despicable.
02:31:44.000 That it went so far this way that anything even remotely gross got shifted into that category.
02:31:53.000 Right.
02:31:53.000 Like into the Kevin Spacey category.
02:31:55.000 Kevin Spacey's grabbing dicks and acting like a psychopath.
02:31:59.000 Or the Matt Lauer story came out on the same day as the Garrison Keillor story.
02:32:03.000 And that one is truly disturbing, too.
02:32:05.000 I mean, really just, you know...
02:32:16.000 I thought he just had affairs.
02:32:19.000 No, no, no.
02:32:20.000 There's really disturbing stuff, including him...
02:32:22.000 I mean, I want to be a little cautious about this business, about him having a button under his...
02:32:27.000 No, they all have buttons.
02:32:28.000 That's a really common thing in NBC. Oh, it is?
02:32:30.000 Yeah, yeah, yeah.
02:32:31.000 Okay, well.
02:32:32.000 Yeah, see, that's the problem with these...
02:32:33.000 Problem with the story.
02:32:34.000 And also, it's not something you really want to go into and start reading.
02:32:37.000 You feel like you're gross.
02:32:38.000 Right, it is gross.
02:32:39.000 I will say, there was one story...
02:32:42.000 Whether that button is a commonplace thing that's been misinterpreted, I don't know.
02:32:45.000 But there is some story in which some woman came into his office, he had her bend over his desk, he had sex with her, she fainted, and he had his assistant take her to the hospital, which...
02:33:00.000 You know, anyway, I'm uncomfortable now because I can't establish that any of this stuff is true.
02:33:07.000 But I will say the Garrison Keillor story doesn't sound like the Lauer story.
02:33:10.000 It doesn't sound like the Harvey Weinstein story or Kevin Spacey.
02:33:15.000 Or Al Franken.
02:33:15.000 The Al Franken one, one of the women, said that he grabbed her waist.
02:33:19.000 He squozed the fat around her waist and she was disturbed because even her husband is not allowed to touch her like that in public.
02:33:28.000 Well, I want to come back to Al Franken separately.
02:33:30.000 I believe we need a very separate category for Franken.
02:33:33.000 But I was going to tell you what was worse than the Keillor story.
02:33:37.000 Go ahead.
02:33:37.000 So the Keillor story is disturbing.
02:33:39.000 When I heard that one, suddenly my feeling of safety, based on the fact that I haven't groped anybody who didn't want to be groped, vanished because suddenly it was open season.
02:33:52.000 But the story that disturbs me even more is the Matt Taibbi story.
02:33:58.000 I don't know who that is.
02:33:59.000 I know who Matt Taibbi is.
02:34:01.000 So Matt Taibbi, Rolling Stone reporter, who's been really excellent at confronting power, especially in the financial sector.
02:34:08.000 And, you know, he's been very consistent on this.
02:34:11.000 Apparently, as a young man, he was in Russia and he was publishing a satirical magazine, I guess it was.
02:34:21.000 And the satirical magazine...
02:34:25.000 So there's the story that came out, which was that he had written all of these things about assaulting women and mocking them.
02:34:37.000 Turned out he didn't do the writing.
02:34:39.000 It was his partner.
02:34:39.000 And that he and his partner were engaged in producing the satirical publication that was in fact mocking the culture of Americans who had gone over to Russia and were snorting tons of coke and living it up as they were corrupting the society that had recently been freed from communism.
02:35:01.000 And so the point is This is a case in which the evidence against Taibbi was ironic because it was really Taibbi critiquing this bad behavior amongst other men.
02:35:12.000 And so the reason that this, you know, I wouldn't know what to make of that story except that the person, the journalist who started sorting this out, interviewed both Matt Taibbi's girlfriend who worked at the publication and other women in the office soliciting stories about what had happened in the office.
02:35:29.000 And the women in the office universally reported Right.
02:35:45.000 Right.
02:35:56.000 I was, A, going to be pretty surprised if he was behaving this way, but okay, I've been surprised by a few of these.
02:36:01.000 And it wasn't even necessarily behaving, it was writing.
02:36:04.000 It was writing.
02:36:05.000 But the fact that there's no there there, when you pursue the story, the women who were in a position to say, yeah, actually he was kind of a dick, said, oh, the opposite.
02:36:13.000 And he didn't even actually write it.
02:36:14.000 Right.
02:36:14.000 He didn't write it.
02:36:15.000 It was designed to lampoon people who were behaving badly in this context in exactly the way that the Me Too movement should applaud.
02:36:24.000 You retweeted and quoted a woman who wrote something, and I retweeted it as well when you did it, which she said, here's an unpopular opinion.
02:36:33.000 I'm actually not at all concerned about men who are falsely accused of sexual assault slash harassment.
02:36:41.000 And you said, rethink this.
02:36:43.000 There's the idea that honorable men who are on your side could get caught up in this, and you're not even remotely concerned.
02:36:50.000 I don't know if you followed the thread before she turned her page to private, but one of the more hilarious things is they turned it on her saying, what about men of color who are falsely accused?
02:37:00.000 No, not men of color.
02:37:02.000 She felt the racism coming her way and immediately acquiesced.
02:37:05.000 It was fascinating because I'm watching this social dance, this weird peacocking of morals, and it's just so odd.
02:37:14.000 But I think that it's like we're talking about with other things.
02:37:17.000 Systems correct themselves.
02:37:19.000 You find this one terrible example, and then everything sort of like gets washed out because of this one terrible example, and then I feel like it'll settle.
02:37:31.000 I don't know.
02:37:33.000 I'm just guessing.
02:37:33.000 You're the biologist.
02:37:36.000 I'm disturbed.
02:37:38.000 That particular tweet you're talking about, I don't care if some innocent men go down, is based on one relatively easy to understand conclusion, but it misses the more important one.
02:37:49.000 So what she's effectively saying is there's been a ton of carnage.
02:37:53.000 Lots of women have suffered awful stuff at the hands of men who weren't accountable.
02:37:56.000 And so a few men who suffer some bad stuff is tiny in comparison.
02:38:00.000 We all get that.
02:38:01.000 But that's a terrible idea because it's a team thing then.
02:38:04.000 It's the worst idea.
02:38:05.000 It's us versus them.
02:38:06.000 It's the worst idea because what you want is a system in which men are honorable.
02:38:11.000 And if you allow men who are honorable to be skewered simply because some person, often cynical, decides to go after them, A, you're going to eliminate all of the courageous men from the system because all those people have enemies.
02:38:27.000 And so the point is, anybody with an enemy suddenly has to fear an accusation that has no truth in it that's going to be reflexively believed.
02:38:33.000 So if you want the system to work, the last thing you want to do is just decide it's fine for innocent people to go down with the ship.
02:38:41.000 Yeah, and to not have any respect for due process is just crazy.
02:38:45.000 You're going to go back to the McCarthy era.
02:38:47.000 You're going to go to the Salem witch trials.
02:38:48.000 It is that.
02:38:49.000 It is that.
02:38:49.000 It is that.
02:38:52.000 I just hope it keeps our daughters and wives and girlfriends and moms from being groped at work.
02:39:00.000 I mean, it might.
02:39:01.000 It might.
02:39:02.000 Look, I've always said this, the environment of an office.
02:39:06.000 It's so fucking entirely unnatural that it takes incredible restraint just to keep people from behaving in the way that they would if they were surrounded by these people on a regular basis.
02:39:16.000 And like we were talking about before with professors, the relationship that a professor has with a student, but it's even more so with a boss and an employee, right?
02:39:26.000 A secretary, someone who makes a fraction of what you make, or someone who's below you in the office food chain and You kind of can dictate whether or not they do well in life, whether or not they advance.
02:39:41.000 Your input can change the course of their career, how much money they make, whether they'll be able to go on vacations, whether they can live comfortably, pursue their dreams.
02:39:50.000 I mean, it's a crazy environment.
02:39:53.000 The office environment is a very bizarre environment.
02:39:55.000 It's super dangerous.
02:39:57.000 And what I keep waiting for, maybe somebody's written it and I haven't read it yet, but The deep question.
02:40:05.000 So we have to deal with the issue of men behaving this way.
02:40:08.000 Obviously, it's completely unacceptable.
02:40:10.000 And, you know, we have two dangers.
02:40:12.000 We've got We're good to go.
02:40:42.000 Where the power of men like Harvey Weinstein and Matt Lauer comes from a very unnatural concentration of opportunity.
02:40:53.000 So women have been compromised because if you're in the news biz, was it NBC, and Matt Lauer tells you to open your blouse, suddenly you're staring at a major career decision, right?
02:41:07.000 Do you say no or do you say yes?
02:41:10.000 And One person is not supposed to have that much power.
02:41:16.000 You should be able to walk out of NBC and say, I'm not working there anymore because Matt Lauer is an ass, and go somewhere else.
02:41:24.000 But if Matt Lauer is not only has the power to make a...
02:41:28.000 I mean...
02:41:54.000 He is a monster.
02:41:55.000 But even though he was her monster, in her words, she said that there were two Harvey Weinsteins, and you didn't know which one you were going to get.
02:42:08.000 So she actually...
02:42:09.000 She dealt with the part of him that wasn't this way, apparently, honorably enough to say that when nobody else was talking about it.
02:42:15.000 So I don't wish to resurrect anything Harvey Weinstein here.
02:42:19.000 I think the guy is getting what he deserves.
02:42:20.000 And even though due process is super important, in this case, there's so many stories that it's impossible to imagine.
02:42:26.000 Well, it's not just stories.
02:42:27.000 It's actually written into his contract.
02:42:29.000 Yeah, and the settlements and...
02:42:31.000 But the sexual harassment clauses in this contract were the craziest fucking things I've ever seen in my life.
02:42:36.000 Yeah.
02:42:36.000 Like, they literally said, if you have one infraction, it's this amount of money.
02:42:39.000 Two infractions, it's half a million dollars.
02:42:41.000 Three infractions, it's a million.
02:42:42.000 Right.
02:42:42.000 He just monetized...
02:42:44.000 It's insane.
02:42:44.000 And the point is, well, there's something unhealthy about a system that creates somebody so powerful that they are capable of just treating their own sexual assault of other people as a cost of doing business by writing it into the contract.
02:42:58.000 So...
02:42:59.000 We have to fix, not only do we have to fix that if you're a man and you behave this way, you're not safe.
02:43:05.000 We also have to fix the system that let this go on so long by concentrating opportunity so that any person who decided to say, hey, actually, things are not healthy here would have been, A, not listened to because everybody else had stuff to lose if Harvey Weinstein didn't like you.
02:43:23.000 And so anyway...
02:43:26.000 Curing the concentration of opportunity problem is part and parcel of solving the sexual assault problem.
02:43:34.000 Yeah, and it's also this giant enterprise, right?
02:43:39.000 And when you have one person who's the king, which is essentially what he was, they behave like kings have classically.
02:43:48.000 I mean, that's what kings do.
02:43:51.000 They want everyone they want.
02:43:53.000 I mean, you hear horrible stories throughout history.
02:43:56.000 Of things that men do when they have ultimate power.
02:43:59.000 The old phrase, ultimate power corrupts, you know, or absolute power corrupts absolutely.
02:44:05.000 And it's just, there's no way around it, it seems like, unless there's full disclosure.
02:44:11.000 I firmly believe that all of this is, there's two things going on right now.
02:44:18.000 That we are in sort of an adolescent stage of human evolution in terms of our culture.
02:44:24.000 And that we're working out how we interact and behave with each other.
02:44:28.000 That information and the ability to exchange information is highlighting all these flaws in these natural systems that we have, these alpha chimps that are running these giant groups.
02:44:38.000 But then I think this technology that it's exposing us is going to give way to something that's even scarier.
02:44:45.000 And my number one fear over the last year has been artificial intelligence.
02:44:50.000 Oh, I'm tremendously troubled by it, too.
02:44:53.000 I'm more scared of it every day.
02:44:54.000 Every day I wake up thinking, are we just sleeping while this thing is about to go live and we literally are in a fucking Terminator movie?
02:45:04.000 I think it's both better and worse than that.
02:45:08.000 It's happening already.
02:45:10.000 And because it's not robots, we don't see it.
02:45:13.000 Right, it's happening in your phone.
02:45:14.000 It's your phone.
02:45:15.000 And the thing is, it's AI, it's not a GI. Right.
02:45:20.000 A G being...
02:45:22.000 General.
02:45:23.000 Okay.
02:45:23.000 So the fear, the one that we have...
02:45:25.000 Sentient.
02:45:26.000 Yeah, that the thing is smart enough to start thinking on its own and that it might come up with its own objectives.
02:45:33.000 Might actually be creative.
02:45:34.000 It might be creative and either the paperclip problem is you tell it to make as many paperclips as possible and it sees your attempt to turn it off or reprogram it as an obstacle to making paperclips and it just starts liquidating the universe.
02:45:48.000 So that's not a malevolent AI, that's a confused AI. Malevolent AI is also possible, but what we've got is a baby version.
02:46:00.000 The algorithms causing us to become addicted to our phones and to do damage to our lives is AI. These are algorithms that are developing.
02:46:10.000 And frankly, it doesn't even matter whether people are reprogramming them or whether they are reprogramming themselves.
02:46:15.000 And it is undoubtedly a mixture.
02:46:17.000 What we have is an evolutionary system that the winner will be the site that manages to capture your attention in the face of competitors who are trying to do the same thing, and they will build anything and everything into that algorithm to get you to do it, which means there's nothing in your life that's sacred,
02:46:34.000 right?
02:46:34.000 Your life can be liquidated.
02:46:38.000 What we have now is a case where the algorithms aren't so good that we can't have this conversation, recognize that this is a danger we've created for ourselves, and address it.
02:46:49.000 We could address it now.
02:46:50.000 But if we don't realize that the phone algorithm addiction problem is, this is the warning shot.
02:46:58.000 This is the place where we get a chance to recognize where we're headed and deal with it before it's AGI. If we don't do that, I don't...
02:47:08.000 Everybody who's serious and has thought about this question has had the same, oh shit, rational or conclusion, is that we...
02:47:17.000 There's no stopping this.
02:47:18.000 We can rationally debate how far off it is.
02:47:22.000 But once it gets going, there's essentially no good way out.
02:47:28.000 And there's no one that's going to agree to stop right now.
02:47:31.000 Right.
02:47:31.000 There's just too much competition involved in terms of the...
02:47:35.000 There's so many resources that are on the line.
02:47:38.000 There's so much resources.
02:47:39.000 There's so much power.
02:47:41.000 There's so much money on the line to see who...
02:47:44.000 Can come up with the best version of this and do it the quickest.
02:47:48.000 And it's this mad race towards the edge of a cliff and no one exactly knows whether or not we're going to be able to use the brakes.
02:47:53.000 Right.
02:47:54.000 And so in some ways, I think this is the ultimate demonstration of where we are and what we must contemplate in order to...
02:48:04.000 I think?
02:48:22.000 Well, suppose you took 99% of the AI projects and managed to correctly build in some algorithm that prevented them from going rogue in one way or the other, but somebody else decides not to.
02:48:35.000 So we have to confront this at the level of what is allowed.
02:48:41.000 I don't like hearing myself say that because I hear people turning off on the other end when they hear, oh, he doesn't want to allow us to innovate.
02:48:50.000 For the survival of the species, I think it's imperative.
02:48:52.000 You have to think about it that way.
02:48:54.000 We have no choice.
02:48:55.000 I'm so scared that we're caught up in all this other nonsense, and we're thinking about so much stupid shit in our life, like whether or not Kim and Kanye stay together.
02:49:03.000 And in the middle of all this...
02:49:05.000 There's a lab right now, and they're connecting these wires and connecting these dots and reprogramming and accelerating the evolution of this thing, and it's not going to turn off.
02:49:15.000 And that life as we know it, we only have a few years left of this.
02:49:20.000 I really feel like we're in a fucking science fiction movie, and we're at the beginning of the movie where everything's great.
02:49:25.000 Right.
02:49:26.000 We've got problems.
02:49:27.000 We've got a lot of dudes out there pinching butts, and some dudes are pretty rapey.
02:49:33.000 Kevin Spacey got taken out of that movie.
02:49:34.000 We're doing pretty good.
02:49:35.000 And then, in the meanwhile, there's fucking robots that are being built by Boston Dynamics that does backflips, and they're going to be able to think for themselves, and they're going to have machine guns for hands, and their body's going to be filled with bullets.
02:49:47.000 What could go wrong?
02:49:48.000 It's fucking crazy because it's happening at such an accelerated rate that it's literally, I'll be sitting around, I'll be playing with my kids, I'll be hanging out, and I'll be like, fuck, are they making robots right now?
02:50:01.000 Are we like a year away from this being a real problem?
02:50:07.000 This is kind of what I'm getting at, is that the AI problem, in some ways, maybe it's good, because it's causing us to be able to focus on one of these hazards to us that you can't, once you've seen why this is a risk, once you've really made eye contact with it,
02:50:24.000 you can't talk yourself into why it's safe.
02:50:26.000 Because it can't possibly be.
02:50:28.000 Even if 99% of them are safe, 1% is enough to create the problem.
02:50:32.000 So once you've got that information, you can then begin to extrapolate to all of the other problems that don't have the same...
02:50:42.000 The same intrigue around them.
02:50:44.000 Sam Harris makes the point that as you talk about the AI apocalypse, it is simultaneously horrifying but kind of fun to get into because it is sort of sci-fi and all.
02:50:59.000 But the whole landscape that we have built Right?
02:51:03.000 Our socio-political landscape was built by people who had never heard of Darwinism.
02:51:09.000 There was no Darwinism.
02:51:10.000 They didn't know.
02:51:11.000 And so they built a bunch of ecosystems in which stuff evolves and they didn't know to worry about it.
02:51:19.000 So the AI problem is a very concentrated version of a very general problem, which is if you build a habitat and you install into it those features which cause Evolution to occur.
02:51:30.000 It will occur.
02:51:31.000 And what it will create is entirely a question of what the niches look like that you have left open.
02:51:36.000 So we are living that.
02:51:37.000 We are suffering the consequence of...
02:51:40.000 An economic environment that is evolutionary, that creates giant rent-seeking monsters who are liquidating our well-being and putting it into their bank accounts.
02:51:51.000 We've created a political apparatus that has the same characteristics, and now we're facing a robot version that actually has the potential to very rapidly push us over the cliff.
02:52:03.000 My real issue with this, and this is something that I've been battling again for a couple years now, is that this is what we do.
02:52:10.000 And it's one of the reasons why we're so fascinated with technological innovation.
02:52:15.000 I mean, literally, we are the caterpillar.
02:52:18.000 That is becoming the electronic butterfly, and we don't realize it while we're doing it.
02:52:22.000 And we're so obsessed with the newest, latest, greatest phone that really doesn't change your life at all.
02:52:26.000 Like, I got this new iPhone.
02:52:28.000 It's pretty.
02:52:28.000 It's great.
02:52:29.000 You do the same shit I did on my old iPhone, but I was so pumped to get it, man.
02:52:33.000 But you're, in some ways, facilitating the evolution of this technology that'll ultimately lead, if you follow it, to an end point.
02:52:41.000 It's going to lead to something very, very bizarre.
02:52:44.000 I mean, if you're Ray Kurzweil, you think it's going to be wonderful.
02:52:46.000 You're going to be able to download your brain to a computer.
02:52:48.000 But if you're a lot of other people, you think it's an absolutely terrifying thing that's going to eventually lead to us being irrelevant.
02:52:55.000 And this is unnecessary.
02:52:57.000 So likely we won't have time to go very far into it.
02:53:02.000 But we have an alternative.
02:53:04.000 And I think it does look a little bit like Lars Anderson with his fancy archery, you know, which is we are just on the cusp of understanding enough about what a human being is and how it functions for us to actually take control of our structure and to turn it to basically creating a stable,
02:53:28.000 non-utopian, abundant system.
02:53:31.000 In other words, the stuff that we are all pursuing.
02:53:34.000 Right?
02:53:35.000 You got your first smartphone and it changed everything and it was marvelous.
02:53:40.000 And then your second one didn't change that much and your third one was no big deal and your fourth one is barely a blip.
02:53:44.000 Right?
02:53:46.000 That thing that we are pursuing, the well-being that we felt the burst of when we got our first smartphones, that thing can be made to...
02:53:57.000 The system can be made such that we are constantly getting the signals of well-being and the liberty to do things that are of consequence.
02:54:07.000 In other words, it is not beyond our current understanding to build a system that instead of getting you to innovate something, having a huge burst of dopamine, and then it wears out and you're constantly looking for the next one, it is possible to build a system that...
02:54:39.000 I think?
02:54:52.000 Architecting a system that understands what a human being is, that understands we are not built to be happy and therefore pursuing happiness as if we were built to be happy is a hazard.
02:55:08.000 We should be pursuing something else and we should recognize that happiness is a carrot on a stick that evolution built into us in order to get us to pursue objectives which were not stable well-being.
02:55:21.000 They were actually the spread of our genomes.
02:55:24.000 Which you can attach to technology.
02:55:25.000 This pursuit of happiness is the pursuit of material possessions.
02:55:29.000 And we facilitate that with these advertisements that make every new thing look like this is the one that's going to take me over the top.
02:55:36.000 I'm going to finally reach the promised land.
02:55:38.000 Well, technology is one thing.
02:55:41.000 Really, we are built to pursue something that economists call growth.
02:55:49.000 But I would say human beings, like all creatures, are built to detect opportunities that they can capitalize on.
02:55:57.000 And those opportunities can look like various different things.
02:55:59.000 It can look like a bunch of...
02:56:01.000 For some creature foraging, it can be some food that it happens onto.
02:56:06.000 For a population of humans, it can be a new continent.
02:56:10.000 That's a huge opportunity.
02:56:11.000 For human beings, it can also be a new technology that takes whatever opportunity you have and allows you to do more with it.
02:56:18.000 But we are wired to search for those things.
02:56:22.000 And the pursuit of those things has produced a great many marvelous opportunities.
02:56:28.000 Innovations and discoveries.
02:56:30.000 But the fact that we do not understand that we are mindlessly pursuing these things, even when they are not available, causes us to do all sorts of harm to ourselves.
02:56:41.000 So understanding these things as we finally are beginning to, we could build a civilization that did not leave us on the hedonic treadmill pursuing happiness, which cannot possibly be Thank you.
02:56:57.000 Thank you.
02:57:03.000 Brett Weinstein, on that note, let's wrap this bitch up.
02:57:06.000 Thank you, sir.
02:57:07.000 I really appreciate it.
02:57:08.000 It was a lot of fun.
02:57:09.000 I hope someone can actually do this.
02:57:12.000 I hope it can be done.
02:57:13.000 Thank you.
02:57:14.000 Thank you very much.
02:57:15.000 Brett Weinstein on Twitter, and you don't have those other things.
02:57:19.000 You don't use those other terrible technologies.
02:57:21.000 I'm not.