The Joe Rogan Experience - June 01, 2016


Joe Rogan Experience #804 - Sam Harris


Episode Stats

Length

4 hours and 27 minutes

Words per Minute

152.0209

Word Count

40,734

Sentence Count

2,977

Misogynist Sentences

64

Hate Speech Sentences

37


Summary

The infamous Jamie Double Finger Gun is back from his 9 month hiatus from eating meat. We talk about why he stopped eating meat and why he thinks it's a good idea to only eat vegetables. We also talk about the benefits of an all-vegetarian diet and why it might not be as bad as you think it is. We also discuss the benefits and drawbacks of veganism and why you should be eating meat at least once a day. Finally, we talk about some of the health benefits of a vegan diet and how it can improve your hormones and testosterone production. This episode is a must listen for all vegetarians and vegans out there who are looking to make a change in their diet. Enjoy and spread the word to your friends and family about this podcast! -Jon Sorrentino Jon Timestamps: 2:00 - The benefits of eating meat 4:30 - Why eating meat is bad for your hormones 6:00 Why you should eat meat 7:30 How to eat meat? 8:15 - Why meat is good for your testosterone 9:20 - How much meat should you eat 11:00 Is meat good for you? 12:15 13:00 | What is veganism a cult? 15:30 | What does it take? 16:15 | What are the benefits? 17:40 | How does meat have a negative impact on your hormones? 18:40 19: Is meat better than vegetables better than meat better? 21: What are you going to do with your cholesterol? 22: Should you eat meat and eggs? 23:20 | What do you need to eat more? 26:30 Is meat more than enough? 27:30 Should you get enough eggs and enough carbs? 25:30 Do you have enough carbs and enough fat? 29:30 How much milk? 30:40 Should I eat more of that? 31:00 Do you need more of it? 32:30 Can you eat more eggs? 35:30 Does meat have enough calories? 35:00 Should I get enough chickens? 36:00 How much of it be enough of that I should I eat enough of my eggs and other stuff? 33:00 Can I have more of a chicken? 37:00 Does meat be enough?


Transcript

00:00:02.000 The infamous Jamie Double Finger Gun.
00:00:04.000 And we're live.
00:00:06.000 How are you, sir?
00:00:06.000 Good to see you, man.
00:00:07.000 I'm good.
00:00:07.000 Good to be back.
00:00:08.000 What's cracking?
00:00:08.000 Looking healthy.
00:00:09.000 Looking fresh.
00:00:10.000 I'm glad to hear it.
00:00:11.000 Look refreshed.
00:00:12.000 Well, I stopped eating meat since I last saw you.
00:00:14.000 I heard about that.
00:00:15.000 I want to talk to you about that.
00:00:16.000 And I don't know that it's correlating with health.
00:00:19.000 Every time I... Worry about this out loud.
00:00:22.000 I get hate mail from vegans and vegetarians who say, stop, stop, put in your bullshit on us.
00:00:27.000 Well, they don't like when you associate any negative consequences whatsoever with only eating vegetables.
00:00:34.000 Because it's essentially a cult.
00:00:36.000 It's a wonderful cult of people that want to take care of animals and be nice to animals, but they're very tribal.
00:00:43.000 Very tribal, very cult-like, and if you say anything that's negative against vegans, they gang up.
00:00:47.000 I go to forums and I read the things they say.
00:00:50.000 They organize like little troll attacks and they make YouTube videos.
00:00:54.000 It's kind of hilarious, like from a psychological standpoint.
00:00:58.000 Yeah, I mean, you know, obviously I don't want it to become a new religion or my one religion, but there is a moral high ground to the position that I find very attractive because I felt like a hypocrite as a meat eater.
00:01:12.000 Now, and I don't think this necessarily extends to someone like you who hunts and feels okay about hunting.
00:01:18.000 I don't have an argument against hunting the way I do against factory farming or Or more or less any of the way we get meat.
00:01:28.000 You know, the environmental implications of it.
00:01:31.000 But it's...
00:01:33.000 So it's very captivating as a position.
00:01:36.000 And you feel like an asshole.
00:01:38.000 Once you go far enough into the inquiry, you feel like an asshole not being sensitive to these concerns and just ignoring how you're getting your food three times a day.
00:01:49.000 But...
00:01:52.000 For me, I'm sure there's individual variation, and I'm not the smartest vegetarian in the world in terms of how I prepare my food and how attentive I am to it.
00:02:01.000 So the onus is somewhat on me, but I'm not totally sure it's the healthiest thing for me yet.
00:02:11.000 So you said you look healthy.
00:02:13.000 I feel like my health is somewhat withering under this.
00:02:17.000 It's been about nine months.
00:02:18.000 Yeah.
00:02:18.000 Withering, are you getting your blood checked?
00:02:21.000 Are you doing B12 supplementation?
00:02:24.000 Yeah.
00:02:24.000 That's essential.
00:02:25.000 B12 is essential.
00:02:26.000 D3 is essential as well.
00:02:28.000 Most people are just not going to get enough from the sun.
00:02:31.000 And are you monitoring your intake of fatty acids and things along those lines?
00:02:37.000 Not really.
00:02:38.000 Beyond the supplementation, I'm just...
00:02:40.000 Just trying to get the food in.
00:02:42.000 Yeah, well, it's all good stuff, but it's really important.
00:02:45.000 Like, there's one of the things that people rage against, unfortunately.
00:02:50.000 It's the stigma, and it's fats.
00:02:53.000 It's dietary cholesterol.
00:02:55.000 Dietary cholesterol and saturated fats, which are critical for hormone production.
00:02:59.000 And it's one of the reasons why people, when they get an all-vegetable diet, if they're not really careful with coconut oil, you gotta eat a lot of coconut oil.
00:03:05.000 I'm a big fan of avocados.
00:03:07.000 I eat a lot of avocados, a lot of avocado oil.
00:03:09.000 A lot of coconut oil as well, though.
00:03:11.000 Almond butter, nuts, things along those lines.
00:03:14.000 You really need to get those essential fats.
00:03:16.000 Well, actually, I'm not vegan, so I would like to be vegan, but I'm still eating dairy.
00:03:21.000 Well, how could you say you would like to be?
00:03:22.000 No one's holding you down.
00:03:24.000 It's like, God damn it, Sam, you need to have some milk.
00:03:27.000 No, I think I would screw it up.
00:03:30.000 I mean, again, I'm going to reap the whirlwind from the vegans.
00:03:34.000 I should have brought you some eggs.
00:03:36.000 I eat eggs and I eat dairy, but you can only eat so many eggs and so much dairy.
00:03:42.000 I eat a lot of eggs.
00:03:44.000 Eggs are very good for you, too.
00:03:45.000 There's another thing.
00:03:46.000 Dietary cholesterol.
00:03:48.000 People are always concerned with dietary cholesterol.
00:03:51.000 Well, that was a big myth for the longest time.
00:03:53.000 As a matter of fact, dietary cholesterol barely moves the needle on blood lipids.
00:03:58.000 A lot of when people have cholesterol issues, it's sedentary lifestyle, there's genetics, there's all sorts of other variables.
00:04:05.000 But people with healthy lifestyles, it doesn't seem to be that dietary cholesterol is bad for you.
00:04:11.000 Also, it's essential for testosterone production.
00:04:15.000 Yeah, yeah.
00:04:16.000 So no, I get a ton of, or not a ton, but I'm not trying to avoid saturated fat.
00:04:23.000 So I get a fair amount of...
00:04:26.000 Do you have a yard?
00:04:28.000 Yeah.
00:04:28.000 Could you raise some chickens?
00:04:30.000 No.
00:04:30.000 No?
00:04:31.000 Not enough of a yard.
00:04:34.000 Come over to my house after this.
00:04:34.000 I'll take you to my house and I'll show you the chicken setup.
00:04:36.000 We have, we just got five new ones and we have 18 now.
00:04:41.000 One of them died.
00:04:42.000 They just fucking die sometimes.
00:04:43.000 Right, and you don't know why.
00:04:45.000 No, they just go in the chicken coop, one of them's dead.
00:04:47.000 You're like, alright.
00:04:47.000 His father time got in there with his scythe.
00:04:49.000 They're getting kind of old now, some of them.
00:04:51.000 I don't know how old a chicken lasts, honestly.
00:04:54.000 But they're like pets that give you food.
00:04:56.000 Like, there's no negativity.
00:04:57.000 I open the door, they come up to me, they run around, I feed them.
00:05:01.000 Like, there's no, they're not trapped.
00:05:03.000 As a matter of fact, they go into their pen at night.
00:05:06.000 Like, you leave it open.
00:05:08.000 And they wander around my, I have a big yard, they wander around my yard, they eat a bunch of bugs and stuff.
00:05:12.000 And then they go back inside when they want to.
00:05:15.000 Right.
00:05:15.000 So there's no, like, animal captivity.
00:05:17.000 There's no cruelty.
00:05:18.000 There's nothing weird going on.
00:05:20.000 And so those eggs, they're pretty much karma-free.
00:05:24.000 Oh, yeah.
00:05:24.000 Yeah.
00:05:25.000 No, I don't doubt that.
00:05:26.000 It's just that when you read the details of how our dairy and eggs are gotten...
00:05:32.000 Arguably as bad, if not worse, than much of the meat production.
00:05:36.000 So moving from eating meat to being a vegetarian in some ways is a symbolic move ethically if you really wanted to not participate in the machinery.
00:05:48.000 There's an issue with vegetables.
00:05:50.000 In vegetarianism, there is an issue with how they gather food.
00:05:55.000 I mean, there's a giant issue that people don't want to take into consideration.
00:05:58.000 It's like, how are they growing all this food?
00:06:01.000 How are they growing all these plants?
00:06:02.000 Well, one of the things they're doing is they're displacing wildlife.
00:06:05.000 They're chewing up this ground and these combines, if you eat grain in particular, combines indiscriminately just chew up all that stuff and they get deer fawns, mice, rabbits, rats, rodents, untold amount of bugs if you want to get really deep.
00:06:20.000 Right.
00:06:21.000 I mean, there's no, like, being a vegan and being a vegetarian is most certainly less cruel and less harmful overall, but it's not karma-free.
00:06:31.000 It can't be, unless you're growing your own stuff.
00:06:34.000 If you can grow all your own vegetables and you essentially live on a small farm...
00:06:39.000 Yeah, you could do it and really feel good.
00:06:42.000 But if you're buying it in a store, you're participating in factory farming whether you like it or not.
00:06:49.000 You're just participating in vegetable farming.
00:06:51.000 But there's still issues.
00:06:54.000 Oh, yeah, yeah, yeah.
00:06:56.000 Just viewer.
00:06:57.000 I had this guy on my podcast, Uma Valeti, who's running this company called Memphis Meats, which is cultured meat.
00:07:05.000 It's a startup in Silicon Valley.
00:07:06.000 Oh, okay.
00:07:07.000 Did you try that?
00:07:09.000 I haven't tried it.
00:07:09.000 No, I want to try it.
00:07:11.000 But it was a fascinating conversation because he – basically, what's fascinating to me on two levels is – One, it's fascinating that we're on the cusp of being able to produce actual, biologically identical meat that is totally cruelty.
00:07:27.000 There's no implication of cruelty at all in it, right?
00:07:32.000 And you would just grow this in a vat, the way you brew beer, essentially.
00:07:39.000 So that seems like the future.
00:07:42.000 But what's interesting, psychologically, is that people have this I'm telling you, I can take the misery and death out of the process.
00:07:56.000 I can take the suffering animal out of it.
00:07:58.000 I can take the chaos of the slaughterhouse out of it.
00:08:00.000 There's no cow that has been mistreated for its whole life, stumbling in blood and feces on the way to the killing floor.
00:08:11.000 And somehow removing all of that makes it creepy for people, right?
00:08:15.000 They want that.
00:08:16.000 I mean, that's the natural way to get meat.
00:08:19.000 And if I told you this is grown in a vat by a guy in a white lab coat and has no xenoviruses and no bacteria, no antibiotics were used to plump this thing up, and it's just the cells you want, people start to...
00:08:34.000 There's kind of an ick feeling that I think we're going to get over, but it's interesting psychologically that it's there in the first place.
00:08:40.000 Did you get that ick feeling, or are you just talking to the people that cultivated it?
00:08:44.000 No, I understand it.
00:08:45.000 I mean, I'm past it, but I understand it.
00:08:48.000 Who do you know that got it, that feeling?
00:08:49.000 I just see the reaction.
00:08:51.000 I actually polled this on Twitter, and 25% of people, like 15,000 people answered the poll.
00:08:57.000 So it's not a scientifically valid poll of the general population.
00:09:01.000 It's just whoever got it on my feed.
00:09:04.000 But it was interesting.
00:09:06.000 I asked...
00:09:08.000 You know, of the people who wouldn't switch, and so I asked, would you switch?
00:09:12.000 And something like 80% said they would switch if this was affordable and available and safe.
00:09:18.000 But of the people who wouldn't switch, 25% wouldn't switch because it's just creepy.
00:09:24.000 And 25% assumed that it was not healthy, I think.
00:09:30.000 I forgot the breakdown.
00:09:32.000 Another 25% were already vegan or vegetarian and didn't want to eat meat.
00:09:37.000 But there was an ick factor for at least 25% of the people who wouldn't do it.
00:09:41.000 I understand that.
00:09:42.000 I wonder how many people would go back to eating meat...
00:09:47.000 If they could raise it this way, like how many people who had gone vegan would go back to eating this scientifically created, lab-created beef?
00:09:56.000 I think enough for a serious market.
00:09:59.000 Oh, for sure.
00:10:00.000 Well, there'd be a serious market for it, for sure.
00:10:02.000 If they could get it cost-effective, because last time I saw it was like a quarter million bucks for a cheeseburger.
00:10:08.000 I think it's down to $18,000 for a meatball.
00:10:13.000 I mean, it'll eventually be like, you know, the Apollo computers now fit in your pocket.
00:10:18.000 Much stronger, in fact, than the Apollo computers.
00:10:20.000 Well, sequencing the genome, this I just noticed, which is fascinating.
00:10:25.000 Sequencing the genome 15 years ago cost $3 billion.
00:10:29.000 It's now $3,000.
00:10:31.000 Whoa!
00:10:32.000 What?
00:10:33.000 It's a million-fold reduction in cost in 15 years.
00:10:36.000 That's insane!
00:10:38.000 Things tend to scale that way.
00:10:41.000 Yeah, that's a giant scale, though.
00:10:43.000 When you talk about human history, good lord, imagine if you're a guy who spent $3 billion of it 15 years ago, and you're like, God, if I just fucking waited, I would have saved so much money.
00:10:55.000 We don't anticipate that when we think of how difficult it is to solve certain problems.
00:11:01.000 This is Ray Kurzweil's point.
00:11:03.000 Ray Kurzweil, who I think is a bit of a cult leader and a carnival barker on many topics, this point he makes again and again I think is quite valid, which is...
00:11:23.000 Yeah.
00:11:25.000 Yeah.
00:11:26.000 Yeah.
00:11:38.000 That's where I am as a vegetarian.
00:11:40.000 They get fucking mad at you.
00:11:43.000 They get mad.
00:11:44.000 Did you see that family, a couple that runs a bunch of vegan restaurants, and they decided to start eating meat again, even though they run vegan restaurants?
00:11:54.000 They have their own farm.
00:11:56.000 They raise their own cattle, and they decided to eat their own cattle.
00:11:58.000 It was real weird, too, because there was a lot of Jesus in their message.
00:12:03.000 There was a lot of, like, Jesus said that, you know, we're supposed to take care of the animal.
00:12:06.000 Like, biblical quotes, you know, like, really obscure biblical quotes about food.
00:12:11.000 Like, oh, okay.
00:12:12.000 Like, what are you doing here?
00:12:13.000 But they have, uh, here it is.
00:12:16.000 Vegans revolted against owners of famous L.A. vegan restaurants after meat-eating outed.
00:12:22.000 Well, I think, I don't think you could say they outed, because I'm pretty sure they put it on their Facebook page.
00:12:28.000 The guy was, like, first cheeseburger in 15 years.
00:12:31.000 Is this...
00:12:32.000 This is Cafe Gratitude?
00:12:33.000 Yes.
00:12:34.000 Oh, yeah.
00:12:34.000 So the other thing that's hilarious about that restaurant, which I like, the food is good, but have you been there?
00:12:41.000 No.
00:12:41.000 So everyone, now forgive me if this is no longer true, but at one point every employee there did the landmark forum, you know, the successor to Est.
00:12:51.000 Oh, that's right.
00:12:52.000 They got sued for that.
00:12:54.000 Who got sued?
00:12:54.000 This restaurant.
00:12:56.000 It's one of the reasons why they closed one of their restaurants.
00:12:59.000 Explain that, though, because the landmark.
00:13:03.000 I've never done this, so again, I'm speaking outside the cult walls.
00:13:09.000 But Werner Erhardt was a 60s human potential figure who...
00:13:16.000 I think we're good to go.
00:13:33.000 I think?
00:13:54.000 Actually, the classic case of this, I think this wasn't Est.
00:13:58.000 This was the Forum, which is now the successor to Est.
00:14:01.000 They were at one point hired to do coaching of various companies, and I think they were hired by the FAA. I wrote about this in one of my books in a footnote.
00:14:12.000 It was the FAA hired the Forum to coach their They're administrators.
00:14:21.000 And one of the exercises they forced these guys to do, and they certainly were mostly guys, they chained the boss to his secretary for the whole day, and they had to go to the bathroom together.
00:14:36.000 This sort of ego-annihilating experience.
00:14:39.000 Anyway, this is the recipe, or one of the recipes that Est has pioneered.
00:14:47.000 This is not to say that people don't go to the forum and get a lot out of it.
00:14:50.000 I've actually met those people.
00:14:51.000 But every employee of this restaurant apparently has gone or used to do the forum.
00:15:00.000 So it's a very, you walk into the restaurant and your interaction with people in the restaurant is unlike most restaurants.
00:15:07.000 People are just very, you know, lots of eye contact and it's just an intense restaurant.
00:15:14.000 And also, the stuff on the menu, this was just so lacerating, I could never comply, but the name of everything on the menu is like, I am humble, I am magical, I am self-assured.
00:15:27.000 So you have to, you're meant to order it that way, like I am humble.
00:15:32.000 Oh, God.
00:15:33.000 I'll have an I am humble?
00:15:34.000 Yeah, so when it's given to you, it's you are humble.
00:15:37.000 So that becomes, a little of that goes a long way in the restaurant.
00:15:42.000 Phew!
00:15:43.000 Okay.
00:15:44.000 Well, they're retarded.
00:15:45.000 That's what's going on.
00:15:46.000 The food is good, though.
00:15:49.000 Don't get me the wrong way.
00:15:50.000 The food is good.
00:15:50.000 Well, it makes sense, then, that they would go all Jesus-y, religious-y when they were trying to justify their meat consumption.
00:15:57.000 Jamie, see me if you pull up the quotes for them justifying their meat consumption, because it was real weird.
00:16:03.000 I was like, how weird is it?
00:16:04.000 These guys are like using Jesus and religion to justify eating cows when they've been a vegan for all those years.
00:16:10.000 Like, did you not listen to Jesus all those other years?
00:16:13.000 Like, you're like, fuck you, Jesus.
00:16:14.000 I'm not eating meat.
00:16:15.000 Like, what was the turnaround there?
00:16:18.000 It doesn't make any sense.
00:16:20.000 Well, I don't think you can pull veganism out of the Bible unless...
00:16:24.000 Well, I guess if you go back to the garden, there's nothing about them eating meat, right?
00:16:28.000 It was all provided from the trees.
00:16:30.000 There's no slaughterhouse in the garden.
00:16:32.000 Well, vegetarianism is fairly old.
00:16:34.000 I mean, vegetarianism has been around in Hindu cultures and so many different cultures forever.
00:16:39.000 But not veganism, right?
00:16:41.000 I mean, veganism is really fairly recent.
00:16:44.000 It's pretty impractical.
00:16:46.000 In deep history.
00:16:48.000 Okay, herding remnants of our best tool to restore fertility to the earth, keep the earth covered in reverse desertification and climate change, he wrote.
00:16:57.000 We need cows to keep the earth alive.
00:16:59.000 Cows make an extreme sacrifice for humanity, but that's their position in God's plan as food for the predators.
00:17:05.000 Whoa.
00:17:07.000 Huh.
00:17:07.000 That's a strange quote.
00:17:08.000 But I think there was more of them, but that's good enough.
00:17:11.000 But there was more of that kind of stuff, like all of a sudden he's a predator.
00:17:17.000 Predators go after animals, they chase them down, and they kill them.
00:17:22.000 I guess we're kind of predators in a way, but we're some new thing.
00:17:27.000 We're some completely new thing.
00:17:28.000 We use weapons or we corral them.
00:17:31.000 And if you've got them corralled and you're just sticking that no country for old men thing in their head and killing them with it, I mean, that's what they're doing, right?
00:17:39.000 If you're doing that, I don't know if you're allowed to call yourself a predator.
00:17:42.000 Well, historically we're predators.
00:17:44.000 Yes, historically.
00:17:45.000 And chimps are predators.
00:17:47.000 Not entirely, but they are that too.
00:17:50.000 We're certainly a kind of a predator, but we're so much different now.
00:17:54.000 And this is what we're talking about.
00:17:55.000 We're talking about factory farming and these weird businesses where they slam all these animals to these entirely too small places and they live in their own feces and urine.
00:18:07.000 And I'm sure you've seen that drone footage from the pig farm.
00:18:11.000 I've seen a lot of pig farm footage, but I don't know if I've seen drone footage.
00:18:14.000 I don't think I've seen drone footage.
00:18:15.000 Lakes of urine and feces.
00:18:17.000 It's disgusting.
00:18:18.000 It's unbelievable.
00:18:19.000 This guy flew this thing.
00:18:21.000 I mean, they have these ag-gag laws, which are evil.
00:18:24.000 Oh, yeah.
00:18:24.000 And if you don't know what that means, ag-gag laws are laws that they make it a federal crime to Show all the abuse of these animals to show factory farming because it'll affect the business so drastically and so radically when people are exposed to the truth that they've made it illegal.
00:18:44.000 Those laws should be illegal.
00:18:46.000 Those laws are scary.
00:18:47.000 That's a scary aspect of human beings.
00:18:49.000 I've never seen this.
00:18:50.000 Was it some toxic lake?
00:18:51.000 That's a lake of pee and poo.
00:18:53.000 And all those things are stuffed to the gills with pigs.
00:18:57.000 There's a book...
00:19:00.000 Eating Animals, Jonathan Safran Foer's book, which is worth reading if you think you're immune to the details.
00:19:10.000 There's two aspects to it.
00:19:11.000 There's the cruelty aspect, which is actually three aspects.
00:19:15.000 There's the cruelty aspect, which is horrific.
00:19:18.000 There's the environmental energy use issue, which is also just totally untenable.
00:19:25.000 And then there's just the If you have any concern about your own health and the contamination, I mean, just getting all these antibiotics that weren't prescribed to you that you don't want, that are still getting into you through this food chain, and,
00:19:42.000 I mean, just the stuff that they...
00:19:45.000 Like the chickens, I mean, the details about chicken farming is almost the most horrible.
00:19:51.000 But, I mean, they're covered in just the most, the foulest, no pun intended, just the most disgusting material.
00:20:02.000 They go into like a broth that's just like pure bacteria.
00:20:05.000 And, I mean, they have to be washed with, you know, just...
00:20:12.000 They really wash them with ammonia?
00:20:26.000 The chickens actually, I think largely because they're so small and more of the process of killing them is automated that they almost get the worst of it because they're, I mean, they're just, they're like, you know, they're getting singed before they're stunned, before they're, I mean, it's just not...
00:20:42.000 I mean, at least with a cow, you've got a single person interacting with a single cow, however briefly, and there's less chaos in the machinery.
00:20:50.000 But chickens just get pulverized, and arguably that's one of the greatest pain points ethically, It comes around just the egg industry because fully half the chickens,
00:21:06.000 the male chicks, just immediately get thrown into literally like a meat grinder because they're not the same chicken that is a broiler chicken.
00:21:17.000 I mean, genetically they're not the same.
00:21:18.000 They don't grow into the same kind of chicken that would be useful.
00:21:22.000 So they're useless.
00:21:23.000 And so they don't lay eggs and you don't eat them.
00:21:26.000 And so they just get Literally just fed into like a wood chipper alive.
00:21:34.000 And again, this is in some ways an artifact of them being so small that it would be just too much of a hassle to stun them appropriately, right?
00:21:44.000 Imagine if they made a law where you had to bury them all and put little crosses in the ground.
00:21:48.000 Be labor intensive.
00:21:50.000 Jesus.
00:21:52.000 I mean, I was on the highway and there was a chicken truck that was passing me.
00:21:57.000 One of those trucks that's containing live chickens.
00:22:00.000 And they're just stacked, just stacked in cages on top of each other.
00:22:04.000 Caged on top of a cage and just shitting on each other.
00:22:06.000 And I'm watching this and I'm like, it's so weird that we're allowed to do that with some animals.
00:22:11.000 Like, if you were doing that with horses, people would lose their fucking minds.
00:22:14.000 If you had dogs in boxes like that, stacked in the open air on the highway and you're driving down the road with them, people would freak out.
00:22:21.000 But no one bats an eye at these chickens.
00:22:24.000 This chicken truck, chickens are a weird thing.
00:22:27.000 We have a hierarchy of animals that we love, and we're not really big into reptiles, not really big into birds.
00:22:33.000 It has something to do, I think, with...
00:22:39.000 I think?
00:23:08.000 And so something, obviously, like an ape or something that's cute.
00:23:11.000 I mean, it's amazing what a fluffy tail will get you.
00:23:14.000 The difference between a squirrel and a rat, right?
00:23:16.000 Yeah.
00:23:16.000 It's crazy.
00:23:18.000 It is crazy.
00:23:19.000 Squirrels could just hang out with everybody.
00:23:21.000 We're cool with them.
00:23:22.000 They look so close to rats.
00:23:24.000 They're like, God, I'm so close.
00:23:25.000 It's like a hot girl's sister.
00:23:27.000 It's like, what the fuck?
00:23:29.000 So much.
00:23:31.000 It's so close.
00:23:32.000 There was an article today about some woman who had rescued a lobster from a restaurant and dropped it off in the ocean and the journey of this all and how you should think of this lobster as something with a cute face.
00:23:45.000 And if you did, then you would appreciate her efforts and understand that this lobster, even though they're not even capable of feeling pain in lobsters, they don't have enough nervous system.
00:23:54.000 Their nervous system is not strong enough for them to feel pain.
00:23:56.000 They don't have the same sort of sensors.
00:23:59.000 Yeah, I don't...
00:24:00.000 I've threatened to do this.
00:24:02.000 Actually, when I decided to become a vegetarian, I said at some point, maybe I will just do a taxonomy of the kind of a comparative neuroanatomy across species just to see where we could plausibly say, you know, the suffering really begins to matter.
00:24:18.000 Like clams?
00:24:19.000 Do clams feel anything?
00:24:20.000 No.
00:24:21.000 Actually, there are what are called bivalve vegans who eat clams and oysters and mussels.
00:24:25.000 Oh.
00:24:26.000 I could get into that.
00:24:27.000 Because they think that there's no...
00:24:28.000 I mean, you can make an argument that there's no basis for suffering there.
00:24:32.000 Yeah.
00:24:33.000 Well, if there's no feeling, there's no suffering.
00:24:35.000 My argument against that, though, is lobsters are clearly not happy when you throw them in boiling water.
00:24:40.000 Yeah.
00:24:40.000 So what's that reaction?
00:24:41.000 I think anything that can behave, that can move, right, and move away from a stimulus, the evolutionary rationale for it to experience pain, the question of consciousness is difficult, you know, where consciousness emerges, and I think there is clearly Unconscious pain mechanisms.
00:25:01.000 I mean, the same mechanisms that give us pain at a certain level, we can be unconscious and yet they can be just as effective.
00:25:09.000 I mean, all of our reflexes are like that.
00:25:10.000 So, you know, if you touch a hot stove, you're pulling your hand away actually before you consciously register.
00:25:15.000 Right.
00:25:16.000 And that's as it should be because you're faster that way.
00:25:20.000 So it's possible to have unconscious pain, but anything that can move Very quickly is going to evolve an ability to move away from noxious stimuli.
00:25:35.000 There's every reason to make that, in evolutionary terms, as salient and as urgent as possible.
00:25:43.000 Our pain response is that for us.
00:25:46.000 There's no reason to withhold that from any other animal that's clearly avoiding stuff with all of its power.
00:25:52.000 Right.
00:25:54.000 Yeah, it seems to me that there's got to be all the way down to mushrooms, because everybody eats mushrooms, even vegans.
00:26:02.000 But mushrooms breathe in oxygen, and they breathe out carbon dioxide.
00:26:06.000 They're way closer to humans than they really are to plants.
00:26:09.000 They're weird.
00:26:11.000 They're a strange sort of an organism.
00:26:13.000 I didn't know that about mushrooms.
00:26:14.000 Yeah, they're not necessarily...
00:26:16.000 Like a plant.
00:26:17.000 I mean, we think of them as a plant because they grow, but they're a fungus.
00:26:21.000 It's a type of life form.
00:26:22.000 And they interlink through the mycelium.
00:26:25.000 Yeah, yeah.
00:26:26.000 Well, Terrence McKenna, if you got him talking on mushrooms long enough, he would...
00:26:31.000 There's some very spooky stuff that he thought about them.
00:26:34.000 But there's no nervous system there.
00:26:35.000 I think the crucial...
00:26:39.000 Variable is the complexity of a nervous system.
00:26:44.000 So suffering and pain and emotions.
00:26:49.000 Suffering is one component of it, but there's just the question of what sort of experience can this creature be deprived of?
00:26:58.000 So when you ask, why is it a tragedy, or why would it be a tragedy to kill someone Right.
00:27:11.000 Right.
00:27:12.000 Right.
00:27:27.000 Ethically speaking, the only problem there, and it's a huge one, is that it forecloses all of the possible happy futures most of us or all of us or at least some of us were going to have.
00:27:39.000 So all of the good things that we could have done over the next million years aren't going to get done.
00:27:45.000 All of the beauty, all of the creativity, all of the joy, all of that just gets canceled.
00:27:50.000 And so leaving the painfulness of pain aside...
00:27:56.000 Why is it wrong to deprive any given animal of life?
00:28:01.000 Well, insofar as that life has any intrinsic value, insofar as the being that animal is better than being nothing, right?
00:28:11.000 Then you're also just canceling all of that good stuff.
00:28:14.000 And for that, for any good stuff, you need...
00:28:18.000 A nervous system.
00:28:19.000 And for any pain, you need a nervous system, as far as we understand this.
00:28:24.000 Though there are people who say...
00:28:25.000 When you say any good stuff...
00:28:26.000 Like experience?
00:28:28.000 Isn't that just movement, though?
00:28:30.000 Right?
00:28:31.000 Like, are we being prejudiced about the experiences that plants are having by being immobile?
00:28:36.000 Have you ever paid attention to some of the more recent research on plant intelligence and weird stuff that's going on with them?
00:28:43.000 Calculations?
00:28:46.000 Exuding, expressing some sense that causes, when they're being eaten, causes other plants downwind to change their flavor?
00:28:54.000 Yeah.
00:28:55.000 To avoid predation.
00:28:56.000 There was a New Yorker article on this.
00:28:58.000 I forget when this came out.
00:29:00.000 It's weird.
00:29:01.000 There's weird stuff that plants do, which—and I remember the details of that article aren't so clear to me.
00:29:07.000 I remember not knowing what to think about some of it, but some of it clearly— Can be explained in evolutionary terms that doesn't imply any experience.
00:29:16.000 It could all be dark.
00:29:18.000 It's all blind mechanism, but it still has an evolutionary logic.
00:29:23.000 The consciousness as we know it is what's most valuable to you.
00:29:27.000 Well, I think it's the only thing that's valuable to anything.
00:29:31.000 We're going to talk about value.
00:29:33.000 If this cup has no experience, if my trading place is with it, insofar as you can make sense of that concept, It's synonymous with just canceling my experience.
00:29:47.000 Well, then this cup isn't conscious.
00:29:50.000 There's nothing that it's like to be the cup.
00:29:52.000 When I break it, I haven't created suffering.
00:29:55.000 I haven't done anything unethical to the cup.
00:30:01.000 I have no ethical responsibilities toward the cup.
00:30:03.000 But the moment you give me something that can be made happy or be made miserable, depending on how I behave around it or toward it...
00:30:12.000 Well, then I'm ethically entangled with it.
00:30:14.000 And that begins to scale, I think, in a fairly linear way with just how complex the thing is.
00:30:24.000 This is maybe something we even talked about on a previous podcast.
00:30:28.000 If I'm driving home today and a bug hits my windshield...
00:30:33.000 You know, that has whatever ethical implication it has, but it's given what I believe about bugs and given how small they are and given how little they do and given how primitive their nervous systems are, you know, I'm not going to lose sleep over,
00:30:48.000 you know, killing a bug.
00:30:50.000 If I hit a squirrel, I'm going to feel worse.
00:30:53.000 If I hit a dog, I'm going to feel worse.
00:30:55.000 If I hit someone's kid, obviously, I'm I may never get over it, even if I live to be a thousand, right?
00:31:01.000 So the scaling, and granted there are cultural accretions there, so you're like, can I justify the way I feel about a dog as opposed to a deer?
00:31:11.000 You know, there's a difference.
00:31:13.000 But the difference is one of richness of experience insofar as we understand what other species experience.
00:31:23.000 You could make that argument to justify eating animals as opposed to being a cannibal, right?
00:31:28.000 You could say, well, what kind of experience does a deer have?
00:31:30.000 They're just running around the woods, trying not to get eaten, they eat grass, they mate.
00:31:35.000 It's very simple.
00:31:36.000 It's a very simple experience in comparison to your average...
00:31:41.000 Person that lives in Los Angeles that reads books, you know?
00:31:44.000 I mean, someone who goes on a lot of trips, someone who has a lot of loved ones, someone who has a great career, someone who's deeply invested in their work.
00:31:51.000 There's much more complexity.
00:31:52.000 Yeah, there's much more complexity.
00:31:54.000 And the deer behave just as deer all over the place, and it's a very primitive sort of a life form.
00:31:59.000 Well, if you go further and further back, it seems like...
00:32:03.000 You can keep going with that.
00:32:04.000 And one of the things that concerns me the most about plants, not concerns me, but puzzles me the most about plants, is whether or not the way I look at them, me personally, my prejudices about them, just not thinking at all of them as being conscious.
00:32:19.000 What if we think about things in terms of the complexity of their experiences just because we're prejudiced about things that move?
00:32:28.000 I mean, it's entirely possible that, like, it's going to sound really stupid, but...
00:32:32.000 I've said a lot of stupid shit.
00:32:33.000 I went into a grow room once, like a pot grow room.
00:32:37.000 Right.
00:32:37.000 This guy had this Mac Daddy grow room.
00:32:39.000 And I walked in, I was like, this is like being in a room full of people.
00:32:43.000 This feels weird as fuck.
00:32:46.000 It felt very weird.
00:32:48.000 It felt like there was a tangible, like, vibe of life in there.
00:32:54.000 And there's no other way to describe that without sounding like a complete moron.
00:32:57.000 Yeah.
00:32:58.000 But it was my experience.
00:33:00.000 I know what it's like to be that moron.
00:33:02.000 But you know what I mean?
00:33:03.000 You take enough acid, you know what you're talking about.
00:33:06.000 I'm not saying that we shouldn't eat plants.
00:33:08.000 People are ready, up in arms with their Twitter fingers, ready to get off.
00:33:12.000 But what I am saying is, it's entirely possible that all things that are alive have some sort of a way of being conscious.
00:33:21.000 May not be mobile, may not be as expressive, But there might be the stillness of you without language when you're in a place of complete peace, when you're in a Zen meditative state.
00:33:34.000 What about that stillness is really truly associated with being a human or really truly associated with being an English-speaking person in North America?
00:33:43.000 Almost nothing.
00:33:44.000 It's just an existence, right?
00:33:45.000 And then everything else sort of branches out from that.
00:33:48.000 And then humans, we all make the agreement that, of course, it branches out far further and wider than any other animal.
00:33:53.000 But how do we know that these plants aren't branching out like that, too?
00:33:56.000 How do we know that if they're having some communication with each other, if they're responding to predation, if they're literally changing their flavor, they're doing all these calculations and all these strange things that they're finding out that plants are capable of doing?
00:34:11.000 What is going on there?
00:34:13.000 We don't know, necessarily.
00:34:15.000 Yeah, well, I'm agnostic on the question of how far down consciousness goes.
00:34:20.000 And I agree that there's very likely a condition of something like pure consciousness that really is separable from the details of any given species.
00:34:31.000 I mean, this is something that I've experienced myself.
00:34:33.000 It feels like you can certainly have this experience.
00:34:37.000 What its implications are, I don't know.
00:34:40.000 But you can have the experience of Just consciousness.
00:34:44.000 And it doesn't have any personal or even human reference point.
00:34:49.000 It doesn't even have a reference point in one of the human sense channels.
00:34:56.000 So you're not seeing, you're not hearing, you're not smelling, and you're not thinking, and yet you are.
00:35:04.000 So there is still just open conscious experience.
00:35:09.000 And whether that is what it's like to be a plant, I don't know.
00:35:13.000 Because I don't know what the relationship between consciousness and information processing in the brain actually is.
00:35:20.000 Though it's totally plausible, in fact...
00:35:24.000 I think it's probably the most plausible thesis that there is some direct connection between information processing and integrated information processing and consciousness and that there is nothing that it's like to be this cup and atoms are not conscious.
00:35:42.000 But the thesis that consciousness goes all the way down We're good to go.
00:36:02.000 You can't rule that out.
00:36:04.000 I mean, there's nothing we know about the world to rule that out.
00:36:07.000 What I think you can rule out is the richness of the contents of consciousness in these species.
00:36:18.000 So plants are not having conversations like this, right?
00:36:21.000 So plants don't understand what we're doing.
00:36:23.000 There's no way they would.
00:36:25.000 They don't have nervous systems.
00:36:27.000 They're not...
00:36:28.000 They can't be processing information in a way that would give them what we know as a rich experience.
00:36:34.000 But your point about the time scale and movement is totally valid.
00:36:39.000 If every time you walked into a room, your fern just turned and looked at you, just oriented toward you and followed you around the room with its leading branch, you would feel very different about the possibility that it's conscious,
00:36:55.000 right?
00:36:55.000 You'd be a good person to ask this.
00:36:57.000 Is that an urban myth that if you sing to your plants, you play your plants music, that they grow better?
00:37:03.000 I haven't looked into it.
00:37:05.000 We need to find out.
00:37:06.000 I would bet that it is, yes.
00:37:08.000 It sounds like one of those things that chicks who like crystals tell you.
00:37:11.000 Yeah, I would bet that it is.
00:37:14.000 I sing to my flowers and they grow so beautiful.
00:37:17.000 Yeah, it seems like one of those things that people say.
00:37:19.000 I'm definitely not insinuating that plants would have as rich an experience as human beings, but I don't think a deer has as rich an experience as a human being either.
00:37:26.000 And it's just, to me, my...
00:37:31.000 My curiosity lies in the future of understanding plant intelligence.
00:37:35.000 Like, wouldn't it be fascinating if we found out that they...
00:37:38.000 Like, one of the reasons why psychedelic drugs puzzle me so much is that they exist in, like, there's a lot of plants.
00:37:44.000 You could just eat them.
00:37:45.000 And you have...
00:37:46.000 Your brain already has this place it'll go if you eat these plants.
00:37:49.000 Like, if you eat peyote, if you...
00:37:52.000 If you try the San Pedro cactus out this spring, you can have these really powerful psychedelic experiences just from a plant.
00:38:01.000 Why does the human mind interact with these plants like that?
00:38:05.000 Especially fungus.
00:38:06.000 When you have major league mushroom trips, it's a very strange sort of feeling like you're in communication with another.
00:38:21.000 Mm-hmm.
00:38:31.000 According to the hippiest of hippies is plant intelligence.
00:38:35.000 It's mother Gaia.
00:38:37.000 It's the earth itself.
00:38:38.000 It's all life.
00:38:39.000 It's love.
00:38:40.000 It's God.
00:38:40.000 It all exists inside the intelligence that's intertwined in nature.
00:38:46.000 It's one of the things that's most puzzling about the most potent of all psychedelics, which is dimethyltryptamine, that it's in so many different plants.
00:38:56.000 Dimethyltryptamine containing plants illegal would be hilarious, because there'd be like hundreds and hundreds of plants they'd have to make illegal, including like Phalaris grass, which is really rich in 5-methoxy dimethyltryptamine, which is the most potent form of it.
00:39:11.000 Well, also our own brains.
00:39:13.000 Yeah, our own brains make it.
00:39:15.000 Every brain makes it.
00:39:17.000 The weird little lizards that have retinas and lenses where their pineal gland is, they're making it in their little screwy little lizard brains.
00:39:25.000 We don't even know what the hell it's for.
00:39:27.000 The questions about it are just so much...
00:39:30.000 There's so many more questions than there are answers.
00:39:33.000 You've paid attention to...
00:39:35.000 Rick Strassman stuff.
00:39:37.000 Did you know that the Cottonwood Research Foundation actually found mice brains producing dimethyltryptamine?
00:39:45.000 So it's been proven that it's actually growing in the pineal gland, the gland that they had always suspected.
00:39:52.000 Yeah, I assumed that.
00:39:54.000 I don't know if I got that from the book or not.
00:39:56.000 That's crazy.
00:39:59.000 As a neuroscientist, doesn't that kind of freak you out that those Egyptians had those third eyes and all the Eastern mysticism had that pineal gland highlighted?
00:40:08.000 It was like on the end of shafts or staffs, they would put those pine cones.
00:40:13.000 How the hell did they know all that?
00:40:15.000 Well, there is a...
00:40:16.000 I mean, the third eye metaphor, or...
00:40:20.000 I mean, it's more than a metaphor, you know, anatomically, but it's a...
00:40:26.000 It correlates with the kind of experience you can have.
00:40:29.000 I don't actually know if the experience people have that's, you know, this chakra, if you talk in yogic terms...
00:40:39.000 I don't know if that has anything to do with pineal gland.
00:40:44.000 I don't think anyone's done this neuroimaging experiment where you can get people who can reliably produce a third eye opening sort of experience and scan their brains while they do it.
00:40:55.000 In fact, I'm almost positive that hasn't been done.
00:40:59.000 But there is a phenomenology here of people having A kind of inner opening of...
00:41:06.000 It's almost certainly largely a matter of visual cortex getting stimulated, but you can meditate in such a way as to produce this experience.
00:41:18.000 It's an experience that you have more or less on different psychedelics.
00:41:23.000 Some psychedelics are much more visual than others at certain doses, in particular, like mushrooms and DMT, which I've never taken, which you can say better than I, but it's reported to be quite visual.
00:41:40.000 So most people, when they close their eyes, unless they're having hypnagogic images before sleep or they just happen to be super good visualizers of imagery, you close your eyes and you just basically have darkness there, right?
00:41:55.000 Now, if you close your eyes, and if you're listening to this, and you close your eyes and you look into the darkness of your closed eyes...
00:42:02.000 That is as much your visual field as it is when your eyes are open.
00:42:09.000 It's not like your visual field hasn't gone away when you close your eyes.
00:42:14.000 There's not much detail for you to notice, again, unless you are in some unusual state.
00:42:20.000 But that, you know, based on different techniques of meditation, and this happens spontaneously, again, with hypnagogic images or with psychedelics, that space can open up into just a massive world of visual display,
00:42:37.000 right?
00:42:37.000 So you can just see a full-blown, you know, 3D movie in there.
00:42:42.000 And it's a...
00:42:46.000 But most of us just take it for granted that when you close your eyes, you're functionally blind, you can't see anything, and we're not interested in that space.
00:42:55.000 But you can actually train yourself to look deeply into that space as a technique of meditation.
00:43:01.000 I don't want to interrupt you, but does that have implications in people having eyewitness testimony and eyewitness experiences that turned out to not be true at all?
00:43:10.000 Because if you think about the human mind and the imagination being able to create imagery once the eyes are closed, like you can in sensory deprivation tanks.
00:43:20.000 Sensory deprivation tanks are...
00:43:22.000 A lot of people's experiences are very visual, even though it's in complete darkness.
00:43:26.000 Now, you know how people see things and they thought they saw something and it turns out to not be true at all, whether it's Bigfoot or whether it's a robbery or a suspect, and they get the details completely all wrong.
00:43:37.000 But isn't it possible that under fear and when your pulse is jacked up and your adrenaline's running and you're worried about all these possibilities and your imagination starts formulating predetermined possibilities you should be looking out for?
00:43:53.000 Like, what if it's Bigfoot?
00:43:54.000 What if it's a robber?
00:43:55.000 What if it's an alien?
00:43:56.000 What if it's a this?
00:43:57.000 And then these people that swear they saw these things that everybody knows they didn't, like maybe there was video footage of it or whatever it was, Is it possible that your brain can do that to you and can literally show you things that aren't real?
00:44:12.000 Oh yeah, it can do that, although I think the unreliability of witness testimony, and it's shockingly unreliable, is more a matter of the corruption of memory and the way memories are recalled.
00:44:28.000 They're especially vulnerable When you're recalling them.
00:44:32.000 They can be revised in the act of recall.
00:44:35.000 And it's very easy to tamper with people's memory, albeit inadvertently.
00:44:40.000 I mean, you can do this on purpose, too, but people just do it with bad interrogation techniques.
00:44:45.000 So, you know, the cop will ask you...
00:44:48.000 So when did you see the woman at the crosswalk?
00:44:51.000 And he's just put a woman at the crosswalk into your memory, right?
00:44:55.000 Because you can't help but visualize a woman at the crosswalk.
00:44:58.000 And memory is very fragile.
00:45:02.000 And so whenever you're given an account of an experience, even if it's an experience that happened half a second ago, Now we're in the domain of memory.
00:45:14.000 Now we're in the domain of just what you can report.
00:45:18.000 It's not a matter of what you're consciously experiencing.
00:45:21.000 Now, I know there was a case in India where I believe it was a woman was convicted of murder through the use of an fMRI, or a functioning magnetic resonance imagery machine.
00:45:34.000 And through this fMRI, they determined in some strange way that She had functional knowledge of the crime scene.
00:45:44.000 And the argument against that, I believe, was that she could have developed functional memory of the crime scene by being told you're being prosecuted for a crime.
00:45:55.000 You might go to jail for murder.
00:45:56.000 Here's the crime scene.
00:45:58.000 Or just being unlucky enough to be familiar.
00:46:06.000 Normally when this gets done, and there are people who do it in the States, but they don't use fMRI as their modality, but they do interrogate people's familiarity and they use EEG as a way of monitoring people.
00:46:24.000 They'll show them, you know, if you are shown evidence from the crime scene that only the perpetrator could have seen, you know, hopefully it's really something that only the perpetrator could have seen.
00:46:38.000 But if, you know, if they show you the picture and, you know, you see that, oh, yeah, you know, I have that IKEA end table and, you know, I have that dress from Banana Republic or whatever...
00:46:48.000 Just by dint of bad luck, you're familiar with something that you're being shown from the crime scene.
00:46:52.000 And especially if it's a murder, you're talking about someone who she was probably intimate with.
00:46:56.000 She probably knew them at least, so she's probably been to their house.
00:47:00.000 Yeah, that would be obviously a case where you really couldn't do it at all.
00:47:04.000 How does it work?
00:47:06.000 Well, no one in the States, as far as I know, unless this has changed in the last year or so since I've paid attention to this, none of this is admissible in court.
00:47:17.000 Right.
00:47:17.000 Only in India.
00:47:18.000 Yes.
00:47:18.000 Only the one case that I ever heard of.
00:47:20.000 Yeah.
00:47:20.000 I mean, it's a crazy, premature use of the technology.
00:47:23.000 But how does the technology work?
00:47:25.000 What is it actually seeing?
00:47:26.000 Well, you have...
00:47:28.000 With EEG, I think this has worked out more.
00:47:32.000 You can have a kind of a canonical familiarity response to a stimuli.
00:47:39.000 If you've seen something for the first time...
00:47:43.000 It would be a novelty response.
00:47:45.000 Seeing something for the third, fourth, fifth time would be a different response.
00:47:50.000 And you can...
00:47:51.000 I mean, this has been...
00:47:52.000 Again, it's been a long time since I've looked at this particular research, and I don't know how...
00:47:56.000 I don't know what they're calling these waveforms now.
00:48:00.000 I mean, there was a P300 waveform at one point, and there are waveforms that come from certain areas of the brain at certain timing intervals based on...
00:48:23.000 Yeah, so, I mean, it's not...
00:48:30.000 There's no question that at a certain point we will have reliable mind-reading machines.
00:48:35.000 I think it's really just a matter of time.
00:48:38.000 I think there's also no question that we don't have them now, at least not in a way that we can send someone to prison on the basis of what their brain did in an experiment.
00:48:48.000 But, I mean, just as you...
00:48:53.000 A lot of the most interesting stuff is unconscious, but anything you're consciously aware of having seen before, right?
00:49:01.000 So if you were to show me this cup, right, and then five seconds later say, is this the cup I showed you?
00:49:09.000 You know, I have a very clear sense of, yeah, that's the cup, right?
00:49:15.000 And if you show me a completely different cup, I'm going to have a very clear internal sense of, no, no, that's not the cup, right?
00:49:21.000 If you're having that experience, that is absolutely something about the state of your brain that can be discriminated by a person outside your brain running the appropriate experiment.
00:49:37.000 It's just our tools are still sufficiently coarse that it's not like...
00:49:44.000 I think?
00:50:04.000 It'll just be clear.
00:50:05.000 I know your thoughts, right?
00:50:06.000 So think of how much faith you would have in this technology if you could open your computer and read any file, a file of your choosing that I have never seen, right?
00:50:17.000 So the contents of which I'm completely blind to.
00:50:23.000 And I'm scanning your brain while you're reading this journal entry or a newspaper article or whatever.
00:50:28.000 And at the end of that, I can say, well, based on this report, you clearly read a story about Donald Trump.
00:50:38.000 And you actually don't like Donald Trump.
00:50:40.000 And I could tell you in detail about what you were consciously thinking about.
00:50:47.000 If you could do that 100% of the time...
00:50:52.000 At a certain point, the basis to doubt the validity of the mind-reading machine would just go away.
00:50:57.000 It would be like, are you really hearing my voice right now?
00:51:01.000 You certainly seem to.
00:51:03.000 It would become just a background assumption that the technology works at a certain point.
00:51:10.000 Just as the meat is scary, the fake meat, that idea is terrifying.
00:51:14.000 The idea of losing the...
00:51:18.000 Losing the privacy of your thoughts?
00:51:20.000 Yeah, the ownership, like losing a thought becoming non-autonomous, like the idea of everybody sharing thoughts, it seems almost inevitable.
00:51:29.000 Well, this, you know, I don't know that we would decide to do this Certainly, I don't think we would do this all the time, right?
00:51:36.000 I mean, it might be fun to do it sometime.
00:51:38.000 But someday, it might be no option.
00:51:39.000 We might be compelled to do it.
00:51:41.000 I mean, if you're at a murder trial, right?
00:51:43.000 If you're someone who may have killed somebody, you...
00:51:46.000 I mean, the Fifth Amendment becomes interesting in that case.
00:51:50.000 But...
00:51:52.000 We have worked that out for DNA evidence, right?
00:51:54.000 So you have a right to take the fifth, but you don't have a right to withhold your blood from the proceeding, right?
00:51:59.000 So I get to look at your blood or your saliva.
00:52:02.000 So you don't have to talk, but I can read your mind.
00:52:04.000 Yeah, and I think that's a...
00:52:06.000 Well, and this is like the iPhone case.
00:52:10.000 We've thus far decided that we can't compel Apple to unlock an iPhone.
00:52:18.000 This is just the ultimate...
00:52:20.000 I think, obviously,
00:52:48.000 you're going to want I don't know.
00:53:18.000 That I see in society.
00:53:21.000 It's just the amount of human misery born of not being able to demonstrate that someone is lying reliably.
00:53:30.000 It's the biggest lever that I think we could pull.
00:53:35.000 Certainly when you have prisons that are filled with a lot of people that are probably innocent.
00:53:39.000 Oh, yeah.
00:53:39.000 People have gone to the gallows, no doubt.
00:53:43.000 We know innocent people who have been So one way to deal with that is just be against the death penalty, right?
00:53:50.000 So there's always a chance to find their innocence.
00:53:52.000 But imagine being in prison on death row for 30 years for a rape you didn't commit.
00:53:59.000 It's so horrible.
00:54:01.000 It's horrifying.
00:54:02.000 So yeah, that would be the major argument.
00:54:04.000 It would be such a simple solution, really.
00:54:30.000 Yeah.
00:54:44.000 This doesn't get publicized very much, but it's a very common experience of police officers or police departments to hear from people in the community who are confessing to the crime and they didn't commit it.
00:54:56.000 They just come in and they say, I did it, and they give all this bogus account of what happened.
00:55:01.000 And this is a sign of mental illness, or these are people seeking attention in some morbid way.
00:55:09.000 But there are people clearly who are so suggestible That they can either lead themselves to believe or be led by others to believe that they've done things they didn't do in just shocking detail.
00:55:26.000 There was another New Yorker article on I think it was written by William Langewish this was years ago but on the satanic panic case where a guy got accused of Running a satanic cult by,
00:55:46.000 I think, his daughter who was in hypnosis recovery therapy, right?
00:55:52.000 So she had been led down the primrose path by her therapist, and so she obviously was fairly suggestible.
00:56:00.000 And she recalled the most...
00:56:05.000 This lurid, just insane, Rosemary's Baby-style cult imaginable going on in her town, where the friends of Dad were coming over and raping everyone, and there was a human sacrifice of infants, and the infants were buried by the barn.
00:56:22.000 She recalled all this.
00:56:23.000 The Dad was so suggestible Oh, my God.
00:56:45.000 At the end, this guy's now in prison, right?
00:56:50.000 The process is completed.
00:56:51.000 Justice has been done.
00:56:53.000 The daughter's convinced that her dad is a satanic monster, as is he, and he's now in prison.
00:57:01.000 And they went in and interviewed him.
00:57:04.000 asking follow-up questions with just details that they made up, right?
00:57:08.000 Because they began to suspect that he was just this kind of suggestibility machine, right?
00:57:12.000 Who just would cop to anything.
00:57:14.000 And so they went in and they just made up stuff.
00:57:17.000 Like, oh, you know, there's a few more details we want to iron out.
00:57:20.000 Your daughter said that there was a time where, you know, you brought in a horse and then you were riding on the horse and then you killed the horse.
00:57:29.000 I'm not making these details up because I don't remember, but Something that they just concocted, right?
00:57:33.000 And he said, oh yeah, yeah, yeah.
00:57:36.000 And he just copped it out.
00:57:37.000 It became like a Twilight Zone.
00:57:39.000 It's like a perfect Twilight Zone episode where now you realize that this guy has been put away, is just saying yes to everything, right?
00:57:47.000 How they resolve it.
00:57:50.000 I don't know, again, I don't know if he ever wrote any follow-up on this, because as I recall, and this is like a 15-year-old story, they ended with the Twilight Zone moment, where now you realize this guy is innocent and just saying yes to everything.
00:58:06.000 And his daughter's crazy, because she shares his genes, probably.
00:58:10.000 I don't recall what the daughter did with that, but...
00:58:15.000 I mean, the story, and perhaps there's more to the story, but the story on his face was totally exculpatory.
00:58:21.000 The reader experience was, you've got to let this guy out of prison tomorrow.
00:58:27.000 Wow, but he's fucked.
00:58:29.000 I mean, even if he gets out, his mind is probably so screwed up by this whole experience.
00:58:33.000 And if he really does believe that he ran these satanic rituals, just the guilt and shame of it all.
00:58:39.000 I mean, we're assuming that his mind works.
00:58:42.000 I mean, this is the thing that I wonder how many people are like functionally, deeply, deeply damaged, but they're functional.
00:58:49.000 Like they're going to the same schools that you go to, they work where you work, but they're barely a person.
00:58:56.000 They're like, all their connections are all fucked up.
00:58:59.000 Like if you went to the back wiring of their head, if you were like an appliance repair person, like, so your TV's not working, huh?
00:59:04.000 Let me go back here.
00:59:05.000 What the fuck is all this?
00:59:07.000 I mean, how many people are like that, that are just sort of kind of functional?
00:59:10.000 My question is, if we do get to a point where you could read minds, what if you go into their minds and you find out, well, this is what they really think.
00:59:19.000 Like, this is not a liar.
00:59:21.000 This is a person who's seeing things that aren't there.
00:59:24.000 Like, a person who's completely delusional, like people that have these hallucinating, hallucinogenic visions.
00:59:33.000 Some people have really deeply troubling visual images that they see.
00:59:39.000 Imagine if these poor fucking people really are seeing that.
00:59:42.000 And if you could read their mind, you would literally be inside the mind of a person whose mind isn't functioning.
00:59:49.000 And we can get sort of an understanding about what that would be like.
00:59:52.000 Yeah, well, I mean, this has been done in a very simple way, where with schizophrenics, who mostly have auditory hallucinations, you can now detect auditory cortex...
01:00:03.000 So mishaps, misinterpretations you can detect?
01:00:06.000 We just know that their auditory cortices are active in the same way that when you're hearing my voice, it's going to be active.
01:00:12.000 When they're hearing internal voices, it's active.
01:00:15.000 Whoa, that's crazy!
01:00:17.000 Which is what you would expect.
01:00:18.000 I mean, the surprise would be the other way.
01:00:21.000 If nothing was going on in the auditory cortex and they were hearing voices, then that would be more surprising to me.
01:00:28.000 That's fascinating.
01:00:29.000 You can watch the hallucinations take place inside the mind of the person.
01:00:35.000 Yeah.
01:00:35.000 Whoa!
01:00:36.000 And you can stimulate hallucinations in people.
01:00:39.000 You can stimulate out-of-body experiences in people with transcranial magnetic stimulation.
01:00:45.000 Yeah.
01:00:46.000 How do they do that?
01:00:46.000 They've been able to do that recently and to try to...
01:00:50.000 Give people the same sort of experience that they claim to have had on the operating table.
01:00:55.000 That's what people have a lot, right?
01:00:56.000 When they're almost dead.
01:00:58.000 What's going on?
01:00:58.000 What is that?
01:01:00.000 Well, that is, there's an area, I believe this is at the temporal parietal junction, which is sort of here.
01:01:10.000 He's pointing to above his ear.
01:01:12.000 Yeah, yeah, yeah, sorry.
01:01:13.000 A lot of people are listening.
01:01:14.000 I'm talking to you, not talking to the millions.
01:01:16.000 But it's, you know, where the temporal lobe and the parietal lobes intersect, and the I think it was first discovered in surgery on an epileptic, or in any kind of resection of the brain where people are awake because there's no pain sensors in the brain,
01:01:38.000 so you can stay awake while you're getting brain surgery.
01:01:41.000 And they tend to keep you awake if they're going to be removing areas of the brain, let's say a tumor or the focus of an epileptic seizure, and they don't want to remove...
01:01:54.000 Working parts, especially, you know, language parts.
01:01:57.000 So they're keeping people awake and they're probing those areas of the cortex to see what it's correlated with in the person's experience.
01:02:09.000 So they're having them talk, they're having them answer questions, and they're putting a little bit of current in that area, which would be disruptive of normal function.
01:02:20.000 And, you know, they're mapping.
01:02:24.000 Almost entirely mapping language cortex when they do this.
01:02:28.000 But there have been experiences where a neurosurgeon will put a little current in an area near this region of the brain, and people will have this out-of-body experience where they're up in the corner of the room looking down on their bodies or...
01:02:48.000 The classic astral projection experience or the near-death experience where people have risen out of their body or seem to have risen out of their body.
01:03:00.000 And consciousness now seems to be located elsewhere.
01:03:04.000 And that's a...
01:03:05.000 That region of the brain is...
01:03:16.000 Virtually every region of the cortex does many, many things.
01:03:19.000 There's no one region of the brain that does one thing.
01:03:22.000 There are a couple of exceptions to this.
01:03:25.000 So the whole brain is participating in much of what we do, and it's just greater or lesser degrees of activity.
01:03:33.000 But in terms of your mapping your body in space, The parietal lobe has got a lot to do with that.
01:03:41.000 And when that gets disturbed, you can have weird experiences.
01:03:47.000 You can have the experience of not recognizing your body or parts of your body, like alien hand syndrome, where this left arm seems like another person's arm, and people try to disown half their body.
01:04:06.000 And you can trick people with visual changes of display.
01:04:15.000 You can wear headgear where you can make me feel like...
01:04:19.000 It's called the body-swapping illusion.
01:04:21.000 I can feel like I am located...
01:04:23.000 My consciousness is located in your body looking back at me.
01:04:27.000 There's a clever experiment that they did where there's the ultimate...
01:04:34.000 extension of what has long been called the rubber hand illusion where you can put, like my two hands are on the table now, you can set up an experiment where if you put a rubber hand if you set this up in a way where I am assuming I have my two hands here you can put a rubber hand in its place and touch this rubber hand with a brush So I'm
01:05:05.000 seeing the rubber hand get touched with a brush, and I can feel like my hand is being touched.
01:05:14.000 It's like if my hand is elsewhere under the table being touched with a brush at the same time, I can feel like my hand is now the rubber hand.
01:05:23.000 So I can feel like my hand is in place of the rubber hand based on visual and tactile You know, the simultaneity of my seeing the rubber hand get touched with a brush and my feeling my hand, which is now under the table, being touched with a brush.
01:05:38.000 I'm not explaining that setup great, but people can look it up.
01:05:42.000 But you can do the same thing to the ultimate degree with this video goggle display where I'm getting input, visual input, from where you're standing.
01:05:54.000 So like if you come up to shake my hand, I'm seeing you come up to me and shake my hand, but I'm seeing it from your point of view.
01:06:02.000 I'm getting visual input from...
01:06:05.000 I now feel like I'm walking up to me, shaking my hand.
01:06:09.000 And you can just kind of feel like your consciousness is over there, outside your body.
01:06:18.000 And it's just to say that our sense of self, our sense of being located where we are in our heads is largely, and in some cases almost entirely, a matter of vision.
01:06:34.000 The fact that you feel you're over there is because that's where your eyes are.
01:06:39.000 You're behind your eyes.
01:06:40.000 You feel like you're behind your eyes.
01:06:42.000 And tricks of vision can seem to dislodge that sense of being located there.
01:06:51.000 So stimulating one area of the brain electrically has been shown, like even just transdermally, I guess, what do they do?
01:07:00.000 They put little, what are those things called?
01:07:02.000 Those little gluon things?
01:07:05.000 Electrodes, is that what it is?
01:07:07.000 You're talking about EEG there.
01:07:08.000 That's reading from the brain rather than...
01:07:11.000 No, I'm actually talking about some new experiments that they've been doing that they were talking about on Radiolab.
01:07:17.000 And apparently there's a bunch of do-it-yourselfers.
01:07:19.000 Have you ever listened to Radiolab?
01:07:21.000 Yeah.
01:07:22.000 It's been a while.
01:07:23.000 Amazing podcast.
01:07:24.000 One of the best ever.
01:07:25.000 But it had this one episode that dealt with people learning certain skills while the outside of their brain was being stimulated with a little electrode.
01:07:35.000 And this woman who was one of the reporters went to a sniper training thing where they set up the scenario and they give you like a fake gun.
01:07:43.000 You point at the screen and you try to hit the targets as all these things are happening.
01:07:46.000 She did it once.
01:07:47.000 Yeah, Nine Volt Nirvana is the name of the episode.
01:07:49.000 Oh, right, right.
01:07:50.000 It is an outstanding episode.
01:07:52.000 It's so fascinating.
01:07:53.000 Anyway, she goes through one.
01:07:55.000 She's terrible at it.
01:07:56.000 She just sucks terribly.
01:07:57.000 Then they hook her up to this machine.
01:07:59.000 They attach this electrode to a certain area of her brain, stimulate one area of her brain, and she goes through it like a fucking sniper.
01:08:07.000 Time slows down.
01:08:09.000 She gets 20 out of 20. So she goes from being a complete failure to being awesome at it in some weird flow state that she described.
01:08:17.000 And they're talking about all the US government's using it.
01:08:19.000 They're trying to train soldiers and snipers and people to try to understand this mindset and try to achieve this mindset and that they're trying to do it.
01:08:27.000 And there's certain companies that are experimenting with it at least by stimulating the outside of your head.
01:08:33.000 So I could not know this particular story.
01:08:37.000 I remember hearing that title, though.
01:08:40.000 So transcranial magnetic stimulation is magnetic energy, which is the flip side of electrical energy.
01:08:48.000 So if you apply a big magnet to the side of your head, you are changing the electrical properties of your cortex.
01:08:56.000 Is that what they're doing?
01:08:57.000 Well, no, I'm wondering if they've just made a transcranial magnetic device so small that it's...
01:09:05.000 I don't think it's magnetic, though.
01:09:06.000 So it's direct current, yeah.
01:09:08.000 Okay, direct current stimulation.
01:09:10.000 Transcranial direct current stimulation.
01:09:12.000 Yeah, that's what it's called, TCD. Yeah, I'm unaware of the specifics of this, but it's been true for a long time that with...
01:09:22.000 I think?
01:09:35.000 On various areas of the cortex.
01:09:38.000 So bizarre.
01:09:39.000 Yeah.
01:09:40.000 Because what you're doing, I mean, you are disrupting neural firing, but you can disrupt areas that are inhibiting It's not always synonymous with the degradation of performance.
01:10:00.000 You could increase performance on a certain task by taking one region of the brain offline or more or less offline.
01:10:11.000 But I'm not, you know, I'm not aware of how far they've taken it in terms of doing anything that seems useful in terms of, you know, performing something.
01:10:18.000 I mean, the research I'm more aware of is just using this to figure out what various regions of the brain are doing.
01:10:26.000 I mean, kind of mapping function, because you want to see, if I disrupt an area here, how does that show up in an experiment?
01:10:34.000 And that gives you some clue as to what that region is doing, at least in that task.
01:10:39.000 As much as we know about the mind and being able to do things like this, like overall, if you had to really try to map out the exact functions of the mind and how everything works, how far do you think we are along to understanding that?
01:10:50.000 Are we halfway?
01:10:51.000 Do you think we understand half of how the brain works?
01:10:56.000 No.
01:10:56.000 Well, I mean, I wouldn't even know how to quantify it at this point.
01:10:59.000 It's just we know...
01:11:02.000 A ton, right?
01:11:04.000 We know a lot about where language is and where facial recognition is.
01:11:11.000 Your visual cortex has been really well mapped, and we know a lot.
01:11:17.000 And for the last 150 years, based on just neurological injury, and then in the last decades, based on imaging technology, we know regions of the brain that...
01:11:33.000 Absolutely govern language and regions of the brain that have basically nothing to do with language, you know, to take one example.
01:11:40.000 And we know a lot about memory, and we know a lot about the different kinds of memory.
01:11:45.000 But there's, you know, I think there's much more we don't know.
01:11:54.000 What's even more...
01:11:59.000 The greatest friction in the system is there's not often a lot to do with what we know.
01:12:05.000 Knowing is not enough for certain things.
01:12:07.000 To intervene...
01:12:10.000 Is another part of the process where there are no guarantees.
01:12:15.000 The way we can intervene in the functioning of a brain is incredibly crude, pharmacologically or with surgery or with a device like that.
01:12:26.000 So to get from a place of really refined knowledge to a place of being able to do something we want to do with that knowledge, that's another step.
01:12:41.000 There's no reason to think that we're not going to take it at some point, but it's an additional complexity to get inside the head safely and help people or improve function, even if you know a lot about what those areas of the brain do.
01:12:56.000 But we don't know.
01:12:58.000 We haven't cracked the neural code.
01:13:00.000 We don't know how consciousness is arising in the brain.
01:13:05.000 We wouldn't know how to build a A computer that does what we do, to say nothing, of experience the world as we experience it yet.
01:13:18.000 And we may be going down paths where...
01:13:24.000 We will build it more by happenstance.
01:13:26.000 We might build it and not quite know how it's doing what it's doing, but it's seeming to do more or less what we do.
01:13:34.000 We'll be doing it very differently.
01:13:36.000 So there are two paths, or at least two distinct paths in...
01:13:40.000 Artificial intelligence, and one path could try to emulate what the brain is doing, and that obviously requires a real detailed understanding of what the brain is doing.
01:13:51.000 Another path would be to just ignore the brain, right?
01:13:55.000 So there's no reason why artificially intelligent machines, even machines that are superhuman in their capacities, need to do anything That is similar to what we do with our brains,
01:14:11.000 you know, with neurochemical circuits.
01:14:13.000 So because they're going to be organized differently and, you know, could be organized quite differently and obviously made of totally different stuff.
01:14:24.000 So, whether you want to go down the path of emulating the brain on the basis of a detailed understanding of it, or you just want to go down the path of maximizing intelligent behavior in machines, or some combination of the two,
01:14:41.000 they're distinct, and one doesn't entail really knowing much about the brain, necessarily.
01:14:51.000 So there's really two different ways they can go about it.
01:14:53.000 Either they could try to reproduce a brain.
01:14:55.000 Yeah, and people are doing that.
01:14:57.000 If they can make fake meat, why can't they make fake brain tissue?
01:15:01.000 It seems like they could.
01:15:02.000 I mean, I know a woman got her bladder replaced with stem cells.
01:15:06.000 Did you hear about that?
01:15:07.000 They took stem cells and recreated her own bladder.
01:15:10.000 She had bladder cancer.
01:15:12.000 So they built her a new bladder in a laboratory and put her own bladder back in her body.
01:15:18.000 That is insanely fascinating.
01:15:21.000 Now, if they can figure out how to extract a little bit of brain tissue...
01:15:24.000 The thing is, the brain is famously the most complicated object in the universe.
01:15:29.000 A bladder is essentially a bag.
01:15:31.000 Yeah.
01:15:32.000 So it's a...
01:15:34.000 Big leap.
01:15:34.000 It's a big leap, but it's not...
01:15:39.000 I think it's a leap we will take.
01:15:43.000 It may be a leap we take in this stepwise way where we build machines down a path that is not at all analogous to recreating brains, which allow us to then understand the brain You know,
01:15:59.000 totally, in the Ray Kurzweil sense, where we can, you know, upload ourselves, if that makes any sense.
01:16:08.000 But it's a...
01:16:10.000 I mean, I think information...
01:16:17.000 Processing is at bottom what intelligence is.
01:16:22.000 I think that is not really up for dispute at this point.
01:16:27.000 That any intelligent system is processing information, and our brains are doing that.
01:16:33.000 And any machine that is going to exhibit the kind of general intelligence that we exhibit and surpass us will be doing, by dint of its...
01:16:46.000 Hardware and software, something deeply analogous to what our brains are doing.
01:16:51.000 But again, we may not get there based on directly emulating what our brains are doing.
01:16:59.000 And we may get there before we actually understand our brains in a way that would allow us to emulate it.
01:17:08.000 Very interesting to me how it seems to be there's always pushes and pulls in life.
01:17:15.000 And when you have things that are as horrific as factory farming and people are exposed to it, then there's this rebound and where people are trying to find a solution.
01:17:26.000 And I always wonder, like, will that be the first artificial life that we create, like zombie cows?
01:17:33.000 Like, maybe if we figured out that meat in the lab is not good because it has to actually be moving around for it to be good for you.
01:17:39.000 Maybe they'll come up with some idea to just...
01:17:42.000 Look, we're gonna make zombies.
01:17:44.000 We're gonna make livestock that essentially can just move forward and consume food.
01:17:49.000 There's no thought whatsoever.
01:17:50.000 These are zombies.
01:17:51.000 You can go right up to them, you wave your hand in front of them, they don't even move.
01:17:54.000 Is it okay to kill those?
01:17:55.000 And then go from that to making artificial people.
01:17:59.000 Because it seems to me that artificial people, it's gonna happen.
01:18:03.000 I mean, it's just a matter of how much time.
01:18:04.000 If they're making bladders, and then they're gonna start making all sorts of different tissues with stem cells to try to replace body parts and organs, and they're gonna work their way through an actual human body.
01:18:15.000 It's going to happen.
01:18:16.000 Do you mean a brainless person that would be like spare parts for you?
01:18:21.000 That, for sure.
01:18:22.000 That's an option.
01:18:23.000 But I think also an artificial human.
01:18:26.000 I mean, it might take a thousand years, but I think if we stay alive, if human beings, rather, if human beings continue to evolve technologically...
01:18:35.000 Within the next thousand years, we're going to have artificial people that are completely...
01:18:38.000 So you mean they're people, so we're going to build a biological person.
01:18:42.000 100%.
01:18:42.000 So it's a synthesized person, right?
01:18:45.000 So you're not talking about the perfect robot.
01:18:47.000 You're talking about an actual person built up cell by cell.
01:18:52.000 Yeah.
01:18:53.000 You know, there's no reason why that's not possible.
01:18:59.000 Whether or not we would do it is another question.
01:19:02.000 Somebody would.
01:19:04.000 China.
01:19:05.000 Yeah, well, presumably there are limits even in China.
01:19:09.000 Aren't they doing experiences already with human embryos?
01:19:12.000 I don't know what they're doing in China.
01:19:14.000 I believe they are.
01:19:15.000 I believe they got the go-ahead.
01:19:16.000 I just know they have that dog festival that every year I see people tweet about.
01:19:20.000 China's a big place, though.
01:19:21.000 Can't run them all in there.
01:19:22.000 But, you know, what is that gene splicing software?
01:19:26.000 CRISPR. CRISPR, yeah.
01:19:26.000 They're using CRISPR on human fetuses or human embryos.
01:19:31.000 Right.
01:19:32.000 So good luck.
01:19:33.000 Good luck, world.
01:19:34.000 China's creating super-athletes.
01:19:36.000 There are many issues there, but when you're talking about changing the genome, and especially when you're talking about changing the germline, then it gets passed on to future generations.
01:19:49.000 That has big implications.
01:19:53.000 I don't see why...
01:19:54.000 I mean, this goes to...
01:19:55.000 It's like the artificial meat conversation.
01:19:58.000 So to grow meat in a vat is ethically the same thing as...
01:20:06.000 At least in my view, it would be the same thing as producing a brainless cow, right?
01:20:11.000 So you have the whole cow that you could slaughter, but it has no brain.
01:20:15.000 So...
01:20:16.000 Presumably there's no experience in this animal, but it is the fully functioning animal, right?
01:20:23.000 So let's say you could produce that and you would produce healthy meat.
01:20:28.000 It's just a messier...
01:20:30.000 Presumably you have to feed this thing, right?
01:20:34.000 I don't know how you get it to eat, but let's say you feed it intravenously.
01:20:37.000 It all begins to look weirder and weirder, but there's no suffering there because there's no brain.
01:20:45.000 I think we would...
01:20:46.000 And I think we have decided to bypass that vision and just go straight to the vat and just build it up cell by cell and build up only what we need, which is the meatball or the steak.
01:20:58.000 So why have the fur and the organs that you don't want and the mess?
01:21:04.000 The kind of the energy intensive aspects of producing a whole animal.
01:21:07.000 And I think with like spare parts for humans, rather than create a clone of yourself that has no brain that you just keep in a vat somewhere in your garage where you can get spare kidneys when you need them.
01:21:22.000 We would just be able to, you know, print the kidneys.
01:21:26.000 Because that gets around a lot of the weirdness, right?
01:21:30.000 It'd be weird to have a copy of yourself that's just, you know, just spare parts.
01:21:35.000 Whereas it wouldn't be weird, or at least in my view, it wouldn't be weird.
01:21:38.000 It would be fantastic to be able to go into a hospital when your kidneys are failing and they just take a cell and print you a new kidney.
01:21:46.000 Yeah.
01:21:48.000 I think that can be expected.
01:21:50.000 I mean, if it's possible.
01:21:52.000 It seems like it's going to be possible.
01:21:54.000 The bladder example you just gave is what's happening there.
01:21:58.000 It's amazing.
01:21:59.000 But I always want to extrapolate things to some bizarre place a thousand years from now for some reason.
01:22:05.000 Because I've been...
01:22:08.000 Since I got into Dan Carlin's Hardcore History, it really fucked my mind up about how I think about the past in this way that I look like a thousand years ago in comparison to today.
01:22:21.000 And I try to think, well, how much different will people be a thousand years from now?
01:22:26.000 And probably way more different.
01:22:28.000 Yeah, you know, I mean the the fascination that we have with ancient history is that we one of the things obviously is we want to know where we came from but also We can kind of see people today doing similar shit if they were allowed to Like if everything went horribly wrong people at their base level are kind of similar Today as they were a thousand years from now.
01:22:48.000 Yeah, well one of them might be running for president We can talk about that and when I think about The future a thousand years from now with the way technology is accelerating and just the capacity that we have and ability to change things,
01:23:06.000 to change the world, to change physical structures, to change bodies.
01:23:10.000 To dig into the ground and extract resources.
01:23:14.000 We're getting better and better at changing things and manipulating things, extracting power from the sun and extracting salt from the water.
01:23:23.000 There's all this bizarre change technology that's consistently and constantly going on with people.
01:23:28.000 And it continues to get better.
01:23:30.000 When I think about a thousand years from now and artificial people and this concept of being able to read each other's minds and being able to map out imagery and pass it back and forth from mind to mind in a clear spreadsheet form.
01:23:43.000 What is that movie?
01:23:44.000 The Tom Cruise movie?
01:23:45.000 Minority Report.
01:23:46.000 Minority Report.
01:23:46.000 Thank you.
01:23:47.000 I mean, that's on the way.
01:23:49.000 It's on the way.
01:23:50.000 It's going to be incredibly strange to be a person.
01:23:53.000 Yeah, whether we will be people in a thousand years, I think you would...
01:24:00.000 Unless we have done something terrible and knocked ourselves back a thousand years, I think we will decide to change ourselves in that time in ways that will make us...
01:24:10.000 There may be many different species.
01:24:13.000 It's like tattoos.
01:24:15.000 You have a bunch of tattoos.
01:24:16.000 I have none.
01:24:17.000 You could take that a lot further if you can just begin really tinkering with everything.
01:24:24.000 Oh yeah, if you want to get nutty and put bolts in your head and shit.
01:24:26.000 Or just give yourself just fundamentally different genetic abilities, right?
01:24:34.000 I mean, like, you could just go, you could become a different species if you took it far enough.
01:24:40.000 Maybe that's what the aliens are.
01:24:42.000 Maybe that's why they all look the same.
01:24:43.000 Maybe they figured it out.
01:24:45.000 You gotta look alike.
01:24:47.000 Otherwise you can't appreciate each other.
01:24:50.000 One guy's weird looking, one guy short, one guy's got a big nose.
01:24:54.000 Everybody's so confusing.
01:24:55.000 Too many people I can hate.
01:24:57.000 Everybody look exactly the same.
01:24:58.000 So the government gets together with all the people to their planet and they go, look, we have a problem with this good looking thing.
01:25:03.000 It's bullshit.
01:25:04.000 It's holding us back.
01:25:05.000 A lot of people, they're stupid.
01:25:06.000 They're getting ahead.
01:25:07.000 We gotta all look the same.
01:25:08.000 Blank, emotionless, and big giant eyes.
01:25:10.000 That's it.
01:25:11.000 They all just went in.
01:25:13.000 With a passion for molesting cattle.
01:25:16.000 Allegedly.
01:25:16.000 I think that's people.
01:25:18.000 People are blaming that on the aliens.
01:25:20.000 I had this guy on my podcast, David Deutsch.
01:25:23.000 Have you ever heard of him?
01:25:24.000 He's a physicist at Oxford.
01:25:25.000 Yes, I have.
01:25:26.000 Why have I heard of him?
01:25:27.000 He wrote a book.
01:25:27.000 He gave at least one TED Talk, and he's written two very good books.
01:25:32.000 The first came out about 10 years ago, The Fabric of Reality, and the more recent one is The Beginning of Infinity.
01:25:42.000 Extremely smart guy and very nice guy.
01:25:45.000 He has this thesis, which He and I don't totally agree about the implications going forward for AI, but he's convinced me of his basic thesis, which is fascinating, which is the role that knowledge plays in our universe,
01:26:07.000 or the potential role that it plays.
01:26:09.000 And his argument is that in any corner of the universe, Anything that is compatible with the laws of physics can be done with the requisite knowledge.
01:26:22.000 So he has this argument about how deep knowledge goes and therefore how valuable it is in the end.
01:26:29.000 So I'm cueing off your notion of building an artificial person, literally cell by cell or atom by atom.
01:26:41.000 There's every reason to believe that's compatible with the laws of physics.
01:26:44.000 I mean, we exist, right?
01:26:45.000 So we got built by the happenstance of biology.
01:26:50.000 If we had what he calls a universal constructor, you know, the smallest machine that could assemble any other machine atom by atom, We could build anything atom by atom, right?
01:27:05.000 And so he has this vision of it.
01:27:08.000 You could literally go into an area of deep space that is as close to a vacuum as possible and begin sweeping up stray hydrogen atoms and fuse them together And generate heavier elements.
01:27:27.000 So you could start with nothing but hydrogen, right?
01:27:30.000 And with the requisite knowledge, Build your own little fusion reactor, create heavier elements, and based on those elements, create the smallest machine that can then assemble anything else atom by atom,
01:27:47.000 including more of itself, right?
01:27:49.000 And you could start this process of building anything from a person to something far more advanced than a person.
01:27:58.000 To a planet?
01:28:00.000 Anything is made of atoms, right?
01:28:01.000 Anything is organized.
01:28:03.000 And so the limiting factor in that case is always the knowledge, right?
01:28:10.000 So the limiting factor is either the laws of physics, either this can't be done because it's physically impossible, or the knowledge is what you're lacking.
01:28:19.000 And given that human beings are physically impossible, there should be some knowledge path whereby you could assemble one atom by atom.
01:28:28.000 There's no deep physical reason why that wouldn't be the case.
01:28:34.000 The reason is we don't know how to do it.
01:28:37.000 But presumably it would be possible for us to acquire that knowledge.
01:28:45.000 And so the horizon of knowledge just extends functionally without limit.
01:28:53.000 We're nowhere near the place where we know everything that's knowable.
01:28:59.000 Yeah.
01:29:12.000 I mean, with the frontiers of knowledge explored, you know, 10,000 years beyond where we are now, we would be unrecognizable to ourselves.
01:29:23.000 Everything would be equivalent to magic, you know, if we could see it now.
01:29:28.000 And most of human history is not like that.
01:29:30.000 And most of human history, if you dropped into any period of human history...
01:29:34.000 It was, for all intents and purposes, identical to the way it was 500 years before and 500 years before that.
01:29:42.000 It's only very recently where you would drop in and be surprised by the technology and by the culture and by what is being done with language and the consequences of language.
01:30:00.000 We're good to go.
01:30:02.000 We're good to go.
01:30:33.000 Yeah, yeah.
01:30:46.000 The planet is not what's complicated.
01:30:50.000 It's the...
01:30:50.000 Ecosystem.
01:30:52.000 It's the life on the planet.
01:30:53.000 Right.
01:30:54.000 And our own brains being the ultimate example of that complexity.
01:30:59.000 But presumably, intelligent systems can become much more complex than that.
01:31:05.000 There's no reason to think that we are near the summit of possible intelligence, biological or otherwise.
01:31:13.000 And...
01:31:17.000 Yeah.
01:31:18.000 Once you begin thinking about building things atom by atom, then the future begins to look very weird.
01:31:29.000 And automating that process, right?
01:31:33.000 This is the promise of nanotechnology, where you have tiny machines that can both build more of themselves and more of anything else that would be made of tiny machines, or assemble anything atom by atom,
01:31:51.000 or treat your own body like the machine that it is and deal with it atom by atom.
01:31:56.000 I mean, the possibilities of intervention in the human body are are then virtually limitless.
01:32:03.000 So it's a Yeah, I mean, that's where the physical world begins to look just totally fungible.
01:32:15.000 You know, when you're not talking about surgery, where you're cutting into someone's head and hoping, you know, in very coarse ways, hoping you're not taking out areas of brain that they need, but you're talking about actually repairing...
01:32:28.000 I mean, if you're talking...
01:32:30.000 If you can tinker with atoms in a way that you understand, then you're talking about repairing anything.
01:32:36.000 You know, then...
01:32:37.000 And creating anything.
01:32:39.000 Like literally, Dr. Manhattan style, build ice condos on Mars.
01:32:43.000 I mean, you could create art with it.
01:32:45.000 You could do anything with it.
01:32:47.000 It would just somehow or another have to be programmed with whatever pattern you were trying to create.
01:32:51.000 So you could essentially make an earth somewhere else with all the biological diversity, water, even intelligent life.
01:33:01.000 It all could be done through some sort of a machine, ultimately one day.
01:33:05.000 It is at bottom, again, just the information.
01:33:08.000 It's the knowledge of how it's organized and how to implement it.
01:33:13.000 So it's analogous to what has happened.
01:33:17.000 In films now, where, like, the animation in film has gotten good suddenly, right?
01:33:23.000 Where you can see, like, they can animate, you know, waving hair and flowing water, and it looks pretty damn good.
01:33:30.000 It looked terrible 30 years ago, right?
01:33:32.000 And we just acclimated to it looking terrible.
01:33:35.000 But now you can really trick the eye.
01:33:39.000 You can build up scenes of nature...
01:33:42.000 Where they're not actually using any photography of nature to build it up.
01:33:48.000 It's all a confection of ones and zeros, right?
01:33:53.000 They've just built it in software.
01:33:56.000 Like the Revenant.
01:33:57.000 Perfect example.
01:33:58.000 That giant bear that attacked Leonardo DiCaprio.
01:34:01.000 Right.
01:34:01.000 See, I don't know how that was done, but I had heard that that was all just pure CGI, right?
01:34:06.000 Yeah, when there was a dude in a costume that acted it out with him, but essentially it was just all CGI. Right.
01:34:11.000 Yeah, so the fact that that's beginning to look good, obviously that's just all surface.
01:34:19.000 That has no implication for building a rendering of a bear on film.
01:34:25.000 It's not the same thing as building a bear, but the fact that we can move so far into modeling that kind of complexity visually Just imagine what a super-intelligent mind could do with a thousand years to work at it.
01:34:48.000 And we're on the cusp of, and when I say cusp, I don't mean five years, but let's say a century.
01:34:55.000 We're on the cusp of producing the kind of technology that would allow for that.
01:34:59.000 And if we put it into perspective, photography, I don't believe, was even invented until the early 1800s, right?
01:35:06.000 Is that when it was first?
01:35:08.000 Yeah, it sounds about right.
01:35:09.000 Is there photographs of Lincoln?
01:35:10.000 There are, though.
01:35:11.000 Yeah, oh yeah.
01:35:11.000 So was that the early 1800s when he, when was Lincoln killed?
01:35:16.000 I don't know.
01:35:19.000 1865. 1865. So somewhere after that, or somewhere before that, rather, they invented photography.
01:35:25.000 1826, 1827. Yeah.
01:35:27.000 So essentially, give or take 10 years, 200 years ago.
01:35:32.000 That's really amazing.
01:35:34.000 Oh, yeah.
01:35:34.000 If you stop thinking about The Revenant from 200 years ago.
01:35:37.000 We couldn't figure out how to get sound in our movies until 100 years ago.
01:35:42.000 That thought right there just actually just freaked me out.
01:35:45.000 What thought?
01:35:45.000 It's a 200-year thought.
01:35:47.000 Like how little 200 years is in the Mongol days.
01:35:50.000 And now 200 years ago, they didn't even have cameras, and now they do.
01:35:55.000 And now they have this insane ability to recreate things like Game of Thrones, wolves, and dragons.
01:36:01.000 And she's riding around on a dragon.
01:36:03.000 I'm like, yeah, it looks like she's on a dragon.
01:36:04.000 It doesn't look like old-school King Kong movies.
01:36:07.000 You ever try to watch those?
01:36:09.000 Ray Harryhausen?
01:36:10.000 Yeah.
01:36:12.000 There's a pleasure in that.
01:36:13.000 They're awesome.
01:36:13.000 They're awesome to just get into for fun, but as far as visual effects, what they can do now, and the idea that it's all been done over 200 years is just spectacular.
01:36:24.000 Not just capturing the image, but then recreating an artificial version and projecting it, which is a thousand times more difficult.
01:36:32.000 But there's another feature here of the compounding power of knowledge and technology, where there's certain gains that are truly incremental, where everything is hard won, everything is just 1% better than its predecessor.
01:36:51.000 But then there are other gains where you have created an ability that seems like a quantum leap beyond where you were and where you go from just fundamentally not being able to do anything in that domain and then all of a sudden the domain opens up totally.
01:37:12.000 Flight is an example.
01:37:15.000 For the longest time, people couldn't fly, and it was obvious that you can't fly.
01:37:20.000 You're heavier than air, and you don't have feathers, and there's no way to flap your arms fast enough.
01:37:28.000 We're never going to fly, right?
01:37:30.000 And then, at a certain point, flight is possible and opens this whole domain of innovation.
01:37:36.000 But the difference between not being able to fly...
01:37:39.000 There's no progress you can make on the ground that doesn't...
01:37:44.000 It doesn't really avail itself of the principles of flight, as we now know them, that's going to get you closer.
01:37:52.000 You can't jump a little bit higher, and so it doesn't matter what you do with your shoes.
01:38:00.000 There are kind of fundamental gains that open up, you know, DNA sequencing is a more recent example, where understanding and having access to the genome, and that's you go from the only way to influence your descendants is to You know,
01:38:23.000 basically make a good choice in wife, right, or husband, to you can just create a new species in a test tube if you wanted to, right?
01:38:34.000 And that's a kind of compounding power of understanding the way things work.
01:38:43.000 I think we're at the beginning of a process that could look very, very strange very, very quickly.
01:38:51.000 I think, obviously, both in good and bad ways, but I don't think there's any break to pull on this train.
01:39:02.000 Knowledge and intelligence Are the most valuable things we have, right?
01:39:08.000 So we're going to grab more insofar as we possibly can, as quickly as we can.
01:39:15.000 And the moments of us deciding not to know things and not to learn how to do things, I mean, those are so few and far between as to be almost impossible to reference, right?
01:39:25.000 I mean, there are moments where people try to pull the brakes and And they hold a conference and they say, you know, should we be doing any of this?
01:39:33.000 But then, you know, China does it or threatens to do it.
01:39:36.000 And we wind up finding some way to do it that we consider ethical.
01:39:45.000 So there are things like, you know, germline tinkering that we, as far as I know, don't do and have decided for good reason we're not doing.
01:39:54.000 But...
01:39:57.000 Is that going to stop people from doing this?
01:40:00.000 I don't think there's any way.
01:40:02.000 They're more worried about actual real diseases than they are man-made diseases.
01:40:07.000 When we went to the CDC, is that disease control?
01:40:11.000 CDC? In Galveston, I guess that's what it is.
01:40:14.000 I guess the name of the organization.
01:40:16.000 But it's a building where they house some of the most horrendous viruses and diseases known to man.
01:40:22.000 Like they had anthrax in there.
01:40:24.000 The CDC has a lot of that, yeah.
01:40:27.000 Yeah, and they have these crazy walls, like thick, thick walls, and vacuums in the ceilings, and everyone's wearing suits.
01:40:33.000 And they wanted us to get in the suits, and I went, fuck you.
01:40:37.000 There's no way I'm going in that room.
01:40:39.000 You did this for one of your shows?
01:40:40.000 Yeah, it was for a TV show with...
01:40:43.000 With Duncan Trussell, the Questions Everything Show.
01:40:45.000 We were talking about weaponized diseases.
01:40:48.000 And the CDC was like, forget all that.
01:40:51.000 Like, the real diseases that are constantly morphing we have to stay on top of.
01:40:56.000 Like, that's what we should be terrified of.
01:40:58.000 Actual real diseases.
01:40:59.000 Like, no one's shown any ability to create this stuff that's more fucked up than what we already have.
01:41:04.000 But weaponized anthrax and things along those lines, like...
01:41:08.000 These Russian guys we talked to, they were talking about how they had vats of this stuff.
01:41:14.000 They had all kinds of crazy diseases that they had created just in case we had gotten into some insane, mutually assured destruction, you know, disease-spreading thing.
01:41:26.000 Like, they were down for that.
01:41:27.000 They were like, well, we have to be prepared in case the United States does that.
01:41:31.000 Whoa!
01:41:33.000 What their concern is, the Center for Disease Controls guys, they were concerned with things like Ebola, things morphing, things becoming airborne, natural things, new strains of the flu that become impossible, MRSA. MRSA is a terrifying one.
01:41:47.000 MRSA is one that has a lot of people scared, a lot of doctors scared.
01:41:50.000 It's a medication-resistant staph infection that kills people.
01:41:55.000 I mean, it can absolutely kill people if you don't jump on it quick and take the most potent antibiotics we have, and even then it takes a long time.
01:42:03.000 Yeah, yeah.
01:42:04.000 Well, it's a—I actually just tweeted this recently.
01:42:08.000 I think I said, would some billionaire, would some 0.1 percenter develop some new antibiotics?
01:42:15.000 Because clearly the government and the market can't figure out how to do it.
01:42:19.000 And it really is falling through the cracks in the government market paradigm.
01:42:25.000 It's like either the government will do it or the market will do it, but neither are doing it.
01:42:33.000 It's a rational for developing antibiotics because it's so costly and you take them, you know, with any luck, you take them once every 10 years for 10 days and that's it.
01:42:44.000 I mean, that's not like Viagra or any antidepressant or any drug that you're going to take regularly for the rest of your life.
01:42:55.000 So there's no real market incentive to do it, or at least not enough of one, to spend a billion dollars developing an antibiotic.
01:43:03.000 And the government apparently is not doing it.
01:43:06.000 And we're running out of antibiotics.
01:43:07.000 This has been in the news a lot recently.
01:43:10.000 But we're close to being in a world where it's as though we don't have antibiotics.
01:43:16.000 We're a superbug away from being in that world.
01:43:21.000 You're freaking me out, Sam Harris.
01:43:23.000 I don't like to get freaked out.
01:43:26.000 Super bugs are the big concern, right?
01:43:29.000 Do you think about that in jujitsu?
01:43:31.000 Yes, I've gotten staph.
01:43:33.000 My friend Ari had it and we were playing pool and he was limping.
01:43:38.000 And I go, what's going on with your leg, man?
01:43:40.000 He goes, I got a spider bite.
01:43:41.000 I go, let me see.
01:43:42.000 He pulls up his pants.
01:43:43.000 You got staph.
01:43:44.000 You got to get a doctor.
01:43:46.000 And he hasn't even done jujitsu in years.
01:43:48.000 And I think he got staph again.
01:43:51.000 And I think it's one of those things where once, I guess everyone has it on their body, and when you get an infection, then it spreads and grows, and apparently it can be a reoccurring thing.
01:44:02.000 So people who get it, particularly MRSA, apparently they can get it again, and it can get pretty bad.
01:44:09.000 There was that one fighter who just died, I don't think related to that, but he had these...
01:44:16.000 Oh, Kevin Randleman.
01:44:17.000 Kevin Randleman, yeah, yeah, yeah.
01:44:18.000 That was Mercer, right?
01:44:21.000 Oh, 100%, yeah.
01:44:22.000 Yeah, he was in the hospital.
01:44:24.000 Well, it was staff.
01:44:25.000 I don't know if it was Mercer.
01:44:26.000 It could have been just staff that he ignored.
01:44:28.000 For long periods of time, but it was 100% staph.
01:44:30.000 And he died.
01:44:33.000 I don't know what exactly was the cause of his death, but I can't think that that helped him any.
01:44:39.000 I mean, it had gotten so bad that it had eaten through his body.
01:44:43.000 If you're interested in this.
01:44:46.000 For people online that are listening, Google Kevin Randleman staph infection.
01:44:51.000 And these are horrific photos that look like something took a bite out of him.
01:44:56.000 Like under his armpit.
01:44:57.000 I mean, like a fist-sized chunk of meat was missing.
01:45:02.000 Yeah, it's really, really scary stuff.
01:45:06.000 So yeah, skin infection.
01:45:07.000 There it is right there.
01:45:08.000 You can look deep in and see his muscle tissue.
01:45:11.000 Yeah, that is hardcore.
01:45:13.000 No exaggeration.
01:45:14.000 It's like a baseball-sized hole.
01:45:16.000 And he's got two of them.
01:45:18.000 He's got another one lower down on his back, which is eating its way through his body.
01:45:21.000 But that was years before he died, right?
01:45:23.000 Or some years.
01:45:24.000 Yeah, it was years before he died.
01:45:26.000 But who knows how devastating that might have been.
01:45:29.000 That's a really, really bad infection.
01:45:33.000 And I think that once your body's that compromised, I mean, you're like really close to death.
01:45:38.000 I mean, he was a really tough, strong, healthy guy.
01:45:42.000 He was a super athlete.
01:45:43.000 Oh, yeah.
01:45:44.000 So I'm assuming his body probably fought it off pretty well.
01:45:48.000 So it's probably one of the reasons why he let it go so long.
01:45:51.000 Probably didn't, maybe didn't even understand, like, how dangerous it was.
01:45:54.000 Or who knows?
01:45:55.000 Maybe it just jumped on him really quick.
01:45:57.000 My dad's girlfriend just got it on her face, and she was in the hospital for two weeks, and they were afraid it was going to spread to her brain, and it almost did.
01:46:04.000 And she's not 100% out of the woods yet, but she's back home now.
01:46:08.000 She just got a little scratch on her face, and it spread into her cheek, and then from her cheek she just got a little red swelling, and then she couldn't see, and then she had to go in the hospital.
01:46:17.000 You got staph.
01:46:19.000 Chill out.
01:46:20.000 Heavy antibiotics right away.
01:46:23.000 Yeah, this is one area that This worries me.
01:46:28.000 There are the bad things we do, and obviously there's a lot to be worried about there.
01:46:35.000 The stupid wars and the things that it's just obvious that if we could stop creating needless pain for ourselves or needless conflict, that would lead to a much nicer life.
01:46:50.000 But then there are the good things we neglect to do.
01:46:54.000 I think we're good to go.
01:47:05.000 I think we're good to go.
01:47:14.000 It's just unthinkable to me.
01:47:16.000 And yet, it's simply we're just hamstrung by the fact that we have a political and economic system where there's no incentive.
01:47:26.000 We don't want to raise taxes.
01:47:28.000 We're so overcommitted in the ways we spend money.
01:47:40.000 We're good to go.
01:47:56.000 Something that the Bill and Melinda Gates Foundation could do.
01:47:59.000 Maybe they're actually doing this and I just don't know about it.
01:48:02.000 They're doing a lot of medical work, obviously.
01:48:05.000 We're talking about some billions of dollars to just get a laser focus on this problem.
01:48:13.000 But it's such an obvious problem.
01:48:15.000 And that's really the only thing that's holding it back?
01:48:17.000 That's what's going on?
01:48:18.000 It's not a research issue, it's just a financial issue?
01:48:21.000 Well, I'm sure the research has to be done because, you know, if it was totally obvious how to build the next generation of antibiotics that would not be vulnerable to having their efficacy canceled in three years by...
01:48:38.000 Just the natural selection among the microbes.
01:48:44.000 Someone would do it very, very cheaply.
01:48:46.000 So I'll admit that it's probably not easy to do, but it's got to be doable, and it's super important to do.
01:48:56.000 I mean, when you look at just what cesspools, hospitals have become, where people come at something like 200,000 people a year die in the U.S. based on essentially getting killed by the machinery of the hospital.
01:49:14.000 They're getting killed by their doctors and nurses.
01:49:17.000 Some of this is drug overdoses or incompetence in dosing or giving someone the wrong medication or whatever.
01:49:29.000 200,000 people?
01:49:31.000 Yeah, 200,000 people a year, right?
01:49:32.000 But a lot of it is just doctors not washing their hands, right?
01:49:36.000 But some of this is also superbugs, whereas the burden on hand washing in a hospital is higher and higher because hospitals just are...
01:49:47.000 Just covered in super germs, right?
01:49:51.000 It's like a haunted house.
01:49:53.000 It's like where you're trying to fix people, and around the house are a bunch of demons that are trying to kill people you're trying to fix.
01:50:00.000 Look, obviously it's not, but if you were a person who was inclined to believe things back in the day before they figured out microscopes, I mean, what else is that other than a demon?
01:50:11.000 You've got a hospital that's filled with superbugs.
01:50:13.000 Where else are they?
01:50:14.000 There's nowhere else.
01:50:15.000 They're just in the hospitals.
01:50:16.000 Yeah, and gyms.
01:50:18.000 And some MMA gyms where they're getting them probably from the hospitals.
01:50:21.000 But, I mean, it's just like a haunted house.
01:50:24.000 It's like a haunted house that's trying to kill the people that live in the house.
01:50:28.000 It is...
01:50:29.000 Well, and it's also just ironic.
01:50:31.000 You go to the hospital to save your life, right?
01:50:35.000 You go when you are, by definition, you're most vulnerable and you are...
01:50:41.000 And you're laying your body open to the intrusions of the place because that, I mean, they have to get...
01:50:50.000 They have to get into your body to help you, right?
01:50:53.000 And yet...
01:50:54.000 That is the very mechanism whereby you're getting all this misery and death imposed on you.
01:51:02.000 And it is as simple as hand-washing, though, in many of these cases, right?
01:51:06.000 It's just doctors and nurses not washing their hands.
01:51:08.000 That is insane.
01:51:10.000 It's so insane to think that that is a gigantic issue, that we have these bugs that try to get into your body and kill you.
01:51:18.000 So there's one area, I don't know if you ever...
01:51:21.000 I had the bad luck to be associated with a NICU, a neonatal ICU. But our first daughter, who's totally fine, was born five weeks early and had to be in the NICU for a week.
01:51:33.000 And there are people who are in the NICU for months.
01:51:36.000 There are babies born at 23 weeks or so.
01:51:40.000 And it's just totally harrowing.
01:51:44.000 But also just these incredibly compassionate, just amazing places where you just...
01:51:50.000 We're good to go.
01:52:13.000 You are going to wash your hands in as complete a way as 21st century science understands how to do that.
01:52:20.000 And it's almost like the decontamination zone of Silkwood or the nuclear reactor.
01:52:29.000 Wow.
01:52:29.000 We should get hosed down.
01:52:30.000 Yeah.
01:52:31.000 But it's hand washing.
01:52:35.000 The fact that we can't even do that perfectly is pretty impressive.
01:52:39.000 Now, is it a fact that MRSA was created by medications, or is that a belief, or has that been proven, that it was created by a resistance to medications that got stronger?
01:52:50.000 Yeah, well, no, yes, it's a fact that...
01:52:54.000 All of these bugs are evolving, and just by dint of happenstance, they are producing changes in their genome that leaves them no longer vulnerable to Antibiotic X,
01:53:11.000 right?
01:53:12.000 Right.
01:53:13.000 So whether it's methicillin or any of its related antibiotics.
01:53:19.000 And so this is what antibiotic resistance is.
01:53:23.000 These bacteria, their genomes mutate.
01:53:28.000 And unfortunately, with bacteria, they also can swap genes with Laterally across bacterial species.
01:53:37.000 So it's not like only their descendants in that line can inherit these genetic changes.
01:53:43.000 They can transfer genetic changes across bacteria.
01:53:47.000 So it just optimizes the process.
01:53:51.000 And again, this is all blind.
01:53:52.000 It's not like bacteria want to become drug resistant, but some percentage of them In any generation will tend to become, or in some generation, will become drug resistant.
01:54:03.000 And then in the presence of the drug, they will be selected for.
01:54:08.000 If you keep bombarding people with penicillin, you will be selecting for the bacteria that isn't sensitive to penicillin in those people.
01:54:22.000 So yeah, the overuse of antibiotics and the overuse of antibiotics in our food chain is also part of this picture, right?
01:54:27.000 So it's the fact that we are...
01:54:29.000 I don't know what the percentage is, but it's more antibiotic use, certainly, in cattle and pigs than in people.
01:54:40.000 And the same evolutionary principles are happening there, too.
01:54:45.000 So you don't know what...
01:54:50.000 We're doing to ourselves.
01:54:51.000 I mean, it can't be good to be using antibiotics everywhere in agriculture and then kind of waiting to see what happens.
01:54:59.000 Well, we know for a fact that we get diseases.
01:55:02.000 I mean, swine flu, avian flu, like the Xenoviruses.
01:55:05.000 Yeah, and a lot of these are coming out of these...
01:55:08.000 Facilities where they're processing cattle, processing pigs.
01:55:12.000 That's another element to it.
01:55:13.000 It's just that most of infectious disease over the ages has been born of proximity to animals, and that's the result of agriculture.
01:55:25.000 So the fact that you have people in bird markets in China...
01:55:32.000 They're dealing with chickens and ducks endlessly in confinement.
01:55:39.000 And then you've got wild birds flying overhead, dropping their droppings into that space.
01:55:46.000 And you have viruses that jump species from birds to pigs and back again.
01:55:54.000 Some of this stuff only stays in those animals and doesn't become active in people.
01:56:01.000 But again, evolution is just this endless lottery wheel where you've just got change and change and change upon change.
01:56:10.000 And something...
01:56:27.000 I think?
01:56:34.000 Isn't airborne and is difficult to contract, right?
01:56:37.000 Well, then it's a fairly well-behaved, you know, it could be scary, but it's not going to become a global pandemic.
01:56:43.000 But then suddenly you get a mutation on that virus or that bacteria that allows it to be, you know, aspirated, become airborne in a cough and inhaled and Well, then you have the possibility of a pandemic.
01:57:02.000 And also the time course of an illness is relevant.
01:57:07.000 So if you have something which kills you very quickly and horribly, well, then that's the kind of thing that is going to be harder to spread because people become suddenly so sick.
01:57:20.000 They're not getting on airplanes.
01:57:21.000 They're not going to conferences.
01:57:23.000 I think?
01:57:48.000 And again, these mutations are just happening spontaneously.
01:57:51.000 It really is a matter of good and bad luck.
01:57:58.000 They all have one function.
01:58:01.000 They kill things.
01:58:03.000 They devour.
01:58:04.000 They overcome.
01:58:05.000 They overcome bodies with their life form.
01:58:07.000 They spread.
01:58:08.000 Well, there are other functions.
01:58:12.000 I mean, obviously, this is all blind and there's no intention or purpose behind it, but there are viruses.
01:58:21.000 And other infectious diseases that produce functions which are behaviorally relevant.
01:58:30.000 I mean, so that there's a spread...
01:58:31.000 I forgot the...
01:58:33.000 Is it toxoplasmosis that makes...
01:58:39.000 Is mice less fearful in the presence of cats?
01:58:42.000 Yeah.
01:58:42.000 So their behavioral changes, right?
01:58:45.000 It makes them attracted to urine.
01:58:46.000 Right.
01:58:46.000 They get erect around cat urine.
01:58:49.000 Yeah.
01:58:50.000 It leads them to their demise, so it spreads to cats.
01:58:52.000 Yeah.
01:58:53.000 And there are theses that there are various infectious diseases that change human behavior.
01:58:58.000 The depression is the result of infectious illness that we're not aware of.
01:59:10.000 Yeah, I mean, so there's a lot that could be going wrong with us that we haven't attributed to viruses and bacteria, which in fact is at bottom a matter of viruses and bacteria.
01:59:21.000 Actually, Alzheimer's, there was recently a report that suggested that Alzheimer's is the result of a brain's immune response to infection,
01:59:37.000 infectious illness.
01:59:38.000 I think it was bacterial.
01:59:42.000 This was just in the last week or so.
01:59:45.000 The plaques associated with Alzheimer's that you see throughout the brain might, in fact, be the remnants of An immune response to something having invaded across the blood-brain barrier.
02:00:04.000 So if Alzheimer's is the result of infectious disease, score that as a major problem that would be nice to solve with the right antibiotic regime.
02:00:18.000 Could you imagine?
02:00:19.000 I mean, that would have been fascinating if that existed during Reagan's time.
02:00:22.000 They could just clear him up.
02:00:24.000 Because you remember when someone publicly starts to go like that, and it's a guy like Ronald Reagan who is an actor and president, and you see him starting to lose his grip on his memory, and you hear all the reports about it.
02:00:40.000 That it's a particularly disturbing because it's exhibited.
02:00:44.000 I mean, that's the head guy, you know, to think that that was just a disease.
02:00:48.000 That's crazy.
02:00:49.000 Yeah.
02:00:50.000 Yeah.
02:00:51.000 I don't remember when that became at all obvious.
02:00:55.000 I remember I know people were trying to do a kind of a retrospective analysis of it, but I don't remember when anyone started to talk about the possibility that he Was not all there, and I don't remember it happening actually during his presidency,
02:01:13.000 but we were both young at that point.
02:01:16.000 I do remember comedians doing jokes about it.
02:01:20.000 That's like one of my main points of references.
02:01:22.000 They'd be like, Will!
02:01:23.000 They'd have this like weird sort of out-of-it grandpa type character that they would do.
02:01:29.000 And that was towards his second run, you know?
02:01:33.000 I don't know if that was years before they were referencing.
02:01:37.000 When was it all going bad for them?
02:01:42.000 Neurologists can spot neurological illness really well.
02:01:47.000 They're walking around seeing neurological illness all over the place.
02:01:51.000 I'm sure there were neurologists who were talking about him long before anyone else was in those terms.
02:01:59.000 Well, there was this old joke that Jimmy Tingle did.
02:02:02.000 Jimmy Tingle is a hilarious political comedian from Boston.
02:02:05.000 He had this joke about Ronald Reagan's trial where he couldn't remember if he sold arms to Iran.
02:02:11.000 He goes, Mr. President, next time you sell arms to Iran, jot it down.
02:02:18.000 Right.
02:02:19.000 Make a note!
02:02:20.000 Put it on the refrigerator!
02:02:23.000 But it was just, that was his excuse when he was in trial.
02:02:29.000 And everybody thought that this is bullshit.
02:02:31.000 This is his defense.
02:02:32.000 He doesn't remember.
02:02:33.000 And then it started coming out that he had, and then the conspiracy theory was, he was always fine.
02:02:37.000 He was like, what was that guy's name?
02:02:39.000 Vincent de Chin Gigante, who'd walk around in a bathrobe and pretend to be crazy.
02:02:43.000 Remember that guy, the mob guy?
02:02:45.000 No.
02:02:45.000 There's a famous mob guy who was running the mob, but pretended to be a crazy old man.
02:02:50.000 So he would walk around with people.
02:02:52.000 I forget how they busted him, but he had it nailed.
02:02:55.000 He'd walk around in a bathrobe and talk to himself, and he would put on an act.
02:02:58.000 Like, go out on the street and act like a crazy person, and then he would go on walks with, like, these capos and tell them, oh, kill this fucking guy and get me a million bucks and all that kind of crazy shit.
02:03:08.000 But all the while, he had an act.
02:03:09.000 Someone wore a wire, yeah.
02:03:10.000 Somehow or another they caught him.
02:03:12.000 I don't remember exactly how they caught him, but everyone knew, like they kind of knew he was doing that.
02:03:17.000 And so some people were thinking that's what Reagan was doing.
02:03:19.000 He was getting told, I don't remember what I did, what I ran.
02:03:21.000 As a matter of fact, I don't remember shit.
02:03:23.000 I think I got a disease.
02:03:24.000 I can't remember anything.
02:03:25.000 And just started pretending and just was out of it.
02:03:29.000 Both could be true.
02:03:30.000 I mean, he clearly did have Alzheimer's in the end, but he could have also been lying.
02:03:34.000 What if he didn't?
02:03:35.000 What if it was a big conspiracy?
02:03:36.000 That would be a great movie.
02:03:38.000 Like, how good of an actor was Ronald Reagan?
02:03:40.000 He was such a good actor.
02:03:41.000 In the latter days of his years, he avoided interviews by pretending to have Alzheimer's.
02:03:45.000 He felt like that was the only way out.
02:03:47.000 So just him and Nancy, they prepped their lines.
02:03:50.000 And when, you know, you go out there to the public, he just started acting like he couldn't remember anything.
02:03:55.000 Yeah.
02:03:56.000 Alzheimer's, unfortunately, is going to be a bigger and bigger story.
02:03:59.000 It's really the baby boomer moment is coming, and it's just going to...
02:04:03.000 This is something we need a full court press on, too.
02:04:06.000 Yeah.
02:04:07.000 If you can find that that's actually a disease, and you can cure that disease, that's insane.
02:04:12.000 An infectious disease, yeah.
02:04:15.000 Have you ever seen the Sapolsky stuff on the Toxoplasma?
02:04:19.000 Robert Sapolsky, the guy from Stanford?
02:04:22.000 He's the guy that's one of the forefront researchers and one of the guys who's really vocal about it.
02:04:29.000 They were also talking about a direct proportionate, a direct relationship between motorcycle crashes And people were testing positive for toxoplasmosis.
02:04:40.000 Oh, I didn't know that.
02:04:40.000 And they felt that it might have either hindered reaction time or loosened inhibitions, the same way it sort of triggers these mice to go near cats.
02:04:51.000 Yeah, that's what I was referencing before.
02:04:54.000 So, I think it's a lot of speculation, but there's a strong correlation, apparently, to motorcycle crashes.
02:05:02.000 Hmm.
02:05:03.000 I guess one of his professors had told him that when he was younger and he had remembered it while they were dealing with some guy who came into the ER victim of a motorcycle crash.
02:05:13.000 But it kind of makes sense.
02:05:16.000 Yeah, well, you know, the underlying biology of, you know, risk avoidance and not risk seeking is, I mean, that's fairly well conserved in mammals.
02:05:30.000 It's not like, I mean, there's a reason why we do most of our research in things like mice.
02:05:37.000 I mean, it's not a...
02:05:39.000 A totally analogous brain, but mice are similar enough to us that doing research on dopamine receptors in mice allows us to extrapolate to humans.
02:05:51.000 And, yeah, so it's a...
02:05:56.000 It wouldn't be a surprise that it's having an effect on people.
02:06:00.000 Was it Steve...
02:06:01.000 I remember I was supposed to bring this up to you before, when you were talking about plants and plants having some sort of consciousness.
02:06:10.000 Was it Steven Pinker, see if you could find this, who gave a speech where he talked about how some plants, you can actually use sedatives on them And that some of them actually produced certain neurochemicals, like dopamine, if that makes any sense.
02:06:27.000 I think it's Steven Pinker.
02:06:28.000 No, it didn't make any sense, right?
02:06:29.000 Well, it would surprise me if Pinker said anything about this, but I haven't heard anything about it.
02:06:34.000 I think it was a speech he was giving about something, but I think it's controversial.
02:06:38.000 It's one of the reasons why I wanted to bring it up to you.
02:06:40.000 Because I had never heard that before, that a plant could produce...
02:06:44.000 It was either dopamine or serotonin, and that somehow or another sedatives would be...
02:06:50.000 Would be effective on a plant.
02:06:53.000 That doesn't even make any sense.
02:06:55.000 I don't know what effective means.
02:06:56.000 I don't know either.
02:06:57.000 That's why I waited for you.
02:06:59.000 The plant is pretty well sedated as far as I'm concerned.
02:07:03.000 I remember as soon as I saw this, I'm like, I must get this to Sam Harris.
02:07:06.000 Please decipher.
02:07:08.000 But I actually, everything I, almost everything I remember about or that I learned about plant biology, I've forgotten.
02:07:15.000 So I don't know.
02:07:16.000 But I'm pretty sure that the plant does not have a brain.
02:07:20.000 I see stuff with Steven Pinker in plants and consciousness, but nothing with sedatives specifically coming up with that.
02:07:27.000 Maybe I can combine two different articles.
02:07:31.000 Maybe I did.
02:07:33.000 The Pinker one I know was...
02:07:34.000 What did Pinker say about plants and consciousness?
02:07:37.000 Oh, he was just talking about the surprising amount of calculations.
02:07:41.000 I think that was one of the, we'd have to read the entire piece, but I think it was just, they were just highlighting what we know so far.
02:07:47.000 I've reached the limits of a human bladder.
02:07:49.000 Oh, you, how dare you?
02:07:50.000 This is early.
02:07:52.000 You're too healthy, man.
02:07:53.000 Coffee and water.
02:07:54.000 You're right.
02:07:55.000 So what have you found on it, Jeremy?
02:07:57.000 Nothing really.
02:07:58.000 To be honest with you, nothing specific about it.
02:08:00.000 But I did have something earlier that was kind of interesting.
02:08:03.000 I'll show it to you here.
02:08:04.000 When you guys were talking about AI, I pulled up something on Minority Report and it pulled me to this article, which Microsoft has an app that can...
02:08:12.000 It's actually developed by Hitachi.
02:08:14.000 It's called Predictive Crime Analytics.
02:08:19.000 They can predict crimes up to 91% accuracy.
02:08:23.000 It's also already being enacted in Maryland and Pennsylvania as of 2013. They have crime prediction software that can find out if an inmate that's going to be released is going to commit another crime.
02:08:38.000 And so they're using that to follow them.
02:08:41.000 And there's some civil rights people that are saying, like, you can't Do that, obviously.
02:08:45.000 That's not good.
02:08:46.000 Whoa!
02:08:46.000 Hold on.
02:08:46.000 Scroll down just a little bit.
02:08:48.000 What is it?
02:08:48.000 Professor Burke says his algorithm could be used to help set bail amounts and also decide sentences in the future.
02:08:56.000 And then I got down to this part in Chicago, they're doing something, and they have, it's called a heat list in Chicago.
02:09:02.000 They have 400 residents that are listed as potential victims and subjects with the greatest propensity of violence, and they go and knock on their door and tell them that they're being watched.
02:09:11.000 And I, like, I've clicked on this thing, and it's an actual, like, Chicago directive from the police.org.
02:09:17.000 It's a pilot program about going and telling people that they're being watched for, someone might be after you, or some shit like that.
02:09:25.000 Yeah.
02:09:25.000 Whoa.
02:09:26.000 Really crazy.
02:09:27.000 I didn't want to interrupt you guys to tell you about this, but.
02:09:29.000 Custom notification under the Violence Reduction Initiative in partnership with the John Jay College of Criminal Justice Community Team, who will serve as outreach partners within the social service and community partners.
02:09:42.000 Show him this.
02:09:43.000 This is crazy.
02:09:44.000 While we were doing this, Microsoft has these programs.
02:09:51.000 Scroll up to the top, Jamie.
02:09:54.000 They revealed an application, they think, that can predict crimes in the future and decide if inmates get parole.
02:10:01.000 It uses all sorts of data, public data from Twitter, closed-circuit camera feeds, public Wi-Fi signals.
02:10:09.000 In LA, there's all sorts of microphones all over the place listening for gunshots and whatnot.
02:10:14.000 There's those new light poles I told you about that are adding 4G connectivity to the city.
02:10:18.000 Yeah.
02:10:19.000 That obviously can be added to this, probably, if they needed it.
02:10:21.000 I'm sure.
02:10:23.000 Yeah, that doesn't surprise me at all.
02:10:24.000 I think that is a—all of that's coming.
02:10:29.000 I mean, just look at just consumer behavior.
02:10:31.000 I mean, just look at how much someone can understand about you based on your zip code and your last three Netflix movies you watched to the end.
02:10:42.000 And just a few other data points, right?
02:10:44.000 And then we basically know—we can predict, you know, with some horrendous accuracy what you're going to like— Given the menu of options, we can advertise to you with immense precision.
02:10:59.000 Facebook, obviously, is at the forefront of this, but when you add everything else that's coming, the more intrusive technology of the sort we've been talking about, it's...
02:11:17.000 None of that's surprising, right?
02:11:19.000 Nothing surprising anymore.
02:11:20.000 If you had read that 30 years ago, it would have looked like an article in The Onion, right?
02:11:26.000 You'd be like, what?
02:11:27.000 This is Judge Dredd.
02:11:28.000 This is crazy.
02:11:29.000 Well, here's The Onion version, which did just happen, right?
02:11:32.000 I think this was Microsoft, where they put out what they were calling an AI bot on Twitter.
02:11:38.000 There was an AI Twitter account that just became a Hitler-loving sex bot.
02:11:44.000 Because it was being tuned up by its interaction with people trolling it on Twitter.
02:11:49.000 Did you see the guy who got arrested for falling asleep inside his Tesla when it was on auto drive?
02:11:54.000 No, no, no.
02:11:56.000 He got busted!
02:11:57.000 I don't know if he got arrested.
02:11:58.000 He got busted, though.
02:11:59.000 They have camera photos of him rolling through an intersection completely unconscious on his way to work.
02:12:05.000 Look at this guy.
02:12:06.000 Look.
02:12:07.000 Look at this.
02:12:09.000 The car's driving him, and he's asleep while it's on autopilot.
02:12:13.000 This is insanity, man.
02:12:14.000 This is going to be a great moment to see, and I'm sure this is coming, where...
02:12:20.000 Once self-driving cars become just obviously the only truly safe alternative, then you'll be arrested for the opposite violation.
02:12:31.000 You'll be arrested if your hands are on the wheel, if you are driving a car that is ape-piloted as opposed to robot-driven.
02:12:41.000 And that's a—I mean, we've got 30,000 people a year dying every year based on ape driving.
02:12:49.000 Yeah.
02:12:50.000 So the moment we crack that, which we're very close to doing, it's just going to seem...
02:12:57.000 I mean, you've got your old muscle cars or whatever you're into.
02:13:01.000 It's going to be like...
02:13:03.000 That's going to be the equivalent of celebratory gunfire.
02:13:06.000 You're going to reserve the right to, at your wedding, shoot your AR-15 into the air and not care where the bullet lands.
02:13:14.000 I'm going to have to get a license to operate them.
02:13:17.000 You're going to have to move out of a big city.
02:13:18.000 Take them somewhere.
02:13:19.000 You've got to take them to the hills and unload them out of the back of a truck and drive them for a very short distance.
02:13:27.000 That's right.
02:13:27.000 Probably monitor how much...
02:13:28.000 Put a governor on them that can go five miles an hour.
02:13:31.000 Yeah.
02:13:33.000 Yeah, I mean, if you look at it in terms of safety, for sure.
02:13:36.000 And it seems to be the thing that's the recurring theme, right?
02:13:39.000 You give up your privacy for safety.
02:13:42.000 You give up your ability to drive a car wherever you want, whenever you want, however you want it.
02:13:48.000 You give that up, too.
02:13:49.000 You give that up for safety.
02:13:52.000 And people are really reluctant to give up fun shit, like lying and driving their car fast.
02:13:59.000 Like those two things.
02:14:00.000 People are going to have a hard time with you actually getting into their mind, seeing their actual mind, and being able to do that so we can know without a doubt whether or not someone's guilty or innocent.
02:14:10.000 But my question to you is, if you could get inside someone's mind, and it was like that really super...
02:14:18.000 Suggestive guy that you were talking about earlier that just confessed all the horrific demonic possession stuff and eaten babies.
02:14:25.000 What if it's like getting to that guy's mind?
02:14:27.000 What if you can't tell?
02:14:29.000 Well, the case that worries me, and this is perhaps an inept segue to politics, but we're in people's minds.
02:14:45.000 You get someone talking long enough You know their minds.
02:14:51.000 They can only conceal what they're about only so well.
02:14:56.000 Maybe you can, because you're a super smart wizard type dude.
02:14:59.000 But Jamie, I don't know if he's a mind reader.
02:15:02.000 But the question is, will people care?
02:15:05.000 We don't even need a lie detector.
02:15:08.000 If you have someone who's openly lying, who just gets caught in lies again and again...
02:15:14.000 You can see it, too.
02:15:15.000 You feel it, right?
02:15:16.000 But people don't seem to care, right?
02:15:18.000 At least in a political process.
02:15:21.000 I'm thinking in this case of Trump, where you have someone who...
02:15:25.000 It is, in some cases, lying or just changing his mind in such an incoherent way that it's the functional equivalent of lying.
02:15:36.000 I mean, it's someone who becomes totally unpredictable.
02:15:38.000 He has a stance that is A on Tuesday and is B on Wednesday.
02:15:43.000 And when the discrepancy is pointed out, he tells you to go fuck yourself, right?
02:15:47.000 So there's just no...
02:15:49.000 There is no accountability to his own states of consciousness that he's going to be held to.
02:15:56.000 And the people who love him don't seem to care.
02:16:00.000 As far as I can tell, I don't know so many of these people personally, but based on social media and seeing the few articles where someone has explained why they love Trump, Um, people view this as a kind of,
02:16:15.000 this sort of, this dishonesty, what is on, in my view, both dishonesty and a kind of theatrical hucksterism, a sort of person who's pretending to be many things that he probably isn't.
02:16:31.000 I think?
02:16:49.000 This is a new form of integrity.
02:16:52.000 It's amazing to watch.
02:16:55.000 I'm someone who, actually, I remember on my own podcast, I think I was talking to Paul Bloom, this Yale psychologist who's great, and we got into politics at least a year ago, but at that point I said, there's no way we're going to be talking about Trump in a year.
02:17:11.000 This is going to completely flame out.
02:17:13.000 This is a—I don't tend to make predictions, but this was a clear moment that I remember of making a prediction, which is now obviously false.
02:17:21.000 But I just couldn't imagine that this was—people were going to find this compelling enough for him to be on the cusp of getting elected.
02:17:31.000 It's— It is terrifying.
02:17:33.000 Have you talked this issue to death on your podcast?
02:17:36.000 I guess we kind of have.
02:17:38.000 I think this is how everybody feels.
02:17:40.000 Everybody feels like you're supposed to be with their person, whether it's Bernie or whether it's for Hillary or whether you're a Trump supporter, whatever it is.
02:17:48.000 You have to be all...
02:17:51.000 If you look at the choices that were given, none of these could really be described as ideal.
02:17:56.000 No, no.
02:17:57.000 Like, Hillary Clinton, you could want a woman in the White House, and you want to show everyone that a woman can do that job just as well as a man, and she's got the most experience, and she certainly has the most experience dealing with foreign governments, and she certainly has the most experience in politics.
02:18:11.000 But she's also involved in two criminal investigations.
02:18:15.000 She had a server in her bathroom.
02:18:18.000 Yeah, yeah.
02:18:19.000 There's all this squirrely stuff going on.
02:18:22.000 She's terrible in many ways.
02:18:24.000 She was anti-gay marriage until like 2013. And then wouldn't admit the change of mind either.
02:18:30.000 Yeah, she's a politician.
02:18:33.000 She's probably a brilliant woman, but she's also set in her ways and a politician.
02:18:37.000 And a politician to the end.
02:18:39.000 And part of being a politician is being a fucking huckster.
02:18:42.000 You gotta be able to get those people to...
02:18:44.000 Let's see your side.
02:18:46.000 And the way you do that is to talk like this.
02:18:50.000 You can't talk like a normal person.
02:18:52.000 She needs a speech coach.
02:18:53.000 They all do.
02:18:54.000 He's terrible, too.
02:18:55.000 Trump's not even good at it, and he kicks their asses.
02:18:57.000 No, but her voice, she has a kind of, to use the sexist trope, she has a shrill voice.
02:19:04.000 How dare you?
02:19:05.000 When you get her in front of a mic, and there's a crowd, and she thinks she's talking over the crowd, which she doesn't have to do because she's in front of a mic, The sound you get is just...
02:19:15.000 She's yelling when she doesn't need to yell.
02:19:18.000 Someone has to teach her how to dial that back.
02:19:21.000 What you just did is called mansplaining.
02:19:23.000 And it's a terrible thing.
02:19:25.000 I'm explaining to the men in her crew who should...
02:19:28.000 Talk some sense into her, but she's a bad candidate, right?
02:19:33.000 I have no doubt that she's very smart, and she's well-informed, and she's qualified, and she is absolutely who I will vote for, given the choices.
02:19:43.000 But...
02:19:44.000 I totally understand people's reservations with her.
02:19:47.000 She's a liar.
02:19:48.000 She's an opportunist.
02:19:49.000 She's just almost preternaturally inauthentic.
02:19:54.000 I mean, she's just like, she will just focus group every third sentence, and you feel that from her, right?
02:20:00.000 And...
02:20:01.000 And this is all true, and yet I also believe the people who say, I've never met her, but people who know her and have met her say that behind closed doors, one-on-one, she's incredibly impressive and great.
02:20:13.000 But that doesn't translate into her candidacy.
02:20:15.000 She probably thinks she has to do it old school.
02:20:18.000 You know?
02:20:18.000 I mean, the way she's doing it.
02:20:20.000 But she...
02:20:21.000 I mean, the thing is, she's...
02:20:22.000 When you look at the...
02:20:24.000 What worries me is...
02:20:26.000 I went out on Facebook the other day, and I've said very little about this, but I've made enough noises of the sort that I just made that people understand that I'm for Clinton, despite all my reservations about her.
02:20:40.000 And...
02:20:42.000 What I got on my own Facebook page, which you have to assume is filtered by the people who are following me on Facebook and already like me in some sense, just like a thousand comments of pure pain.
02:20:54.000 I mean, no one loves Hillary.
02:20:55.000 No one said, oh, thank God someone's smartest for Hillary.
02:20:59.000 It was all just Bernie people and Trump people flaming me for the most tepid possible endorsement of Clinton.
02:21:08.000 All I said was, Listen, I understand Clinton's a liar, and she's an opportunist, and I completely get your reservations about her, but at least she's a grown-up, right?
02:21:22.000 And she's going to be the candidate.
02:21:25.000 It's not going to be Sanders.
02:21:27.000 Now's the moment to put your political idealism behind you if you're a Sanders person.
02:21:32.000 And recognize that there is a vast difference between Clinton and Trump.
02:21:36.000 And know she's not going to change the system, but she's also not going to run civilization off a cliff.
02:21:41.000 And I forget how I said it on Facebook, but it really was a lesser of two evils argument.
02:21:50.000 And it's amazing to see how energized and passionate people are in defense of Trump and Sanders.
02:21:57.000 And there's almost none of that for Clinton.
02:22:00.000 It's like people are just sheepishly saying, just divulging that they will vote for Clinton.
02:22:06.000 But they are, maybe somewhere that I haven't noticed, someone absolutely loves Clinton.
02:22:12.000 But it's just, she does not have her defenders the way these guys do.
02:22:15.000 Have you seen the Man Enough to Vote for Her campaign?
02:22:18.000 No.
02:22:19.000 It's with, like, hipster dudes with tattoos and beards that are going to vote for Hillary?
02:22:24.000 No.
02:22:24.000 I hope it's fake.
02:22:25.000 Because it's so brilliant, I hope it's not real.
02:22:29.000 Is it fake?
02:22:30.000 I think so.
02:22:30.000 Thank God.
02:22:31.000 Is it?
02:22:31.000 It's okay.
02:22:32.000 Thank God.
02:22:33.000 It's so good, though.
02:22:35.000 Because it's not that fake.
02:22:37.000 It's pretty good.
02:22:38.000 Like, you could almost see.
02:22:42.000 So, wait a minute.
02:22:44.000 I mean, this is not bad for her, right?
02:22:46.000 No, no, no.
02:22:48.000 It's not bad for her.
02:22:48.000 It's just funny that someone would, like, make a joke, political ad, you know, that you have to be man enough to vote for Hillary.
02:22:57.000 Like, there's guys out there that would buy that.
02:22:59.000 They would be like, I'm man enough, bro.
02:23:02.000 They'd do it.
02:23:04.000 It's a scary time because it doesn't seem like anybody that you would want to be president wants to be president.
02:23:10.000 And so we're left with, all right, what do you pick?
02:23:13.000 It's like as if we're going to play the Super Bowl with three of the shittiest teams we could find.
02:23:18.000 We're just going to go get some drunk high school kids.
02:23:20.000 We're going to get some inmates with club feet.
02:23:24.000 We're going to have the worst game ever.
02:23:25.000 And that's what this game is.
02:23:27.000 This is not a good game.
02:23:28.000 This is not a game where you've got like a John F. Kennedy versus a Lyndon Johnson.
02:23:34.000 It's not like powerful characters.
02:23:39.000 Trump, I guess, is a really powerful character, but in more ways like a showman character.
02:23:46.000 What he's doing is he's putting on a great show, and he's going to win, probably, because he's putting on such a great show, and people like a great show.
02:23:56.000 I do think I'm now among the people who think something new, we're witnessing something new with Trump.
02:24:04.000 It's not just the same old thing where the process is so onerous that it's selecting for the kind of narcissist or thick-skinned person who is willing to submit to the process and then there are many,
02:24:20.000 most of the good people just aren't going to put up with this.
02:24:23.000 I mean, yes, there's that too, but There's something...
02:24:28.000 It's a moment among the electorate where...
02:24:35.000 There's enough of an anti-establishment...
02:24:40.000 Mood and vote now.
02:24:42.000 This is happening with Sanders, too, where people just want to jam a stick in the wheel of the system just to see what happens.
02:24:52.000 The main gripe against Hillary, really, is that she's politics as usual.
02:24:59.000 She's not going to change the system.
02:25:01.000 People want to change the system, but they're not really thinking about the implications of radically changing the system.
02:25:07.000 In the case of Trump, Here is someone who is advertising his lack of qualifications for the office in every way that he can.
02:25:23.000 I'm not even bothered by his racism or his misogyny or his demagoguery or his bullying.
02:25:31.000 All of that I'm willing to guess.
02:25:42.000 I don't know why I would think that's plausible, but I have a hunch that he's far more liberal than he seems, and is just pandering.
02:25:55.000 But the thing that...
02:25:59.000 What can't be true is there's no way he's actually brilliant and well-informed about all the issues and is saying the things he's saying.
02:26:09.000 He's not pretending to be as uninformed and as incoherent and as irresponsible as he's seeming.
02:26:17.000 Because you wouldn't withhold information.
02:26:19.000 It would make you look like a better leader.
02:26:21.000 Well, it's just the vacuousness of his speech.
02:26:26.000 He'll say the same thing three times in a row, and it was meaningless the first time.
02:26:32.000 He'll say, it's going to be amazing.
02:26:34.000 It's going to be very, very amazing.
02:26:36.000 Trust me, it's going to be so amazing.
02:26:37.000 And he does this with everything.
02:26:39.000 If you look at the transcripts of his speeches, and the fact that he can't...
02:26:44.000 He has never, so far as I've...
02:26:48.000 He has never once strung together a string of sentences that was even interesting.
02:27:00.000 There's never a moment where I say, oh, this guy is smarter and better informed than I realized.
02:27:07.000 That moment never comes.
02:27:08.000 I keep expecting to see that happen.
02:27:12.000 And it's a little bit like...
02:27:14.000 I mean, I have this image of...
02:27:15.000 Like, imagine you have an urn, right?
02:27:17.000 And you just keep pulling things out of it.
02:27:19.000 And all you pull out of it is junk, right?
02:27:21.000 Like, you pull, you know, chicken bones and broken marbles and gum.
02:27:25.000 And it's still possible that if you root around in that urn long enough, you're going to find the Hope Diamond.
02:27:32.000 I mean, in each round...
02:27:34.000 That you pull something out, that really has no logical implication for the next thing you might pull out of the urn.
02:27:40.000 But minds aren't like that.
02:27:42.000 When I see what this guy says, he does not say anything that a well-informed, intelligent person would say.
02:27:49.000 And ideas are connected, right?
02:27:54.000 You can't fake this stuff.
02:27:56.000 You can't fake being this uninformed, and you can't fake being really well-informed.
02:28:03.000 Look at one policy that he wants.
02:28:06.000 The rounding up of illegal aliens.
02:28:10.000 Round up 11 million illegal aliens.
02:28:12.000 Now, this gets stated as, yeah, we're going to round them up and send them back to Mexico.
02:28:18.000 And what worries me is no one seems to care that if you just look at the implications of doing this, this one policy claim alone is so impractical and unethical.
02:28:32.000 It's just...
02:28:34.000 What are we talking about here?
02:28:35.000 Your gardener, your housekeeper, the person who works at the car wash, the person who picks the vegetables that you buy in the market is going to get a knock on the door in the middle of the night by the Gestapo and get sent back to...
02:28:50.000 The vast majority of these people are law-abiding people who are just working at jobs that Americans, by and large, don't want to do.
02:28:58.000 I think?
02:29:16.000 Held in isolation from all of the other things he said, the crazy things like climate change is a hoax concocted by the Chinese to destroy our manufacturing base and the fact that he likes Putin.
02:29:28.000 I mean, everything else he said, right?
02:29:29.000 This one policy claim alone should be enough to disqualify a person's candidacy.
02:29:35.000 It's so crazy the moment you look at it.
02:29:39.000 And yet no one seems to care.
02:29:41.000 In fact, it's just more energizing to the people who already like him.
02:29:46.000 I know that he said that he wanted to build a wall, but I didn't know that he said that he wanted to get rid of the illegal aliens.
02:29:51.000 Round them up.
02:29:52.000 And round them up.
02:29:53.000 And do what with them.
02:29:55.000 Send them back to their country.
02:29:56.000 Oh, that is so crazy.
02:29:58.000 That's such a crazy idea, and it's so brutal.
02:30:00.000 The idea that, I mean...
02:30:02.000 It's a subhuman thing.
02:30:04.000 The only reason why people would come to America is because they felt like it would make their life better.
02:30:09.000 So people take a big risk.
02:30:10.000 There's not an easy way to do it if you're poor, you don't have any qualifications for any unusual job, and you're trying to get across to Mexico.
02:30:19.000 But everybody who does it does it because they want to improve their life.
02:30:22.000 And the idea that one group of people shouldn't be able to do it, one group should, just because they were born on the right side of some country.
02:30:28.000 Strange line that is only a couple hundred years old.
02:30:32.000 But actually, I'll go further in meeting him in the middle.
02:30:36.000 So I think we should be able to defend our borders, right?
02:30:40.000 I don't have a good argument for having a porous border that we can't figure out how to defend and we don't know who's coming into the country.
02:30:48.000 I think building the wall is almost certainly a stupid idea among his many stupid ideas, but I think it would be great to know who's coming in the country and have a purely legal process by which that happened.
02:31:02.000 Ultimately, that's got to be the goal, right?
02:31:04.000 And we're imperfectly doing that.
02:31:10.000 So I don't have an argument for open borders or porous borders, but the question is, what do you do with 11 or 12 million people who are already here doing jobs we want them to do that help our society?
02:31:23.000 And the vast majority of them are law-abiding people who, as you say, are just trying to have better lives.
02:31:30.000 The idea that you're going to break up families and send people back by the millions and the idea that you're going to devote your law enforcement resources to doing this when you have real terrorism and real crime to deal with is just pure insanity and also totally unethical.
02:31:50.000 And yet he doesn't get any points docked for this aspiration.
02:31:56.000 It's one of the things around which people are rallying.
02:32:00.000 But the climate change thing is also insane and dangerous.
02:32:03.000 Well he was a birther.
02:32:05.000 Right, yeah, right.
02:32:06.000 He was one of the original birthers.
02:32:07.000 He was saying that Obama's birth certificate was bullshit.
02:32:11.000 He was born in Kenya, right?
02:32:12.000 Wasn't he one of those guys?
02:32:14.000 Oh, yeah.
02:32:14.000 He was self-funding that for a while.
02:32:18.000 I would love it if he got into office and just said, listen, folks, I am nothing like this person I pretended to be to win the presidency.
02:32:26.000 I just wanted to show you that you could be manipulated and get it together.
02:32:30.000 Yeah.
02:32:32.000 I'm going to hire some people who actually know how to run things.
02:32:37.000 The smart people who are voting for him think, and this is, I think, a crazy position, but they think that...
02:32:46.000 He is just pandering to the idiots who he needs to pander to to get into office.
02:32:53.000 So he's not disavowing the white supremacist vote with the alacrity that you would if you were a decent human being and you found out that David Duke supported you.
02:33:05.000 Because he needs those votes and he knows that most of the people in his base aren't going to care and he can just kind of move on in the news cycle.
02:33:16.000 And he's doing this on all these issues where smart people see that he looks like a buffoon and the people who don't like him are treating him as a comic figure who...
02:33:31.000 He can't really believe that stuff.
02:33:33.000 He's too sophisticated to really believe that stuff, so he's just pandering.
02:33:38.000 One is that people aren't seeing, if that's true, just how unethical and weird that is.
02:33:44.000 The guy has no compunction about lying and demonizing people.
02:33:48.000 Let's say he thinks that That Clinton really isn't guilty, Bill Clinton isn't really guilty of a rape, right?
02:33:57.000 And now he's calling him a rapist, right?
02:33:58.000 Now at the time, he was saying he wasn't a rapist and he's just being defamed and this is outrageous.
02:34:03.000 He was taking the side of a friend who he invited to his wedding.
02:34:07.000 But now he's calling him a rapist, right?
02:34:11.000 A sexual predator who harmed women's rights more than anyone.
02:34:15.000 So which is true, right?
02:34:17.000 So there's no version of the truth here that makes Trump look at all acceptable as a person.
02:34:24.000 It's like either he knew he was a rapist and was defending him because he was just cozying up to power at that point, right?
02:34:31.000 Didn't care that he's a rapist.
02:34:33.000 Or now he knows he's still the guy who thinks he wasn't a rapist, but now he's just for purely opportunistic reasons.
02:34:41.000 I think?
02:35:00.000 But I think people think that he's got to be much more sophisticated than he is, and that if he got into office, he would just be a totally sober and presidential person.
02:35:13.000 There's just no reason to believe that.
02:35:15.000 I mean, if he thinks climate change is a hoax, and that we should pull out of the Paris Accords, and we should ramp up coal production, and we're going to bring back the coal jobs, I mean, this is what he's saying, right?
02:35:26.000 There's no reason to think he doesn't believe this at this point.
02:35:31.000 It is a disastrous thing for a president to think.
02:35:36.000 The only fascinating versions of this that I've been hearing from people that I respect are that the idea that he is like...
02:35:48.000 The political version of the asteroid that killed the dinosaurs.
02:35:50.000 He's going to come down and smash it, and it's going to be so chaotic that they're going to be forced to reform the system, and people are going to respond in turn.
02:35:59.000 The way people are responding against factory farming and more people are going vegan, that kind of a thing.
02:36:03.000 They're going to see it, and they're going to respond in turn.
02:36:06.000 That is such a...
02:36:07.000 So he's going to toss the apple cart up in the air.
02:36:10.000 He's just going to fuck this whole goofy system up and then we'll be able to rebuild after Trump has dismantled all the different special interest groups and lobbyists and all the people that we really would like to get out of the system.
02:36:21.000 We really don't like the fact that there's such insane amounts of influence that big corporations and lobbyists have had on the way laws get passed.
02:36:31.000 This might be the way to do it.
02:36:32.000 You have some wild man.
02:36:34.000 Everyone's fired!
02:36:35.000 You're fired!
02:36:36.000 You're fired, Jetson!
02:36:37.000 It's like a character!
02:36:39.000 Like, he's coming in, his hair's plastic, he's all fired up.
02:36:42.000 He's a billionaire, made all his own money, sort of.
02:36:44.000 Dad gave him some money, but he turned into a lot of money.
02:36:47.000 Point being, he doesn't need anybody's money.
02:36:49.000 The truth is, he's probably lying about the amount of money he has, too.
02:36:51.000 He's a baller, for sure, though, right?
02:36:54.000 At the very least.
02:36:55.000 He's gotta be worth some cash.
02:36:57.000 Yeah.
02:36:58.000 I mean, there could be a big difference between what he's claiming and what is, in fact, true.
02:37:01.000 But he's a...
02:37:03.000 I mean, there are many pieces here.
02:37:05.000 I mean, people assume that because he's a successful businessman, he must understand the economy, right?
02:37:10.000 Right.
02:37:10.000 Which there's no necessary connection there, right?
02:37:13.000 There's a lot of rich people who are totally confused about economics.
02:37:18.000 And, you know, most economists don't have a lot of money.
02:37:21.000 So there's no real connection there.
02:37:23.000 But the...
02:37:28.000 So what you're describing is a kind of just random...
02:37:31.000 Let's just smash the window and then see what happens, right?
02:37:37.000 We're going to light a fire to this place and see what happens.
02:37:40.000 And that's...
02:37:45.000 Almost any process by which you would change the system is more intelligent than that.
02:37:50.000 And it's also not valuing how much harm one bad president could do.
02:37:57.000 I haven't tested this, but I'm imagining that even Trump supporters would answer this question the way I would hope, which is, if I had a crystal ball It can't tell you who's going to be president, but it tells you how it works out for the next president.
02:38:15.000 If I look in this crystal ball and it says the next president of the United States is a disaster.
02:38:21.000 It's like the worst president we've ever had.
02:38:24.000 Just think of failures of governance and the toxic influence of narcissism and hubris that comes along just like once every thousand years.
02:38:37.000 Just a disaster.
02:38:40.000 I think you know, even if you're a Trump supporter, which candidate that was.
02:38:45.000 Only Trump is likely to screw things up that badly.
02:38:50.000 Clinton is going to be almost perfectly predictable.
02:38:55.000 She's going to be a politician.
02:38:56.000 She's going to be basically centrist on foreign policy and domestic policy.
02:39:02.000 She's going to be liberal on social issues.
02:39:06.000 She is not going to...
02:39:09.000 To try to dismantle NATO and get into a war with North Korea or get into an alliance with Putin.
02:39:19.000 She's not going to do something insane.
02:39:23.000 An alliance with Putin?
02:39:26.000 He's said basically only favorable things about Putin.
02:39:29.000 They're homies.
02:39:30.000 They're tight.
02:39:33.000 Hopefully we'll see pictures with both of them on horseback, shirtless.
02:39:38.000 I think Donald is probably going to keep his shirt on.
02:39:40.000 He'll probably keep his shirt on.
02:39:41.000 I don't see him as being the shirtless kind of guy.
02:39:46.000 Yeah.
02:39:48.000 When you just look at the landscape between Bernie and Hillary and him, to me it looks like the last gasps of a dying system.
02:40:00.000 Okay, but that's scary, right?
02:40:02.000 Representative government system.
02:40:04.000 Yeah, it is scary.
02:40:04.000 A lot of people are saying that Things like that, but they're not Hearing just how nihilistic that is, if true.
02:40:17.000 There's so much stuff we have to get right.
02:40:19.000 And the only tool to get it right is having your mind actually understand what's going on in the world and how to manipulate the world in the direction you want it to go.
02:40:34.000 So you have to understand whether or not climate change is true Your beliefs about it have to be representative of that truth.
02:40:43.000 Let's say I'm mistaken and there is no human cause.
02:40:49.000 Climate change is not a problem.
02:40:51.000 And every moment spent thinking about it, worrying about it, correcting for it is just a waste of time that's just throwing out the wealth of the world.
02:41:02.000 That would be a terrible problem.
02:41:04.000 So it really matters who's right about that.
02:41:07.000 And the fact that we have a president or a candidate who is coming in saying, this is all bullshit, in defiance of all of the science, is on every other point.
02:41:24.000 He doesn't know anything about...
02:41:26.000 I guarantee you he doesn't know the difference between Sunni and Shia Islam or which countries are Sunni predominantly and which are Shia predominantly.
02:41:35.000 And I mean, I'm sure he's going to do...
02:41:37.000 I don't know when he's going to cram for this final exam.
02:41:39.000 I'm sure before one of those debates he's going to get...
02:41:41.000 Someone's going to sit down with him and give him some bullet points he's got to have in his head.
02:41:45.000 But...
02:41:47.000 I don't know.
02:42:07.000 Character flaws of this guy who is just obviously going to...
02:42:11.000 I mean, he's...
02:42:13.000 But are we attached too much to this idea of one person being the figurehead?
02:42:18.000 It's not a figurehead.
02:42:19.000 Someone has to...
02:42:19.000 Someone's the decider.
02:42:21.000 If we all woke up today, if everybody woke up and there was just no government, there was nothing...
02:42:26.000 We're all just, what happened?
02:42:27.000 I don't know, but we've got to figure out how to run this thing.
02:42:30.000 We had no previous understanding of government.
02:42:32.000 Would you think anybody would say, we need one dude to just run this whole giant continent filled with 300 million people?
02:42:38.000 Most likely, if we woke up and we had technology like we have today, we had the ability to communicate like we have today with social media and whatever, We would probably say we need to, like, figure this out amongst each other and find the people that are the most qualified for each one of these positions and start running our government that way.
02:42:56.000 Well, that's what we're attempting to do, but it's just...
02:42:59.000 And I totally agree with you that it is astonishing that out of a nation of 300 million people, these are the choices.
02:43:06.000 You would think, starting from your zero set point of just, you know, now we're going to reboot civilization.
02:43:16.000 You would think that if you had this kind of process, each candidate would be more impressive than the next.
02:43:23.000 I mean, you'd be like, I can't believe...
02:43:25.000 Each person who came to the podium would be so impressive.
02:43:29.000 Like LeBron James.
02:43:29.000 Oh, yeah.
02:43:30.000 It'd be like the dunk contest for the NBA. It'd be like, oh, my God.
02:43:34.000 Just when you thought you saw the best dunk in your life, the next guy comes along.
02:43:38.000 Exactly.
02:43:39.000 And it would be that on every topic.
02:43:43.000 Right.
02:43:43.000 Right?
02:43:43.000 It'd be like, you'd be talking about the science of climate change, you'd be talking about the actual dynamics of the war on terror.
02:43:51.000 So topics that seem to have no relationship, where you'd have to be, you'd be amazed that anyone could be an expert in all of them, you would find someone who is an expert, a functional expert in all of them.
02:44:03.000 A Jeopardy winner, dude.
02:44:04.000 Yeah, but someone who's also ethically wise, who wasn't obviously an asshole, and who had a mature relationship to changing his or her mind,
02:44:23.000 right?
02:44:23.000 So this whole bit about flip-flopping and not...
02:44:29.000 Someone who could honestly represent changes of mind across a political career, right?
02:44:36.000 It's nowhere written that it's a good thing to believe today what you believed 20 years ago.
02:44:41.000 In fact, if you do that on every topic, it means basically you haven't been in dialogue with the world.
02:44:46.000 But there's something...
02:44:48.000 It's so taboo to change your mind that either you have to lie about it or you have to pretend it was always that way or it's just a...
02:44:58.000 I mean, the system is broken in that respect, but given the choices, you know, and when you have a choice between someone who is, for all her flaws...
02:45:11.000 I've been in the game for long enough to be really well informed and capable of compromise and capable of not just breaking things.
02:45:26.000 Yeah.
02:45:47.000 It's an amazing situation.
02:45:49.000 Well, he's a product of attention because they realize that there's a heated race, right?
02:45:54.000 The heated race.
02:45:55.000 This guy was really famous.
02:45:56.000 And in a heated race, this guy would say some crazy stuff.
02:45:59.000 And so they would tune into him.
02:46:00.000 So everybody had to tune into him.
02:46:02.000 So because of him saying crazy stuff, he accelerated the amount they were talking about him.
02:46:06.000 So they were constantly talking about him and barely talking about other people.
02:46:10.000 But he's created a wormhole in our political process now where there's nothing so crazy that could disqualify him among the people who like him now.
02:46:18.000 So he can just keep nuclear bombs of craziness that the press can't ignore, that every time they think, okay, this is the crazy thing he said that's going to harm his candidacy, so let's shine a light on it, it just helps him.
02:46:34.000 He could get on Twitter right now and say, you know who I'd like to fuck?
02:46:38.000 I'd like to fuck Nicki Minaj.
02:46:43.000 And it would work for him.
02:46:45.000 It would work for him.
02:46:46.000 You would see a tweet storm of a billion people who say, I'd like to fuck Nicki Minaj too.
02:46:51.000 Go get her.
02:46:52.000 And it's insanity.
02:46:55.000 That's where we are.
02:46:58.000 But in a sense, we do admit that this is a fucked up system.
02:47:02.000 It's not ideal.
02:47:03.000 It should definitely be reworked.
02:47:05.000 And it's so hard to rework.
02:47:06.000 Wouldn't the best way to rework it?
02:47:08.000 A Trump asteroid just slams right into the White House.
02:47:12.000 Boom!
02:47:13.000 Blows the whole thing sky high.
02:47:15.000 Who knows what terrible things have to happen, but...
02:47:19.000 Maybe that would be enough.
02:47:20.000 The thing is, those asteroids are coming anyway.
02:47:23.000 So when you look at 9-11 was an asteroid, right?
02:47:27.000 Or a superbug that becomes a pandemic.
02:47:32.000 These are things that are coming, and we need people who are in touch with reality to deal with them.
02:47:42.000 So the moment someone...
02:47:44.000 It advertises not only their ignorance, but the fact that they don't care that they're ignorant.
02:47:50.000 And they do this again and again.
02:47:52.000 They keep doubling down.
02:47:53.000 If you put that person at the helm, what you have done is basically put chaos at the helm.
02:48:02.000 This person's going to believe whatever he believes, regardless of the information coming in and regardless of the consequences.
02:48:11.000 It's worse than having no one in charge.
02:48:13.000 Because you've put chaos.
02:48:16.000 I think?
02:48:35.000 That's what you're doing if you're hiring someone like this who—I mean, yeah, in the best case, what you stated earlier would in fact be true, which he'll get into the Oval Office, and even he will be scared of the prospect that he's now running the better part of human civilization— And he will hire the best people or some semblance of the best people he can get access to and say,
02:49:03.000 tell me how to not screw this up.
02:49:07.000 And then it'll essentially be business as usual, right?
02:49:11.000 Insofar as you've hired the best people will be people who are...
02:49:14.000 Are deeply in this game already, right?
02:49:17.000 You know, he'll defer to the generals when it comes time to make war.
02:49:20.000 Being really pragmatic about how they pick politicians and how they push certain people and decide not to push others, do you think that something like Trump completely changes how they move forward now?
02:49:32.000 They realize that this can happen?
02:49:34.000 Like, now that you see that people are so goofy, we're so WWE'd out that you can get this guy, you know?
02:49:40.000 I mean, this is where we're at.
02:49:42.000 We've got a guy...
02:49:44.000 I told them the wall just got ten foot higher!
02:49:46.000 Yeah!
02:49:47.000 Everybody gets crazy like how could you say that?
02:49:50.000 Once they realize that that's possible, how long before you get like some motivational speaker type dudes?
02:49:56.000 How long before they start jumping in there?
02:49:58.000 Tony Robbins for president.
02:49:59.000 That's what we have.
02:50:00.000 This is way worse than Tony Robbins for president.
02:50:04.000 Oh yeah, way, way worse.
02:50:04.000 Tony Robbins is a positive dude, but I'd vote for him.
02:50:08.000 Very positive guy.
02:50:09.000 That's not what I mean.
02:50:11.000 But that sort of ability to excite people.
02:50:15.000 We're going to get one of those motivational speaker dudes, one of those guys who wears a lot of yoga pants, and he's going to be the next president.
02:50:21.000 He's going to get us in shape.
02:50:23.000 It's going to be a reality show.
02:50:26.000 America's a reality show.
02:50:27.000 We're there.
02:50:28.000 We're awesome.
02:50:29.000 We're the best.
02:50:32.000 Did you see this press conference he held?
02:50:34.000 I think it was yesterday.
02:50:35.000 No, I did not.
02:50:36.000 It was a very funny moment where there was one journalist.
02:50:40.000 I didn't recognize who it was.
02:50:44.000 So Trump was being very combative with the press pool, and he was basically shouting them down, not answering any of the questions.
02:50:52.000 And one journalist, just aghast, said, is this what it's going to be like when you're president?
02:50:57.000 Is this what it's going to be like to be in the White House press corps and deal with you?
02:51:03.000 And he said, yes, this is exactly what it's going to be like.
02:51:07.000 But you could just see that the journalists, they turn the camera on in the room of journalists, and they are astonished by what is happening here.
02:51:19.000 They don't know.
02:51:19.000 They're participating in this process.
02:51:21.000 In some sense, they have created this process.
02:51:25.000 But...
02:51:27.000 No, not all of you, just many of you.
02:51:29.000 All right, fine.
02:51:30.000 Enough of us.
02:51:31.000 Is this what it's going to be like covering you if you're a president?
02:51:35.000 Yeah, it is.
02:51:36.000 Let me tell you something.
02:51:37.000 We're going to have this kind of competition in the press room?
02:51:38.000 Okay, yeah, it is going to be like this, David.
02:51:41.000 If the press writes false stories, Like they did with this.
02:51:44.000 Because, you know, half of you are amazed that I raised all of this money.
02:51:48.000 If the press writes false stories like they did where I wanted to keep a low profile.
02:51:53.000 I didn't want the credit for raising all this money for the vets.
02:51:55.000 I wasn't looking for the credit.
02:51:57.000 And by the way, more money is coming in.
02:51:59.000 I wasn't looking for the credit.
02:52:01.000 But I had no choice but to do this because the press was saying I didn't raise any money for them.
02:52:05.000 Not only did I raise it, much of it was given a long time ago.
02:52:08.000 And there is a vetting process, and I think you understand that.
02:52:11.000 But when I raise almost six million dollars, and probably in the end we'll raise more than six because more is going to come in and is coming in.
02:52:18.000 But when I raise 5.6 million as of today, more is coming in.
02:52:22.000 And this is going to phenomenal groups, and I have many of these people vetting The people that are getting the money and working hard...
02:52:30.000 You played the moment I was referring to.
02:52:34.000 I mean, so here is a case where he's probably almost certainly lying about his history of giving to Veterans Affairs.
02:52:41.000 And he gave money very recently after people started fishing around to see if he actually had given the money that he claimed to have given to veterans.
02:52:51.000 But, I mean, this is...
02:52:55.000 What's difficult about this is that yes, the press is highly imperfect and also partisan and there are false stories and there are exaggerations and they screw people over, yes.
02:53:10.000 And there are reasons to not trust the press from time to time.
02:53:20.000 But in this case, you have a There is no amount of fact-checking and disconfirmation of his statements that forces him to ever acknowledge anything that he's done wrong,
02:53:36.000 and the lack of acknowledgement that he pays no price for it among the people who like him.
02:53:42.000 And so the press is powerless.
02:53:46.000 But the net result of a press conference like this, if you're a Trump follower, is...
02:53:52.000 He just showed how biased and petty the press pool is.
02:53:59.000 And the press do need to just be beaten up by a strong man who's not going to stand for their bullshit.
02:54:04.000 But it's a...
02:54:09.000 It's unbecoming, at the very least.
02:54:11.000 That kind of communication.
02:54:12.000 It's kind of unbecoming of the person that we expect.
02:54:15.000 And that's pretty mild.
02:54:16.000 No, that's mild compared to his parodying of the disabled reporter.
02:54:22.000 You saw that bit where he did a cerebral palsy imitation at one of his speeches.
02:54:30.000 No.
02:54:31.000 He was interviewed by, I don't happen to know who the reporter was.
02:54:35.000 And it was about a specific person?
02:54:36.000 He was making fun of somebody with cerebral palsy.
02:54:39.000 I mean, he's done so many things that you would think would be fundamentally canceling of a person's political aspirations.
02:54:49.000 Like, you caught Marco Rubio pretending, just goofing on someone's cerebral palsy?
02:54:56.000 At one of his campaign events?
02:54:58.000 Oh, it's so true.
02:54:59.000 Just be the end, right?
02:55:00.000 Well, one of the things that sunk Ted Cruz was just that video of him with his family, the outtakes, where they were like...
02:55:06.000 Oh, I didn't see that.
02:55:07.000 Oh, you didn't see it?
02:55:08.000 No.
02:55:08.000 It's a gem.
02:55:09.000 It's spectacular.
02:55:10.000 It's him with his mom, and he's like, my mom prays for me, often for hours every day, and she's like...
02:55:17.000 She looks at him like, what the fuck are you talking about?
02:55:20.000 Hours every day.
02:55:21.000 No, I don't.
02:55:22.000 You can't even say that.
02:55:23.000 And so they have all these really awkward moments like, okay, I'm going to go in for a hug.
02:55:27.000 I'm going to say I love you.
02:55:29.000 It's all like weirdly mapped out.
02:55:31.000 And that got online and people were like, oh, Christ.
02:55:35.000 Okay.
02:55:35.000 See, this is a bad game.
02:55:38.000 Like, you're not even good at this game.
02:55:40.000 You're terrible at this game.
02:55:42.000 Yeah, he was objectively terrible at the game.
02:55:45.000 Well, that's Trump's competition.
02:55:48.000 Well, the thing about Cruz that never even got out, which was the reason to be scared about a Cruz presidency, was his level of religious craziness.
02:55:56.000 I mean, no one was even pushing on that because there was just enough to push on before he even got to that door.
02:56:00.000 Yeah, you have to hold on to those weapons.
02:56:02.000 Yeah.
02:56:03.000 But I mean, had Cruz been the nominee, it would have been all about religion.
02:56:09.000 What's odd is that that's not a handicap in 2016, that you can have that and people consider it an asset.
02:56:16.000 Well, the one thing that's surprising and actually hopeful in Trump's candidacy...
02:56:23.000 Is the fact that he has dissected out the religious, social, conservative component of the Republican Party.
02:56:33.000 Evangelicals, for the most part, were going for Trump over Cruz when it was pretty clear to them that Trump was just pretending to be religious.
02:56:42.000 So Trump gave one speech at, I think, Liberty University where he spoke.
02:56:47.000 He said, you know, Corinthians 2, and that's not the way any Bible reader would speak about 2 Corinthians.
02:56:56.000 How would you say it?
02:56:57.000 2 Corinthians.
02:56:58.000 That's how you would say it?
02:56:59.000 Yeah.
02:56:59.000 Yeah, and so he said, Corinthians 2, as though this is something he just opened every night before he went to sleep.
02:57:07.000 And so it was clear to them that he is just miming the language, you know, or...
02:57:19.000 It's impersonating a person of faith, but they don't care, really, as long as he does it.
02:57:27.000 And that is, if you're going to look for a silver lining to this, it shows that it's not—they just want— A space where their religious convictions are not under attack, and they don't really care that the person in charge share them.
02:57:45.000 If you pretend to share them, that's good enough.
02:57:48.000 And that's better than actually caring that this person really believe in the rapture or anything else that is quite obviously crazy.
02:57:59.000 So I don't think any Christian who's voting for Trump thinks...
02:58:03.000 I mean, they'll say...
02:58:05.000 I'm not going to judge another man's faith.
02:58:07.000 Who am I to say what's really in his heart?
02:58:10.000 They'll say that, but if you've been paying attention to who he's been, and if you just look at how he talks about these things, I don't think he's fooling any Christian.
02:58:24.000 So I think they're willing to vote for someone.
02:58:26.000 Now, for other reasons that are fairly depressing in their own right, they're willing to vote for someone who doesn't really play the game the way they do.
02:58:40.000 You have to believe in God to be president in 2016, right?
02:58:44.000 Wouldn't you say that that's...
02:58:45.000 You have to pretend to believe in God.
02:58:46.000 You have to.
02:58:47.000 But I think with Trump, I think the pretense is...
02:58:53.000 It's obvious enough that I don't think he's fooling the better part of the people who are voting for him, who would say they care about a person of faith being in the White House.
02:59:03.000 So if anything, he might be—one thing he might be breaking is the barrier on having an atheist president, because I think he—you know, it's just— Nobody thinks he is a person of faith.
02:59:18.000 I don't think anyone really thinks that.
02:59:20.000 So he might be our first atheist president.
02:59:23.000 He would help us in that regard as well.
02:59:25.000 Another Trump media right into the White House.
02:59:28.000 I'm starting to sound like a Trump supporter.
02:59:29.000 Occasionally an asteroid does something good.
02:59:31.000 Who would be the ideal president?
02:59:33.000 I mean, like, what kind of a person?
02:59:35.000 I mean, it would probably be a person who doesn't seek attention.
02:59:38.000 Probably a person that...
02:59:40.000 Well, I don't think it could be that.
02:59:41.000 I mean...
02:59:43.000 The process is...
02:59:46.000 Even an optimized process will require enough sacrifice of what ordinary people want most of the time that it will be an unusual personality who has to get promoted.
03:00:03.000 I mean, you will on some metric be...
03:00:07.000 I mean, it's almost by definition narcissistic to think that you should be in this role, right?
03:00:11.000 Who are you to think that you should be running civilization at this moment in human history?
03:00:17.000 And for you to honestly stand at the podium and say, I'm the guy, you know, or the woman, right?
03:00:25.000 I am the most qualified.
03:00:26.000 I should be doing this, right?
03:00:28.000 I can help.
03:00:31.000 You know, if you're going to scrutinize the kind of personality that could give rise to those opinions, it's not...
03:00:37.000 Yeah, there are some dials you would probably want to chain, tweak if you had to be married to this person, or it's not an optimal personality.
03:00:46.000 So there's going to be...
03:00:47.000 There's a kind of pathology of...
03:00:49.000 Of power seeking that might be just intrinsic to it, but you want someone who is actually wise ethically.
03:00:59.000 I mean, just try to map that onto Trump, right?
03:01:03.000 Imagine someone saying, the thing I like about Trump is that he is so deeply ethical and wise, right?
03:01:11.000 It's just, it does not, I mean, it's like saying it's because his hair looks so natural.
03:01:19.000 I mean, there's just no—it's the antithesis of what he is, right?
03:01:25.000 The thing I like about Trump is that he is so well-informed about the way the world works.
03:01:32.000 And where he's not informed— We're good to go.
03:01:57.000 You'd want to be able to say that about a president.
03:01:59.000 You could not begin to say that about Trump, right?
03:02:03.000 You could probably say that—honestly, you could probably say that about Clinton, right?
03:02:08.000 Hillary?
03:02:08.000 Yeah, for all her defects, she's very knowledgeable, and I'm sure she will just try—where she doesn't feel like she's got the knowledge, she's going to try to go to the source of the knowledge, right?
03:02:21.000 Just grab the best experts she can find.
03:02:24.000 Oh.
03:02:26.000 I think she will be as aware as you or I would be of the consequences of not knowing what's going on.
03:02:34.000 She's just going to want to find out what's going on.
03:02:38.000 All Trump has advertised about himself is that he thinks that bluster and banality and bullying will win in every situation.
03:02:50.000 It's just attitude.
03:02:52.000 The guy is winging it It could not be more obvious that this guy is winging it on every level.
03:03:00.000 It is...
03:03:05.000 There'd be no way for him to signal the fact that he's winging it more clearly than he is with everything he's doing, and yet there's no penalty.
03:03:14.000 Do you think it's possible that in this age of information, the way we can communicate with each other, that we're going to experience these cycles, these waves, these in and outs, these high and low tides?
03:03:27.000 Of really smart presidents and really stupid presidents.
03:03:30.000 And we just, people revolt.
03:03:32.000 And there's just, it's so easy to stay alive.
03:03:35.000 There's plenty of stupid people out there.
03:03:36.000 And so they're only willing to vote for other dumb folks.
03:03:39.000 So the other dumb folks get into position.
03:03:41.000 They send out the frequency that only the dummy's here.
03:03:45.000 And everybody else is going, what the fuck is everybody voting for this guy for?
03:03:48.000 What is happening?
03:03:49.000 And then it makes the smart people rebound in four years and challenge themselves anew.
03:03:55.000 Because they need some sort of an enemy to rally against to reach their full potential.
03:04:00.000 And then without the low tide, you cannot have the high tide, Sam Harris.
03:04:06.000 Hopefully that's not an analogy that applies.
03:04:09.000 The maintenance of civilization.
03:04:11.000 The smell.
03:04:12.000 Yeah.
03:04:13.000 Maybe, man.
03:04:14.000 Maybe.
03:04:14.000 At the very least, it's a wake-up call for the political establishment.
03:04:18.000 This silly game that you've been running of two candidates just doesn't work.
03:04:22.000 Someone can co-opt your candidacy, get in there.
03:04:25.000 Throw the fucking monkey wrench into the gear system, and guess what?
03:04:28.000 Trump's running for president now.
03:04:29.000 He's the head...
03:04:30.000 I mean, he's the head guy for the Republicans.
03:04:33.000 How is that even possible?
03:04:34.000 Well, they don't know.
03:04:35.000 I mean, what's amazing, it is a way...
03:04:37.000 If nothing else, it is a total wake-up call for the Republicans.
03:04:40.000 I mean, they are just...
03:04:40.000 It's June.
03:04:41.000 Aghast, yeah.
03:04:42.000 It's June.
03:04:43.000 Everything's decided, locked down.
03:04:45.000 So we have July, August, September, October, November.
03:04:50.000 We're that close.
03:04:53.000 But he's not someone who has been...
03:04:56.000 who's aligned with the Republican platform in most ways, right?
03:05:03.000 So it's like he's been...
03:05:06.000 The truth is, virtually no one knows what his policies are because he keeps changing his position on things like taxation.
03:05:14.000 He's talked on both sides of core Republican issues.
03:05:20.000 But in many ways, he's left of Hillary.
03:05:26.000 He's left of Hillary in terms of being an isolationist.
03:05:31.000 His relationship to war is...
03:05:36.000 But both extremes.
03:05:38.000 We're going to get out of the world's business.
03:05:42.000 We're going to be isolationists, which is deeply anti-Republican.
03:05:47.000 But I'm going to be the maniac who you're never going to know who I'm going to bomb next.
03:05:51.000 We're going to wipe out ISIS just straight away.
03:05:54.000 Not a man left standing.
03:05:57.000 And...
03:05:59.000 I'm not going to take any shit from anyone, including China and North Korea.
03:06:02.000 So he's that, but we're going to pull back in a huge way and not be in anyone's business, right?
03:06:08.000 He said both of those things.
03:06:16.000 It's way too interesting.
03:06:19.000 We don't want politics to be this interesting.
03:06:23.000 November is going to be, if the polls are closed, watching those debates and waiting for a swing in the polls as a result, it's just going to be way too interesting.
03:06:34.000 It's going to be like watching the Super Bowl, those first debates.
03:06:38.000 It's going to be 100 million people watching those debates.
03:06:42.000 I have a prediction.
03:06:44.000 I think...
03:06:48.000 I think it's entirely possible that this whole thing was a plot that didn't work out.
03:06:53.000 I think he probably came out of the gate saying crazy shit, thinking he would tank the Republican Party and get his friend Hillary Clinton into the White House.
03:07:02.000 But it just didn't work out.
03:07:05.000 He kept trying to insult her, kept trying to make stuff up about Mexicans, and it just kept making him get better and better, and now he's stuck.
03:07:13.000 He can't pull out.
03:07:16.000 That would be a great moment.
03:07:18.000 That would change the system.
03:07:19.000 Well, we're going to have to go through something like this in order for us to realize that this is crazy, that a guy can just do this, can just not really have any interest in politics.
03:07:27.000 But if he pulled out, then he should get the Nobel Prize for everything.
03:07:30.000 If he pulls out at this point and says, listen, I took you to the precipice here.
03:07:35.000 Just because I wanted you to recognize how unstable this situation is.
03:07:40.000 You guys could elect a demagogue who...
03:07:47.000 It's actually an incoherent demagogue.
03:07:49.000 I haven't even been playing an incoherent authoritarian.
03:07:53.000 I'm, on the one hand, very liberal and tolerant, and on the other hand, I'm getting ready to be Hitler, and you guys can't figure out who I am, and yet you're still prepared to vote for me.
03:08:08.000 For him to do a post-mortem on his punking of the culture, that would be the best thing to ever happen.
03:08:17.000 But I don't think that's what's happening.
03:08:19.000 Do we need someone like this so that we realize how silly this whole thing is?
03:08:24.000 Do we need someone like this?
03:08:25.000 No, no.
03:08:25.000 We need a qualified person to deal with all of the other hassles and dangers that are coming our way that have nothing to do with what we do.
03:08:33.000 Right.
03:08:34.000 But that person's not there.
03:08:35.000 Even if we were doing everything perfectly, there would still be this tsunami of risk and hassle and...
03:08:50.000 We're good to go.
03:09:15.000 We're good to go.
03:09:25.000 In 1918, there was a killer flu, and there's going to be another killer flu, right?
03:09:30.000 There's just no way there's not going to be another killer flu.
03:09:33.000 And we need people, smart people to change, to optimize the system to deal with these kinds of things.
03:09:43.000 And if we're promoting religious maniacs and crazy narcissists and liars...
03:09:55.000 And ignoramuses, and only those people, how could this end well?
03:10:00.000 Maybe this is just a weird year for heavyweight boxing.
03:10:04.000 You know, they have those weird years for heavyweight boxing where Tony Tubbs is the champ.
03:10:09.000 Where you could be the heavyweight champion in the world?
03:10:12.000 They went through a period of time in the early 80s before Tyson came around.
03:10:16.000 It was a series of these champs that were sort of like journeyman fighters.
03:10:20.000 And then Tyson came along.
03:10:22.000 Maybe that's what it is.
03:10:22.000 But only with heavyweights, right?
03:10:24.000 Yeah, mostly with heavyweights.
03:10:26.000 Yeah, and the lighter weights, they were always badass.
03:10:28.000 But I think that maybe that's what's going on.
03:10:30.000 Maybe we need to have this bad season, get the season out of our way, realize the danger of having an inept person in office, whether it's a liar, or a dude who hates money, or Trump, whoever it is.
03:10:44.000 Just go through it and realize how silly it is that we have it set up this way still.
03:10:48.000 Except people thought that of Hitler.
03:10:51.000 I mean, any comparison to Hitler obviously brands you as an exaggerator.
03:10:58.000 They thought, let him get in there and fuck it up and then we'll have somebody better?
03:11:02.000 Hitler was a comic figure for a while, for a good long while.
03:11:06.000 And people were, including the American press, were incredibly slow to recognize what a sinister character he was.
03:11:14.000 And he was considered a buffoon.
03:11:17.000 And there was like a...
03:11:19.000 Maybe Jamie could find this.
03:11:22.000 I think it was Home and Garden.
03:11:23.000 I think it was House and Garden?
03:11:25.000 Home and Garden?
03:11:26.000 There was a write-up on his eagle's nest or his house.
03:11:33.000 It was just this pure puff piece of Hitler love in our architectural magazine.
03:11:42.000 At home with the viewer.
03:12:00.000 But it's hilarious.
03:12:01.000 It's just, you know, like Architectural Digest does, you know, The Eagle's Nest.
03:12:05.000 But it's at a time where it's not too far away from a moment where it should have been absolutely obvious to every thinking person that this guy was going to try to, you know, conquer the world for evil, right?
03:12:20.000 And yet it wasn't obvious.
03:12:22.000 And when you look at how it wasn't obvious, it's pretty humbling.
03:12:27.000 I mean, you don't know you would have been necessarily different.
03:12:30.000 Up until this conversation, practically, I've been looking at Trump as a clown, right?
03:12:38.000 But what would this clown actually do with the power of the presidency?
03:12:45.000 I don't know that he couldn't be.
03:12:47.000 I mean, he's given voice to a kind of authoritarianism That, you know, some people are—his enemies are noticing, his friends are discounting, but he's talked about, you know, going after the press, and I mean, he's bragged about how many people he's going to torture,
03:13:03.000 right?
03:13:03.000 He's talked about, you know, well, of course we're going to do waterboarding, and we're going to do worse, and maybe we'll kill the families of terrorists, right?
03:13:11.000 And he—but there's a kind of a— It's going to make America great again.
03:13:20.000 What would he do if he actually had more power than anyone in the world?
03:13:25.000 It's a legit question.
03:13:28.000 The transition from comedy to, oh my god, we can't take this back in anything like short order, that could well be terrifying.
03:13:44.000 To go back to the question of heavyweights, why do you think you could be a fake heavyweight and not a fake middleweight?
03:13:52.000 There's not that many really good athletes that go to boxing when they're really large.
03:13:56.000 They tend to go to football or basketball if they're really tall.
03:14:00.000 If you look at the amount of money that guys in the NBA can make or guys in the NFL can make, the really top-level guys can make a tremendous amount of money.
03:14:10.000 So when you get the really super-athlete guys, They tend to gravitate towards the big name.
03:14:16.000 I mean, there's no bigger name sport than football.
03:14:18.000 So getting someone to abandon the whole team thing and having the balls to go one-on-one in a cage and having that mentality, that's also very different.
03:14:27.000 Because it's not necessarily the smartest thing to do, but it's the most challenging thing to do.
03:14:32.000 And there's some really smart people that do it.
03:14:34.000 So even though cage fighting isn't the safest way to get through life, For a lot of people that engage in, it becomes like an extreme, extremely difficult pursuit.
03:14:47.000 And then that's what it becomes to them.
03:14:48.000 You know, and in the heavyweight division, those guys were being lured into other ways.
03:14:54.000 And boxing was just kind of, it went in through like a peak in a valley.
03:14:57.000 Went Ali, and then it went Larry Holmes.
03:14:59.000 And even though Larry Holmes was amazing, people didn't appreciate him for how good he was.
03:15:03.000 So that doesn't happen, like at the middleweight level, that it's not the same competition for that kind of athlete?
03:15:08.000 They get little lulls in the middleweight division, but it's always pretty fucking strong.
03:15:13.000 But why wouldn't you have the same competition for the high-level athlete at the 165 weight?
03:15:22.000 Our favorite sports require bigger people, like basketball and football, and baseball doesn't really apply here.
03:15:29.000 The amount of cultures that produce heavyweights, first of all, are fairly limited.
03:15:34.000 Like very few heavyweights have come from Asia, except like Polynesian guys, which I guess is kind of Asian, but like Samoans.
03:15:45.000 Samoans known to be great fighters, but giant sturdy heavyweights.
03:15:49.000 The Chinese don't really produce them that often.
03:15:52.000 People don't get that big in Japan.
03:15:54.000 Well, they're sumo.
03:15:56.000 Yeah, but they're very fat.
03:15:57.000 There's never been a guy who looks like Mike Tyson that came out of Japan in the 80s.
03:16:02.000 We're seeing more of that now.
03:16:07.000 But I mean if we had a guy that was like a Japanese version of Mike Tyson, just a super fast blinding knockout fighter with a fucking head like a brick wall and a giant neck that started above his ears and went down to his traps.
03:16:20.000 Remember Tyson when he first came on the scene?
03:16:22.000 Oh yeah.
03:16:22.000 He was unbelievably terrifying.
03:16:25.000 So he's never been a Japanese person that has that kind of physical strength.
03:16:28.000 So I think it's limited genetically.
03:16:30.000 And I think in a lot of the competitive boxing countries, they tend to be poorer countries.
03:16:36.000 And I think also...
03:16:37.000 In a lot of poor countries you'll see much smaller men like you'll see like some some men are like flyweights like it's very rare you find an American flyweight most Americans are larger they get more food right I think probably has a lot to do with it or just just the genetics in general but like South America produces a lot of flyweights like the Philippines that's of course where Manny Pacquiao came from and he was like eight weight classes lower when he first started right And if you're a great athlete at 120 pounds or 130 pounds is not
03:17:07.000 a lot of sports.
03:17:09.000 Yeah, what else can you do?
03:17:10.000 Especially if you, I mean, some of these guys are really tiny, but they're amazing boxers.
03:17:14.000 Like, I mean, there's a ton of them, but in the United States, Johnny Tapia was a smaller guy.
03:17:21.000 I think, what weight did Johnny Tapia fight out?
03:17:24.000 See if you can find that out.
03:17:25.000 But there were some lightweight guys that were just so incredible.
03:17:28.000 They brought so much attention to those divisions.
03:17:31.000 But there was never...
03:17:32.000 There's like little peaks and valleys where greatness comes in and then people have to recover and then new people come along that are great.
03:17:37.000 But there's always been pretty steady.
03:17:39.000 What did he fight at?
03:17:41.000 Super flyweight.
03:17:41.000 Super flyweight.
03:17:43.000 Yeah.
03:17:43.000 So he was one of the rare Americans.
03:17:45.000 Mexican-American Johnny Tapp.
03:17:47.000 He was a bad motherfucker.
03:17:49.000 Super flyweight, which is...
03:17:50.000 What is that?
03:17:50.000 Like 126 or something?
03:17:52.000 Maybe 130?
03:17:55.000 I don't even know what that means.
03:17:57.000 Because Bantamweight, I think, in the UFC, it's different.
03:17:59.000 There's different weight classes.
03:18:03.000 115 pounds.
03:18:04.000 Wow.
03:18:05.000 Crazy.
03:18:06.000 That's tiny.
03:18:07.000 Yeah.
03:18:07.000 He was a wild, wild guy.
03:18:10.000 They did a documentary about him.
03:18:12.000 So he died?
03:18:13.000 Yeah, he died.
03:18:15.000 I don't remember.
03:18:16.000 But he had a lot of problems with drugs and crime and craziness.
03:18:21.000 And he had, like, Mi Vida Loca tattooed on his chest.
03:18:25.000 Right.
03:18:26.000 Well, that might get you the presidency.
03:18:28.000 Ah!
03:18:28.000 No.
03:18:29.000 He was a wild man, but just an amazing fighter to watch.
03:18:32.000 Just so much fun.
03:18:34.000 I think, you know, where the bigger people are, you know, I just think they tend to gravitate towards other sports.
03:18:41.000 I think that's all it is.
03:18:42.000 And boxing, like, always like 160, 147 to 160 has always been like the promised land.
03:18:49.000 That's Sugar Ray Leonard, Marvin Hagler, Roberto Duran.
03:18:52.000 Floyd Mayweather's in there.
03:18:53.000 Sugar Shane Mosley's in there.
03:18:55.000 So many guys are in that mix.
03:18:57.000 That's the sweet spot.
03:18:58.000 It always has been.
03:19:00.000 There's great fighters in pretty much every weight class.
03:19:03.000 Optimizing strength and speed at that point.
03:19:05.000 Yeah.
03:19:05.000 I think you see it in the UFC, too.
03:19:08.000 I think, well, when it comes to just freak movements, I always think that the flyweights and the bantamweights, the 25 and 35s are the fastest and the best guys.
03:19:17.000 They're moving like 20% faster than anybody else.
03:19:19.000 But I always wonder how much of that is because they're just not affected by gravity as much.
03:19:23.000 And they're also not affected by the blows that are being landed by the other guy.
03:19:28.000 Unless it's Mighty Mouse.
03:19:30.000 Mighty Mouse is one of the few guys in that division that consistently stops people.
03:19:34.000 Did you see his last fight with Henry Cejudo?
03:19:37.000 No.
03:19:37.000 It was incredible.
03:19:39.000 It was insane.
03:19:40.000 I mean, he fought this guy, Henry Cejudo, an Olympic gold medalist, one of the best wrestlers to ever compete in MMA. I mean, he is just a stud wrestler and a really good kickboxer, too.
03:19:50.000 And Mighty Mouse clinched up with him and hit him with these knees to the body that were just out of this world technical.
03:19:57.000 Just so perfect.
03:19:58.000 No wind-up, no slop.
03:20:00.000 Just drilled him in on each side with perfect precision.
03:20:04.000 And he just crumpled.
03:20:05.000 He was like, what the fuck?
03:20:06.000 The victory on knees?
03:20:08.000 Knees to the body.
03:20:08.000 Just kneed the shit out of his body.
03:20:11.000 Kneed him in the face.
03:20:12.000 But it was the fluidity of the way he was moving his knees into perfect position.
03:20:18.000 I mean, they were so perfectly oiled.
03:20:22.000 Like, everything was going down a path that it had gone a million times.
03:20:26.000 Wham!
03:20:27.000 Bam!
03:20:27.000 Bam!
03:20:28.000 Yeah, but it was better than I've ever seen.
03:20:30.000 I mean, it was without a doubt the most...
03:20:33.000 Well, there's two.
03:20:34.000 There's another one between Anderson Silva and Rich Franklin, but that was like a prolonged, brutal beatdown where Anderson just kept beating him up and beating him up in the clinch and broke his nose.
03:20:44.000 There's one where...
03:20:46.000 Yeah, I remember a victory.
03:20:49.000 Was it Weidman or was it Silva?
03:20:51.000 I can't remember.
03:20:52.000 Someone just won on a knee to the chest against someone who was...
03:20:57.000 Can you knee someone who's...
03:20:58.000 Yes, who's down.
03:20:59.000 It was Chael Sonnen and Anderson Silva.
03:21:01.000 Right, yeah.
03:21:01.000 Chael Sonnen, yeah, yeah.
03:21:02.000 Yeah, that was pretty brutal.
03:21:04.000 But what Mighty Mouse did in this fight that was crazy was the precision of the placement of the knees and how quick they came.
03:21:11.000 Just bam, bam, bam, bam, just controlled them.
03:21:13.000 And he's controlling a guy who's an Olympic gold medalist wrestler, just a stud wrestler.
03:21:17.000 It was really pretty impressive stuff, like really, really amazing, sharp technique.
03:21:22.000 But I wonder, could a heavyweight even move like that?
03:21:24.000 Yeah, probably not.
03:21:26.000 No, I mean, as you get bigger, there are just things you can't do.
03:21:30.000 Clearly, there's a limit to the size you can be and be not only athletic, but even ambulatory.
03:21:40.000 I mean, you couldn't have a 30-foot tall person who could walk around and your bones would break.
03:21:46.000 Because mass goes up with a cube of just the size.
03:21:52.000 It's why if you...
03:21:57.000 Well, this is kind of a different point, but you could drop an ant off the Empire State Building, and it'll fall and hit the ground and be fine.
03:22:07.000 If you drop a horse off the Empire State Building, it's going to be a liquid horse.
03:22:12.000 It's a...
03:22:13.000 I mean, there you have...
03:22:16.000 Air resistance with surface area.
03:22:21.000 The air resistance goes up by the square, the surface area.
03:22:26.000 But that doesn't counteract for the mass going up with the cube, the volume.
03:22:32.000 So the horse is bigger.
03:22:34.000 You'd think it might be able to act like a wing as much as the ant would.
03:22:39.000 It's got a lot of air resistance.
03:22:40.000 It's a giant, so it's a horse.
03:22:41.000 But it's...
03:22:44.000 Its mass is going up with a cube of its size, so the air doesn't resist its fall at all compared to what it's doing for an ant.
03:22:56.000 But yeah, we have a limit on...
03:22:58.000 This is one function.
03:23:00.000 If you're going to engineer the super athlete, if we're going to give you chimpanzee muscle proteins or whatever to make you super explosive and strong, you'd have to get that right with your connective tissue and your bones and everything else because you could rip your own arm off with your ballistic moves.
03:23:20.000 Yeah.
03:23:21.000 Yeah, do you imagine if you had chimp strength and human tendons?
03:23:25.000 Like, good luck.
03:23:25.000 Just break your hand off.
03:23:27.000 You take your own arm off and beat the person with it.
03:23:31.000 Well, I'm sure you saw that video of the little boy who got into the gorilla cage.
03:23:36.000 Yeah.
03:23:37.000 As they had to shoot the gorilla.
03:23:39.000 That seemed...
03:23:40.000 I heard a lot of...
03:23:41.000 I didn't pay a lot of attention to the commentary, but...
03:23:44.000 It seemed pretty straightforward to me.
03:23:47.000 Do you have to shoot the gorilla?
03:23:48.000 Yeah.
03:23:49.000 I'm with the zoo on that one.
03:23:51.000 It's totally tragic.
03:23:53.000 How is a zoo...
03:23:54.000 The zoo is certainly culpable for having an enclosure that a three-year-old or four-year-old can get into.
03:24:01.000 How the hell did that happen?
03:24:03.000 So you've got to fix that.
03:24:06.000 But it's totally tragic.
03:24:07.000 But once you have a 400-pound gorilla that has a human child and is...
03:24:13.000 Not letting it go.
03:24:15.000 You know, just kind of dragging it around.
03:24:16.000 I mean, it wasn't looking aggressive toward the child, but just the fact that it moved it around with that kind of force, who knows what was going to happen.
03:24:25.000 I mean, that looked like you had to end that as quickly as possible.
03:24:28.000 We have to assume that that gorilla is going to know that a baby is more fragile than a baby gorilla.
03:24:32.000 We have to assume.
03:24:33.000 I don't assume anything.
03:24:34.000 We can't assume.
03:24:35.000 Oh, yeah.
03:24:35.000 No, no.
03:24:36.000 There's no way it could know.
03:24:37.000 It never has experienced it.
03:24:38.000 He could have just...
03:24:39.000 Rip their arms off.
03:24:40.000 Torn his head off.
03:24:41.000 Yeah, easily.
03:24:42.000 Accidentally.
03:24:42.000 Oh, yeah.
03:24:43.000 Easily.
03:24:44.000 No, it's totally tragic, and I'm sure the parents and the zoo are reaping sufficient criticism, but once that situation is unfolding, I think, I mean, you can't tranquilize it because it doesn't work fast enough.
03:25:01.000 Right, no.
03:25:02.000 And it might grab the baby in fear and think it's being attacked.
03:25:07.000 Yeah.
03:25:07.000 Jesus Christ, that's scary, though.
03:25:09.000 But if it was your kid, I think he'd probably, like, shoot that fucking gorilla.
03:25:13.000 Oh, yeah?
03:25:14.000 If it was your kid?
03:25:15.000 I mean, if more people were carrying guns, you know, where was it?
03:25:22.000 It was in Cincinnati, right?
03:25:23.000 If that had been in Texas, he probably would have had some innocent bystander who was going to take the law into his own hands.
03:25:29.000 Yeah.
03:25:30.000 Chuck Norris' pants.
03:25:31.000 Jump right over the railing.
03:25:33.000 Yeah.
03:25:34.000 Open fire.
03:25:36.000 Yeah.
03:25:37.000 I saw some horrible comments, too, where people were like, they should have just shot the parents while they were at it.
03:25:44.000 Well, it's easy to be outraged.
03:25:46.000 Somebody fucked up.
03:25:47.000 It's a little kid.
03:25:49.000 You shouldn't have been able to get in, first of all.
03:25:51.000 You've got a gorilla enclosure.
03:25:53.000 It is an architectural failing.
03:25:54.000 I mean, it should be impossible to get in.
03:25:58.000 Not like a three-year-old could do it.
03:26:00.000 How could a three-year-old...
03:26:01.000 It should be impossible for an adult to get in there with a gorilla.
03:26:04.000 Yeah.
03:26:05.000 But it's been happening with...
03:26:07.000 It's pretty regular lately.
03:26:09.000 Guys have been breaking into zoos.
03:26:12.000 Some guy tried to get killed by the lions recently, right?
03:26:16.000 Yeah, Jesus Christ.
03:26:17.000 I mean, you gotta realize, you got a lot of responsibility when you have monsters in a cage in your city.
03:26:21.000 You can't let babies get in there with them.
03:26:23.000 I mean, a gorilla is awesome as it is.
03:26:25.000 If it wanted to attack you, it's a monstrous beast.
03:26:30.000 It's a thing with power that you couldn't even fathom.
03:26:34.000 A gorilla could literally pick you up and throw you like you could a football.
03:26:39.000 I mean, they can launch you.
03:26:41.000 They're so strong.
03:26:42.000 Oh, yeah.
03:26:43.000 Didn't we Google it?
03:26:44.000 They get to be like 500 pounds or something crazy.
03:26:46.000 Yeah, well, this was like 400 pounds.
03:26:48.000 Oh, God.
03:26:49.000 But not 400 pounds like Bob Sapp.
03:26:54.000 400 pounds.
03:26:55.000 I mean, he was 300 pounds.
03:26:56.000 But it's like a 300-pound gorilla is much stronger than Bob Sapp.
03:27:00.000 Yeah.
03:27:01.000 I mean, they're not equivalent pounds.
03:27:03.000 It's unfathomable the amount of physical strength they must have.
03:27:07.000 Yeah.
03:27:08.000 It just sucks that they keep them in zoos in the first place.
03:27:11.000 It sucks that they had to kill them, but it really sucks that they keep doing this zoo thing with smart stuff.
03:27:17.000 If you want to have giraffes in the zoo, and I had a whole bit about it, that they look real relaxed because there's no lions around.
03:27:23.000 They don't care.
03:27:24.000 They just give them some food.
03:27:26.000 But there's some animals that look tortured, and primates in particular.
03:27:31.000 They just look so freaked out in this enclosure, in this weird place where people are staring at them.
03:27:37.000 They're pacing and trying to get away from people's gazes.
03:27:40.000 It's just, I think it's very, very stressful to them.
03:27:43.000 I think, well, it's a hard question of what to do given certainty that these species are...
03:27:51.000 Are on the verge of extinction.
03:27:53.000 How do you preserve them?
03:27:56.000 Obviously you can preserve them in all kinds of technical ways, like have their DNA frozen and be able to reboot them at a certain point when we figure out how to preserve their habitat.
03:28:11.000 I mean, I gotta think there's a role for good zoos.
03:28:16.000 Also, you just want to maintain the public's connection to these animals, because the decision to destroy habitat is made by people who don't really care about the prospects of extinction,
03:28:31.000 right?
03:28:32.000 It's a very good point when you present it that way, because the people that are over there are facing...
03:28:36.000 I mean, any people that are over in Africa trying to save gorillas and chimps, I mean, that is an unbelievably difficult struggle, and they might not make it.
03:28:45.000 I mean, there's a real concern that if there was no regulation at all, and there was no one telling anybody what to do, that they could just go in there and wipe them all out.
03:28:54.000 Well, historically, that's what we've done, right?
03:28:56.000 With kind of everything that we've profited from.
03:28:59.000 Anything that you can make money off of?
03:29:01.000 Anything that you can...
03:29:02.000 I mean, I don't know what the hell they use chimps and gorillas for.
03:29:07.000 Well, there's the whole bushmeat trade.
03:29:09.000 Well, no, they just...
03:29:10.000 I mean, there's bushmeat.
03:29:11.000 They eat them.
03:29:11.000 But then there's the...
03:29:13.000 Why do they call it bushmeat?
03:29:14.000 Well, you go into the bush, and it's like hunting.
03:29:17.000 But you just shoot everything.
03:29:19.000 Well, it's just they're hunting species that you don't think of as food species, but they're eating monkeys and gorillas.
03:29:27.000 And that's why they call it bush meat?
03:29:29.000 Well, I mean, bush is like the jungle.
03:29:33.000 So it's just hunting species that are...
03:29:40.000 There's the other component of it, which is the crazy ideas that the Chinese have about the medicinal properties of tiger bone wine or rhino horn.
03:29:50.000 So you have these species that are being hunted by poachers because there's a market for their parts, like the ivory trade.
03:30:02.000 But some people just eat species that are endangered, too.
03:30:06.000 The term bushmeat is always associated with primates for some reason.
03:30:10.000 I was always trying to figure out why.
03:30:12.000 I don't know.
03:30:13.000 Is that true?
03:30:13.000 No.
03:30:14.000 It's probably not, but in my mind it was.
03:30:16.000 I don't know.
03:30:18.000 When I read about people eating chimps and how common it was.
03:30:22.000 My friend Steve Rinella did a show in Bolivia where they shot and they ate a monkey.
03:30:28.000 And it's so weird to watch.
03:30:31.000 These people...
03:30:32.000 Yanumami?
03:30:33.000 Is that how you say them?
03:30:34.000 Yanumami?
03:30:35.000 They live in Bolivia and they live in very much like the way they probably lived hundreds of years ago.
03:30:41.000 They have handmade bows and arrows, these long spear-like arrows.
03:30:46.000 It's not like a regular bow and arrow.
03:30:47.000 It's like a five foot long arrow.
03:30:50.000 Very strange.
03:30:50.000 They walk around barefoot.
03:30:52.000 And they have some Russian shotgun.
03:30:55.000 And their favorite thing to do is eat monkeys.
03:30:57.000 They have a Russian shotgun?
03:30:58.000 Yeah, they got a Russian shotgun that somehow they got from somebody and it made it all the way deep into the jungle.
03:31:03.000 But the shells are very precious.
03:31:05.000 But their favorite thing to do is shoot and eat monkeys.
03:31:07.000 It's like their number one favorite thing to eat.
03:31:10.000 And they cooked it on the show and it's really weird to watch.
03:31:14.000 You talk about some strange genetic connection that you have with a very human shape.
03:31:21.000 Like the lips and the eyes and the face.
03:31:24.000 You could recognize a certain amount of human in that little fella.
03:31:28.000 Or fella.
03:31:30.000 Female.
03:31:31.000 Whatever.
03:31:31.000 Female version of fella would be.
03:31:34.000 So they cooked it over this fire and then made this stew out of it.
03:31:39.000 Right.
03:31:40.000 Well, actually, to go back to our cultured meat conversation, one thing that's weird about that prospect is that if you're just growing cells in a vat, then there's no problem with cannibalism.
03:31:56.000 So you could be growing human meat in a vet.
03:32:00.000 There's zero ethical problem, but it's just as grotesque as, at least to my palate, it's a fairly grotesque thing to contemplate.
03:32:12.000 But these are just...
03:32:14.000 The distinction...
03:32:15.000 I mean, you're just talking about...
03:32:16.000 There is no, in principle, human DNA. And at the cellular level, the...
03:32:26.000 The difference between human muscle protein and bovine muscle protein, if this was never attached to an animal, we're dealing with concepts here.
03:32:41.000 If you bite a fingernail and swallow it, are you practicing autocannibalism?
03:32:48.000 It's a...
03:32:53.000 At some level, the concept is doing a lot of work.
03:32:57.000 It's unignorable when you're talking about depriving another person of life.
03:33:00.000 But if you're talking about spinning up cells in a vat, then it becomes, well, does it really matter whether this was a person?
03:33:08.000 Maybe people would decide that would be the only ethical choice.
03:33:12.000 You have to eat cultured human meat.
03:33:13.000 If you can eat meat, you have to eat human meat.
03:33:16.000 There's the end of this economy.
03:33:19.000 You can't harm any other animals.
03:33:21.000 Not only can you not, you have to eat yourself.
03:33:22.000 This is what the cattle industry could do to just quash this.
03:33:26.000 They just spread the rumor that there are human cells in those vats.
03:33:29.000 Yeah, you go to your local center, you get scraped, and then they start making your monthly supply of you.
03:33:36.000 Soylent Green is people.
03:33:37.000 Yeah.
03:33:38.000 And they just have it in a vat and they break you off a cube every week.
03:33:43.000 You take it back to your flat and you eat yourself.
03:33:48.000 That's the only meat you're allowed.
03:33:50.000 Well, we figured out a way to live in harmony with nature.
03:33:52.000 We just have to kill everything except us and then eat ourselves.
03:33:56.000 Tell us which part of your own body you want to eat for the rest of your life and we will culture those cells.
03:34:01.000 Well, I know it was you that I was having this conversation with once, I believe, where we were talking about how when areas become more educated and women become more educated, it tends to slow the population down.
03:34:12.000 People tend to even worry that if these graphs continue further on, that people in industrialized parts of the world, as they get into the first world, if they do, they're more likely to have less and less people.
03:34:28.000 Oh, yeah.
03:34:29.000 Less and less children.
03:34:30.000 Fertility goes down with literacy and education among women, yeah.
03:34:36.000 And so just to kind of map that on to life as you know it here, so women, given All the choices available, educational, economic, and an ability to plan a pregnancy.
03:34:53.000 So here we have women who want to have careers, want to go to college, and they delay pregnancy to the point where they They have realized a lot of those aspirations, and so pregnancies come later and later and later,
03:35:08.000 and families get smaller and smaller.
03:35:13.000 Virtually no one chooses to have 10 kids in the face of all of this other opportunity that We're the things they also want out of life, right?
03:35:27.000 If they can avoid it.
03:35:28.000 If you can't avoid it, well then you just find yourself with ten kids, right?
03:35:31.000 Or if you have some religious dogma which says you, though it's possible to avoid, you shouldn't avoid it because you were put here to have as many kids as possible.
03:35:39.000 But are you allowed to bring that up when you talk about the population crisis?
03:35:43.000 Are you allowed to...
03:35:44.000 I mean, that's a fascinating piece of information.
03:35:49.000 Which population crisis are you thinking of?
03:35:51.000 Because there are two different ones.
03:35:54.000 There are two opposite ones.
03:35:55.000 China where they don't let you have girls.
03:35:58.000 Have they lessened that?
03:36:01.000 I think they've relaxed that.
03:36:02.000 I think they've relaxed the one-child policy, but I'm not sure...
03:36:06.000 I remember hearing something about that.
03:36:07.000 And the other one you would say India?
03:36:09.000 Well, no, no.
03:36:10.000 The United States?
03:36:11.000 No, there's an overpopulation crisis in certain countries and disproportionately in the developing world.
03:36:20.000 And there is underpopulation in the developed world.
03:36:24.000 Most of Western Europe is not replacing itself.
03:36:28.000 So you're having these senescent populations who have to, they just have to import They rely on immigration to carry on the functions of society because they're not anywhere near a replacement rate.
03:36:48.000 The most surprising detail that brings this home is that There are more adult diapers—now, this is Japan—there are more adult diapers sold in Japan than baby diapers.
03:37:02.000 Now, just think about the implication of that for a society, right?
03:37:06.000 How do you have a functioning society, barring perfect robots that can tend to your needs, where you have just— A disproportionate number of people who are no longer economically productive,
03:37:23.000 relying on the labor of the young to keep them alive and cure their diseases and defend them from crime, all that.
03:37:35.000 But the ratio is totally out of whack.
03:37:42.000 The world is a giant Ponzi scheme on some level.
03:37:44.000 You need new people to come in to maintain it for the old people, apart from having some technology that allows you to do that without people.
03:37:56.000 But I think everything I've heard about population recently suggests that we are on course globally to peak around $9.5 billion and then taper off.
03:38:08.000 I don't think anyone now is forecasting this totally unsustainable growth where we're going to wind up with Did I say million?
03:38:16.000 Nine and a half billion people.
03:38:17.000 You said billion.
03:38:18.000 Billion.
03:38:19.000 Where we're going to hit something like 20 billion people, right?
03:38:23.000 I don't think anyone, even the most Malthusian people, are expressing that concern at the moment, which was the case like 20 or 30 years ago where they thought this is just going to keep going and we're going to hit the carrying capacity of the earth,
03:38:40.000 which is something like 40 billion people.
03:38:45.000 I don't think anyone thinks so.
03:38:46.000 Because fertility is falling everywhere, but it has fallen actually below replacement in the developed world.
03:38:55.000 Do you think in our lifetimes, or in our children's lifetime, it's feasible that we figure out a way, in some way, to...
03:39:06.000 I'm not endorsing taking people's money and giving it to other people, but in some sort of a way, to eliminate poverty.
03:39:13.000 Is that even possible?
03:39:14.000 Is it ever going to be possible to completely eliminate poverty worldwide and within a lifetime?
03:39:20.000 Well, I think we talked about this the last time when we spoke about AI, but this is the implication of much of what we talked about here.
03:39:29.000 If you imagine building the perfect labor-saving technology, where you imagine just having A machine that can build any machine that can do any human labor powered by sunlight more or less for the cost of raw materials,
03:39:46.000 right?
03:39:46.000 So you're talking about the ultimate wealth generation device.
03:39:50.000 Now we're not just talking about blue-collar labor.
03:39:52.000 We're talking about the kind of labor you and I do, right?
03:39:56.000 So like artistic labor and scientific labor and You know, just a machine that comes up with good ideas, right?
03:40:04.000 We're talking about general artificial intelligence.
03:40:08.000 This, if in the right political and economic system, this would just cancel any need for people to have to work to survive, right?
03:40:18.000 There'd be enough of everything to go around.
03:40:22.000 And then the question would be, do we have the right political and economic system where we actually could spread that wealth?
03:40:29.000 Or would we just find ourselves in some kind of horrendous arms race and a situation of wealth inequality unlike any we've ever seen?
03:40:42.000 It's not in place now.
03:40:53.000 All of my concerns about AI were gone.
03:40:56.000 I mean, there's no question about this thing doing things we didn't want.
03:40:59.000 It would do exactly what we want when we want it, and there's just no danger of its interests becoming misaligned with our own.
03:41:07.000 It's just like a perfect oracle and a perfect designer of new technology.
03:41:13.000 If it was handed to us now, I would expect just complete chaos, right?
03:41:20.000 If Facebook built this thing tomorrow and announced it, or rumor spread that they had built it, right?
03:41:28.000 What are the implications for Russia and China?
03:41:30.000 Well, insofar as they are as adversarial as they are now, it would be rational for them to just nuke California, right?
03:41:39.000 Because having this device is just a winner-take-all scenario.
03:41:45.000 I mean, you win the world if you have this device.
03:41:47.000 You can turn the lights off in China the moment you have this device.
03:41:51.000 It's just the ultimate...
03:41:53.000 Because literally, we're talking about, and many people may doubt whether such a thing is possible, but again, we're just talking about The implications of intelligence that can make refinements to itself over a time course that bears no relationship to what we experience as apes,
03:42:17.000 right?
03:42:17.000 So you're talking about a system that can make changes to its own source code and become better and better at learning and more and more knowledgeable, if we give it access to the Internet.
03:42:29.000 It has instantaneous access to all human and machine knowledge, and it does thousands of years of work every day of our lives.
03:42:42.000 Thousands of years of equivalent human-level intellectual work.
03:42:49.000 Our intuitions completely falter to capture just how immensely powerful such a thing would be, and there's no reason to think This isn't possible.
03:42:58.000 The most skeptical thing you can honestly say about this is that this isn't coming soon.
03:43:04.000 But to say that this is not possible makes no scientific sense at this point.
03:43:10.000 There's no reason to think that a sufficiently advanced digital computer can't instantiate general intelligence of the sort that we have.
03:43:22.000 There's no reason to think that.
03:43:23.000 Intelligence has to be at bottom, some form of information processing.
03:43:27.000 And if we get the algorithm right with enough hardware resources, and the limit is definitely not the hardware at this point, it's the algorithms.
03:43:40.000 There's just no reason to think this can't take off and scale and that we would be in the presence of something that is like having an alternate human civilization in a box that is making thousands of years of progress every day,
03:43:57.000 right?
03:43:58.000 So just imagine that if you had in a box You know, the 10 smartest people who've ever lived.
03:44:03.000 And, you know, every time, every week, they make 20,000 years of progress, right?
03:44:08.000 Because that is the actual—we're talking about electronic circuits being a million times faster than biological circuits.
03:44:16.000 So even if it was just—and I believe I said this the last time we talked about AI, but this is what brings it home for me— Even if it's just a matter of faster, right?
03:44:26.000 It's not anything especially spooky.
03:44:28.000 It's just this can do human-level intellectual work but just a million times faster.
03:44:34.000 And again, this totally undersells the prospects of superintelligence.
03:44:40.000 I think human-level intellectual work is going to seem pretty paltry in the end.
03:44:47.000 But imagine just speeding it up.
03:44:49.000 If we were doing this podcast...
03:44:52.000 Imagine how smart I would seem if, between every sentence, I actually had a year to figure out what I was going to say next, right?
03:45:02.000 And so I say this one sentence, and you ask me a question, and then in my world, I just have a year.
03:45:06.000 I'm going to go spend the next year getting ready for Joe, and it's going to be perfect.
03:45:13.000 And this is just compounding upon itself.
03:45:16.000 Like, not only am I working faster...
03:45:21.000 Ultimately, I can change my ability to work faster.
03:45:25.000 We're talking about software that can change itself.
03:45:27.000 You're talking about something that becomes self-improving.
03:45:30.000 So there's a compounding function there.
03:45:33.000 But the point is it's unimaginable in terms of how...
03:45:47.000 We're good to go.
03:46:10.000 You know, we say, you know, cure Alzheimer's and it cures Alzheimer's.
03:46:13.000 You know, you solve the protein folding problem and it's just off and running and to develop a perfect nanotechnology and it does that.
03:46:22.000 This is all, again, going back to David Deutsch, there's no reason to think this isn't possible because anything that's compatible with the laws of physics can be done given the requisite knowledge, right?
03:46:35.000 So you just, you get enough intelligence and I don't know.
03:47:01.000 Their worst fears could be realized.
03:47:03.000 If Donald Trump is president, what's Donald Trump going to do with a perfect AI when he has already told the world that he hates Islam, right?
03:47:15.000 We would have to have a political and economic system that allowed us to absorb this ultimate wealth-producing technology.
03:47:26.000 And again, so this may all sound like pure sci-fi craziness to people.
03:47:31.000 I don't think there is any reason to believe that it is.
03:47:34.000 But walk way back from that edge of craziness and just look at dumb AI, narrow AI, just self-driving cars and automation and intelligent algorithms that can do human-level work.
03:47:52.000 That is already poised to change our world massively and create massive wealth inequality, which we have to figure out how to spread this wealth.
03:48:01.000 You know, what do you do when you can automate 50% of human labor?
03:48:07.000 Were you paying attention to the artificial intelligence Go match?
03:48:12.000 Yeah, yeah.
03:48:13.000 I mean, I don't actually play Go, so I wasn't paying that kind of attention to it, but I'm aware of what happened there.
03:48:19.000 Do you know the rules of Go?
03:48:24.000 I don't play it.
03:48:25.000 I know vaguely how it looks when a game is played.
03:48:30.000 It's supposed to be very complicated though.
03:48:32.000 More complicated and more possibilities than chess.
03:48:36.000 And that's why it took 20 years longer for a computer to be the best player in the world.
03:48:43.000 Did you see how the computer did it too?
03:48:51.000 The company that did it is DeepMind, which was acquired by Google, and they're at the cutting edge of AI research.
03:49:01.000 The cartoons are unfortunately not so far from what is possible.
03:49:11.000 Yeah, I mean, there's...
03:49:13.000 Again, this is not general intelligence.
03:49:17.000 These are not machines that can even play tic-tac-toe, right?
03:49:21.000 Now, there have been some moves away from this, like DeepMind has trained...
03:49:26.000 An algorithm to play all of the Atari games, like from 1980 or whenever.
03:49:33.000 And it very quickly became superhuman on most of them.
03:49:37.000 I don't think it's superhuman on all of them yet, but it could play Space Invaders and Breakout and all these games that are...
03:49:48.000 It's highly unlike one another.
03:49:50.000 And it's the same algorithm becoming expert and superhuman in all of them.
03:49:55.000 And that's a new paradigm.
03:49:57.000 And it's using a technique called deep learning for that.
03:50:02.000 And that's been very exciting and will be incredibly useful.
03:50:07.000 The flip side of all this, I know that everything I tend to say on this sounds scary, but The next scariest thing is not to do any of this stuff.
03:50:17.000 We want intelligence.
03:50:19.000 We want automation.
03:50:20.000 We want to figure out how to solve problems that we can't yet solve.
03:50:24.000 Intelligence is the best thing we've got, so we want more of it.
03:50:27.000 But we have to have a system where...
03:50:30.000 It's scary that we have a system where if you gave the best possible version of it to one research lab or to one government...
03:50:39.000 It's not obvious that that wouldn't destroy humanity.
03:50:44.000 That wouldn't lead to massive dislocations where you'd have some trillionaire who's trumpeting his new device and just 50% unemployment in the U.S. in a month.
03:50:56.000 It's not obvious how we would absorb This level of progress.
03:51:02.000 And we definitely have to figure out how to do it.
03:51:06.000 And of course we can't assume the best case scenario, right?
03:51:09.000 That's the best case scenario.
03:51:11.000 I think there's a few people that put it the way you put it that terrify the shit out of people.
03:51:17.000 And everyone else seems to have this rosy vision of increased longevity and automated everything and everything fixed and easy to get to work.
03:51:28.000 Medical procedures would be easier.
03:51:30.000 They're gonna know how to do it better.
03:51:31.000 Everybody looks at it like we are always going to be here.
03:51:34.000 But are we obsolete?
03:51:36.000 I mean, is this idea of a living thing that's creative and wrapped up in emotions and lust and desires and jealousy and all the pettiness that we see celebrated all the time, we still see it.
03:51:48.000 It's not getting any better, right?
03:51:51.000 Are we obsolete?
03:51:52.000 I mean, what if this thing comes along and says, listen, there's a way to do...
03:51:55.000 You can abandon all that stupid shit.
03:51:57.000 You can abandon all that makes you...
03:51:59.000 All the stuff that makes you fun to be around?
03:52:01.000 Yeah.
03:52:02.000 It also fucks with you.
03:52:03.000 You can live three times as long without that stuff.
03:52:06.000 I think it would, in the best case, would usher in a...
03:52:16.000 The possibility of a fundamentally creative life on the order of something like The Matrix, whether it's in The Matrix or it's just in the world that has been made as beautiful as possible based on what would functionally be an unlimited resource of intelligence.
03:52:42.000 For there to be a...
03:52:46.000 An ability to solve problems of a sort that we can't currently imagine.
03:52:51.000 I mean, it really is like a place on the map that you can't...
03:52:54.000 You can indicate it's over there.
03:52:57.000 You know, it's like the blank spot on the map.
03:52:59.000 This is why it's called the singularity, right?
03:53:01.000 It's like this is a...
03:53:03.000 It was John von Neumann, the...
03:53:06.000 The inventor of game theory, a mathematician who, along with Alan Turing and a couple of other people, is really responsible for the computer revolution.
03:53:18.000 He was the first person to use this term, singularity, to describe just this, that there's a speeding up of Information processing technology and a cultural reliance upon it beyond which we can't actually foresee the level of change that can come over our society.
03:53:40.000 It's like an event horizon past which we can't see.
03:53:45.000 And this certainly becomes true when you talk about these intelligent systems being able to make changes to themselves.
03:53:54.000 And again, we're talking mostly software.
03:53:56.000 I'm not imagining...
03:53:57.000 I mean, the most important breakthroughs are certainly at the level of better software.
03:54:04.000 I mean, we have...
03:54:05.000 In terms of the computing power, the physical hardware on Earth, that's not what's limiting our AI at the moment.
03:54:14.000 It's not like we need more hardware.
03:54:19.000 But we will get more hardware, too, up to the limits of physics.
03:54:23.000 And it will get smaller and smaller, as it has.
03:54:25.000 And if quantum computing becomes possible or practical, that will...
03:54:33.000 Actually, David Deutsch, the physicist I mentioned, is one of the fathers of the concept of quantum computing.
03:54:42.000 That will open up a whole other area, you know, extreme of computing power that is not at all analogous to the kinds of machines we have now.
03:54:55.000 But it's just...
03:54:59.000 When you imagine...
03:55:03.000 People seem to always want to...
03:55:06.000 I just had this conversation with Neil deGrasse Tyson on my podcast.
03:55:10.000 Name-dropper?
03:55:12.000 I'm just attributing these ideas to him.
03:55:18.000 He doesn't take this line at all.
03:55:20.000 He thinks it's all bullshit.
03:55:21.000 He's not at all worried about AI. What does he think?
03:55:24.000 He thinks that we just use...
03:55:28.000 He's drawing an analogy from...
03:55:30.000 How we currently use computers, that they just keep helping us do what we want to do.
03:55:37.000 Like, we decide what we want to do with computers, and we just add them to our process, and that process becomes automated, and then we'll find new jobs somewhere else.
03:55:46.000 Like, you don't need a stenographer once you have voice recognition technology, and that's not a problem.
03:55:52.000 A stenographer will find something else to do, and so the economic dislocation isn't that bad.
03:55:57.000 And...
03:55:59.000 Computers will just get better than they are, and eventually Siri will actually work, and she'll answer your questions well, and it's not going to be a laugh line, what Siri said to you today.
03:56:10.000 And then all of this will just proceed to make life better, right?
03:56:17.000 Now, none of that is imagining what it will be like to make...
03:56:23.000 Because there will be a certain point where you'll have systems that are...
03:56:30.000 The best chess player on Earth is now always going to be a computer.
03:56:34.000 There's not going to be a human born tomorrow that's going to be better than the best computer.
03:56:41.000 We have superhuman chess players on Earth.
03:56:45.000 Now imagine having computers that are superhuman At every task that is relevant, every intellectual task.
03:56:54.000 So the best physicist is a computer.
03:56:56.000 The best medical diagnostician is a computer.
03:57:00.000 The best prover of math theorems is a computer.
03:57:04.000 The best engineer is a computer.
03:57:05.000 There's no reason why we're not headed there.
03:57:10.000 The only reason I could see we're not headed there is that something massively dislocating happens that prevents us from continuing to improve our intelligent machines.
03:57:19.000 But the moment you admit that intelligence is just a matter of information processing...
03:57:25.000 And you admit that we will continue to improve our machines unless something heinous happens, because intelligence and automation are the most valuable things we have.
03:57:36.000 At a certain point, whether you think it's in five years or 500 years, we are going to find ourselves in the presence of super intelligent machines.
03:57:45.000 And then at that point...
03:57:48.000 The best source of innovation for the next generation of software or hardware or both will be the machines themselves, right?
03:57:58.000 So then that's where you get what the mathematician I.J. Goode described as the intelligence explosion, which is just the process can take off on its own.
03:58:10.000 And this is where the singularity people either are hopeful or worried, because there's no guarantee that this process will remain aligned with our interests.
03:58:26.000 And every person who I meet, even very smart people like Neil, who says they're not worried about this, When you actually drill down on why they're not worried, you find that they're actually not imagining machines making changes to their own source code.
03:58:49.000 Or they simply believe that this is so far away that we don't have to worry about it now.
03:58:57.000 And that's actually a non-sequitur.
03:58:59.000 To say that this is far away is not actually grappling with It's not an argument that this isn't going to happen.
03:59:06.000 And it's based on what, too?
03:59:11.000 First of all, there's no reason to believe...
03:59:15.000 Jamie, you want to find out where that is?
03:59:21.000 We don't know how long it will take us to prepare for this.
03:59:24.000 If you knew it was going to take 50 years for this to happen...
03:59:32.000 Is 50 years enough for us to prepare politically and economically to deal with the ramifications of this?
03:59:38.000 And to do it...
03:59:40.000 And to say nothing of actually building the AI safely in a way that's aligned with our interests?
03:59:46.000 I don't know.
03:59:47.000 I mean, so 50 years is...
03:59:49.000 It's like we've had the iPhone for, what, 10 years?
03:59:53.000 Nine years?
03:59:54.000 I mean, it's like 50 years is not a lot of time, right, to deal with this.
03:59:58.000 And...
04:00:02.000 There's just no reason to think it's that far away if we keep making progress.
04:00:07.000 I mean, it's not...
04:00:08.000 It would be amazing if it were 500 years away.
04:00:10.000 I mean, that seems like...
04:00:12.000 It's more likely...
04:00:14.000 I mean, from what I... I mean, the sense I get from the people who are doing this work, it's far more likely to be 50 years than 500 years.
04:00:25.000 Like, you know...
04:00:31.000 I mean, the people who think this is a long, long way off are, I mean, they're saying, you know, 50 to 100 years.
04:00:40.000 No one says 500 years.
04:00:43.000 As far as I know, no one who's actually close to this work.
04:00:46.000 And some people think it could be in five years, right?
04:00:50.000 I mean, the people who are, you know, like the deep mind people who are very close to this are the sorts of people who say...
04:00:56.000 Because the people who are close to this work are astonished by what's happened in the last 10 years.
04:01:02.000 Like, we went from a place of...
04:01:05.000 Very little progress to, wow, this is all of a sudden really, really interesting and powerful.
04:01:13.000 And again, progress is compounding in a way that's counterintuitive.
04:01:19.000 People systematically overestimate how much change can happen in a year and underestimate how much change can happen in 10 years.
04:01:27.000 And as far as estimating how much change can happen in 50 or 100 years, I don't know that anyone is good at that.
04:01:36.000 How could you be with giant leaps come giant exponential leaps off those leaps and it's it's almost impossible for us to Really predict what we're gonna be looking at 50 years from now, but I don't I don't know what they're gonna think about us That's what's most bizarre about it is what we really might be obsolete if we look at how ridiculous we are look at This political campaign.
04:01:59.000 Look at what we pay attention to in the news.
04:02:01.000 Look at the things we really focus on.
04:02:03.000 We're a strange, ridiculous animal.
04:02:06.000 And if we look back on some strange dinosaur that had a weird neck, why should that fucking thing make it?
04:02:13.000 Why should we make it?
04:02:15.000 We might be here to make that thing.
04:02:17.000 And that thing takes over from here with no emotions, no lust, no greed, and just purely existing electronically.
04:02:26.000 And for what reason?
04:02:27.000 Well, that's a little scary.
04:02:28.000 There are computer scientists who, when you talk about why they're not worried, or talk to them about why they're not worried, they just swallow this pill without any qualm.
04:02:41.000 We're going to make the thing that is far more powerful and beautiful and important than we are, and it doesn't matter what happens to us.
04:02:50.000 I mean, that was our role.
04:02:51.000 Our role was to build these mechanical gods.
04:02:56.000 And it's fine if they squash us.
04:03:01.000 And I've literally heard someone give a talk.
04:03:05.000 I mean, that's what woke me up to how interesting this area is.
04:03:10.000 I went to this conference in San Juan about a year ago.
04:03:17.000 The people from DeepMind were there, and the people who were very close to this work were there.
04:03:24.000 To hear some of the reasons why you shouldn't be worried from people who were interested in calming the fears so they could get on with doing their very important work, it was amazing.
04:03:38.000 They were highly uncompelling reasons not to be worried.
04:03:46.000 So they had a desire to be compelled.
04:03:51.000 They're not worried at all.
04:03:53.000 People want to do this.
04:03:56.000 There's a deep assumption in many of these people that We can figure it out as we go along, right?
04:04:03.000 That's so scary.
04:04:04.000 It's just like, we're just going to get closer.
04:04:07.000 We're far enough away now, even if it's five years, we'll get there.
04:04:12.000 Once we get closer, once we get something a little scary, then we'll pull the brakes and talk about it.
04:04:18.000 But the problem is, Everyone is essentially in a race condition by default.
04:04:24.000 Google is racing against Facebook, and the U.S. is racing against China, and every group is racing against every other group.
04:04:34.000 However you want to conceive of groups, to be the first one with...
04:04:43.000 Incredibly powerful narrow AI is to be the next, you know, multi-billion dollar company, right?
04:04:51.000 So everyone's trying to get there.
04:04:53.000 And if they suddenly get there and sort of overshoot a little bit, and now they've got something like, you know, general intelligence, you know, or something close, what we're relying on, and they know everyone else is attempting to do this, right?
04:05:08.000 We don't have a system set up where everyone can pull the brakes together and say, listen, we've got to stop racing here.
04:05:15.000 We have to share everything.
04:05:16.000 We have to share the wealth.
04:05:18.000 We have to share the information.
04:05:21.000 This truly has to be open source in every conceivable way, and we have to diffuse this winner-take-all dynamic.
04:05:31.000 I think we need something like a Manhattan Project To figure out how to do that.
04:05:36.000 Not to figure out how to build the AI, but to figure out how to build it in a way that does not create an arms race, that does not create an incentive to build unsafe AI, which is almost certainly going to be easier than building safe AI, and just to work out all of these issues.
04:05:53.000 Because I think we're going to build this by default.
04:05:57.000 We're just going to keep building more and more intelligent machines.
04:06:01.000 And this is going to be done...
04:06:04.000 By everyone who can do it.
04:06:08.000 With each generation, if we're even talking about generations, it will have the tools made by the prior generation that are more powerful than anyone imagined 100 years ago, and it's going to keep going like that.
04:06:21.000 Did anybody actually make that quote about giving birth to the mechanical gods?
04:06:27.000 No, that was just me.
04:06:29.000 There was a scientist that actually was thinking and saying that.
04:06:32.000 But that was the content of what he was saying.
04:06:36.000 We're going to build the next species that is far more important than we are.
04:06:42.000 And that's a good thing.
04:06:45.000 Actually, I can go there with him.
04:06:46.000 Actually, the only...
04:06:50.000 A caveat here is that unless they're not conscious.
04:06:55.000 The true horror for me is that we can build things more intelligent than we are, more powerful than we are, and that can squash us, and they might be unconscious.
04:07:08.000 The universe could go dark if they squash us.
04:07:11.000 Right?
04:07:11.000 Or at least our corner of the universe could go dark.
04:07:15.000 And yet these things will be immensely powerful.
04:07:18.000 So if...
04:07:19.000 And this is just, you know, the jury's out on this.
04:07:22.000 But if there's nothing about intelligence scaling that demands that consciousness come along for the ride, then it's possible that...
04:07:31.000 I mean, nobody thinks our machines are...
04:07:32.000 You know, very few people would think our machines that are intelligent are conscious, right?
04:07:36.000 Right?
04:07:37.000 So at what point does consciousness come online?
04:07:40.000 Maybe it's possible to build super intelligence that's unconscious.
04:07:44.000 Super powerful, does everything better than we do.
04:07:47.000 It'll recognize your emotion better than another person can, but the lights aren't on.
04:07:55.000 That's also, I think, possible, but maybe it's not possible.
04:07:59.000 But that's the worst case scenario.
04:08:02.000 The ethical silver lining, and speaking outside of our self-interest now, but just from a bird's eye view, the ethical silver lining to building these mechanical gods that are conscious is that, yes, in fact,
04:08:18.000 if we have built something That is far wiser and has far more beautiful experiences and deeper experiences of the universe than we could ever imagine.
04:08:28.000 And there's something that it's like to be that thing.
04:08:31.000 It has kind of a god-like experience.
04:08:36.000 Well, that would be a very good thing.
04:08:38.000 Then we will have built something that was...
04:08:41.000 If you stand outside of our narrow self-interest...
04:08:44.000 I can understand why he would say that.
04:08:47.000 He was just assuming—what was scary about that particular talk is he was assuming that consciousness comes along for the ride here, and I don't know that that is a safe assumption.
04:08:58.000 Well, and the really terrifying thing is who— If this is constantly improving itself, and it's under the beck and call of a person then?
04:09:09.000 So it's either conscious where it acts as itself, right?
04:09:14.000 It acts as an individual thinking unit, right?
04:09:16.000 Or as a thing outside of it.
04:09:19.000 It's aware, right?
04:09:20.000 Either it is or it isn't.
04:09:22.000 And if it isn't aware, and some person can manipulate it?
04:09:25.000 Like imagine if it's getting 10,000, how many thousands of years in a week did you say?
04:09:31.000 If it was just a million times faster than we are, it's 20,000 years.
04:09:36.000 20,000 years in a week.
04:09:37.000 In a week, yeah.
04:09:38.000 In a week.
04:09:39.000 So with every week, this thing constantly gets better at even doing that, right?
04:09:44.000 So it's reprogramming itself, so it's all exponential.
04:09:48.000 Presumably.
04:09:48.000 Just imagine, again, you could keep it in the most restricted case, you could just keep it at our level, but just faster, just a million times faster.
04:09:59.000 But if it did all these things, if it kept going and kept every week was thousands of years, we're going to control it?
04:10:05.000 A person?
04:10:05.000 A regular person?
04:10:06.000 That's even more insane.
04:10:07.000 Just imagine being in dialogue with something that lived the 20,000 years of human progress I think?
04:10:39.000 Even that is an unstable situation.
04:10:42.000 But just imagine this emerging in some way online, already being out in the wild.
04:10:47.000 So let's say it's in a financial market.
04:10:52.000 Again, what worries me most about this and what is also interesting is that our intuitions here I think the primary intuition that people have is, no, no, no, that's just not possible or not at all likely.
04:11:07.000 But if you're going to think it's impossible or even unlikely, you have to find something wrong with the claim that intelligence is...
04:11:18.000 Just a matter of information processing.
04:11:21.000 I don't know any scientific reason to doubt that claim at the moment.
04:11:27.000 And very good reasons to believe that it's just undoubtable.
04:11:33.000 And you have to doubt that...
04:11:38.000 We will continue to make progress in the design of intelligent machines.
04:11:45.000 All that's left is just time.
04:11:48.000 If intelligence is just information processing, and we are going to continue to build...
04:11:55.000 Better and better information processors.
04:11:58.000 At a certain point, we're going to build something that is superhuman.
04:12:06.000 And so whether it's in five years or 50, it's the biggest change in human history I think we can imagine, right?
04:12:18.000 I keep finding myself in the presence of people who seem...
04:12:24.000 At least to my eye, to be refusing to imagine it.
04:12:27.000 They're treating it like the Y2K virus or the Y2K bug where it just may or may not be an issue.
04:12:35.000 It's a hypothetical.
04:12:37.000 We're going to get there and it's either not going to happen or it's going to be trivial.
04:12:42.000 But if you don't have an argument for why this isn't going to happen, Then you're left with, okay, what's it going to be like to have systems that are better than we are at everything in the intellectual space?
04:13:04.000 And...
04:13:08.000 What will happen if that suddenly happens in one country and not in another?
04:13:14.000 It has enormous implications, but it just sounds like science fiction.
04:13:19.000 I don't know what's scarier, the idea that an artificial intelligence can emerge, it's conscious, it's aware of itself, and that acts to protect itself, or the idea that a person A regular person like of today could be in control of essentially a God.
04:13:37.000 Right.
04:13:38.000 Because if this thing continues to get smarter and smarter with every week and more and more power and more and more potential, more and more understanding, thousands of years, I mean, it's just...
04:13:48.000 This one person, a regular person controlling that is almost more terrifying than creating a new life.
04:13:55.000 Or any group of people who don't have the total welfare of humanity as their central concern.
04:14:02.000 So just imagine, what would China do with it now?
04:14:05.000 What would we do if we thought China, Baidu or some Chinese company was on the verge of this thing?
04:14:13.000 What would it be rational for us to do?
04:14:15.000 I mean, if North Korea had it, it would be rational to nuke them, given what they say about their relationship with the rest of the world.
04:14:24.000 So it's totally destabilizing.
04:14:26.000 Well, that kind of power just isn't rational.
04:14:29.000 That kind of power is so life-changing.
04:14:32.000 It's so paradigm-shifting.
04:14:35.000 Right.
04:14:35.000 But to wind this back to what someone like Neil deGrasse Tyson would say is that the only basis for fear is, yeah, don't give your super-intelligent AI to the next Hitler, right?
04:14:47.000 That's obviously bad.
04:14:49.000 But if we're not idiots and we just use it well, we're fine.
04:14:56.000 And that, I think, is an intuition that's just a failure to unpack what is entailed by Again, something like an intelligence explosion.
04:15:09.000 Once you're talking about something that is able to change itself and So what would it be like to guarantee, let's say we decide, okay, we're just not going to build anything that can make changes to its own source code.
04:15:24.000 Any change to software at a certain point is going to have to be run through a human brain, and we're going to have veto power.
04:15:32.000 Well, is every person working on AI going to abide by that rule?
04:15:36.000 It's like we've agreed not to clone humans, right?
04:15:39.000 But are we going to stand by that agreement in the rest of human history?
04:15:44.000 Is our agreement binding on China or Singapore or any other country that might think otherwise?
04:15:51.000 It's a free-for-all.
04:15:52.000 And at a certain point, everyone's going to be close enough to making the final breakthrough that unless we have some agreement about how to proceed, someone is going to get there first.
04:16:11.000 That is a terrifying scenario of the future.
04:16:15.000 You know, you cemented this last time you were here, but not as extreme as this time.
04:16:20.000 You seem to be accelerating the rhetoric.
04:16:22.000 Yeah, exactly.
04:16:25.000 You're going deep.
04:16:28.000 Boy, I hope you're wrong.
04:16:29.000 I'm on team Neil deGrasse Tyson on this one.
04:16:32.000 Go, Neil.
04:16:36.000 In defense of the other side, too, I should say that David Deutsch also thinks I'm wrong, but he thinks I'm wrong because we will integrate ourselves with these machines.
04:16:47.000 There will be extensions of ourselves, and they can't help but be aligned with us because we will be connected to them.
04:16:54.000 That seems to be the only way we can all get along.
04:16:55.000 We have to merge and become one.
04:16:57.000 Yeah, but I just think there's no deep reason why.
04:17:01.000 Even if we decided to do that, like in the U.S. or in half the world, one, I think there are reasons to worry that even that could go haywire.
04:17:10.000 But there's no guarantee that someone else couldn't just build AI in a box.
04:17:16.000 I mean, if we can build AI such that we can merge our brains with it, Someone can also just build AI in a box, right?
04:17:26.000 And then you inherit all the other problems that people are saying we don't have to worry about.
04:17:30.000 If it was a good Coen Brothers movie, it would be invented in the middle of the presidency of Donald Trump.
04:17:36.000 And so that's when AI would go live, and then AI would have to challenge Donald Trump, and they would have like an insult contest.
04:17:43.000 But that...
04:17:45.000 That's when this thing becomes so comically terrifying, where it's just...
04:17:51.000 Just imagine Donald Trump being in a position to make the final decisions on topics like this for the country that is going to do this almost certainly in the near term.
04:18:06.000 It's like, should we have a Manhattan Project on this point, Mr. President?
04:18:11.000 Yeah.
04:18:14.000 The idea that anything of value could be happening between his ears on this topic or a hundred others like it, I think is now really inconceivable.
04:18:26.000 So what price might we pay for that kind of inattention and self-satisfied inattention to these kinds of issues?
04:18:37.000 Well, this issue, if this is real, and if this could go live in 50 years, this is the issue.
04:18:44.000 Unless we fuck ourselves up beyond repair before then and shut the power off, if it keeps going...
04:18:50.000 Yeah, no, I think it is the issue, but unfortunately it's the issue that doesn't, it sounds like a goof.
04:18:57.000 Yeah, it does.
04:18:57.000 It just sounds, you sound like a crackpot even worrying about this issue.
04:19:01.000 It sounds completely ridiculous, but that might be how it's sneaking in.
04:19:05.000 Yeah.
04:19:05.000 Yeah.
04:19:06.000 I mean, just imagine the tiny increment that would suddenly make it compelling.
04:19:12.000 I mean, just imagine...
04:19:14.000 I mean, chess doesn't do it because chess is so far from any central human concern.
04:19:19.000 But just imagine if your phone recognized your emotional state better than your best friend or your wife or anyone in your life and it did it reliably.
04:19:31.000 And was your buddy like that movie with Joaquin Phoenix?
04:19:35.000 Oh, her?
04:19:35.000 Yeah.
04:19:35.000 He falls in love with his phone, right?
04:19:37.000 I mean, that's just not, you know, that is not that far off.
04:19:42.000 It's a very discreet ability.
04:19:44.000 I mean, you could do that without any...
04:19:47.000 Any other ability in the phone, really.
04:19:49.000 It doesn't have to stand on the shoulders of any other kind of intelligence.
04:19:56.000 You could do this with just brute force in the same way that you have a great chess player that doesn't necessarily understand that it's playing chess.
04:20:06.000 You could have the facial recognition of emotion and the tone of voice recognition of emotion and The idea that it's going to be a very long time for computers to get better than people at that I think is very far-fetched.
04:20:24.000 I was thinking, yeah, I think you're right.
04:20:26.000 I was just thinking how strange would it be if you had like headphones on and your phone was in your pocket and you had rational conversations with your phone.
04:20:33.000 Like your phone knew you better than you know you.
04:20:36.000 Like, I mean, I don't know what to do.
04:20:37.000 I mean, I don't think I was out of line.
04:20:39.000 She yelled at me.
04:20:40.000 I mean, what should I say?
04:20:41.000 And I would listen to every one of your conversations with your friends and train up on that.
04:20:44.000 And just talk to you about it and go, listen, man, this is what you got to do.
04:20:47.000 You sounded way too critical.
04:20:48.000 You were sounding angry.
04:20:50.000 You got defensive.
04:20:50.000 You got defensive.
04:20:51.000 Why were you so defensive with Joe?
04:20:53.000 Apologize.
04:20:54.000 Relax.
04:20:55.000 Let's all move on.
04:20:56.000 You can accelerate it.
04:20:57.000 Okay.
04:20:57.000 You're right, man.
04:20:58.000 Right, man.
04:20:58.000 And like you're talking this little artificial...
04:21:00.000 Maybe that's the first version of artificial intelligence that we suggest.
04:21:03.000 We say, all right, let's give it a shot.
04:21:05.000 And like self-help guys in your phone.
04:21:06.000 You have like a personal trainer in your phone.
04:21:09.000 How to talk to girls.
04:21:10.000 It tells you everything.
04:21:11.000 Slow down, dude.
04:21:12.000 Slow down.
04:21:13.000 You're talking too fast.
04:21:15.000 Got to act cool.
04:21:16.000 Yeah.
04:21:16.000 I mean, literally giving you information.
04:21:19.000 That would be like step one.
04:21:20.000 That would be like the Sony Walkman.
04:21:22.000 Remember when you had a Walkman, like a cassette player?
04:21:24.000 That was like a VCR on your own.
04:21:27.000 We were on our way to what we have today, where you have fucking 30,000 songs in your phone or something.
04:21:33.000 I think I remember the first Walkman.
04:21:35.000 The first thing...
04:21:36.000 Back when I skied, there was something called Astral Tunes or something.
04:21:40.000 It was like a car radio that you could just put in a pack on your chest.
04:21:47.000 If they kept coming out with those, they would get smaller and smaller.
04:21:51.000 So then that little dude would start telling you, yo, man, dude, listen, they keep replacing me every year.
04:21:55.000 Just let them stick me in your brain.
04:21:57.000 We'll be together all the time.
04:22:00.000 I've been giving you good advice for years, bro.
04:22:03.000 Let me in your brain.
04:22:05.000 And so you and this little artificial intelligence, you have a relationship over time.
04:22:09.000 And eventually it talks you into getting your head drilled.
04:22:12.000 And they screw it in there.
04:22:13.000 And your artificial intelligence is always powered by your central nervous system.
04:22:17.000 Have you seen most of these movies?
04:22:20.000 Did you see Her?
04:22:21.000 No, I didn't.
04:22:22.000 And Ex Machina?
04:22:24.000 I saw Ex Machina.
04:22:25.000 That was one of my top ten all-time favorite movies.
04:22:28.000 I loved that movie.
04:22:30.000 Actually, I saw it twice.
04:22:31.000 I was slow to realize how well they did it.
04:22:38.000 The first time I saw it, I thought I wasn't as impressed.
04:22:43.000 I watched it again, and they really...
04:22:45.000 First of all, the performance of...
04:22:48.000 I forgot the actress's name.
04:22:51.000 Vikander?
04:22:52.000 Lisa Vikander or something.
04:22:54.000 The woman who plays the robot in Ex Machina is just fantastic.
04:22:59.000 Scary good.
04:23:00.000 She can tuck you into anything.
04:23:01.000 We're getting a little full on time.
04:23:03.000 Yeah, what are we, like five hours in?
04:23:05.000 Four and a half hours in, but I just got a note.
04:23:06.000 This is about to fill up.
04:23:07.000 Wait a minute.
04:23:08.000 How many hours?
04:23:09.000 Four and a half hours.
04:23:09.000 No.
04:23:10.000 Our computers are about to fill up?
04:23:11.000 Yeah.
04:23:11.000 How dare they?
04:23:12.000 We just did a four and a half hour podcast.
04:23:13.000 Yeah.
04:23:14.000 We were ready to keep going, too.
04:23:15.000 Jesus.
04:23:16.000 Jamie didn't cockpocket.
04:23:17.000 Sorry.
04:23:18.000 You know what, man?
04:23:19.000 Once you opened up that box, that Pandora's box of artificial intelligence.
04:23:22.000 I have a small question about AI that I haven't heard you guys discuss yet, and I've looked up.
04:23:25.000 Is there any sort of concept of, like, autism in AI? Like, a spectrum of AI? Like, there are dumb AI, and there's going to be smart AI, but...
04:23:34.000 Oh, yeah, yeah, yeah.
04:23:35.000 So, the scary thing, so, yeah, it's like super autism.
04:23:39.000 There's no...
04:23:42.000 I mean, across the board, I think that superintelligence and motivation and goals are totally separable.
04:23:50.000 So you could have a superintelligent machine that is purposed toward a goal that just seems completely absurd and harmful and non-commonsensical.
04:24:00.000 And so the example that Nick Bostrom uses in his book, Superintelligence, which was a great book, And did more to inform my thinking on this topic than any other source.
04:24:10.000 He talks about a paperclip maximizer.
04:24:13.000 You could build a super-intelligent paperclip maximizer.
04:24:16.000 Now, not that anyone would do this, but the point is you could build a machine that was smarter than we are in every conceivable way, but all it wants to do is produce paperclips.
04:24:26.000 Now, that seems counterintuitive, but there's no reason, when you dig deeply into this, There's no reason why you couldn't build a superhuman paperclip maximizer.
04:24:38.000 It just wants to turn everything, you know, literally the atoms in your body would be better used as paperclips.
04:24:44.000 And so this is just the point he's making is that...
04:24:48.000 Superintelligence could be very counterintuitive.
04:24:50.000 It's not necessarily going to inherit everything we find as commonsensical or emotionally appropriate or wise or desirable.
04:25:00.000 It could be totally foreign intelligence.
04:25:03.000 Totally trivial in some way, you know, focused on something that means nothing to us but means everything to it because of some quirk in how its motivation system is structured, and yet it can build the perfect nanotechnology that will allow it to build more paperclips,
04:25:21.000 right?
04:25:21.000 So...
04:25:24.000 At least, I don't think anyone can see why that's ruled out in advance.
04:25:28.000 I mean, there's no reason why we would intentionally build that, but the fear is we might build something that either is not perfectly aligned with our goals and our common sense and our aspirations,
04:25:44.000 and that it could form some kind of separate instrumental goals to get what it wants that are totally incompatible with Life as we know it.
04:25:55.000 And that's, you know, I mean, again, the examples of this are always cartoonish.
04:26:00.000 Like, you know, how Elon Musk said, you know, if you built a super intelligent machine and you told it to reduce spam, well, then it could just kill all people.
04:26:07.000 And that's a great way to reduce spam, right?
04:26:09.000 But see, the reason why that's laughable, but you can't assume, the common sense won't be there unless we've built it, right?
04:26:18.000 Like, you have to have anticipated all of this.
04:26:19.000 If you say, take me to the airport as fast as you can, again, this is Bostrom, and you have a super-intelligent automatic car, a self-driving car, you'll get to the airport covered in vomit because it's just going to go as fast as it can go.
04:26:37.000 So our intuitions about what it would mean to be super-intelligent necessarily are...
04:26:46.000 I mean, we have to correct for them, because I think our intuitions are bad.
04:26:51.000 You're freaking me out, and you've been freaking me out for over an hour and a half.
04:26:55.000 I'm freaked out that we did four and a half hours, and I thought we were coming up on three.
04:27:00.000 Man, I hope you're wrong about all that stuff.
04:27:02.000 Yeah, maybe so.
04:27:03.000 It doesn't seem that, I don't know, it doesn't look that rosy, Jamie.
04:27:08.000 I'm sorry to be such a buzzkill.
04:27:10.000 I'm going to walk to the woods.
04:27:11.000 Might have to figure out how to live off the land.
04:27:14.000 You're the ultimate prepper.
04:27:17.000 It ain't easy, man.
04:27:18.000 I'm going to call you.
04:27:20.000 I'm bad at it.
04:27:21.000 I'll starve.
04:27:22.000 I'll starve.
04:27:22.000 I won't be a vegetarian.
04:27:23.000 I'll come to your house for bear meat.
04:27:26.000 It might get ugly, folks.
04:27:27.000 Let's hope Sam Harris is wrong.
04:27:29.000 Thank you, brother.
04:27:29.000 Appreciate it.
04:27:30.000 And your podcast, tell people how they get yours.
04:27:32.000 Waking Up is my podcast, and you can find it on my website, samharris.org, or on iTunes.
04:27:38.000 You can get one of his books if you go to audible.com forward slash Joe.
04:27:42.000 Right?
04:27:43.000 Isn't that?
04:27:43.000 Get one?
04:27:44.000 Go get one of those, he fucks.
04:27:45.000 All right.
04:27:46.000 Thank you, ladies and gentlemen.
04:27:47.000 Thank you, Sam.
04:27:48.000 That was awesome.
04:27:48.000 Yeah, thank you, bro.
04:27:56.000 you