The Joe Rogan Experience - April 29, 2015


Joe Rogan Experience #641 - Sam Harris


Episode Stats

Length

2 hours and 55 minutes

Words per Minute

157.17026

Word Count

27,586

Sentence Count

1,866

Misogynist Sentences

18

Hate Speech Sentences

94


Summary

Sam Harris ( ) joins us to talk about Jon Jones being stripped of his UFC title, Jose Aldo vs. Conor McGregor, and much, much more. We also talk about the recent car crash that left a pregnant woman in need of medical care and how the driver flees the scene of the crime. And we answer some of your burning questions. We are part of the Robots Radio Podcast Network. See all the great network shows at RobotsRadio.net. Episode Music: "Space Travel" by Borrtex "Goodbye Outer Space" by Cairo Braga "Outer Space Warning" by Fountains of Wayne "Space Junk" Adams "Outro Music" by Zapsplat "Incomptech" by D'Andra "Space Truck" by KRS-One "Outcome" by PBR "Outward Vision" by Squeex "Outpost Commando" Join our FB group! Subscribe to our new podcast! Subscribe, Like, and Share to stay up to date with the newest episodes of Robots Radio and other awesome stuff going on in the world of Radio and FMK Radio & FMK. Subscribe on all social medias including Apple Podcasts, Vevolution, Podchaser, and TikTok! We now have a new podcast version of the show called Robots Radio & MMA Radio! Subscribe on iTunes! Learn more about your ad choices! Become a supporter of RobotsRadio & MMA Podcasts! Rate, review, review and subscribe to our podcast on iTunes, and become a Friend of the Crew Podcasts on all major podcast directories and social media platforms! If you like the show, we'll be giving you a chance to win a FREE PRICING opportunity to win tickets to our upcoming show on the next episode! Send us a review on the show! and much more! on Rate/subscribe to our newest episode on Rate and review us on iTunes & review the show on Podchips, too review it on your favorite streaming platform, and we'll send you a review and review it over to the world? Thank you for listening to the show and review our podcast? Subscribe & subscribe on your thoughts and subscribe on the podcast, too send us your thoughts on the pod is amazing reviews and reviews are amazing and more reviews on the world will be reviewed on the best of your favorite podcasting experience on social media and more.


Transcript

00:00:11.000 Sam Harris, ladies and gentlemen.
00:00:13.000 All right, there we go.
00:00:13.000 We're live.
00:00:14.000 To the world.
00:00:15.000 You're not going to do a read?
00:00:17.000 Nah, fuck not.
00:00:17.000 I don't do that anymore.
00:00:18.000 Oh, cool.
00:00:19.000 I do it before or after.
00:00:22.000 That way the conversation isn't garbled by five minutes of ads.
00:00:26.000 It just got annoying, you know?
00:00:28.000 Yeah, no, that's a good call.
00:00:29.000 Yeah, well, there's two different versions of this.
00:00:30.000 The one that goes up on YouTube.
00:00:32.000 So the one that goes on YouTube has no ads in it unless YouTube puts up ads.
00:00:35.000 And the one that goes on podcasts or iTunes rather has the ads for it.
00:00:41.000 So there you go.
00:00:42.000 So we could just talk.
00:00:43.000 Yeah.
00:00:43.000 Well, good to see you.
00:00:44.000 Good to see you too, man.
00:00:44.000 Yeah.
00:00:45.000 Absolutely.
00:00:46.000 It's an interesting time.
00:00:47.000 I know you're an MMA fan.
00:00:48.000 You know about the Jon Jones situation?
00:00:49.000 You know, I've heard rumors about it, but I don't actually know how far the misbehavior runs.
00:00:54.000 He got stripped of his title today.
00:00:56.000 Wow.
00:00:56.000 Yesterday, actually.
00:00:57.000 They announced a new title fight between Daniel Cormier and...
00:01:01.000 And Anthony Rumble Johnson, they'll be fighting for the now-vacated title.
00:01:06.000 John Jones is likely going to jail.
00:01:08.000 Was it a hit-and-run thing?
00:01:10.000 It crashed into a pregnant woman, allegedly, I should say.
00:01:13.000 It broke her arm.
00:01:15.000 She was rushed to the hospital, but pregnant.
00:01:19.000 It's got to be terrifying for her.
00:01:21.000 We had a car crash and smashed by a guy who runs a red light and then flees the scene of the crime.
00:01:27.000 He drove away or he ran away on foot?
00:01:29.000 Ran away.
00:01:29.000 The car was wrecked.
00:01:30.000 Wow.
00:01:30.000 Yeah, the car was wrecked.
00:01:31.000 You see the images of her car?
00:01:33.000 I couldn't imagine unless he was driving a fucking Humvee on how he could get away.
00:01:37.000 It was pretty bad.
00:01:38.000 Was it a DUI? We don't know because he disappeared for like...
00:01:42.000 24 hours at least.
00:01:44.000 Probably even more.
00:01:45.000 A couple days I think he disappeared for.
00:01:47.000 Which, you know, obviously the speculation would be that he was on something that he would worry about getting tested for for 24 hours or 48 hours or whatever it was.
00:01:55.000 Think of how desperate a move and hapless a move that is to run away from the car that you're leaving in the scene, which is obviously traceable to you.
00:02:04.000 It's not like you're getting out of the problem.
00:02:07.000 Worse.
00:02:07.000 And you're leaving an injured person there to...
00:02:09.000 Yeah.
00:02:10.000 I mean, it's a disaster.
00:02:11.000 But it's worse, because he actually ran back to the car, got stuff out of the car, and then ran away again.
00:02:16.000 Oh, man.
00:02:17.000 ID'd by a cop.
00:02:18.000 Right.
00:02:19.000 Jesus.
00:02:19.000 Yeah.
00:02:20.000 Disaster.
00:02:21.000 All across the board.
00:02:22.000 But not what we're here for.
00:02:23.000 No.
00:02:24.000 But I know you're an MMA fan, so...
00:02:25.000 No, we have a...
00:02:26.000 Bring it up.
00:02:26.000 Yeah, I actually have a question for you, though, on a close topic.
00:02:30.000 And we have a list of questions that, as you know, we got from Twitter, but McGregor or Aldo?
00:02:37.000 Who knows?
00:02:38.000 That's my answer for almost every fight.
00:02:40.000 It's going to be a very interesting fight.
00:02:42.000 We've never seen McGregor in with anybody remotely as talented as Jose Aldo.
00:02:47.000 Not even remotely.
00:02:48.000 It doesn't mean he can't win.
00:02:49.000 Because everybody that McGregor has been in with, he's steamrolled.
00:02:52.000 I mean, he's steamrolled really talented guys like Dustin Poirier.
00:02:56.000 Just, he's fucking good.
00:02:58.000 He's really good.
00:02:59.000 He's really good, really confident, and he really fucks with people's heads.
00:03:02.000 He's so confident and so good at talking and so, there's so much charisma about him that I think it's intimidating to his opponents.
00:03:11.000 I think when they get in there with him, his just overwhelming belief And belief is an incredibly powerful thing.
00:03:19.000 If you really truly believe that you are the best, and you say it, your opponents have to wonder.
00:03:28.000 Because everybody questions themselves, especially in the world of fighting.
00:03:32.000 Because no one is...
00:03:34.000 If you were born a rhino, okay?
00:03:37.000 And you were about to get into a fight with a parakeet, you'd absolutely be confident because you've always been a rhino and that parakeet is just a parakeet.
00:03:46.000 That parakeet's fucked.
00:03:47.000 But a person is a work in progress and not just from your fighting style but Your ability to manage your own mind, your ability to manage insecurities and anxieties, just the existential angst that comes with being a human being that's navigating their way through this complicated world.
00:04:06.000 There's variables.
00:04:07.000 There's days you're up and days you're down, and then you add into that martial arts.
00:04:12.000 And you don't get good at martial arts unless you get your ass kicked.
00:04:15.000 There's only one way through.
00:04:17.000 I mean, you can be One of those John Jones types, it's unbelievably physically talented and have a leg up on a lot of people, but even John had to get his ass kicked.
00:04:26.000 You had to.
00:04:27.000 He had to compete in wrestling against guys who are better than him.
00:04:32.000 He had to learn martial arts.
00:04:33.000 He had to spar.
00:04:34.000 I mean, there's gonna be days you're up and days you're down.
00:04:36.000 There's no way.
00:04:37.000 So when you get in there with a guy like Conor McGregor, we're not here!
00:04:41.000 To take part!
00:04:42.000 We're here to take over!
00:04:43.000 All that crazy shit.
00:04:44.000 You're just like, fuck, this guy's overwhelming.
00:04:46.000 It's like, if you can put on a show like that, if you can put your peacock feathers up and puff up your back hairs, get those bristles up nice and high like a cat when it's angry and hunt your back up, there's a reason why that exists in nature.
00:05:00.000 Well, it's beyond what Muhammad Ali used to do.
00:05:03.000 It's going to be fascinating psychologically to see him when he loses.
00:05:07.000 When that day comes, it'll come eventually.
00:05:10.000 Well, he has lost.
00:05:11.000 He's lost to a guy named Joe Duffy, who's now in the UFC, and he's very good.
00:05:15.000 This guy Joe Duffy is fucking very good.
00:05:17.000 But has he lost since his rise?
00:05:20.000 No.
00:05:20.000 He has not lost since he's been in the UFC. But in all fairness, the one style that he has not faced, which is the most difficult style, he's never faced a wrestler.
00:05:32.000 He's never faced a guy who can get in there and get him to the ground and outwork him and just sap his energy.
00:05:41.000 Wrestlers, they get on top of you and you can't get them off and you get exhausted trying and they're punching you in the face and elbowing you in the face and punching you in the body and constantly trying to strangle you and then the round is up and you get up and the next round comes and they do it again.
00:05:56.000 Boom!
00:05:56.000 You're on your back and boom!
00:05:58.000 You're getting punched in the face.
00:05:59.000 He's never faced a guy like that before.
00:06:01.000 All the fighters he's faced have chosen to stand up with him, and he's a very high-level boxer.
00:06:06.000 He was an amateur champion as a boxer.
00:06:08.000 He's got very good jujitsu skills, very good Muay Thai, very good everything.
00:06:13.000 We've just never seen him against a top-flight wrestler.
00:06:17.000 But Aldo's not that, right?
00:06:18.000 Nope.
00:06:18.000 Aldo's not that.
00:06:19.000 But Aldo is a world-class Brazilian jiu-jitsu fighter.
00:06:23.000 Like, people who are not aware of his ground skills.
00:06:26.000 Aldo beat a guy named Cobrina in an actual jiu-jitsu competition, which is very high level.
00:06:32.000 Cobrina's world championship caliber.
00:06:34.000 So Aldo is going to present him with some things inside the octagon that he's never faced before.
00:06:41.000 However, Aldo has been fighting professionally for a very long time, and like a race car that doesn't ever get its tires changed or doesn't ever get its suspension redone.
00:06:54.000 As a human being, your body can literally only take so many miles.
00:06:59.000 There's only so many times you can go to war.
00:07:02.000 There's so many sparring sessions you can take part in.
00:07:05.000 Only so many wrestling sessions you can take part in.
00:07:08.000 There's just a certain amount your body can take and we don't know when that number is.
00:07:13.000 When you reach that number though, that's it.
00:07:16.000 There's no turning back unless you're using testosterone or growth hormone or some things that turn your body into the superhuman sort of experiment.
00:07:25.000 Yeah, yeah.
00:07:25.000 Unless he's doing that, which we've seen from Vitor Belfort.
00:07:28.000 Like, Vitor Belfort's the only guy who actually got better as he got older.
00:07:31.000 Right.
00:07:31.000 Talking about a guy from UFC 12. The miracle of science.
00:07:34.000 It's exactly what it is.
00:07:35.000 It's absolutely fascinating.
00:07:37.000 And you can't completely discount his training, because his training is what made him better.
00:07:42.000 But his ability to recover was essentially supernatural.
00:07:46.000 You know, he's fighting in the same car that Jones was supposed to fight.
00:07:49.000 He's fighting Chris Weidman.
00:07:50.000 But now Nevada has made testosterone replacement therapy illegal.
00:07:55.000 So his last three performances, which are some of the best performances of his career, the knockout of Michael Bisping, the knockout of Luke Rockhold, and the knockout of Dan Henderson, those were all while he was on testosterone.
00:08:07.000 So things get weird now.
00:08:11.000 Now you see what a 37-year-old man really looks like, optimizing his hormones as best he can naturally, hopefully.
00:08:20.000 Right.
00:08:20.000 Well, 37 sounds young.
00:08:22.000 It sounds young, yeah.
00:08:23.000 I would like that level of testosterone.
00:08:25.000 Well, you know, it's also the level of testosterone of a regular 37-year-old man versus the level of testosterone of someone going through a camp.
00:08:33.000 Right.
00:08:33.000 When you're going through those camps, it's absolutely brutal.
00:08:36.000 Like, Jon Jones and Daniel Cormier both got tested randomly.
00:08:40.000 Before their tidal fight, and their testosterone levels were so low, people were wondering, like, hey, maybe these guys are doing something.
00:08:48.000 Maybe they were doing steroids, and that made their testosterone levels drop.
00:08:52.000 But what's most likely, it's testosterone to epitestosterone.
00:08:55.000 It was very low testosterone levels.
00:08:57.000 Most likely it's just the brutality of training.
00:09:00.000 It's so hard.
00:09:02.000 It's so hard to do.
00:09:04.000 You're showing up every day exhausted, and your muscles are sore, and your body's exhausted, and you gotta go through it all again tomorrow, and you're getting kicked and punched, and you're lifting weights, and you're doing sprints, and you're jumping up on boxes, and then the next day, all over again.
00:09:18.000 Your body just can't keep up, especially when you get into your 30s.
00:09:22.000 Yeah, yeah.
00:09:24.000 So that's the answer to that.
00:09:25.000 I will watch it with interest.
00:09:26.000 It's going to be very exciting.
00:09:28.000 Let me know if you want to go.
00:09:28.000 It's in Vegas.
00:09:30.000 Yeah, we'll talk about that.
00:09:32.000 We'll talk about it.
00:09:33.000 Sam Harris, Ghost of the Fights.
00:09:35.000 As a neuroscientist, does it disturb you at all when you're seeing these guys getting their heads rattled?
00:09:42.000 When you're very aware of what's going on behind the scenes in the actual brain itself, Yeah, well, I just talked about this on my blog with this writer, Jonathan Gottschall, who I told you about at one point.
00:09:54.000 Yeah, I'm trying to get him on the podcast.
00:09:56.000 We're working out a date.
00:09:58.000 Yeah, he's great.
00:09:59.000 So we had a conversation, which we published the transcript of, but he's got this book, The Professor in the Cage, where he's an academic, he's an English professor, and he decided to get really into MMA and fight a real cage match.
00:10:15.000 It's an interesting book.
00:10:17.000 He and I were talking about the discomfort we have just seeing people get brain damage in real time for our amusement.
00:10:26.000 It does make me uncomfortable, but it's also part of what's thrilling.
00:10:31.000 I'm as thrilled by the prospect of a knockout, too.
00:10:34.000 It's not even a conscious thrill.
00:10:36.000 It's just when things start going that way, you feel your own testosterone or something kick in.
00:10:44.000 But I mean, his recommendation was that, and I'm sure he's not the only person who's thought of this, he thought the gloves should come off, and the gloves are making it realistic to just send endless overhand rights and other crazy punches, you know, full force into people's heads for,
00:11:01.000 you know, 30 minutes.
00:11:03.000 Whereas if it was a bare-knuckle fight, you couldn't really, it'd be fewer knockouts, but you couldn't deliver that kind of Brain trauma because you'd be breaking your hands now obviously there are things like elbows and knees and kicks and so people would still be getting hurt but What do you think about that with that change?
00:11:22.000 I'm a big advocate of that and I've been said it many times I've said on the air I've talked about it on this podcast I think it's very unrealistic the way you're allowed to not just put gloves on but also wrap your hands up tape your wrists Yeah.
00:11:56.000 Foreheads are far harder than your knuckles are most likely you're gonna break it unless you hit them on the nose in the eye on the jaw You're gonna hurt your hand and you can only throw so many punches like that Even just hitting a heavy bag without being wrapped up you just screw up your wrist and your hands Yeah, you have to really learn how to tighten everything up and you have to really strengthen your wrist and you have to strengthen your your your hand muscles It's a it's a completely different thing,
00:12:19.000 which is why The karatekas, the karate students, would hit a makiwara, which is a board that's wrapped with rope.
00:12:27.000 And the idea behind punching that board wrapped in rope over and over again is you develop these intense calluses all over the front two knuckles, which is really the only way you're supposed to hit someone.
00:12:37.000 Those are the knuckles that are reinforced by the wrist, whereas where the pinky finger is and the ring finger, those knuckles are not reinforced nearly as well, especially if you have larger hands, like your hand, like my hand, Spreads out past my wrist.
00:12:51.000 It doesn't go in a straight line from my wrist to the pinky.
00:12:55.000 It actually goes out to the side.
00:12:57.000 So that knuckle is not enforced by anything.
00:13:00.000 If you punch someone with a boxing glove on with that knuckle, you're fine.
00:13:03.000 If you punch someone with an MMA glove on, it's less supported.
00:13:07.000 If you punch someone bare knuckle, you are very likely to break your hand.
00:13:11.000 If you punch someone full force on the forehead and you hit with a pinky knuckle, you're very likely to break your hand.
00:13:17.000 It's a high possibility.
00:13:19.000 Also, it also impedes your grappling to have those gloves on.
00:13:23.000 Right, right.
00:13:24.000 Marcelo Garcia.
00:13:24.000 That was his point, too.
00:13:25.000 Yeah, it was a huge issue.
00:13:27.000 Marcelo Garcia fought in Japan, I believe it was.
00:13:30.000 Had this guy's back and couldn't finish him.
00:13:33.000 Right.
00:13:33.000 The guy was just grabbing his gloves.
00:13:34.000 Just grabbing his gloves.
00:13:35.000 Grabbing his gloves, holding onto the gloves.
00:13:36.000 The guy just worked on his defense.
00:13:38.000 And the gloves...
00:13:41.000 Marcello's specialty is the rear naked choke, and the rear naked choke, a big part of it is sliding your hands underneath the guy's chin.
00:13:50.000 And you have the glove there, there's all this extra padding that makes it very difficult to get your hand in the proper position to get the choke right.
00:13:58.000 And the back of the head.
00:14:00.000 It's also very difficult to get the glove in the back of the head.
00:14:03.000 So a lot of times you'll see in an MMA fight, guys who use poor technique and still finish the choke, well they'll use their palm on the top or the back of the head because they can't do this.
00:14:12.000 They can't do this movement where it's actually the back of your hand that should touch the back of your opponent's head.
00:14:19.000 This is all, like, to someone who doesn't understand jiu-jitsu, this is very complicated, but I agree with him.
00:14:24.000 I think that gloves would help a lot.
00:14:26.000 Yeah.
00:14:27.000 Gloves off.
00:14:27.000 No gloves.
00:14:28.000 Yeah, I think it would help a lot.
00:14:29.000 But I think it's also, it would be beneficial for everyone to have some intensely...
00:14:40.000 Comprehensive scans done on a regular basis.
00:14:43.000 I don't know if that would be prohibitive Financially, I don't know how much that would cost I don't know how much that would even reveal because apparently one of the things that is troublesome for these NFL players is When they die and they do autopsies on them and they they reveal damage that they had no idea before I don't know how much like an fMRI or MRI can detect When it comes to the actual damage.
00:15:06.000 Well, it can, but to what end?
00:15:08.000 Because you know you're getting it.
00:15:09.000 If you're just going to be in a job where you have to get hit in the head, forget about competition, just training.
00:15:15.000 I mean, these guys train hard, as you know, and so they're just getting hit in the head to prepare them to get hit in the head in the match.
00:15:24.000 You're just...
00:15:25.000 It's not even...
00:15:28.000 It's like smoking.
00:15:29.000 The causal linkage between getting hit in the head and brain trauma is 100%.
00:15:35.000 It's just a matter of how much you individually, by dint of luck, can take until you actually have damage that matters.
00:15:45.000 So, you know, I obviously haven't had an experience of anything like an MMA fighter, but I regret all the...
00:15:54.000 The head injuries I took, just training as a...
00:15:57.000 I mean, now in martial arts, I just don't let myself get hit in the head.
00:16:00.000 But as a teenager, I got hit in the head a fair amount, and I played soccer and headed the soccer ball, and that always felt totally weird.
00:16:07.000 Did you play soccer?
00:16:09.000 You know, you head a soccer ball, you immediately get a kind of a rusty taste in your mouth, you know?
00:16:13.000 It's just unlike anything else that happened that day, except the other time you got hit in the head.
00:16:19.000 Isn't that crazy?
00:16:20.000 No one would ever think that soccer...
00:16:21.000 It can somehow or another give you traumatic brain injury.
00:16:24.000 Because it does knock you out.
00:16:26.000 Until recently, like in the last couple decades, we had this erroneous assumption that you had to get a knockout.
00:16:33.000 You had to get knocked out to have brain damage.
00:16:35.000 But these little thuds, just a little...
00:16:38.000 I was talking to a doctor.
00:16:40.000 Who said that water skiing can give you brain damage.
00:16:43.000 Right.
00:16:43.000 Water skiing.
00:16:44.000 Just the bouncing on the waves.
00:16:46.000 Oh, just the bouncing, yeah.
00:16:47.000 Just wave riding, you know, when you're doing that, like that bouncing, that stuff can give you brain damage.
00:16:52.000 Right.
00:16:52.000 Like your brain gets rattled around inside your head, the connective tissue dislodges, and it doesn't heal back.
00:16:59.000 I spoke about this with Jonathan too, that there's obviously all of these sports and just forms of recreation that entail some risk of injury and death, right?
00:17:11.000 And people should be able to do these things informed of the risks.
00:17:17.000 And so, you know, cheerleaders, and the example he brought up is cheerleading.
00:17:20.000 I mean, cheerleaders sometimes hit the ground and just are, you know, fantastically injured.
00:17:24.000 So all these things that don't necessarily seem like high testosterone, high risk, you know, just foolishly reckless sports can be very dangerous.
00:17:37.000 Skiing is very dangerous, too, and rock climbing.
00:17:39.000 There are things that are even just non-violent that don't entail much risk of injury until they kill you, like free solo rock climbing.
00:17:48.000 You're climbing, everything's fine.
00:17:49.000 Maybe you've hurt your hands in the past, but then all of a sudden you're dead because you just went up 500 feet without a rope and fell.
00:17:57.000 So there's all these kinds of risks that people can take, but the problem The problem that I think differentiates striking sports from even something like football is that the progress in the sport is synonymous with the damage.
00:18:15.000 So if you and I are in a boxing match or a kickboxing match hitting each other, Every instance of successfully degrading each other's performance with respect to the head, hitting someone in the head, is synonymous with delivering injury to the brain.
00:18:33.000 It's not incidental like in football, where I was trying to tackle you, I was not hoping to hurt your brain, but...
00:18:40.000 You know, you fell down hard and it did.
00:18:44.000 This is just, you know, a trade of brain damage.
00:18:48.000 And yeah, so it's interesting ethically.
00:18:53.000 You know, I don't know.
00:18:54.000 Again, I think people should be free to do it.
00:18:56.000 But I think people, you know, we should be informed about it.
00:18:59.000 And I would certainly vote to...
00:19:02.000 It would just make it more realistic combatively, too.
00:19:05.000 Insofar as you want to see what works combatively, I'm more interested to see what two people can do just with their bare hands than when they've got these tape and pillows.
00:19:31.000 Or does it somehow or another, is it correlated to brain damage?
00:19:35.000 It certainly can be.
00:19:37.000 It can be, right?
00:19:37.000 I mean, that's one of the issues with brain damage, impulsive behavior.
00:19:41.000 Yeah, especially in the frontal lobes, because your frontal lobes regulate your emotional and behavioral behavior.
00:19:49.000 And when those connections, when either the cell bodies or the connections between the gray matter and the frontal lobes and your limbic system and your basal ganglia and other areas in the brain, when that gets damaged, yeah, you have these classic impulse control problems where you just reach out and grab the woman standing next to you at Starbucks because you couldn't dampen the impulse to do it.
00:20:16.000 That's hard for people to grasp, because, I mean, again, this should be really clear.
00:20:21.000 I am, without a doubt, not trying to let him off the hook.
00:20:25.000 What he did was horrible.
00:20:28.000 If it was someone in my family that he hit with that car, I would be unbelievably furious.
00:20:33.000 I'm incredibly disappointed in him.
00:20:35.000 I think the UFC absolutely did the right thing in stripping him of his title, and I think law enforcement is going to do the right thing by putting him in jail.
00:20:43.000 I mean, they're going to.
00:20:44.000 It's just...
00:20:45.000 You can't do that you can't hit someone with a car and leave the scene of the crime that it is a crime yeah, but There are things that people do because they have brain damage and that's where the real question comes up is Obviously,
00:21:02.000 they're responsible ultimately for their own actions, but what is it that's responsible for making them do that action?
00:21:09.000 I mean we had this long conversation once Two podcasts ago, I think, about free will and determinism.
00:21:16.000 These are variables that come into play when it comes to the ultimate actions that you choose to do.
00:21:22.000 The ultimate movements that you choose to take, the thought processes, are unquestionably dependent upon the brain itself.
00:21:32.000 And if the brain is getting damaged, and if we have proven that some of the Some of the issues with people that have brain damage is impulse control.
00:21:42.000 You gotta wonder, man, when you see fighters do wild crazy shit, how much of that is due to getting just bonked in the fucking head all the time?
00:21:51.000 Yeah, except for me it breaks down a little differently because the My views on free will change the picture of how I view moral culpability in those situations.
00:22:02.000 So even if we knew his brain wasn't damaged, right?
00:22:06.000 So he, let's say, had never got hit in the head or we did a scan on him before the car accident and we saw...
00:22:12.000 And it's the perfect scan.
00:22:13.000 It's the scan that we'll have 50 years from now if we don't fuck ourselves up.
00:22:19.000 And so we just know that he's got a totally healthy brain.
00:22:27.000 I think?
00:22:43.000 This has sort of the punchline, which has certain consequences.
00:22:48.000 But one of the consequences is not that we can't respond to his misbehavior, that we can't put him in jail, that we couldn't have intervened at any point along the way to have made him a better person.
00:23:01.000 There's a difference between voluntary and involuntary behavior, even if it's all just causally propagating from the moment of the Big Bang.
00:23:11.000 But I do view it as...
00:23:13.000 I think the brain damage case is a little bit of a red herring because it's, on some level, it's all just unconscious causes that the person himself can't ultimately account for.
00:23:27.000 So there are situations in which he...
00:23:32.000 We're good to go.
00:23:48.000 One hour more sleep the night before and hadn't had a fight with his girlfriend and his blood sugar level was a little bit higher and hadn't had a friend who had told him to drink one more beer, which he normally would have resisted but couldn't because of all the other factors I just mentioned.
00:24:07.000 And that is the difference that made the difference that caused him to be this total misfit on the road.
00:24:15.000 Whereas, if you had just tweaked those dials a little bit, you know, no fight with a girlfriend, you know, one more bite of food in the morning, he would have been, he would have acted as you would have acted, in that case, say.
00:24:26.000 So, ultimately, you're not...
00:24:28.000 Let's just say, let's say that's true, then that, there's something, there's a kind of bad luck component to all of this creeping in.
00:24:37.000 There's a concept of moral luck, which is...
00:24:44.000 We're good to go.
00:24:50.000 It does seem unfair that there are many situations in which people create immense harms doing stuff that you and I have gotten away with.
00:25:03.000 They're not worse people than we are.
00:25:06.000 You and I have both driven when we shouldn't have driven.
00:25:11.000 We've had one beer too many.
00:25:13.000 There are things that we did.
00:25:14.000 You look at a text When you know you should never look at your phone when you're driving, but you decide, oh, I'm expecting a text, and you look.
00:25:22.000 And there are people who are looking at that text right now and just killing some child in the crosswalk, right?
00:25:29.000 And their lives are going to be ruined, and they're going to go to prison, and they're exactly like you and me, right?
00:25:35.000 So there's an aspect of luck here.
00:25:39.000 The luck actually propagates backward into the kind of brain you have, the kind of upbringing you had, the kind of parents you had, the fact that you got hit in the head as hard as you did or didn't, and no one has made themselves.
00:25:52.000 So I'm a little bit more...
00:25:55.000 I'm less judgmental about some of these things, given my view of free will, but I'm not...
00:26:00.000 It's not that I'm not interested in making the interventions that would make a difference.
00:26:05.000 Whatever we could have done to have gotten him to behave differently, we should have done.
00:26:09.000 Whatever we should need to do now to him to make society better and to make him better and to get restitution for the woman, we should do all that.
00:26:17.000 And so this does entail locking up certain dangerous people.
00:26:21.000 It does entail, you know, we have to keep ourselves safe from people who are going to reliably act badly.
00:26:29.000 And I don't know where he falls on that spectrum, but...
00:26:32.000 It's just it's not the difference between the feeling you get when you hear, oh, it was brain damage.
00:26:41.000 I sort of have that feeling about everything.
00:26:43.000 If he gets a brain scan, if he goes to trial now, he gets a brain scan, and we find that his brain is just massively damaged in all the right areas that would have eroded his impulse control, that would seem to let him off the hook a little bit.
00:26:57.000 He would look like someone who was unlucky more than he would look like a bad person.
00:27:03.000 And I sort of see bad people as unlucky, too.
00:27:08.000 I recognize that there are certain people who are classically bad people.
00:27:12.000 There are psychopaths who you just...
00:27:13.000 Not only can you not rely on them, you can rely on them to be bad actors.
00:27:18.000 So you have to be in a posture of self-defense with respect to these people.
00:27:24.000 But I do view them as unlucky on some fundamental level.
00:27:30.000 I share that thought, and I share that thought much more as I get older, and I have a more philosophical point of view when it comes to people that live in impoverished neighborhoods, especially like this Baltimore thing that was going on.
00:27:44.000 We were just having this conversation the other day about, or last podcast, about these kids that robbed the RT reporter.
00:27:51.000 I don't know if you've seen the video of it.
00:27:52.000 There's an RT reporter interviewing these kids that are on the street that are causing all this havoc in Baltimore, and they start swarming this reporter, and then they rob her and take her purse and take her off.
00:28:07.000 Imagine being one of those kids.
00:28:10.000 Imagine being in that environment.
00:28:12.000 You want to talk about determinants.
00:28:14.000 Imagine being born into this crime-ridden environment.
00:28:17.000 Who knows what kind of family you have?
00:28:19.000 Who knows what kind of influences you have?
00:28:21.000 Who knows what kind of experiences that you've had that you've had to react to and protect yourself from and develop this hardened thick skin and attitude and And also survival instincts.
00:28:34.000 And you also, your family or the people that you can reliably count on are the people that you hang out in the street, your gang.
00:28:41.000 I mean, that is the big thing with gang violence.
00:28:43.000 One of the big things with gang violence, one of the dirty secrets of it, is that a lot of it comes from broken homes.
00:28:48.000 When people don't have a strong family environment and people they can count on and trust, they don't have anybody that's there for them.
00:28:55.000 And then they find someone that's there for them in the gang.
00:28:57.000 The gang becomes their new family.
00:28:59.000 And they will do anything to keep that love, to reinforce that love.
00:29:04.000 And we all want to look at it as, they're criminals.
00:29:07.000 They should be home by 10. There's a curfew on the street.
00:29:09.000 It's completely unrealistic.
00:29:11.000 And if you were in their point of view, or if you were in their life, rather, and if you saw it through their point of view, What they see.
00:29:19.000 You would look at life the way they look at life.
00:29:21.000 Also, there's another variable here, which is just the influence of mob behavior.
00:29:26.000 People will behave in crowds in ways that they wouldn't otherwise.
00:29:30.000 Why is that?
00:29:30.000 What is that?
00:29:31.000 What's the mechanism behind that?
00:29:33.000 Yeah, it's a...
00:29:34.000 Well, I can't speak to the mechanism neurologically, but it's a fascinating social phenomenon that has been thought about for at least a century.
00:29:44.000 There was a...
00:29:46.000 I've had a philosopher, Elias Canetti, who wrote a book, Crowds and Power, which is very interesting on this topic.
00:29:52.000 A crowd is almost like a fire.
00:29:54.000 Once it gets started, the mob will behave by its dynamics that aren't really explained by the individual intentions of the individuals in the mob.
00:30:07.000 Actually, it was a great book.
00:30:08.000 Did you ever hear this book, Among the Thugs, by Bill Buford?
00:30:13.000 I've never heard of it.
00:30:16.000 He's a really nice writer.
00:30:18.000 He edited this literary magazine, Granta, I believe, back in the day.
00:30:24.000 And he got fascinated with the phenomenon of soccer hooliganism.
00:30:28.000 And he went to the UK and just started hanging out with these just diehard, I guess they were, I don't know...
00:30:37.000 Manchester United or Arsenal fans, but he just got in with these guys who were normal guys, like plumbers and electricians and people who had real lives.
00:30:46.000 These were not just teenagers who were thugs.
00:30:49.000 They were people who had families, but soccer was their life, and they became soccer hooligans.
00:30:57.000 But what's brilliant about the book, and again, it's been at least 20 years since I read it, so I could be a little off in my recollection here, but What I recall is that he wrote it in such a way that these guys he was hanging out with were really the protagonists.
00:31:14.000 He got you in on their side for about 75 pages or so.
00:31:18.000 And then when they start misbehaving, when they go to their first game against the Italians and form a mob and start just marauding the streets and bash kids in the head, they start behaving like sociopaths in this crowd.
00:31:36.000 But he catches you out totally because you're on their team for about 75 pages.
00:31:41.000 And you've identified with them.
00:31:42.000 You've sort of laughed with them.
00:31:43.000 You bonded with them as he did.
00:31:46.000 And then he reveals the level of thuggery that they're capable of as a mob.
00:31:52.000 And it was really...
00:31:53.000 I recall it being a fascinating book.
00:31:55.000 But it's just a fact that people will do in a crowd.
00:32:00.000 When you see...
00:32:01.000 Part of it's...
00:32:03.000 The social proof situation where you see everyone doing something and that, on some level, It gives you license to do it.
00:32:14.000 It's just contagious.
00:32:16.000 When you see people breaking windows or jumping on a car or turning over a car or looting, it takes less of any individual to participate in that.
00:32:27.000 It takes less for you to go in and grab a television set when you've seen a hundred of your neighbors do it, and you wouldn't have that morning just woken up deciding to rob the store yourself.
00:32:43.000 I mean, we all like to think we're the sort of people who would stand against the mob.
00:32:48.000 We would be the German who would have hid Jews in our basement and stood against the Nazis.
00:32:53.000 And you can multiply those cases ad nauseum.
00:32:59.000 But what...
00:33:00.000 A lot of psychological science shows that, yeah, there are those people.
00:33:04.000 There are the people who will stand against the tide of imbeciles who are going to do something heinous, but most people are part of the tide, and it's just a very common phenomenon.
00:33:15.000 The social license, that's a really interesting way to describe it, because that is what it is, right?
00:33:21.000 I mean, isn't that a big part of war?
00:33:23.000 I mean, a big part of war is doing things that you would never do on a normal basis, in a normal scenario.
00:33:30.000 On a regular basis, you are asked to put bullets into other human beings.
00:33:35.000 One of the things that I thought was really interesting about the controversy about American Sniper, the Chris Kyle movie, was he was talking about what it was like the first time he killed someone.
00:33:47.000 That he is in the book.
00:33:49.000 I don't believe this is in the movie But that he had this feeling before he shot someone like is this okay?
00:33:55.000 I can actually do this.
00:33:57.000 It's okay to do this and then he grew to enjoy it and then he grew It became commonplace and normal and he's like yeah, they're bad guys and I'm gonna shoot him but this the license the social license and then is a Accentuated with this This mob mentality that means you're a part of an army and you have an enemy.
00:34:19.000 And it's the life or death consequences, a life or death scenario that you're a part of.
00:34:26.000 The whole thing is escalated.
00:34:28.000 It's the highest level of that type of behavior that we have in society, in our culture today.
00:34:36.000 Well, interestingly, it takes a lot to get people to kill in war.
00:34:43.000 I think there's some myths around how easy it is for soldiers to shoot at the bad guy, but there have been studies done in prior wars where Some shocking percentage of soldiers either never fired their guns or fired above their targets on purpose.
00:35:01.000 They didn't want to kill anyone.
00:35:05.000 And so some of the discipline of training soldiers has been against the grain of those tendencies, trying to get people to actually try to kill the other person.
00:35:18.000 And, you know, I think we've become more successful probably at doing that.
00:35:22.000 You know, this is not something I know a ton about.
00:35:24.000 I just know that this research is out there.
00:35:27.000 And the main dynamic, I think, with soldiers is you are trying to keep your buddy safe, and he or she now is trying to keep you safe.
00:35:39.000 And they're not only firing at you We're good to go.
00:35:48.000 We're good to go.
00:36:08.000 And so now obviously there are aspects of war-making that don't fit that mold, and...
00:36:15.000 Some of the more disturbing aspects that actually require less of us in terms of you're dropping bombs from 30,000 feet or you're flying a drone from an office park outside of Las Vegas or wherever they are.
00:36:28.000 And so we find that sort of telescopic approach to war different ethically.
00:36:33.000 And I think it's different in a variety of ways that are interesting.
00:36:40.000 I think it's not so much that war unleashes in most people this bloodlust that they're struggling to contain in the civilized world, and that once the tap is open in a foreign country,
00:36:57.000 you just have Rambos everywhere.
00:37:00.000 People are really conflicted about what they do, and a lot of people try to not do anything of consequence.
00:37:06.000 There's a great episode of one of Dan Carlin's podcasts, one of the Hardcore History podcasts about World War I. And I believe it was about the Germans and the English that they had been in battle with each other and they had...
00:37:23.000 Sort of, without verbally agreeing to this, they had sort of agreed to a ceasefire during lunch.
00:37:29.000 Yeah, it was fascinating.
00:37:31.000 Do you know the story?
00:37:32.000 Yeah, yeah.
00:37:32.000 Please.
00:37:33.000 Because I've heard...
00:37:34.000 I knew the story, but I've also listened to Dan's podcast, which is...
00:37:38.000 I think I got from you.
00:37:40.000 It's just fantastic.
00:37:41.000 He's the best.
00:37:42.000 It's amazing.
00:37:42.000 I think that...
00:37:43.000 All of them are great, but that series on World War I is just a masterpiece.
00:37:49.000 It's really...
00:37:50.000 He's doing something remarkable there.
00:37:54.000 But yeah, this trench warfare was the most brutal.
00:38:00.000 It was just this...
00:38:02.000 You know, horror compounded upon horror endlessly for years to no evident gain.
00:38:09.000 I mean, these people, they're fighting for yards of ground forever, and just tens of thousands of people are dying, and they're basically camped out on the decomposing bodies of the people who died before them.
00:38:24.000 And it's the most horrible version of warfare you've ever heard about.
00:38:28.000 And then there's this no man's land between the trenches where people who run out there trying to make an incursion into the enemy trench will get caught on barbed wire or they'll get shot.
00:38:40.000 So you have this spectacle of injured and dying people in the no man's land between the trenches.
00:38:49.000 You know, howling for hours and hours and hours in misery, and when someone goes to try to rescue them, they get shot.
00:38:54.000 And so, but there were periods where the two sides just agreed that this was just, and again, how that was communicated was kind of interesting.
00:39:07.000 I don't actually recall the details there, but it was kind of a tacit agreement that emerged where, okay, we're going to let you get your, we're not going to shoot at you when you get the injured person or the dead bodies.
00:39:19.000 And there was one Christmas, I believe, where they just basically went out and exchanged cigarettes and had an impromptu soccer game.
00:39:27.000 And they basically called the war off at a certain point and then got chastised by the higher-ups for doing that.
00:39:33.000 And then the war started all over again.
00:39:35.000 But yeah, they actually socialized at one point.
00:39:38.000 It's amazing.
00:39:38.000 It really is an amazing depiction of what must have been an impossible place to be in.
00:39:47.000 To imagine being a person standing on the decomposing bodies, being forced to shit in a coffee cup and throw it over the top of the trench, and know that no one's getting out of this.
00:40:00.000 I mean, you might be one of a thousand people that's gonna die.
00:40:05.000 In the next couple hours, you might be, you know, you might make it to next week.
00:40:09.000 You might not.
00:40:10.000 I mean, and just the stress that you're dealing with, the non-human aspect of that life.
00:40:18.000 This is not a normal thing that you ever expected to deal with.
00:40:22.000 There's not a normal set of scenarios.
00:40:24.000 It's not your brain, the way you grew up.
00:40:27.000 You're not prepared for this life.
00:40:29.000 You're just thrust into it and it doesn't make any sense.
00:40:32.000 And then to have that all sort of Eroded to the point where on Christmas you guys are hanging out and then the generals come in and say fuck this you got to kill those people and next thing you're killing each other again like so you had this brief glimpse of You know some utopia inside of war Yeah,
00:40:51.000 well, what was so weird about that war in particular was that the run-up to it was so romanticized and idealistic.
00:41:00.000 I mean, you had a kind of war fever that happened throughout Europe where this was just looked at in the rosiest possible terms.
00:41:12.000 Like, this is just the true glory of manhood being expressed.
00:41:16.000 Finally, we have a...
00:41:18.000 It was approached like the World Cup or something.
00:41:21.000 It was like pure exuberance around the prospect of fighting this war in many quarters.
00:41:26.000 And you'd be surprised if that ever happened again.
00:41:30.000 So it's a little bit like what's happening with jihadists globally.
00:41:33.000 But they have beliefs that cause that to make more sense.
00:41:37.000 I mean, they believe they're going to go to paradise when they get killed in this war.
00:41:41.000 But it's hard to...
00:41:43.000 Map your own psychology onto the cream of English youth where they were just going off with this level of enthusiasm, having no idea...
00:41:54.000 I guess part of it was they had no idea just how horrible it was going to be.
00:41:57.000 But they...
00:41:59.000 Yeah, you read Homer and war is this glorious thing.
00:42:06.000 The war ethic you get from ancient civilizations is something that we have...
00:42:13.000 I think largely outgrown, but you can really see it in World War I. Don't you think a lot of people had that similar attitude post 9-11, especially when the World Trade Center towers went down and there was this flag waving fewer in America,
00:42:29.000 unlike anything I had ever seen.
00:42:31.000 I remember post 9-11, I remember driving down the street, leaving the street near my house, And entering into this main street and every car, every car had an American flag.
00:42:46.000 Every car.
00:42:46.000 It was insane.
00:42:48.000 I mean, if you did, I didn't have an American flag.
00:42:50.000 I was, like, looked at odd.
00:42:53.000 You know, like, this is an unprecedented time in history.
00:42:58.000 And then all these people were signing up for war.
00:42:59.000 All these people were signing up because they wanted to go over there.
00:43:01.000 They wanted to fight the good fight.
00:43:03.000 And then you start hearing things from people like Pat Tillman, who left a career as an NFL player, a very promising career as a pro athlete, and all of a sudden he's over there in this war.
00:43:15.000 And his impression of it was that it was a huge clusterfuck.
00:43:19.000 It was nothing like what he wanted.
00:43:21.000 It was nothing like what he expected and he was very verbal about that.
00:43:24.000 Very, very, very openly critical about that.
00:43:26.000 And a lot of people think that's one of the reasons why he died.
00:43:29.000 You know, there's a giant conspiracy theory that they killed him because he was talking and he's killed by friendly fire.
00:43:34.000 He was killed by American troops.
00:43:36.000 And there was the conspiracy theory was that they shut him up.
00:43:39.000 Because he was so openly critical of what was going on over there, that it wasn't what he thought it was going to be.
00:43:47.000 He thought it was going to be this incredibly organized group of heroes that went over there to fight these evil bad guys that are hell-bent on destruction and suicide bombing their way into America to kill the American dream.
00:43:59.000 I mean, this is the idealistic version of it.
00:44:01.000 Well, I think there is an idealistic version of good and bad actors in this case.
00:44:07.000 It's just the reality of fighting this war is so messy.
00:44:14.000 Afghanistan, I think, was pretty clear-cut morally that we had to do something against al-Qaeda.
00:44:20.000 And by definition, once the Taliban wouldn't Release Osama bin Laden to us.
00:44:28.000 We had to do something against the Taliban and that's where he was and they were sheltering him And so I didn't I didn't feel ethically conflicted over that But that was such a mess.
00:44:41.000 I mean you're just going into Afghanistan the reality of what it takes to go into Afghanistan and kill the bad guys is so messy that There's arguably no good way to do it.
00:44:51.000 There's no way to do it which at the end of the day is going to look like a success.
00:44:56.000 And so maybe that's something we're now learning that you have to, this is so messy that you have to be, you really have to pick your moments.
00:45:06.000 And be far more surgical than we've ever been inclined to be, and not even think about defeating the enemy, ultimately, but just kind of keeping the enemy at bay, containing this problem for long enough to change minds or change culture in some other way.
00:45:24.000 Because even in this case, I think it was very clear-cut.
00:45:28.000 Killing members of Al-Qaeda was a good idea, and I think it's still a good idea.
00:45:33.000 It's just, you know, a drone strike kills some of them, and it also kills some of the hostages, as we now see, and it also kills some of the people standing too close to the bomb blast, and it's ethically messy,
00:45:49.000 you know?
00:45:50.000 But I think there are instances of it that are certainly necessary, but...
00:45:55.000 Someone has to be thinking very clearly about how we proceed in a world where there really are people trying to destroy us.
00:46:02.000 It's not that there's no bad guys.
00:46:05.000 There are bad guys.
00:46:07.000 Isn't that where the foreign policy argument comes into play?
00:46:10.000 Because some people say those bad guys are bad guys because of US foreign policy, because of the way we have intervened and dominated natural resources.
00:46:20.000 You think it's confused?
00:46:21.000 Yeah.
00:46:22.000 Yeah.
00:46:22.000 Well, before we dive into that, I'm looking at a list of topics that were brought up by our Twitter people.
00:46:29.000 And I'll read the list just so we have it in our heads and you can decide what you want to deal with here.
00:46:34.000 But Islam, anything but Islam.
00:46:37.000 Abby Martin, Abby Martin, Abby Martin.
00:46:41.000 So I think we have to deal with Abby Martin.
00:46:42.000 Okay.
00:46:43.000 Abby, who is a good friend.
00:46:45.000 I love Abby.
00:46:46.000 She's crazy, though.
00:46:47.000 In a good way.
00:46:48.000 But she's wild.
00:46:50.000 And she accused you of being one of the new atheists with your anti-Islamic rhetoric.
00:46:58.000 And, you know, that's nothing new to you.
00:47:00.000 You've been accused of that in the past.
00:47:03.000 Did she misrepresent your point of view?
00:47:05.000 Yeah.
00:47:06.000 Did you listen to it?
00:47:07.000 Yeah, I did.
00:47:07.000 I did listen to it.
00:47:08.000 Should we play it or no?
00:47:10.000 I don't think you need to.
00:47:11.000 You don't need to.
00:47:11.000 What did she say and do you not agree with what she said?
00:47:14.000 Well, it was really interesting listening to her because...
00:47:18.000 So I listened to the whole podcast and she didn't mention me until like the second hour.
00:47:24.000 And I'm listening to this and I'm thinking...
00:47:26.000 So I'm actually having a conversation with you in my head as I'm listening to this and I'm thinking...
00:47:32.000 Joe, it's kind of remarkable what you are able to do here, because you're having a conversation with her.
00:47:36.000 From my point of view, you are just drinking from a fire hose of bullshit, right?
00:47:41.000 What she's saying, there's so much wrong with what she's saying.
00:47:47.000 But yet you're in a position to have a conversation with her that is where there's just a ton of goodwill and it doesn't run to the ditch at all.
00:47:54.000 And you can have a conversation with me in the same vein.
00:47:57.000 But then I was thinking, I'm sure she's a perfectly nice person and I would be very nice talking to her.
00:48:06.000 I have a feeling now of more or less total hopelessness talking to someone as polarized on these issues as I view her to be.
00:48:16.000 And so I was kind of praising you in my mind thinking, you know, I couldn't do what you're doing here.
00:48:23.000 And at that instance, she just mentioned me, right?
00:48:27.000 So it was like one of those bad scenes in a movie where the television starts talking to the character.
00:48:32.000 She just kind of called me out and then more or less totally misrepresented my views.
00:48:41.000 So she said many things that are just inaccurate, which we can talk about, but in terms of what she attributes to me, she said that I only care about intentions, right?
00:48:54.000 So that intentions are all that matter.
00:48:56.000 So if we kill a billion people, but meant well, We're fine.
00:49:01.000 And if the Muslims kill a million people but don't mean well, they're far worse than we are ethically.
00:49:09.000 Intentions are all that matter.
00:49:10.000 And she was, I think in her defense, I'm sure she's never read anything I've written, but she was reacting to a snippet of a podcast where I push back against some of the things that Noam Chomsky has said.
00:49:26.000 And I haven't thought that...
00:49:29.000 I've said in my first book that Chomsky doesn't value the ethical role of intentions enough.
00:49:33.000 And I said something very brief in a podcast that bounced around.
00:49:37.000 And so that's what she heard.
00:49:38.000 So she misconstrued me there.
00:49:40.000 Intentions are not all that matters, obviously.
00:49:43.000 But intentions do matter.
00:49:45.000 So if someone stabs you, right...
00:49:48.000 The difference between them doing it on purpose because they want to kill you and them doing it by accident because they were cutting...
00:49:57.000 You guys were cooking in the kitchen and they didn't know you were there.
00:50:00.000 They turned and they stuck a knife into your belly.
00:50:04.000 It's a world of difference, ethically.
00:50:07.000 And the crucial difference is...
00:50:21.000 I mean, he's trying to kill you.
00:50:28.000 Is it going to be rushing you to the hospital in the next instance?
00:50:31.000 Right, but this is a kind of a disingenuous comparison.
00:50:34.000 Because, I mean, are you describing the difference between accidentally killing civilians with a surgical strike, in quotes, of a drone strike, versus killing someone with a suicide bomb?
00:50:47.000 Are you trying to kill as many people that are random as possible?
00:50:50.000 So I'm using a very idealized example just to show you that The role of intention is not all that matters, because getting stabbed still sucks, right?
00:50:59.000 So if you assume the same stab wound, you still have the same problem.
00:51:03.000 But one of your scenarios is completely innocent and accidental.
00:51:07.000 The other one is murderous intent.
00:51:08.000 Okay, so those are the extremes.
00:51:10.000 Right.
00:51:10.000 So then you can have gradations along that continuum, right?
00:51:13.000 Where you have...
00:51:14.000 And somewhere more in the middle would be...
00:51:16.000 You're trying to kill a bad guy and you accidentally kill an innocent person as well.
00:51:19.000 Yes, absolutely.
00:51:20.000 And it's totally...
00:51:21.000 Or you think you've got the bad guy and you've just got bad intelligence and all you kill is an innocent person, right?
00:51:28.000 Right.
00:51:29.000 Let's say you're being totally surgical.
00:51:31.000 You're a sniper.
00:51:32.000 You're going to just kill one person with one bullet, but you've got the wrong person through no fault of your own, right?
00:51:37.000 Or worse yet, the bullet goes through that person and kills someone else, which also happens.
00:51:41.000 Right.
00:51:41.000 So all kinds of scenarios like that.
00:51:42.000 And there's a very common scenario, I think, which is...
00:51:48.000 I think?
00:52:06.000 You know, 500 yards from the next person.
00:52:10.000 So if you want to fight this war with drones, say, you have to accept some level of collateral damage.
00:52:16.000 Now, I don't actually...
00:52:18.000 I mean, I'm not privy to any kind of intelligence.
00:52:21.000 You know, I'm not in those circles, and I'm not...
00:52:24.000 One of those people.
00:52:25.000 So I don't know just how Obama or anyone in a position of responsibility makes those calculations.
00:52:32.000 What is acceptable collateral damage?
00:52:34.000 But we know that some level of collateral damage is acceptable because otherwise it would be impossible to fight war at all, right?
00:52:42.000 So we know that some level of collateral damage is acceptable just driving on our roads.
00:52:48.000 You know, 30,000 people die every year On our roads, we could dial that number down to zero, right?
00:52:56.000 If we were committed to no death on our roads, we could get there.
00:53:00.000 We would just all have to drive five miles an hour.
00:53:02.000 Right, but the difference is that when you're driving, you're not intending on killing someone.
00:53:06.000 It's an unintended conversation.
00:53:08.000 There's a big difference between that and the unintended consequence of violence.
00:53:12.000 Which is definitely deliberate.
00:53:14.000 Let's get into that.
00:53:15.000 Let's see if there is.
00:53:16.000 This is an unintended but foreseeable consequence.
00:53:21.000 In fact, certain consequence of our keeping the speed limit where it is.
00:53:26.000 You and I both know...
00:53:27.000 Let's say we could vote on this.
00:53:30.000 What do you want the speed limit to be?
00:53:31.000 Let's say it's 75 miles an hour.
00:53:34.000 We know that if we reduced it to 5...
00:53:37.000 There'd be some other costs, and I'm sure there'd be some other ways in which people might die.
00:53:43.000 An ambulance getting to the hospital would be hitting a traffic jam, and some people would die on the way to the hospital.
00:53:49.000 Leave that aside.
00:53:51.000 We would save tens of thousands of lives every year if we just took all the fun out of driving.
00:53:57.000 Or just forget about that.
00:53:58.000 Let's keep the speed limit exactly where it is, but No matter what car you have, there's a governor on it, and you cannot go past the legal speed limit ever.
00:54:10.000 So if you're in a 25-mile-an-hour zone, whatever your car is, you've got a Porsche or whatever you like to drive, it can only go 25 miles an hour, not a mile an hour more, no matter how you hit the throttle, and that would be true in every zone.
00:54:26.000 There are people who would resist that, and their reasons for resisting it is just that driving would be less fun.
00:54:33.000 If anything is indefensible when you're talking about kids being killed, that is.
00:54:39.000 That's a far more superficial commitment.
00:54:43.000 Than wanting to get the higher-ups in Al-Qaeda who are trying to, at some point, blow up an American city, right?
00:54:52.000 Right.
00:54:52.000 But imagine if as many innocent people died from driving from one activity.
00:54:59.000 Like, think about the amount of people that die.
00:55:01.000 They do.
00:55:02.000 But they don't.
00:55:03.000 They do.
00:55:04.000 The numbers are nowhere near...
00:55:05.000 Let's talk about the numbers.
00:55:07.000 Drone numbers.
00:55:08.000 How about the drone numbers?
00:55:09.000 What are the percentage of people that have died, the innocent people that have died because of drone strikes?
00:55:13.000 But it's more than 80%.
00:55:15.000 It's more than 80%.
00:55:17.000 I don't actually...
00:55:17.000 I just have to plead ignorance on that.
00:55:19.000 I don't know those numbers.
00:55:20.000 They're crazy.
00:55:21.000 They're very high.
00:55:22.000 They're very high.
00:55:22.000 But hold on for a second.
00:55:23.000 But hold on for a second.
00:55:25.000 Because you're talking about something like driving.
00:55:28.000 Right.
00:55:28.000 30,000 a year, every year, reliably.
00:55:31.000 The last 10 years has been 300,000 people in the U.S. But how many people who drive on a daily basis wind up driving their whole life and never killing anybody?
00:55:39.000 Most.
00:55:40.000 Yeah.
00:55:40.000 Most.
00:55:41.000 How many drone strikes wind up not killing innocent people?
00:55:46.000 Almost none.
00:55:47.000 But that's not necessarily the way to analyze it, or at least I would argue that's not the way.
00:55:51.000 But let's just talk about numbers, for instance, because there's another problem I had with Abby Martin.
00:55:56.000 She was using this number 2 million dead in Iraq and Afghanistan.
00:56:00.000 Where did she get that number?
00:56:02.000 There's no credible person is using that number.
00:56:05.000 What do you think the number is?
00:56:06.000 That number is almost certainly an order of magnitude too high.
00:56:13.000 The sober estimates are like 200,000.
00:56:16.000 And most of that, most, Is the result of sectarian violence, right?
00:56:22.000 We didn't kill 200,000 people.
00:56:24.000 We went into Iraq.
00:56:27.000 We're mostly talking about Iraq.
00:56:29.000 The numbers are much higher there than in Afghanistan.
00:56:32.000 We went into Iraq.
00:56:35.000 We did some very understandable things and also some very stupid things, but we took the lid off of a simmering civil war.
00:56:44.000 The real catastrophe of Iraq, apart from our going in in the first place, which I never supported.
00:56:51.000 But the real catastrophe is that having gone in, we failed to anticipate the level of sectarian hatred.
00:56:58.000 And we did very little to hedge against it.
00:57:02.000 And we kicked off a civil war, which someone like Abby Martin clearly thinks we are entirely responsible for.
00:57:08.000 So when Shia death squads are taking out the power drills and drilling holes into the heads of their neighbors, And the Sunni are returning the favor.
00:57:18.000 That's us.
00:57:19.000 We are culpable for that.
00:57:21.000 Now, I don't accept that.
00:57:22.000 These people were...
00:57:23.000 They're killing one another.
00:57:26.000 They've got a blood feud going back over a millennium now.
00:57:30.000 And we...
00:57:32.000 Pop the cork on it in Iraq.
00:57:34.000 And that's a terrible thing to have collaborated in, and we probably should have foreseen it.
00:57:39.000 So if we're culpable, it's for not having anticipated certain of these consequences of our actions.
00:57:45.000 But we are not the people, we are not the Sunni who are killing Shia, and we're not the Shia who are killing Sunni.
00:57:51.000 And the same is true in Afghanistan.
00:57:53.000 We are not the Taliban who are blowing themselves up in crowds of fellow Afghans.
00:58:03.000 We're good to go.
00:58:25.000 Insofar as we could have anticipated the rise of ISIS and all of this consequent death toll, you are faulting us for leaving because our political interests and our stomach was no longer aligned with this project.
00:58:43.000 I'm not sure that's an argument that someone like Abby Martin wants to make, that we should have stayed longer, that we should have spent more money, that we should have killed more people in an effort to keep the locals from killing so many more people.
00:58:56.000 So anyway, the number 2 million is plucked out of a bad dream.
00:59:02.000 Who says 2 million?
00:59:04.000 I don't know anyone.
00:59:06.000 Obviously, you're not going over there counting bodies.
00:59:09.000 No.
00:59:09.000 So who is saying it's 200,000 and who is saying it's 2 million?
00:59:12.000 Okay, so I'll tell you, the highest number that at one point seemed credible was based on a Lancet article.
00:59:22.000 Lancet is a British medical journal.
00:59:24.000 Very well regarded.
00:59:26.000 There was something, Jamie just put this up on the screen here.
00:59:29.000 What is this from, Jamie?
00:59:30.000 Iraq Body Count.
00:59:31.000 Iraq Body Count.
00:59:32.000 So that's 200,000.
00:59:33.000 But who is making this iraqbodycount.org website?
00:59:37.000 It says...
00:59:37.000 Let's just read what it says.
00:59:39.000 Documented civilian deaths from violence, 138,000 to 156,000.
00:59:44.000 Total violent deaths including combatants, 211,000.
00:59:49.000 And this is, what is this up from?
00:59:52.000 It says following the 2003 invasion, but is this current?
00:59:56.000 I think this is 2014, 2015. They keep counting.
01:00:00.000 They keep counting.
01:00:00.000 So there are different ways to do this.
01:00:02.000 One is you can count bodies, right?
01:00:06.000 And the information there is not perfect because some people die and their death doesn't get reported.
01:00:11.000 Not everyone has a death certificate.
01:00:13.000 So you can count bodies, you can get reports of the actual deaths.
01:00:21.000 The other thing you can do is you can estimate the amount of death That would have occurred in the absence of an invasion, and then compare the reports of...
01:00:30.000 You do a statistical sample of an area and compare the reports of death...
01:00:35.000 Based on the past.
01:00:36.000 Yeah, based on what's happening now, and you see a differential there, and then you extrapolate to the rest of the population.
01:00:42.000 And so that's what this...
01:00:43.000 The authors of this Lancet article, they did that, and they came up with a number 600 or 650,000.
01:00:51.000 And that article has been widely criticized, not to the point of it being unpublished, but I don't think it's been retracted, but I don't think any serious person thinks that article is representative of the facts.
01:01:05.000 And so what they did, for instance, is they...
01:01:07.000 They would take a cluster of, I think, 40 homes in an area and ask the people, you know, who has died, who do you know who has died, and how did they die?
01:01:18.000 And then they would get, they would just, based on that sample, they'd do that in many different sites around Iraq.
01:01:25.000 Based on those samples, they would extrapolate to the rest of the population, and they came up with 600 or 650,000.
01:01:31.000 So one criticism I read about that article was that they seemed to have focused on Areas near major thoroughfares in big cities where IEDs were far more common than other places in those same cities or in other places in Iraq.
01:01:50.000 So the very place you would most likely plant an IED... Is a not especially representative place to poll all the families in the area to see whether they've lost loved ones in the war, right?
01:02:04.000 So that was one way to get an unrealistic number.
01:02:08.000 The other thing is that it seemed that there were just some shady things with the researchers where they weren't releasing their data and their methods and the communication with them broke down.
01:02:16.000 And so, anyway, the sober people I trust who focus on these things think that's a fictional number.
01:02:22.000 And that's one-third, not even one-third of what Abby's working with.
01:02:26.000 Excuse me.
01:02:29.000 I think I caught Aubrey de Grey's cold.
01:02:31.000 Just watching that podcast, I think I got his cold.
01:02:33.000 I think that we can agree that even 200,000 people dead is a tremendous tragedy.
01:02:39.000 Horrific, tremendous tragedy.
01:02:41.000 So the semantics argument over whether or not it's tenfold, that number, or whatever it is, you just disagree with her.
01:02:49.000 That's not semantics.
01:02:50.000 Okay, it's not.
01:02:51.000 It's not semantics.
01:02:51.000 Semantics is a bad word.
01:02:53.000 Tenfold is tenfold.
01:02:55.000 But even more important is...
01:02:57.000 We didn't go in and kill 200,000 people.
01:03:00.000 We went in and killed, I'm sure, some tens of thousands of people, many of whom were the Baathist, the Revolutionary Guard, right?
01:03:13.000 And we unleashed a maelstrom of internecine sectarian conflict, religious conflict.
01:03:22.000 And we failed to contain it.
01:03:25.000 And it would have taken more blood and treasure to contain it.
01:03:29.000 And so it's a huge problem.
01:03:32.000 I'm not minimizing the horror of Iraq.
01:03:34.000 Again, I never supported our invasion of Iraq.
01:03:38.000 The things I've said that have been spun as support of it...
01:03:42.000 We're good to go.
01:03:50.000 We're good to go.
01:04:10.000 We're good to go.
01:04:29.000 Sam thinks that we went into Iraq for humanitarian reasons.
01:04:32.000 He thinks we went into Iraq just to make it like Marin County, right?
01:04:35.000 So that's the sort of pushback I get from the Abbey Martins of the world.
01:04:38.000 No, I've never said that.
01:04:39.000 I'm just saying that the truth is so sinister that even if our intentions were perfectly benign and we're just trying to raise the standard of living there and even just give them the freedom to practice their own religion, right?
01:04:53.000 We're trying to make this like Nebraska.
01:04:57.000 It would have been a bloodbath, given the beliefs of the sorts of people who now populate a group like ISIS. So, anyway, that's my claim.
01:05:08.000 So this is just one aspect of what you disagreed with what she said.
01:05:13.000 The two million number.
01:05:14.000 So we've beaten that down to the ground.
01:05:17.000 I mean, she just has this...
01:05:19.000 She's a lefty.
01:05:20.000 Well, but she has a kind of confabulatory style.
01:05:23.000 Again, I'm not really denigrating her personally, but I don't know her.
01:05:27.000 I'm sure she's a good person.
01:05:29.000 But there's a style of talking that you run into with people where there's just...
01:05:32.000 It's kind of confabulatory, where you're just sort of talking, and it's sounding good, and you're just sort of spitballing, but you're using numbers, right?
01:05:40.000 You're using numbers like two million, or you're saying things like, our biggest export is weapons.
01:05:57.000 Okay.
01:06:10.000 Let me just finish it just so we can get to it.
01:06:12.000 97-page report.
01:06:13.000 By the Nobel Peace Prize winning Doctors Group is the first to tally up the total number of civilian casualties from the U.S.-led counterterrorism interventions in Iraq, Afghanistan, and Pakistan.
01:06:24.000 Right.
01:06:25.000 Okay.
01:06:25.000 Well, so clearly that's where she got this number.
01:06:28.000 She got it from somebody who got this number there.
01:06:32.000 I think that number is...
01:06:35.000 So, for instance, as a sanity check, when I heard Abby Martin, I sent an email to my friend Steven Pinker, who's an incredibly sober scientist, just a very careful researcher.
01:06:49.000 He wrote this truly landmark book on the decline of violence in the last century, The Better Angels of Our Nature.
01:06:57.000 It came out a few years ago.
01:06:58.000 It was like 800 pages on this topic.
01:07:01.000 Very data-driven book.
01:07:03.000 He did a tremendous amount of research for this.
01:07:06.000 And he's an incredibly well-respected Harvard scientist.
01:07:13.000 So I pinged him about this.
01:07:15.000 I said, I'm hearing in liberal circles that we killed two million people in Iraq and Afghanistan.
01:07:20.000 Is there any chance that this is true?
01:07:22.000 And he said more or less what I'm saying to you now.
01:07:26.000 It's almost certainly an order of magnitude too high.
01:07:30.000 The highest briefly credible study was the Lancet one.
01:07:34.000 I didn't know about this, and I don't know what he in particular would say about this study, but undoubtedly they used the same sort of extrapolation methods.
01:07:43.000 They're not counting bodies.
01:07:45.000 They're doing, based on sort of the ambient level of death over the years, they think it's gone up by the tune of two million in those countries.
01:07:56.000 But anyway, so Steve said, no, this is a totally fanciful number, and here's why.
01:08:04.000 And he broke it down for me, and then I did a little more reading on the topic.
01:08:11.000 But again, the crucial...
01:08:13.000 I think it really matters whether the number is 200,000 or 2 million.
01:08:16.000 I don't want to be loose on that.
01:08:19.000 But the crucial ethical difference is, did we go in and perform our own sort of final solution against the Iraqis and the Afghanis trying to kill millions of people a la Hitler?
01:08:33.000 Or did we wander into a situation where we unleashed a civil war And are we culpable for that?
01:08:42.000 And I don't think we are.
01:08:43.000 We're culpable for something, but we are not the Sunnis killing the Shia, and vice versa.
01:08:49.000 And you believe that that is the majority of the deaths?
01:08:51.000 Absolutely, absolutely.
01:08:52.000 I don't know if the new study necessarily agrees with that.
01:08:56.000 Well, I would be astonished if they didn't.
01:08:59.000 It talks about that study in here, that Lancet study.
01:09:01.000 It says it's likely to be far more accurate than the figures initially.
01:09:06.000 Which study is likely to be?
01:09:08.000 The 200,000 one?
01:09:09.000 655,000 deaths.
01:09:11.000 So wait a minute, the people who are saying it may be 2 million, but it's at least 101.3 million?
01:09:17.000 According to the PSR study, the much disputed Lancet study that estimated 655,000 Iraq deaths up to 2006, and over a million until today by extrapolation was likely to be far more accurate than the IBC's figure.
01:09:31.000 In fact, the report confirms that A virtual consensus among epidemiologists on the reliability of the Lancet study.
01:09:42.000 This is coming from the...
01:09:43.000 So I don't want to totally...
01:09:45.000 I'm not saying I'm not open to this information, but the website you're pulling this from is just trash.
01:09:53.000 Middle East Eye is just...
01:09:55.000 These guys publish...
01:09:58.000 The serial plagiarist who's been stalking me, who I vowed not to name.
01:10:04.000 The stuff they publish is just pure insanity.
01:10:08.000 I mean, Google me on this site and you'll get madness.
01:10:12.000 But they're talking about a study.
01:10:15.000 They didn't perform the study, right?
01:10:16.000 Right, but I can't...
01:10:17.000 They're publishing the study?
01:10:18.000 In real time, I can't vet their representation of it.
01:10:20.000 Click on that study and see who the hell...
01:10:22.000 Who does it say performed that study?
01:10:24.000 I can't read that.
01:10:25.000 PSR, Physicians for Social Responsibility.
01:10:28.000 That's a real group.
01:10:29.000 But again, the problem...
01:10:31.000 I mean, one problem here is that this whole area has become so politicized that it's hard to...
01:10:39.000 I mean, even Amnesty International has embarrassed itself with supporting a jihadist organization in the UK. They just, at the 11th hour...
01:10:48.000 Pulled out their support, but for a very long time, they were just in the same trench with jihadists and not knowing it, or they should have known it.
01:10:58.000 I mean, people were telling them, but they were very slow to realize it.
01:11:00.000 So, you should be slow to take even a humanitarian organization's word for the significance of a given study.
01:11:14.000 But I would find it, frankly, amazing Yes.
01:11:20.000 We had killed anything like that number of people.
01:11:23.000 Why?
01:11:24.000 Well, I would find it amazing if that number of people had died.
01:11:31.000 It's unthinkable to me that we killed 2 million people.
01:11:35.000 But it's over 12 years of war.
01:11:38.000 Even if you killed, you know, think about the numbers.
01:11:41.000 No, but you just know where all this death is coming from.
01:11:43.000 You know where the bombings, the IEDs, the truck bombs, the blowing up of mosques...
01:11:49.000 We're not doing that, right?
01:11:51.000 So the body count, when you look at the penalty we're paying for killing people, and when we look at how much our own soldiers don't want to die unnecessarily,
01:12:06.000 and our own level of casualties, We're not on the other side of all those guns.
01:12:14.000 I mean, there's just a tremendous amount of internecine violence in both Afghanistan and Iraq that is just killing, you know, like a bomb will go off and a hundred people at a mosque will be dead.
01:12:26.000 So, Abby Martin, I think, I don't think I'm being uncharitable here, I think she thinks We're responsible for that.
01:12:35.000 For the Sunni Shia violence?
01:12:37.000 Yes.
01:12:37.000 Okay.
01:12:38.000 So whatever the number is, that's your argument with that.
01:12:41.000 You were saying that I was swimming in a sea of bullshit before that.
01:12:44.000 Right.
01:12:45.000 What else was bullshit?
01:12:47.000 Well, I think...
01:12:48.000 I don't remember all the details, but so, for instance, one thing she said is that our main export is arms.
01:12:56.000 That's not true, right?
01:12:56.000 That's not true.
01:12:57.000 That's not true.
01:12:58.000 So it may be...
01:12:59.000 What is our main export?
01:13:01.000 I think our main export is like machinery.
01:13:03.000 Everything like farming equipment and pumps and road making.
01:13:08.000 I thought it was corn.
01:13:10.000 That was a guess.
01:13:11.000 I think it's airplanes and everything that's a machine.
01:13:17.000 So maybe arms falls into that category.
01:13:19.000 But when you look at...
01:13:21.000 Machines, number one.
01:13:23.000 $219 billion in machines.
01:13:27.000 13.5% of total exports.
01:13:29.000 Number two, electronic equipment, $171 billion.
01:13:32.000 Oil, $157 billion.
01:13:33.000 Number three, vehicles, $135 billion.
01:13:36.000 Number four, aircrafts and spacecrafts, $124 billion.
01:13:40.000 Number six, medical, technical equipment.
01:13:55.000 So, guns aren't even in the top ten unless machines are guns?
01:14:04.000 Well, you have to think that weaponry is somehow spread across machines and aircraft and vehicles.
01:14:14.000 I'll cut her the benefit of the doubt there.
01:14:18.000 Let's find a more comprehensive list that actually includes arms.
01:14:23.000 Because that's pretty sneaky.
01:14:24.000 If it is machines, then she might be somehow or another correct.
01:14:28.000 Let's say we export $50 billion in arms a year.
01:14:32.000 I don't know what it is.
01:14:35.000 And that's a whole other conversation, whether we should be doing that.
01:14:39.000 I think it's suicidally stupid in certain cases that we're arming people who are eventually going to be using these arms against us.
01:14:46.000 Or our friends, but...
01:14:48.000 Well, the conspiracy theory would be that that's how they perpetuate this whole constant cycle of war, is that you have to keep arming your enemies.
01:14:54.000 Right.
01:14:54.000 Well, no, so I think, yeah, the economic interests of defense contractors is not something that I am especially sanguine about.
01:15:04.000 It's not, you know, I think the...
01:15:07.000 The possible role of corruption there and a sort of callous indifference to the effects of being in this business, I think that's a very real concern.
01:15:19.000 So, to say that this is our main export.
01:15:22.000 It's bombs, essentially.
01:15:24.000 That's not true.
01:15:25.000 Now, maybe what she meant to say is we are the main exporter of weaponry in the world.
01:15:31.000 It's probably true.
01:15:32.000 Which is probably true.
01:15:32.000 But that's not what she said.
01:15:34.000 Or maybe she said both, and I only heard one.
01:15:36.000 No, I think you're correct in what she said.
01:15:37.000 That's what she said.
01:15:38.000 She might be getting, you know, that's the problem.
01:15:41.000 Unless you're doing, look, we live in a world that's so broad and comprehensive that unless you're doing the actual research yourself, and not just doing it, but doing it over a long period of time, and very meticulous, most likely, you don't know the actual numbers.
01:15:55.000 There's very few things that I could talk about with utmost certainty that aren't involved directly in my own life.
01:16:00.000 And when you deal with numbers, like numbers of imported Guns, exported guns, people dying in a place that you've never even visited.
01:16:08.000 Boy, you're relying on a lot of people's data.
01:16:11.000 Yeah, yeah, yeah.
01:16:12.000 And one thing that really is depressing is the degree to which this conversation is so politicized that you just...
01:16:20.000 It's like the climate change conversation.
01:16:23.000 The fact that there are people who are...
01:16:27.000 You can always find someone with a PhD to sit up there and say, you know, I don't think cigarettes cause cancer, right?
01:16:35.000 I mean, you can find those people.
01:16:37.000 You can find the people who are engineers who say that 9-11 had to be an inside job because, you know, the melting point of steel, blah, blah, blah.
01:16:46.000 And you can find on all these issues, you get incredibly politicized science.
01:16:54.000 But certain things don't pass the smell test.
01:16:57.000 And to me, two million people doesn't pass the smell test, certainly if you're going to say that we killed those two million people.
01:17:04.000 That we...
01:17:06.000 We did double the level of a Rwandan genocide intentionally.
01:17:11.000 That was what we did.
01:17:13.000 That just seems completely masochistically hallucinatory to me.
01:17:21.000 I see your point.
01:17:22.000 I don't know who's right, but I see your point.
01:17:24.000 Now, was there anything else that she said that you needed to dispute?
01:17:29.000 I don't think so.
01:17:30.000 The thing I can say just categorically is that what she said about my concern about intention is just not true.
01:17:38.000 Intentions matter because they are the best predictor as to what the person is likely to do in the future.
01:17:45.000 If you know someone is killing people because he intends that, he wants that, he wants to cause grief and suffering and death, well then you know this is a person you have to jail or kill and this is not a good actor.
01:17:58.000 If someone does it because they did it by accident, or they didn't foresee the consequences of their actions, or they were trying to get the bad guy and they produced collateral damage, it's a very different scenario, and yet the body count may be the same.
01:18:13.000 And so the thing I've faulted Chomsky for in the past is that he seems to talk about situations where All you need to concern yourself with is body count.
01:18:24.000 So the example I dealt with in my first book, The End of Faith, and this was in reaction to a short book he did right after 9-11 called 9-11, he talked about the...
01:18:35.000 Clinton's bombing of the al-Shifa pharmaceutical plant in Sudan in retaliation for the embassy bombings, al-Qaeda bombings in Kenya in the 90s.
01:18:46.000 And he talked about this bombing of the pharmaceutical plant as a great atrocity, seemingly equivalent to the atrocity of 9-11 or worse.
01:19:01.000 Because of the consequences for Sudan of having half the supply of pharmaceuticals destroyed.
01:19:07.000 People would die from preventable illness as a result of this.
01:19:11.000 What an incredible atrocity.
01:19:13.000 Except...
01:19:15.000 The representation of our government was not, and I think any rational thinking on this topic would suggest that our intention was not to destroy a pharmaceutical plant.
01:19:24.000 We claimed to be bombing what we thought was a chemical weapons factory, run by Al-Qaeda, and we wanted to degrade that capacity of theirs after they had just bombed two embassies in East Africa.
01:19:38.000 So...
01:19:40.000 Let's just say that's true.
01:19:41.000 I mean, who knows what our actual intentions were?
01:19:43.000 But if our intention was to bomb a chemical weapons plant that we didn't know was a pharmaceutical plant, and we bombed a pharmaceutical plant that was being used for peaceful purposes, and as a result, tens of thousands of people didn't get their medicine and died, I think?
01:20:21.000 If you accept that to be true, then the fact that tens of thousands of people died as a result doesn't have the same ethical significance.
01:20:29.000 It is much more like you and I are just trying to get home at 55 miles an hour, but we're participating in a system that's going to kill 30,000 people this year based on our speed limits.
01:20:40.000 We're not intending to kill any of those people, right?
01:20:42.000 It's just...
01:20:43.000 But perhaps we should have foreseen...
01:20:46.000 So it was bad data.
01:20:47.000 Was it bad data?
01:20:48.000 Well, no.
01:20:48.000 I mean, that's what our government said about its actions.
01:20:51.000 Now, let's say that's not true.
01:20:54.000 Let's say we knew it was a pharmaceutical plant, but we also thought it was a chemical weapons plant.
01:21:02.000 And we bombed it...
01:21:05.000 Really knowing what the bad, you know, we thought we were going to get the chemical weapons facility, but we also knew we were going to destroy all of their pharmaceutical infrastructure and that would have cascading bad effects that, all things considered, we didn't care that much about.
01:21:20.000 Let's say it was that place on this continuum of moral callousness.
01:21:27.000 Well, that's still different than trying to kill 10,000 people by taking away their medicine, right?
01:21:34.000 In my view, it may not be so different, and it's something that we would be culpable for.
01:21:41.000 But I think you have to...
01:21:42.000 The reason why intentions matter is because they are...
01:21:49.000 They're the clear expression of what we are committed, the ends to which we're committed, the kind of world we want to live in.
01:21:56.000 What I did in that first book, I asked, this is a thought experiment called The Perfect Weapon, where I said, just imagine what a group would do.
01:22:07.000 If they had perfect weapons, right?
01:22:09.000 There's no such thing as collateral damage.
01:22:11.000 They could just target everyone they wanted to target.
01:22:14.000 They would never hit Osama Bin Laden's mom, who happened to be standing too close to him.
01:22:19.000 They're just going to hit Osama Bin Laden.
01:22:23.000 So, what would any one group do with the perfect weapon?
01:22:26.000 What would Bibi Netanyahu do with perfect weapons?
01:22:29.000 What would Hitler have done with perfect weapons?
01:22:32.000 What would Bill Clinton do with perfect weapons?
01:22:34.000 People like Chomsky and Abby Martin talk about the Clintons and the Bushes and the Netanyahu's and the Dick Cheney's of the world.
01:22:44.000 I'm not necessarily equating all of those people, but they're all sort of in a certain area for me.
01:22:53.000 We're good to go.
01:22:57.000 We're good to go.
01:23:10.000 The intentions of our government are to go around the world killing brown-skinned people.
01:23:16.000 That was a phrase she used.
01:23:18.000 The spirit in which she talked about our culpability on the world stage is very much in the sense that we have intentionally murdered millions of brown-skinned people because we don't care about them, and maybe it's part of the reason why we want them dead,
01:23:35.000 right?
01:23:38.000 I do not believe that's the situation we're in.
01:23:40.000 I certainly don't believe that someone like President Obama wants to create massive collateral damage.
01:23:47.000 And if you gave him the perfect weapon, I'm reasonably sure he would target the bad guys.
01:23:53.000 If you and I could vote on whether these people should go down, we would have a 90% convergence with him.
01:24:01.000 We wouldn't find ourselves in the presence of a psychopath who just was so amped up on his power to kill.
01:24:07.000 That he would be killing, you know, Anne Frank, right?
01:24:12.000 Whereas there are people who really did kill Anne Frank because they intended to kill Anne Frank and everyone like her, right?
01:24:18.000 There's a difference.
01:24:19.000 Is there any culpability?
01:24:22.000 Is there any...
01:24:23.000 Do you put any blame on the United States government and our foreign policy and our decisions as far as the domination of global natural resources, whatever we've done overseas?
01:24:37.000 Is that in any way responsible for the hatred that these people have for America in the first place?
01:24:45.000 Yeah.
01:24:45.000 Well, yeah.
01:24:46.000 And beyond the hatred, responsible for our alliances with people who commit outright human rights abuses.
01:24:56.000 Like Saudi Arabia.
01:24:56.000 Yeah.
01:24:57.000 So the fact that we can't...
01:25:00.000 Break all ties with Saudi Arabia.
01:25:02.000 The fact that we can't twist their arm and get them to behave like a civilized culture on the world stage at this point.
01:25:13.000 So they're jailing and caning bloggers.
01:25:18.000 This one atheist blogger, Raif Badawi, I'm sorry if I'm mispronouncing his name.
01:25:29.000 It's an absolute scandal, the fact that we can't apply more pressure to them.
01:25:34.000 And that isn't, as far as I can tell, entirely explained by our dependence on oil and our unwillingness.
01:25:40.000 We have the technology to break this dependence on oil.
01:25:43.000 The fact that we have such entrenched financial interests that's keeping us tied to oil.
01:25:50.000 And that the whole military-industrial complex is tuned to safeguard those interests for us in the world out of necessity.
01:25:58.000 Because those interests have been monetized.
01:26:01.000 They've been controlled.
01:26:02.000 They've been monopolized.
01:26:03.000 Roll back the clock 50 years.
01:26:06.000 There, I'm sure, was not an alternative to being dependent on oil.
01:26:10.000 There's a certain point.
01:26:11.000 So we're totally dependent on oil.
01:26:13.000 Civilization just needs petrochemicals to survive.
01:26:16.000 And they all happen to be buried in the ground, inconveniently, under the palaces of these religious maniacs.
01:26:26.000 That may have been the situation then.
01:26:28.000 So how culpable are we for securing our interests, and not just we, the U.S., but the West, at that point, by entering a relationship with the House of Saud?
01:26:41.000 That's one question.
01:26:42.000 That may have been a marriage of necessity, and there have been marriages of necessity with tyrants, I think, in the past.
01:26:49.000 Now, the fact that we can't sprint to the finish line and get off of oil, right?
01:26:55.000 We know this is a dwindling resource.
01:26:57.000 We know it's a disaster for climate change.
01:26:59.000 We know that there would be the financial and technological renaissance that's waiting if we all just grab Elon Musk's coattails and go towards sustainable energy.
01:27:17.000 All of this, you know, our interests...
01:27:19.000 We're funding both sides of the war on terror.
01:27:22.000 It makes absolutely no sense.
01:27:23.000 So we should just make a full court press in the direction of sustainability, energy security, and getting ourselves into a position to say to people like the Saudis, you treat your bloggers better or we're going to bankrupt you, right?
01:27:39.000 All their wealth is coming out of the ground, right?
01:27:42.000 So the moment we don't need this wealth or need to defend it, We'd be in a much better position to demand that people treat women better throughout the world, and they honor free speech, etc.
01:27:55.000 So I think it's a scandal that we are not doing that, and I think, yes, we are culpable for doing that, but given what would happen to us in the near term, If we lost access to oil,
01:28:12.000 and again, I'm not just talking about us, I'm talking about Europe and just the whole world, it's been a very difficult situation to be in, and it's understandable that we have gotten into this situation, but I don't find it understandable now that we aren't sprinting away for it.
01:28:28.000 So if I could define your point of view.
01:28:30.000 Your point of view is more of a pragmatic take on what the world is currently at this stage.
01:28:37.000 You're not taking away the responsibility of the United States government.
01:28:40.000 You're not saying that they haven't made horrific decisions.
01:28:43.000 You're not saying that they haven't been manipulated by these gigantic corporations that are profiting off of the war that we're currently involved in.
01:28:52.000 That you are just saying that if you want to look at the actual reality of the good guys and the bad guys and where the world is fucked right now, there's certain things that have to be done and there's certain people that have to be taken out.
01:29:03.000 If you do not, you put everyone else at risk.
01:29:05.000 Is that a good...
01:29:06.000 Yeah, that's fair.
01:29:07.000 I guess I would only add that...
01:29:12.000 I've been saying this for 10 years at least, or now closer to 15 years, and it just never gets heard.
01:29:21.000 I can grant someone like Chomsky You know, 80, 90% of his thesis, right?
01:29:29.000 So I think he pushes forward into kind of masochistic voodoo a little bit, but we have done horrific things historically, right?
01:29:40.000 And the question is just how far you want to walk back in your time machine.
01:29:45.000 But, you know, starting with, you know, our treatment of the Native Americans on up, it depends on who the we is, but we being the United States, right?
01:29:54.000 We get here, we start behaving badly, and we behave badly for a very long while, and we have done terrible things, and yet it is also true that we have enemies we haven't made.
01:30:07.000 There are people who have had the benefit of everything the West has to offer.
01:30:12.000 Who are waking up today deciding to join ISIS for reasons that have nothing to do with U.S. foreign policy, or if they do have something to do with U.S. foreign policy, it's based on a theological grievance.
01:30:23.000 It's not based on any real political concern for the lives of the Palestinians.
01:30:27.000 It's based on, you've got infidels too close to Muslim holy sites.
01:30:32.000 And you have...
01:30:33.000 The problem, the intellectual and moral problem I've spent more time focused on is the problem of someone like Jihadi John, right?
01:30:42.000 The guy who's got a degree in computer science, right?
01:30:45.000 He comes from a middle, upper middle class background in the UK. He's got all the opportunity anyone could want.
01:30:52.000 There are...
01:30:53.000 At least 3 billion people, probably something like 5 billion people who would trade places with him to be in a position of such opportunity in this world.
01:31:07.000 And yet the opportunity he wants to take is to move to Iraq or Syria and cut the heads off of journalists and aid workers.
01:31:18.000 Journalists and aid workers, not Navy SEALs they captured.
01:31:23.000 They want to kill the aid workers.
01:31:26.000 It's not an accident.
01:31:27.000 It's not something that's not like a perversion of their impulse.
01:31:30.000 It's not like, oh, I really wish this guy wasn't an aid worker or a journalist, you know, but he's the only guy we have.
01:31:36.000 No, this is...
01:31:38.000 Their commitments are that horrible, right?
01:31:41.000 And you have to explain how...
01:31:46.000 And this is something that someone like Abby Martin and someone like Noam Chomsky...
01:31:50.000 This is the phenomenon they really don't explain.
01:31:52.000 How is it that someone with all the opportunity, who's never been victimized by anyone, how is it that he is committed to the most abhorrent and profligate misuse of human life, where he's just ready to burn up the world, right?
01:32:09.000 And how do you get tens of thousands of people like this coming from first world societies?
01:32:14.000 And so given that phenomenon, then what explains the commitments of the people who don't have all those opportunities, right?
01:32:24.000 The people who are born in these societies and are shell-shocked and have been mistreated and who are...
01:32:35.000 I mean, some of them still love the West.
01:32:41.000 Some of them still are trying to get out.
01:32:44.000 I hear from atheists in these countries who don't hate the West.
01:32:48.000 I mean, they don't follow Abby Martin's line on this.
01:32:51.000 They understand why we were bombing in their neighborhoods, right?
01:32:55.000 But the fact is, this is really like a science experiment.
01:33:00.000 There are pristine cases of people who have no rational grievance, who devote their lives to waging jihad, and they're not mentally ill.
01:33:09.000 And that's the problem that I... That problem is scaling.
01:33:15.000 The thing that I worry about is that is a meme that is spreadable.
01:33:17.000 You don't have to ever meet anyone affiliated with a terrorist organization to get this idea into your head.
01:33:23.000 And so that's the piece I have focused on.
01:33:28.000 And it's not that I've denied the reality of the other pieces.
01:33:31.000 Is this related in any way to just the natural instinct that a certain amount of people have to be contrarians?
01:33:37.000 I mean, there's a certain amount of people that when they find any sort of large group that's in power, they want to oppose them.
01:33:44.000 If they find a ban that's popular, they want to hate it.
01:33:46.000 If they find a political party that's in control, they want to oppose it.
01:33:50.000 There's a certain amount of people that are just natural contrarians.
01:33:53.000 When they find a group that is absolutely committed And completely involved in an ideology to the point where they're rabid about it.
01:34:04.000 It becomes attractive to them and they want to join that resistance to fight against the Death Star that is the United States.
01:34:11.000 And religious by any stretch of the imagination.
01:34:14.000 But what I am is curious.
01:34:16.000 And one of the things that I like to do is I like to watch really pious or really obsessed religious people.
01:34:25.000 I love to watch videos of them.
01:34:27.000 Because I find it fascinating.
01:34:28.000 And there's a certain amount of, when I see the Islamic scholars that are talking in absolute confidence about their beliefs, there's a certain amount of that that I personally find attractive.
01:34:41.000 I don't want to join ISIS. I don't want to become a Muslim.
01:34:45.000 But when I see someone, almost like what we were talking about with Conor McGregor earlier, where he just fucking believes, man.
01:34:51.000 When someone believes.
01:34:52.000 I was watching this guy, I forget his name, but he's a guy from, he lives in the UK and he's this rabid Islamic scholar that, you know, All of his tweets are on how Islam is superior and it doesn't have to be adjusted like the laws of modern society and secular wisdom is inferior to the Islamic wisdom and blah blah blah.
01:35:14.000 I watched this guy do this YouTube video where he's describing how Islamic culture is superior to Western culture in terms of the way they manage money and he made a lot of fucking good points.
01:35:27.000 I made a lot of good points about wealth and about building economies and about how you take a company that's only worth $100,000 but you could sell it for a million dollars or trade it.
01:35:40.000 You have stocks and this is invisible wealth and Islam doesn't allow invisible wealth because that's how societies get crushed and that's how other economies crumble.
01:35:50.000 And I'm watching this guy with his moral certainty.
01:35:56.000 And his extreme confidence in what he's saying, absolute, and it becomes compelling.
01:36:01.000 And I'm not joining, I'm not saying that he got me, but I'm saying that I'm just absolutely admitting there's a certain aspect of human nature that gets compelled to join groups.
01:36:12.000 Oh yeah, yeah.
01:36:13.000 Well, there's something, there's that component of it, which I understand, but there's also just the religious ecstasy component, the aesthetics, the emotional component of it, which...
01:36:26.000 I really understand and I'm susceptible to.
01:36:29.000 I have a blog post, I believe it's called Islam and the Misuses of Ecstasy.
01:36:35.000 This is actually the first blog post I ever wrote where I realized I could not possibly have written this in book form or in a newspaper because it relied on embedded video.
01:36:45.000 The only way to have done this was with embedded video.
01:36:51.000 I wrote this, I think, once again over protests, something was said about me by Glenn Greenwald or somebody.
01:37:00.000 The charge had been that I totally lack empathy.
01:37:03.000 I don't even know what it's like to be...
01:37:05.000 what these people are getting out of their religion.
01:37:08.000 I've just demonized a whole people.
01:37:10.000 I don't understand religion.
01:37:12.000 And so I wrote this blog post to try to indicate how far from the truth that was.
01:37:18.000 So I put an example of the call to prayer, right?
01:37:23.000 Which I think, I mean, there's some that sound kind of ratty, but a nice call to prayer...
01:37:28.000 I think it's one of the most beautiful sounding things humanity has ever produced.
01:37:33.000 That hits me, that gets into my bones.
01:37:36.000 I don't have to imagine what a devout Muslim is feeling when he hears the call to prayer.
01:37:41.000 I think it's absolutely beautiful.
01:37:44.000 Without even knowing the language.
01:37:45.000 Exactly.
01:37:46.000 And I'm not without ever having been a Muslim or believing any...
01:37:49.000 So if that sound...
01:37:51.000 Again, your listeners can just read that blog post.
01:37:54.000 I only dimly remember what I wrote.
01:37:57.000 But if that ritual was purposed towards some other end, right?
01:38:04.000 If that ritual just was signifying...
01:38:07.000 Let's all get up in the morning and consider how profound human consciousness is and consider our togetherness on this rock spinning through empty space and realize that we just have this common project to make the world beautiful.
01:38:23.000 If that was what that meant, right?
01:38:26.000 I would just want a minaret right next to my house.
01:38:29.000 I would be totally on board with the experience of participating in that.
01:38:37.000 So I'm totally empathetic there.
01:38:39.000 And so I went through many other instances of this where something I'm seeing in the Muslim world, I really grok how beautiful and meaningful and captivating this is for people.
01:38:51.000 But then at the end, I put in a Quranic recitation and sermon by a...
01:39:00.000 I forgot his name now, but some sheikh who's got, you know, like ten times the number of Twitter followers you have, right?
01:39:05.000 I mean, he's like...
01:39:06.000 He's not a fringe figure.
01:39:08.000 He's a Muslim rock star.
01:39:11.000 And, you know, you see the translation of what he...
01:39:14.000 He's giving this tear-filled recitation of the Quran, which, again, is beautiful, right?
01:39:18.000 He's a great singer.
01:39:22.000 And, you know, it's a packed house in wherever it was, Saudi Arabia or Yemen.
01:39:29.000 But what is being said there is so ethically ugly, right?
01:39:35.000 Essentially celebrating the tortures of hell, right?
01:39:39.000 Just expressing a certainty that infidels are going to go to hell and how, you know, just this is a...
01:39:46.000 You have to organize your life around this question about how to escape the torments of hell.
01:39:54.000 And the only way to do it is to be a true believer in the Quran and Muhammad, etc.
01:39:59.000 And this is at the center of the mandala of their ethical concern.
01:40:07.000 Nothing in this life matters but avoiding hellfire.
01:40:13.000 And so there's a kind of a ghastly perversion of this impulse that I think many of us feel, I certainly feel it, to It's very much like Burning Man for people.
01:40:35.000 Imagine if Burning Man were just as ecstatic as it was and attracting all the smart people that it attracts.
01:40:44.000 But strewn throughout it was a message of just true divisiveness.
01:40:50.000 Like, everyone else who's not here is going to be tortured for eternity and they deserve it and we shouldn't be their friends and we should fuck them over any way we can when we get out of this place.
01:41:02.000 If God had wanted to enlighten them, He would have, but He hasn't.
01:41:06.000 So we're the only ones here.
01:41:08.000 And just a kind of a durable message of us-them thinking that just cannot be dissolved, right?
01:41:16.000 That's what's going on in the Muslim world.
01:41:19.000 And it's a huge problem, because it's pulling all the strings of I mean, it's not just Islam, obviously.
01:41:28.000 Christianity has a version of this, and all religions in principle have a version of this, but there are differences.
01:41:34.000 There is no version of jihad.
01:41:39.000 There's no Buddhist jihad.
01:41:41.000 It's not to say that Buddhists can't do terrible things, and it's not to say you can't find Buddhist reasons for doing terrible things, but...
01:41:50.000 Jihad, martyrdom, paradise, this is the jewel, the horrible jewel that so many millions of people are contemplating in Islamic context, and that's what I'm worried about,
01:42:05.000 and I'm not insensitive to the experience people are having.
01:42:10.000 Is this version of Islam recent in human history?
01:42:13.000 No.
01:42:14.000 This extreme radical version?
01:42:16.000 There are some things you can say that have, you know, with Wahhabism and Salafi-style Islam generally, that have been politicized and tuned up in a negative way in the last century.
01:42:32.000 You can say that, but the reality is that jihad is...
01:42:37.000 As old as Islam.
01:42:39.000 And Islam spread by jihad.
01:42:42.000 But isn't the original version of jihad a war on your own vices?
01:42:45.000 No, no.
01:42:46.000 That's just, no.
01:42:47.000 I mean, there is that component to it.
01:42:48.000 There is an inner jihad and an outer jihad, but there was always an outer jihad, and that's how Muhammad spread the faith.
01:42:55.000 And Muhammad, I mean, to answer your question very simply, as I did...
01:42:58.000 Somewhere, I just said, there's absolutely nothing ISIS is doing, the Islamic State is doing, that Muhammad didn't do, right?
01:43:09.000 I think I said, good luck finding something significant, some difference between them.
01:43:14.000 I mean, taking sex slaves, right?
01:43:16.000 Muhammad took sex slaves and gave sex slaves to his generals.
01:43:21.000 It was totally kosher.
01:43:22.000 It's a kosher thing to do.
01:43:24.000 If you're going to follow the example of Muhammad...
01:43:25.000 It's definitely not kosher.
01:43:26.000 Yeah, no, but halal.
01:43:28.000 Halal.
01:43:28.000 I mean, if you're going to follow Muhammad's example, which is perhaps the main lens through which you have to look at this, I mean, there's just what's in the Quran, and there's what's in the Hadith, the larger literature, and there's the example of Muhammad,
01:43:45.000 which is attested to in both those literatures and in the early biographies about him.
01:43:51.000 Muhammad was not like the Buddha.
01:43:54.000 He was not like Jesus.
01:43:55.000 He was a conquering warlord who succeeded, right?
01:44:02.000 And that is an example that is very different from the example of a guy who got crucified, or the example of a guy who spent his life meditating and then teaching, right?
01:44:11.000 If the Buddha had been lopping heads off You know, at every sermon and advocating, just talking endlessly about when to kill people and how many people to kill and, you know, how to treat your sex slaves.
01:44:27.000 If that was just strewn throughout the Buddhist teaching, I would expect Buddhists to behave exactly the way we see members of ISIS and Al-Qaeda and Al-Shabaab and Boko Haram behave.
01:44:38.000 How to treat your sex slaves?
01:44:40.000 Sure.
01:44:42.000 How many...
01:44:42.000 So, yes, you...
01:44:44.000 How do you treat your sex slaves?
01:44:46.000 Not you.
01:44:48.000 How is one?
01:44:49.000 Taking sex slaves, it's not adultery if you're having sex with sex slaves, it's adultery if you're having sex with other women, right?
01:45:00.000 Other Muslim women.
01:45:01.000 How convenient.
01:45:03.000 No, you can...
01:45:05.000 Slavery...
01:45:06.000 I mean, this is the horror of Abrahamic religion generally.
01:45:11.000 I mean, this is why we know these books were not authored by a moral genius.
01:45:16.000 The Bible and the Quran can't give you a basis to resist slavery.
01:45:23.000 Slavery is supported in both traditions.
01:45:25.000 So, the fact that we have...
01:45:29.000 You know, after centuries, decided, more or less unanimously, that slavery is an abomination, that proves that there's more moral wisdom to be found outside of these books than inside, at least on that point.
01:45:43.000 And I would argue on virtually every other point of consequence.
01:45:47.000 Now, it's not to say they're not gems of moral wisdom in some of these books, but We're good to go.
01:46:12.000 The members of ISIS right now have the theology on their side.
01:46:17.000 It's not like they're ignoring the books.
01:46:19.000 They're looking at the books very literally.
01:46:21.000 And they're saying, what are we doing that you don't find in the books, essentially?
01:46:31.000 This is just connect the dots.
01:46:33.000 That was one of the videos that you had posted up on your blog that you and I discussed with the guy that was standing in front of all those people that was talking about stoning people for adultery or the treatment of homosexuals and how that this is not radical Islam this is just Islam and that was shocking and that's one of those videos where You post it or you talk about it and you get a million people that get upset at you over it.
01:46:57.000 You get a million people that call you Islamophobic or what have you and get upset about it and a lot of those are the same people.
01:47:05.000 There was a weird thing that happened after Charlie Hebdo that really kind of freaked me out.
01:47:15.000 Yeah.
01:47:16.000 Yeah.
01:47:20.000 Yeah.
01:47:34.000 Justified or at least rationalized the fact that they could just gun down cartoonists?
01:47:41.000 Fucking cartoonists!
01:47:42.000 You're not talking about people that are doing experiments on monkeys or people that are torturing animals.
01:47:49.000 You're not talking about people that are imprisoning other human beings.
01:47:52.000 You're not talking about people that are, you know, even stopping people from doing anything.
01:47:59.000 Just mocking them in cartoon form.
01:48:02.000 And in this case, mocking also Christianity and the Vatican and many of the things that were interpreted as racist weren't even racist if you understood French or French politics.
01:48:12.000 So it's shocking.
01:48:14.000 And the people who missed the The train on this, people like Gary Trudeau, the Doonesbury creator, he just came out against Charlie Hebdo and a bunch of writers who belong to the PEN America organization,
01:48:31.000 which is the whole point of which is to defend free speech.
01:48:34.000 They just walked out of a gala event or declined to show up because PEN had given Charlie Hebdo the Freedom of Expression Award this year, as they should have.
01:48:45.000 And some prominent people left in protest.
01:48:49.000 And it's...
01:48:52.000 No, the fact that...
01:48:53.000 What is that?
01:48:54.000 What is that?
01:48:55.000 Well, it's political correctness and fears about being perceived as a racist and this notion that you should...
01:49:07.000 That it makes sense to have a double standard here where you can...
01:49:12.000 That there's some trade-off between freedom of expression and freedom of religion, where when the freedom that's being claimed on the religious side is the freedom not to be offended, right?
01:49:26.000 So, I mean, really what's happening here is some number of Muslims...
01:49:31.000 Are demanding that non-Muslims follow the taboos of Islam.
01:49:37.000 So, it's taboo for you to say anything negative about the Prophet, or even to depict him in a drawing, right?
01:49:44.000 That's where it gets really crazy, right?
01:49:47.000 And we want you to follow this taboo, though you are not Muslim.
01:49:50.000 And we feel so strongly about this that we're going to kill you, or make credible threats of killing you, or...
01:49:59.000 We're just going to—when people do kill you, we're going to blame you for having gotten yourself killed, for having been so stupid and insensitive by caricaturing the prophet.
01:50:07.000 And that whole—I mean, that just has to lose.
01:50:11.000 I mean, we have to hold to free speech so unequivocally that all the people over here who think that there's this trade-off between religious sensitivity and free speech— Just have to realize that they've lost.
01:50:27.000 Because we don't play this game with any other religion.
01:50:29.000 Just think about this analogy I've used before, but the Book of Mormon, right?
01:50:34.000 It just pillories Mormonism.
01:50:37.000 It makes Mormonism look ridiculous, right?
01:50:40.000 What did the Mormons do in response?
01:50:41.000 The Mormons took out ads in Playbill.
01:50:44.000 It was very cute, what they did.
01:50:46.000 They took out ads, like, if you like the play, come learn the real stuff.
01:50:50.000 It was totally civil, good-natured, fine.
01:50:54.000 They're my favorite cult.
01:50:56.000 They really are.
01:50:57.000 I like them way more than I even like Scientology, which is my second favorite cult.
01:51:01.000 But Trey Parker and Matt Stone are not looking over their shoulders for the Mormon assassins.
01:51:07.000 But they were about Muslims.
01:51:10.000 Briefly, they put Muhammad in a bear suit.
01:51:13.000 It was just a bear, right?
01:51:15.000 And then they had to put the bear suit in a van.
01:51:17.000 Yeah, and then they pulled it off the air.
01:51:18.000 Worse still, they had to pull it off the air.
01:51:20.000 And worse still, it made sense for them to pull it off the air, given the actual nature of the threat.
01:51:26.000 So, as I've argued, we have already lost our freedom of speech on this issue.
01:51:31.000 On that one individual issue, we've almost...
01:51:34.000 The only issue on Earth, really.
01:51:37.000 And there are people on the liberal side of this argument who think that is a good thing, that you are a racist to question the...
01:52:05.000 Do you think that that's fear?
01:52:08.000 That it's a fear of Islam, a fear of retaliation, that they want to be on the side of the others because it's so dangerous, because they are the only religion that will come out and kill you.
01:52:16.000 And these same people, I've found, that will call people out on being Islamophobic will not say a fucking peep about anti-Christian rhetoric.
01:52:25.000 If you start talking shit about Jesus or mocking Christianity, they never have a word to say about it because it's not dangerous.
01:52:32.000 Because it's not dangerous to be on that side.
01:52:34.000 I think it's much more just white guilt and political correctness.
01:52:42.000 There's definitely some of that as well.
01:52:43.000 Just a sense.
01:52:44.000 It's just, it is, if you take, again, I don't mean to trash Abby per se.
01:52:49.000 If you met her, you'd love her.
01:52:51.000 I'm telling you, she's a great person.
01:52:52.000 I'm sure she's cool.
01:52:54.000 If you take her view of our foreign policy, if you just agreed with her down the line, just check all those boxes, two million people, we did it all, we just kill brown-skinned people all over the world because we just like to sell bombs, and that's really our moral core, then we should have a fair amount of white guilt,
01:53:13.000 right?
01:53:14.000 Then it's understandable that you think that more or less any...
01:53:20.000 Non-Western population that expresses a grievance against us has a point.
01:53:24.000 Well, isn't there a real problem with saying our?
01:53:27.000 Because you and I have nothing to do with that, and we're a part of this weird gang called the United States of America.
01:53:33.000 Whenever you say us, what we've done, us, I mean, we haven't done shit, but we're somehow or another lumped into this group.
01:53:40.000 That's a big part of it.
01:53:41.000 But we've participated in a system the existence of which is predicated on some of this shit.
01:53:47.000 The existence of which existed long before you and I were ever born.
01:53:50.000 We're born into a system we have zero control over.
01:53:53.000 And that's why I think that some of the greatest...
01:53:56.000 Ethical changes, the greatest ethical progress for us as a species is going to come not with each one of us developing an ethical code that allows us to be a hero, you know, personally and just bucking a system and bucking a trend, you know,
01:54:12.000 from morning till night.
01:54:13.000 We need to design systems that are more benign.
01:54:16.000 It comes down to our smartphones.
01:54:19.000 Is there a way to produce a smartphone that is ethically benign?
01:54:23.000 At the moment, it seems like there isn't, or at least we're not being so scrupulous as to find one.
01:54:29.000 You mean as far as conflict minerals?
01:54:31.000 Exactly.
01:54:32.000 All of it.
01:54:32.000 Could we actually be good people all the way down the supply chain?
01:54:36.000 Slave labor.
01:54:37.000 All the way down.
01:54:39.000 Now, I would pay more for that phone.
01:54:41.000 There's no question.
01:54:41.000 Well, did you know there was a phone that they were trying to produce about that?
01:54:44.000 It was called the Fair Phone, and it was non-conflict minerals, but it was only 3G. Nobody wanted a piece of shit.
01:54:53.000 I'm not kidding.
01:54:54.000 Pull that up, James.
01:54:55.000 It's called the Fair Phone.
01:54:57.000 Our hold on our better nature is so tenuous that the difference between 4G and 3G could make the difference, right?
01:55:04.000 It's like, let's see if they've moved up to 4G. I'll fucking buy it.
01:55:09.000 Right.
01:55:09.000 But it's got to be...
01:55:11.000 Sold out.
01:55:13.000 Look at that.
01:55:13.000 They're sold out.
01:55:14.000 Wow.
01:55:15.000 My point is no one should have to have a bad phone to be a good person, right?
01:55:19.000 True.
01:55:20.000 So we want systems...
01:55:22.000 That is adorable, right?
01:55:24.000 When you see liberals with an iPhone 6. Right.
01:55:26.000 Like, listen, son...
01:55:27.000 No, but we are those liberals, too.
01:55:30.000 Look at that guy with a fair phone.
01:55:31.000 That's what you get when you get a fair phone.
01:55:33.000 That's perfect.
01:55:33.000 That goddamn brick.
01:55:34.000 Something the size of a toaster.
01:55:36.000 That's some shit from an iced tea video from 1988. Look at that brick that that guy's got up to his ear.
01:55:42.000 That is an unfair phone.
01:55:43.000 That is a terrible way to sell your phone.
01:55:46.000 Why would you have that fucking ridiculous phone?
01:55:49.000 You can't put that in your pocket, son.
01:55:51.000 Yeah, well, all of those people that buy those things that have those extreme liberal values, progressive values, you have to deal with the absolute reality that, at the very least, your phone is being produced in a factory where people are jumping off the roof.
01:56:08.000 That's a fact.
01:56:09.000 Unless they're making them in Korea, the Samsung phones, I think ethically, I think they have like a leg up on the iPhone in the sense that, you know, those Foxconn buildings where they have nets installed all around the building to keep people from jumping off the roof because it sucks so bad there.
01:56:26.000 And I've heard the argument against that.
01:56:28.000 Well, the amount of people that...
01:56:30.000 You've got to deal with the fact that these factories employ half a million people, and the number of people that commit suicide is directly proportionate to the same number of people that would commit suicide in the regular population.
01:56:40.000 But they're killing themselves at work.
01:56:42.000 Like, how many people kill themselves at work?
01:56:43.000 That's not normal, and they live at work.
01:56:46.000 Okay, well, that's not normal either.
01:56:47.000 You've got slaves.
01:56:49.000 These are essentially wage slaves.
01:56:51.000 Again, these are situations where there's often...
01:56:55.000 Or at least sometimes, no good option immediately.
01:56:59.000 So when you think of child labor laws in a place like Pakistan...
01:57:03.000 I know, but look at that phone, dude.
01:57:05.000 Look how beautiful it is.
01:57:06.000 Look at that.
01:57:07.000 Talks to you and shit.
01:57:08.000 Come on, man.
01:57:09.000 Look at that screen.
01:57:10.000 Pretty.
01:57:12.000 I'm not going to lose any sleep at night over your owning that phone.
01:57:15.000 Thank you.
01:57:15.000 But it's a...
01:57:17.000 I think you and I would, and millions of other people would probably, I mean, I know I would, but I think millions, if we could make the problem transparent, we would pay more to be truly good actors across,
01:57:36.000 you know, in all of the streams of influence.
01:57:40.000 But there are certain situations, again, where, you know, I just mentioned, you know, child labor laws in Pakistan.
01:57:44.000 If you go...
01:57:46.000 If you just say, no kids can work, right?
01:57:49.000 Because this is obscene.
01:57:51.000 We haven't done this in the West for over 100 years.
01:57:53.000 We don't want kids stitching together our soccer balls, right?
01:57:59.000 These kids should be in school.
01:58:00.000 Well, there are situations where that may be workable, right?
01:58:04.000 Where you get the kid out of the factory, and where he's been working 14 hours a day, and you get him into school, and he's got a better life.
01:58:11.000 But there are many situations in places like Pakistan where...
01:58:15.000 No, what you've just done is you've made it impossible for this kid to work and you've further impoverished his family.
01:58:21.000 Because he wasn't going to go to school anyway.
01:58:22.000 Now he's going to be picking stuff out of a trash heap or whatever it is.
01:58:26.000 And you haven't put in place an alternative that's workable.
01:58:30.000 And so with many problems of this sort, we have to find a path forward where the first doors we open, the choice between the doors we have to open, All suck, right?
01:58:45.000 And there are situations, you know, geopolitically that are like that, where you can either back a guy who's a dictator, right, but he's secular and he's committed to a first approximation to basically sane relations with the rest of the world.
01:59:03.000 But he really is a dictator, and he really has a history of treating people badly, and he's going to treat political dissent very badly because of the possible consequences for him if he doesn't, because the society is bursting, coming apart at the seams.
01:59:19.000 Or you can just let the Islamists and the jihadists run the place, right?
01:59:24.000 And that is a, you know, there's no good option, and it's understandable that we have In many cases, chosen the dictator there.
01:59:32.000 Well, that was sort of the situation with Saddam Hussein.
01:59:34.000 Right?
01:59:35.000 Yeah.
01:59:36.000 I mean, a psychopath.
01:59:37.000 His children were psychopaths, murderers, serial killers.
01:59:40.000 He did horrific things, but he was very secular in the way he ran his country.
01:59:44.000 Yeah, and so we're facing this on many fronts.
01:59:49.000 I want to ask you this, because you have these extreme opinions about these things.
01:59:52.000 You have these extreme criticisms.
01:59:55.000 If you could, if ultimately someone said, look, Sam, you're going to be king of the world.
02:00:01.000 You are going to be the guy that gets to sort this mess out.
02:00:06.000 We need someone to engineer a global culture.
02:00:09.000 What would be the step that you would take to try to alleviate some of the suffering of the world, alleviate some of the bloodshed, alleviate all these conflicts, these geopolitical conflicts?
02:00:20.000 Well, in this area, the first few things I would do, we've already talked about.
02:00:25.000 One is I would make it absolutely clear that free speech just wins, right?
02:00:33.000 So whenever you got into a Charlie Hebdo situation or the Danish cartoons, you know, the riots over those cartoons, we've had half a dozen situations like that in the last 10 years.
02:00:47.000 The people...
02:00:50.000 Even our own government can't...
02:00:52.000 We're fighting a war on terror, and we still can't defend free speech when those situations erupt.
02:00:59.000 So, for instance, this was over the Innocents of Muslims film.
02:01:04.000 I don't know if you remember that film.
02:01:05.000 It was a YouTube film that kicked off...
02:01:09.000 Riots everywhere.
02:01:10.000 Was that true, though?
02:01:11.000 Because I've heard so many versions.
02:01:14.000 No, it did.
02:01:15.000 The Benghazi thing, it's true that it did kick off riots everywhere, but the thing that was egregious about our...
02:01:28.000 Government statement there was that we basically just, rather than to take the totally sane line of saying, listen, in our society we're committed to freedom of speech, and you can make films about anything here, and that never gives you license to kill people,
02:01:44.000 right?
02:01:44.000 Or to burn embassies, you know, full stop.
02:01:47.000 What was the name of the documentary?
02:01:49.000 Well, it was a film called The Innocence of Muslims, or Innocence of Muslims, made by some crackpot somewhere.
02:01:58.000 And it was just a YouTube video, but it got spun as this major scandal in the Muslim world, and it reliably produced this reaction of the sort that the Danish cartoons had.
02:02:12.000 And we, rather than just hold the line for free speech...
02:02:17.000 We, I mean, the State Department said something like, you know, we totally repudiate this attack upon Islam.
02:02:27.000 And we just distanced ourselves from it just as a way of trying to contain the madness, right?
02:02:33.000 It was a symptom of just how afraid we are that this sort of thing can get out of hand in the Muslim world, because it can, right?
02:02:42.000 If there's a rumor that a Quran got burned, or if some, you know, pastor...
02:02:47.000 In Florida, threatens to burn a Quran.
02:02:51.000 Reliably, people by the dozens get killed in places like Afghanistan.
02:02:56.000 Because it's in a way that a suicide bombing between Sunni and Shia never produces a response of that sort.
02:03:03.000 So I would hold to free speech, and I would just make that...
02:03:10.000 Because free speech is the freedom that safeguards every other freedom.
02:03:14.000 If you can't speak freely, if you can't criticize powerful people...
02:03:19.000 Or powerfully bad ideas.
02:03:21.000 There's just no way to defend society from slipping back into theocracy or any other kind of medieval situation.
02:03:34.000 So you have to defend free speech.
02:03:36.000 Even speech you don't like.
02:03:38.000 It's like these Holocaust denial laws in Western Europe.
02:03:42.000 It's illegal to deny the Holocaust in Germany and a few other countries.
02:03:47.000 I think Austria.
02:03:50.000 I think even France.
02:03:52.000 And it's a ludicrous law.
02:03:54.000 You should be totally free to deny the Holocaust, and then everyone else should be free to treat you like an idiot.
02:03:59.000 And you should be free to destroy your reputation, right?
02:04:03.000 The fact that they are putting people in jail For denying the Holocaust is totally counterproductive.
02:04:10.000 And it does look like, in defense of Muslim apologists, it does look like a double standard.
02:04:15.000 You're going to put people in jail for denying the Holocaust, but you're going to allow Charlie Hebdo to criticize the Prophet?
02:04:21.000 How does that make sense, right?
02:04:23.000 I totally agree with them there.
02:04:24.000 We should not be criminalizing any form of speech.
02:04:29.000 Regardless of how stupid.
02:04:30.000 Yeah, but there are people trying to push through blasphemy laws.
02:04:34.000 There's a politician in the UK who recently just said he would make Islamophobia a criminal offense, right?
02:04:41.000 I'm sure he would make the sorts of things I say about Islam criminally actionable in the UK, right?
02:04:48.000 This is a disaster.
02:04:49.000 That's the wrong road to go down.
02:04:51.000 So, first thing.
02:04:53.000 And I think that's a hugely important thing.
02:04:55.000 And the other piece we just talked about is just getting off of oil.
02:04:59.000 Just imagine that one change, right?
02:05:01.000 We could get off of oil.
02:05:04.000 And that would prove, beyond any shadow of a doubt, that spending your life splitting hairs about...
02:05:16.000 Muslim theology and demonizing the rest of the world and exporting crazy madrasas by the tens of thousands all over the world, as the Saudis do.
02:05:29.000 It would prove that that is not a way to join the community of functional nations, because absent an ability to pull their wealth out of the ground, they have no intellectual content.
02:05:40.000 They don't produce anything of value that anyone wants.
02:05:44.000 That's a problem they would have to solve, right, if they don't want to be beggared in a global community.
02:05:49.000 Well, isn't that an issue also with the ideology of the religion, is that you're not allowed to question or change or manipulate the way you approach life because it's all dictated by the religion?
02:06:00.000 Even their finances.
02:06:01.000 Well, it's part of it.
02:06:03.000 And just when you look at societies where they keep half the population, the female half, more or less hostage and unable to get educated or to work or to drive cars, depending on which society you're talking about.
02:06:19.000 This economically and socially doesn't make any sense in a context where you need to produce intellectual content to be part of a global conversation.
02:06:31.000 So the only way they've been able to do this is because of the fact that they have an extreme amount of money that comes from oil.
02:06:36.000 Well, certainly if you're talking about the oil states, yeah.
02:06:39.000 And so that's...
02:06:41.000 If oil were no longer valuable...
02:06:45.000 And we actually could get to a time where that would be the case, whereas oil is just a dirty fluid that no one wants to have anything to do with, right?
02:06:58.000 That would be a huge change.
02:07:00.000 Now, I'm sure there's another side to this argument where it would be a destabilizing change.
02:07:04.000 I mean, just imagine how things will start to run off the rails in the Middle East if oil is worthless, right?
02:07:10.000 And what's Saudi Arabia going to be like?
02:07:12.000 Arguably, I think they've probably hedged their bets and they have so much money in other investments now that at least the royal family would be fine.
02:07:19.000 But it's a huge part of the problem.
02:07:25.000 And as you pointed out, it keeps us double-dealing and being...
02:07:35.000 We're good to go.
02:07:53.000 But how do you get someone to abandon such a rigid ideology?
02:07:56.000 How do you get someone to open their mind up to the possibility that this was just written by people?
02:08:01.000 There's just a way of governing people and keeping people in line, which is essentially every single religion that's ever been created.
02:08:10.000 Well, but see, it happens.
02:08:13.000 Actually, this is another point that Abby Martin made, which I agree with.
02:08:19.000 We don't agree with it.
02:08:21.000 We don't think this thought for the same reasons.
02:08:23.000 But she pointed out that religions change, right?
02:08:25.000 That you roll back the clock 500 or so years, Christians were burning people alive and actively prosecuting people for sins.
02:08:35.000 For blasphemy, you had the Inquisition in Europe, and that was every bit as much of a horror show as what's going on in Iraq now.
02:08:41.000 So look, Christianity can be just as bad as Islam.
02:08:45.000 Now, it's true, as a matter of history, that is in fact true.
02:08:50.000 There are differences between Islam and Christianity that are nevertheless important, but the crucial piece is that Christianity did not change from the inside.
02:08:59.000 You know, Christianity got hammered from the outside by a Renaissance and, I mean, a Reformation initially, which was bloody and horrible, but It got hammered by the forces of a scientific revolution and an attendant industrial revolution and capitalism and the rest of culture that didn't want to be shackled to theocracy for very good reason,
02:09:26.000 right?
02:09:27.000 And so, like, once you have a real science of medicine, you don't have to ask the priest why your child is flopping around on the floor.
02:09:34.000 And when the priest diagnoses it as demonic possession, you don't take it seriously, right?
02:09:39.000 You now have a neurologist to talk to.
02:09:41.000 I mean, there's progress made outside of religion, which propagates back to religion and applies a lot of pressure to it.
02:09:50.000 So...
02:09:52.000 Christianity has been humbled and mastered by the secular world, by humanism, by science, by the rest of our conversation with ourselves.
02:10:02.000 And this has not happened in the Muslim world.
02:10:04.000 And it should happen.
02:10:05.000 It has to happen.
02:10:06.000 We have to figure out how to engineer it for Muslims.
02:10:08.000 And again, it's not going to come from the outside.
02:10:11.000 Non-Muslims are not going to force it on Muslims.
02:10:13.000 But we have to support the genuine reformers And the people who are fighting for the rights of women and gays and free thinkers in the Muslim world.
02:10:25.000 And the horrible thing is that the liberals on our side don't do that.
02:10:29.000 The liberals on our side criticize people like me and even Ayaan Hirsi Ali, a former Muslim who has been hunted by theocrats, right?
02:10:40.000 I think?
02:11:00.000 We are concerned about the rights of women in Silicon Valley, right?
02:11:05.000 That's how effete our concerns are now.
02:11:08.000 Why isn't there an equal number of women in venture capital now?
02:11:13.000 What a fucking scandal, right?
02:11:14.000 There are people who can go on for hours about that, right?
02:11:17.000 I'm not saying it's not a potential scandal.
02:11:19.000 Great, let's talk about that scandal.
02:11:21.000 But let's talk about the fact that girls six years of age are getting clitorectomies By barbarians in septic conditions, and everyone around them thinks it's a good and necessary thing, right?
02:11:38.000 And, you know, women who get raped get killed because they brought dishonor on their family.
02:11:43.000 I mean, there's another planet over there that we have to interact with because violence is coming our way for no other reason.
02:11:50.000 But there's another reason.
02:11:52.000 There's the ethical imperative of figuring out how to help people who are hostage to a bad system.
02:11:59.000 And so, yeah, let's be for women's rights globally, but what does that look like?
02:12:05.000 That looks like a rather staunch criticism of the way women are treated under Islam.
02:12:10.000 There's a lot of seemingly open-minded European cultures that have opened the door for a lot of Islamic immigrants or Muslim immigrants to come over to their country.
02:12:19.000 Now they're dealing with a lot of the issues that involve these ideologies being a part of their culture now.
02:12:26.000 Yeah.
02:12:27.000 Well, they are in a situation similar to ours with Latin America, where they need immigrant labor, right?
02:12:36.000 They're actually worse than the U.S. in terms of their replacement rate.
02:12:39.000 You've got a bunch of countries in Western Europe who...
02:12:44.000 I think?
02:12:59.000 We have political refugees who are leaving war-torn places for obvious reasons and winding up in the closest shores across the Mediterranean.
02:13:09.000 So yeah, the people they're attracting are different from many of the Muslim immigrants we get in the U.S. who are coming to work for Google or they get engineering degrees.
02:13:22.000 It's a different demographic, largely.
02:13:26.000 Okay, I think we've covered that subject.
02:13:29.000 Into the ground.
02:13:30.000 So let me...
02:13:31.000 I'll just mention the things on this list, and you can choose.
02:13:33.000 Send Abby a big kiss.
02:13:34.000 And again, I hope what I said about Abby didn't seem mean-spirited.
02:13:39.000 No.
02:13:39.000 She actually is wrong about me, and if I'm wrong about her, I'm happy to be enlightened on that topic.
02:13:45.000 Well, it'd be interesting to have two of you sit down together.
02:13:48.000 That'd be hilarious.
02:13:50.000 I'll bring the tequila.
02:13:51.000 Okay.
02:13:51.000 Is that what's necessary?
02:13:53.000 For me.
02:13:54.000 I'll bring my own tequila.
02:13:55.000 Okay.
02:13:56.000 So we did free will, I think, unless that comes up again.
02:14:01.000 AI, which...
02:14:02.000 Is that something you're concerned with?
02:14:04.000 Yeah.
02:14:05.000 I actually just blogged about this and spoke about it.
02:14:08.000 But I think it was in my head to talk about it because I heard you talk about it with someone.
02:14:12.000 It might have been Duncan Trussell.
02:14:14.000 Most likely.
02:14:14.000 We've talked about it many times.
02:14:16.000 But AI just...
02:14:18.000 I got onto this on the bandwagon here because I hadn't really thought about it at all.
02:14:24.000 I'm not really a sci-fi geek.
02:14:26.000 I don't read science fiction.
02:14:28.000 And the word in neuroscience has been, for a very long time, and really science generally, is that AI hasn't panned out.
02:14:39.000 It's not that it's inconceivable that something interesting is going to happen, but it has been Old-style AI was really a dead-end, and we never really got out of that particular cul-de-sac, and we just haven't made much progress, and so we have, you know,
02:14:54.000 the best chess player in the world is a computer that's the size of this table, but...
02:15:00.000 The prospect of having truly general artificial intelligence and superhuman level intelligence, that's not something we have to worry about in the near term at all.
02:15:12.000 But then I heard, as many people did, my friend Elon Musk say something which seemed quite hyperbolic.
02:15:23.000 He thought it was the greatest threat to humanity, probably worse than nuclear weapons.
02:15:28.000 And there was a lot of pushback against him there.
02:15:33.000 But I actually know Elon, and I knew he just wouldn't say that without any basis for it.
02:15:40.000 And it just so happened there was a conference that had been scheduled long before in Puerto Rico, in San Juan, that was really like a closed-door conference for the people who are really at the cutting edge of AI research.
02:15:52.000 It was organized by the Future of Life Institute, which is a...
02:15:57.000 A non-profit purpose toward looking at the existential threat and looking at, in this case, how to create a...
02:16:07.000 foreseeing the existential problems around the development of AI. And it was a conference.
02:16:16.000 I mean, maybe there were 70 people at the conference.
02:16:18.000 And it was all people who were doing this work and...
02:16:25.000 I literally think I was one of maybe two people who had sort of talked his way into the conference.
02:16:29.000 Everyone else was just invited and they had a good reason to be there.
02:16:35.000 What was interesting is that outside this conference, Elon was getting a lot of pushback.
02:16:39.000 Like, dude, you don't know what you're talking about.
02:16:42.000 Go back to your rockets and your cars, but you don't know anything about computers, apparently.
02:16:46.000 And he was getting this pushback from serious people, people who are on the edge.org website, where I am also occasionally published.
02:16:57.000 You know, roboticists at MIT and people who should, you know, former top people at Microsoft, people who you think are very close to this, would say, no, no, this is 50 or 100 years out, and this is crazy.
02:17:13.000 And so anyway, I went to this conference just wanting to see, you know, what was up.
02:17:20.000 And what was interesting and, frankly, scary was that At the conference, even among people who clearly drunk the Kool-Aid and are just not willing to pull the brakes on this at all, I mean, they don't even...
02:17:34.000 Arguably, it's hard to conceive of how you would pull the brakes on this, because the incentive to make these breakthroughs financially is just so huge that, you know, if you don't do it, someone will, and so everyone's just pedal to the metal.
02:17:46.000 But...
02:17:51.000 Basically, even the people who were going at this most aggressively were people who were conceding that huge...
02:18:03.000 It was not at all fanciful to say that huge breakthroughs in artificial general intelligence could come in five or ten years, given the nature of the progress that had been made in the last ten years.
02:18:14.000 And the scary thing is that when you look at the details, it's not at all obvious to see a path forward that doesn't just destroy us.
02:18:25.000 Because it's not...
02:18:27.000 You'd think that...
02:18:29.000 I mean, I think most people's default...
02:18:35.000 Yeah, I think.
02:18:48.000 The clock changes and nothing happens, right?
02:18:51.000 So this is just you got a bunch of nerds worried about something that just doesn't happen, right?
02:18:56.000 So is that an analogy for the situation?
02:19:00.000 It really isn't.
02:19:02.000 What's going on here is you're talking about Even in the most benign...
02:19:09.000 Well, let me just step back, because I'm assuming that a lot of people understand what we're talking about here.
02:19:16.000 What we're talking about is producing what's called strong AI, or often called AGI, like artificial general intelligence.
02:19:25.000 You're talking about a machine that is...
02:19:28.000 The intelligence is not brittle.
02:19:31.000 The best chess-playing computer in the world can't play tic-tac-toe.
02:19:36.000 All it can do is play chess.
02:19:38.000 It's not a general intelligence.
02:19:40.000 You're talking about something that is...
02:19:44.000 That learns how to learn in such a way that the learning transfers to novel situations, and it doesn't degrade.
02:19:55.000 If you give it a new problem, it won't get worse at the other problems that it got good at because you're giving it new problems now.
02:20:03.000 So you're giving it something that scales, that can move into new territories, that can become better at learning, and in the ultimate case...
02:20:13.000 Can make improvements to itself.
02:20:15.000 I mean, once these machines become the best designers of the next iteration of software and hardware, well, then you get this sort of...
02:20:24.000 this exponential takeoff function, or, you know, often called the singularity, where you have something where there's a runaway effect, where it's just...
02:20:31.000 you can't...
02:20:32.000 this is now...
02:20:33.000 the capacities are...
02:20:35.000 it's just gotten away from you.
02:20:39.000 And...
02:20:41.000 So you imagine, what's often said is that we're going to build something, the near-term goal is to build something that's human-level intelligence, right?
02:20:49.000 So you're going to build, we have a chess computer that's not quite as good as a person, and then it is as good as a person, and now it's, you know, a little better than a person, but it's still not so much better as to be completely uncanny to us.
02:21:01.000 And we're thinking of doing that for everything, but...
02:21:04.000 The truth is that is a mirage.
02:21:06.000 We're not going to build a human-level AGI. Once we build an AGI, it's going to be better at...
02:21:12.000 Which is to say, once we build a truly generalizable intelligence, something that can prove mathematical theorems and make scientific hypotheses and test them, and everything a human can do...
02:21:25.000 It's going to be so much better than a human for a variety of reasons.
02:21:30.000 One is that your phone is already better than you.
02:21:33.000 It's superhuman in many respects.
02:21:35.000 It has a superhuman memory.
02:21:37.000 It has a superhuman capacity to calculate.
02:21:41.000 And if you hook it to the internet, it has potential access to all of human knowledge.
02:21:47.000 So we're not going to build a human-level AGI. We're going to build something that is going to be not an AGI, right?
02:21:52.000 It's going to be like a dumb chess-playing computer until it isn't.
02:21:56.000 And then it's going to be superhuman, right?
02:21:59.000 And when you're talking about something that runs...
02:22:04.000 Potentially a million times or more faster than a human brain, because you're not talking about a biological system now.
02:22:09.000 You're talking about photons.
02:22:13.000 It could make...
02:22:15.000 You just do the math and you see that this thing is running for a week.
02:22:19.000 That is the equivalent of 20,000 years of intellectual progress.
02:22:23.000 So it's just like, get the smartest people alive in a room for 20,000 years with access to all of the world's information.
02:22:32.000 And an ability to model new experiments and computational abilities of the sort that we can't imagine.
02:22:40.000 And 20,000 years from now, what are they going to come back to you with?
02:22:43.000 That's going to be one week of this machine running, right?
02:22:46.000 So this is how this thing sort of escapes.
02:22:48.000 How do we feel that we can control the goals...
02:22:55.000 The behavior of a system that is capable of making 20,000 years of progress in a week, right?
02:23:03.000 And when you hear about how they're going about designing these systems, it is kind of uncanny.
02:23:10.000 They're talking about blackboxing these systems where...
02:23:15.000 And the first thing you want to do is not give it access to the Internet, right?
02:23:18.000 You're just going to cage this thing, right?
02:23:20.000 Because you don't want it to get out, right?
02:23:22.000 But you want to tempt it.
02:23:24.000 You want to see if it's trying to get out, right?
02:23:26.000 So you're going to give it like a dummy Ethernet port that you're monitoring, right?
02:23:30.000 I mean, this is...
02:23:32.000 The people doing this work at the highest level...
02:23:35.000 Are talking about games like this.
02:23:37.000 We're like, how do you know whether the thing is lying to you that it tried to make...
02:23:41.000 How do you know whether it knows about the Internet?
02:23:44.000 How do you know whether it is...
02:23:46.000 This is called a honeypot strategy, where you tempt it to make certain moves in the direction of acquiring more power than you wanted to give it, and then you can just, you know, shut it off immediately.
02:23:56.000 But you're talking about guys who are a lot younger than us, many of whom are somewhere on the Asperger's continuum, who are drinking a lot of Red Bull and have billions of dollars at their disposal to do this work.
02:24:20.000 There's a huge responsibility not to do anything that obviously destroys the world.
02:24:26.000 And the problem is, even when you think about the most benign versions of this, the possibility of destroying the world is not fanciful.
02:24:36.000 Forget about what I just said about 20,000 years of progress.
02:24:39.000 Just imagine we have an AI. Someone working for Facebook or whatever builds this thing.
02:24:47.000 We've solved what's called the control problem.
02:24:49.000 We've figured out how to keep this thing doing our bidding.
02:24:53.000 It's not going to come up with near-term goals that are antithetical to human happiness.
02:25:00.000 It's just a non-trivial problem.
02:25:02.000 If you say, you have to be committed to human well-being, if that's the foundational architecture of the system, It depends what that means in certain cases.
02:25:12.000 I mean, what if the thing decides, well, okay, if I'm committed to human well-being, I'm going to kill all the unhappy people, right?
02:25:17.000 Or I'm just going to plug, you know, electrodes into the right part of the brain of every human being and give them just pure pleasure, right?
02:25:22.000 You have to solve these problems.
02:25:26.000 So let's say we build something that's totally under our control, it works perfectly, and we don't have to worry about the control problem.
02:25:32.000 We still have to worry about the political and economic effects of building something that's going to put the better part of humanity out of work.
02:25:41.000 I mean, you're talking about now something that can build further iterations of itself, where the cost of building versions of itself now is going to plummet to more or less the cost of raw materials.
02:25:52.000 You're talking about a labor-saving device of a sort that no one has ever anticipated.
02:25:58.000 And we don't have a political system that can absorb that.
02:26:02.000 We have a political system where we would see the picture of some trillionaire on the cover of Inc.
02:26:10.000 magazine, and we would hear that unemployment now was at 30% even among white-collar people.
02:26:18.000 And so we need to...
02:26:20.000 It's humbling to realize that even if we were given the perfect labor-saving device, it could screw up the world.
02:26:28.000 We couldn't reliably share that wealth with all of humanity, which is, of course, what we should do.
02:26:34.000 But we're in a system where...
02:26:36.000 The Chinese and the Russians would probably reasonably worry that we're going to use this thing as the ultimate tool of war, right?
02:26:44.000 Both terrestrial and cyber.
02:26:46.000 So just imagine the cyber war and the drone war we could unleash on the rest of the world if we had the ultimate war-making computer, right?
02:26:57.000 And they didn't.
02:26:57.000 So this is like a winner-take-all scenario that is...
02:27:01.000 It's unsustainable politically.
02:27:04.000 Politically, we have to be in a position, and economically, where if this thing were handed to us, we could use it for benign purposes and share the wealth, and we're not there yet.
02:27:14.000 And that is the best case scenario.
02:27:17.000 That isn't even dealing with any of the problems of this thing having a will of its own, which of course it would.
02:27:23.000 Is it possible that it's the next stage of life?
02:27:27.000 Yeah, well, that's the other uncanny thing at this conference.
02:27:30.000 You had a few people whose names escaped me, unfortunately.
02:27:36.000 Actually, no, the rules of the conference were I couldn't even mention their names if they hadn't escaped me.
02:27:40.000 Really?
02:27:41.000 Yeah.
02:27:41.000 Whoa.
02:27:42.000 There's secret rules?
02:27:44.000 Well, no, it's called the Chatham House Rules.
02:27:46.000 Certain conferences are organized under, you know, board meetings are organized under these...
02:27:52.000 Rules where you, because you want to encourage, so there's no press there, right?
02:27:55.000 And you want to encourage just a free exchange of ideas, and you can talk about what was talked about there, but you can't give any attribution, and you can't, I mean, nothing's formally on the record.
02:28:07.000 Where did this conference take place?
02:28:08.000 Puerto Rico.
02:28:09.000 Whoa, you had to go to another country, sort of.
02:28:11.000 Not really, but sort of, right?
02:28:13.000 Yeah, I don't know that that was...
02:28:14.000 I think it was just...
02:28:15.000 They were looking for good weather.
02:28:16.000 It was in the middle of winter.
02:28:17.000 Please.
02:28:17.000 They had plans.
02:28:19.000 They wanted to go to where that giant Arecibo...
02:28:21.000 Isn't that where the...
02:28:22.000 One of the big telescopes they used to search for extraterrestrial intelligence from the movie Contact?
02:28:29.000 Wasn't that in Puerto Rico?
02:28:30.000 No, I don't know.
02:28:30.000 The Arecibo disk?
02:28:32.000 I feel like that's it.
02:28:33.000 They have some of those things in the southern hemisphere.
02:28:39.000 So...
02:28:40.000 Isn't it?
02:28:41.000 Oh, so your question...
02:28:42.000 Your question about whether this could be the next form of life.
02:28:44.000 But it's a new stage of life.
02:28:45.000 I mean, are we a caterpillar that's giving birth to a butterfly that we're not aware of?
02:28:49.000 Essentially, one of these guys gave a talk that was all purpose toward making that ethical case, that this is...
02:28:58.000 Is that Puerto Rico?
02:28:58.000 Yeah.
02:28:59.000 They're talking to aliens, bro.
02:29:01.000 They're not even letting you know.
02:29:02.000 They're already planning.
02:29:03.000 Aliens are making these things.
02:29:05.000 They're making an alien.
02:29:06.000 That's what an alien is.
02:29:07.000 But that's the thing.
02:29:08.000 This thing then gets that weird.
02:29:11.000 Like, when you imagine...
02:29:14.000 This thing getting away from us, yeah, it would be its own...
02:29:17.000 Now, whether or not it would be conscious...
02:29:18.000 I'm actually agnostic as to whether or not a super-intelligent computer would, by definition, be conscious.
02:29:23.000 It could be unconscious.
02:29:25.000 It could be nothing that it's like to be that sort of system, or it could be conscious.
02:29:30.000 But, in any case, this one guy gave a talk...
02:29:34.000 This one guy gave a talk where he just speculated about this thing taking off and...
02:29:42.000 More or less standing in relation to us the way we stand in relation to bacteria or snails or, you know, life forms that we just squash without a qualm.
02:29:52.000 And that's a totally benign, acceptable, not only acceptable, but to be hoped for, right?
02:30:00.000 And I can follow him there at least halfway.
02:30:05.000 If you imagine that these are conscious and actually become the center of the greatest possible happiness in the universe, right?
02:30:14.000 So, like, if you imagine we build—if we give birth to a conscious machine that is essentially a god, right, that has interests and states of pleasure and insight and meaning that we can't even imagine, right— That is,
02:30:32.000 that thing by definition, by my definition, becomes more important than us, right?
02:30:36.000 Then we really are like the chickens that, you know, hope we don't kill them to eat them, but they're just chickens and we're more important because we have a, you know, this greater scope to our pains and pleasures.
02:30:48.000 And that's not to say that I don't see any moral problem with killing chickens.
02:30:51.000 I'm just saying that we are more important than chickens because of the nature of human experience.
02:30:56.000 And its possibilities.
02:30:57.000 But if we build a machine that stands in relation to us the way we stand in relation to chickens, or far beyond, right?
02:31:05.000 It's nowhere written that the spectrum of possible intelligence ends somewhere close to where we are.
02:31:12.000 Not only that, there's nowhere written that they cannot create far better versions than we could ever possibly imagine.
02:31:19.000 That's implicit in what I'm saying.
02:31:21.000 We're imagining that this takeoff would be This machine makes recursive improvements to itself or to new generations.
02:31:31.000 Yeah, so it's changing its own code, it's learning how to build better versions of itself, and it just takes off.
02:31:39.000 But one horrible possibility...
02:31:43.000 Is that this is not conscious, right?
02:31:45.000 That there's no good that has come of this.
02:31:47.000 This is just blind mechanism, which still is godlike in its power.
02:31:57.000 And it could be antithetical to our survival, or it could just sort of part ways with us, you know?
02:32:04.000 That's the mindfuck of all mindfucks, is that we really are just a caterpillar, and we're giving birth to this ultimate fractal intelligence that's infinite in its span.
02:32:15.000 You could create something within, like as you said, a week, 20,000 years of life.
02:32:31.000 Oh, yeah.
02:32:39.000 It's sort of kind of crazy that, as you said, a lot of these guys that are creating these things are on the spectrum.
02:32:45.000 What is that from?
02:32:47.000 Is it possible that these super intelligent human beings, that a lot of them do have this sort of Asperger's-y way of approaching life, and a lot of them are not on the spectrum.
02:33:02.000 Did nature sort of design that in order to make sure that we do create these things?
02:33:09.000 I mean if everything in life if life itself everything in life We look at alpha wolves and the way caterpillars interact with their environment and bugs and whatever all that stuff's natural is human behavior human Cognitive thinking is human creativity is all that nature is all that just a part of human beings ultimate curiosity is Almost inevitably leading to the creation of artificial intelligence.
02:33:34.000 Was it sort of programmed into the system to create something far better than what we are?
02:33:39.000 Well, I wouldn't say it's programmed into the system necessarily.
02:33:42.000 I think you can explain all this just by...
02:33:46.000 Everything being pushed from behind by...
02:33:48.000 We're following our own interests.
02:33:50.000 We're trying to survive.
02:33:50.000 We have all of the inclinations and abilities that evolution has selected for in us, and we have an ability to create increasingly powerful technology.
02:34:01.000 And the...
02:34:03.000 But the inevitability of this is hard to escape.
02:34:07.000 There are really only two assumptions.
02:34:10.000 All you have to assume is that we are going to build better and better computers, which I think you have to assume, apart from the possibility that we're just going to destroy ourselves and lose the ability to do so.
02:34:22.000 But if we don't destroy ourselves some other way, we are going to continue to make progress in both hardware and software design.
02:34:31.000 And the only other thing you have to assume is that there's nothing magical about the wetware we have inside our heads as far as information processing is concerned and that it's possible to build intelligent machines in silicon, right?
02:34:47.000 I can't imagine any, at this point, serious scientist fundamentally downing either of those two assumptions.
02:34:56.000 Nobody thinks that there's something magical about being, you know, neural material when it comes to intelligent, you know, the process of information that is underlying intelligence.
02:35:10.000 And we're just going to keep making progress.
02:35:12.000 So at some point this progress is going to birth a generation of computers that is better able to make this sort of progress than we are.
02:35:22.000 And then it takes off.
02:35:24.000 And so the benign version of this that some people imagine, and this is where the whole singularity begins to sound like a religion, but there are many people in Silicon Valley imagining that we are going to merge We're going to upload our consciousnesses onto the internet eventually and become immortal and just live in the dreamscape of the paradise that we have engineered
02:35:54.000 into our machines.
02:35:57.000 That vision Presupposes a few other things that are much more far-fetched than the first two assumptions I just listed.
02:36:09.000 One is that before this happens, we will crack the neural code and truly understand how to upload the information in a human brain into another medium, and that you could move consciousness, mind and consciousness, into the Internet.
02:36:28.000 We're good to go.
02:36:46.000 Haven't we just doubled you?
02:36:48.000 And then when you die, aren't you just dying every bit as much as you would be dying if we hadn't done that?
02:36:52.000 I mean, there are problems of sort of identity that come in there that are sort of hard to solve.
02:36:58.000 But no, there are people who are looking at this as a, you know, it's very much like we're building the matrix in some sense, and we're going to leap into it at the opportune moment, and it's going to be glorious.
02:37:11.000 That is such a utopian possibility.
02:37:14.000 Like, that's the utopian version of Ex Machina, right?
02:37:18.000 Which comes out, isn't that out right now?
02:37:20.000 Yeah, I saw it.
02:37:21.000 I haven't seen it yet.
02:37:22.000 But I mean, this is what we're talking about.
02:37:24.000 I mean, that's just, that is probably the most benign version of it.
02:37:28.000 An artificial person.
02:37:30.000 It's like, you can't distinguish it between that and a real person.
02:37:33.000 Although not if you've seen the film.
02:37:34.000 Well, I haven't yet.
02:37:35.000 But our own consciousness is, I mean, it's so archaic in comparison.
02:37:40.000 If you're talking about something that can exponentially increase in one week, 20,000 years, and then on and on and on from there, why would you want to take your consciousness and download it?
02:37:51.000 It's like a chicken asking, you know, how can I stay a chicken?
02:37:55.000 Where are my feathers going to go in this new world?
02:37:58.000 Yeah, if you could take an ant and turn it into Einstein, would it really want to go back to being an ant?
02:38:02.000 I prefer digging in the dirt and just dropping my eggs and cutting leaves.
02:38:08.000 What would it do?
02:38:11.000 Ultimately, this is inescapable, it seems like.
02:38:13.000 Our thirst for ingenuity and innovation is just never going to slow down.
02:38:19.000 And our ability to do that is never going to slow down either, unless a supervolcano hits.
02:38:26.000 Yeah.
02:38:26.000 Well, and the other side of this is there's so many problems that we would want artificial intelligence to solve for us.
02:38:33.000 I mean, you think of, you know, curing Alzheimer's or solving, you know, global economic problems.
02:38:38.000 How great would it be to have a reliably benign superintelligence, which literally would be like an oracle, right?
02:38:47.000 Or a god, to...
02:38:51.000 Help us solve problems that we're not smart enough to solve.
02:38:54.000 But the prospect of building that and keeping it reliably benign or keeping ourselves from going nuts in its presence, it's just a non-trivial problem.
02:39:06.000 Wouldn't it almost instantaneously recognize that part of the problem is us itself?
02:39:11.000 We're the problem!
02:39:12.000 That's one reasonable fear, yeah.
02:39:15.000 And it would also immediately recognize, like, hey, this planet has only got, like, another billion years of reliable sunlight.
02:39:22.000 Like, we've got to get the fuck out of here and propagate the universe.
02:39:25.000 Well, there's a great book, if you really want to get into this, there's a book by the philosopher Nick Bostrom and...
02:39:32.000 He actually might have been the one who convinced Elon this is such a problem.
02:39:37.000 He was one of the organizers at this conference, and virtually everyone had read his book at the conference.
02:39:42.000 He wrote a book called Superintelligence, which just lays out the whole case.
02:39:47.000 Virtually everything that you've heard me say on this topic is some version of a concern that he expresses in that book.
02:39:57.000 And it's very interesting because he just goes through it.
02:40:02.000 It's like 400 pages of systematically closing the door to every utopian way this could go right for us.
02:40:14.000 And he just is like, yeah, well, here are the things you're not foreseeing about how even a...
02:40:22.000 You just have to anticipate absolutely everything.
02:40:25.000 So if you're trying to create a machine that is going to block spam, you need to create a machine that will not, as a strategy for reducing spam, just kill people.
02:40:38.000 That's a way to reduce spam.
02:40:40.000 That's the only way.
02:40:42.000 Yeah, it's like common sense things.
02:40:44.000 You can't assume common sense in a super intelligent machine unless you have engineered it into the architecture or you have taught it, you've built it to emulate your values.
02:40:59.000 There's strategies where...
02:41:02.000 You would build a machine where it would not merely emulate current human values.
02:41:12.000 Ultimately, you want a machine that instantiates the values that we should have, not that we necessarily do in any moment.
02:41:22.000 One thing that's interesting to me in thinking about this is that the moment you think about building a machine like this, You realize that you have to solve some fundamental philosophical problems.
02:41:34.000 You can't pretend that everyone has equivalently valuable values, right?
02:41:41.000 Because you have to decide what to engineer into this thing.
02:41:44.000 So do you want to engineer the values of jihadists and Islamists?
02:41:49.000 Did the Taliban get a vote on how we should design the values of this thing?
02:41:54.000 I think it's pretty obvious that the answer to that is no.
02:41:59.000 But then you have to cut through basically every other moral quandary we have, because this thing is going to be acting on the basis of values.
02:42:08.000 But initially, if it's independent and autonomous, it's going to automatically realize that a lot of our ideas are based on our own biological needs, and that a lot of those are unnecessary for it.
02:42:22.000 Oh, yeah, but we will be building it, I mean, if we're sane, we're going to build it not to be merely self-interested.
02:42:30.000 We're going to build it to conserve our interests, whatever those deepest interests are, ultimately.
02:42:36.000 Again, utopian, though.
02:42:38.000 Otherwise, we're building, you know, Satan.
02:42:43.000 Which is what Elon Musk said.
02:42:45.000 Summoning the demon, yeah.
02:42:47.000 We're building a wrecking ball, and we're going to swing it out away from the planet and watch it hurtle back.
02:42:53.000 So essentially, the Unabomber was right.
02:42:57.000 Have you ever seen people, a few people have done this with his text because there are sections of his text that can read totally irrational.
02:43:06.000 And people occasionally will put a section there and then it's not until you turn the page and have already agreed with it that you see who wrote it.
02:43:13.000 Right.
02:43:15.000 But, yeah, well, yeah, it's interesting just to see that we are kind of headed toward some kind of precipice here.
02:43:25.000 Do you know how he lost his mind?
02:43:27.000 Do you know the story about Ted Kaczynski?
02:43:29.000 I don't know.
02:43:30.000 He was part of the Harvard LSD studies.
02:43:32.000 They dosed the shit out of that dude.
02:43:35.000 Yeah, they dosed him.
02:43:36.000 He went to Berkeley, started teaching, and saved up all his money from teaching, and went to the woods and started blowing up people that were involved in technology.
02:43:43.000 Yeah, there's a documentary called The Net, and I believe it's from Germany.
02:43:48.000 I believe it was a German documentary, but it's very secretive, like, who was and was not involved in those Harvard LSD studies.
02:43:56.000 Wow.
02:43:57.000 I know people on the other side of those studies, and I knew Richard Alpert, who became Ram Dass.
02:44:02.000 Right, yeah.
02:44:04.000 But...
02:44:04.000 Didn't kill everybody.
02:44:05.000 Didn't break everybody's brain.
02:44:06.000 No, no.
02:44:07.000 But, I mean, he might have had a vision that he chased down, you know...
02:44:12.000 Add to the final point, and he recognized from his experiences, like, whoa, if we keep going, this is inevitable, and became obsessed with it.
02:44:21.000 Obviously, one of the things that people try to connect is various drugs with schizophrenia and mental illnesses.
02:44:31.000 And most of those have not been able to stick because there's a certain percentage of people that will inevitably have issues.
02:44:38.000 And the percentage of people that have issues with schizophrenia or various mental illnesses are almost mirrored by the percentage of people who do psychedelic drugs, various psychoactive drugs, and develop drugs.
02:44:52.000 These mental issues.
02:44:53.000 So it might not be the cause, but it's a concern.
02:44:58.000 And if you get a guy who may have a propensity for mental illness and you dose the shit out of him with LSD, you might get a Ted Kaczynski.
02:45:07.000 No, I think there are some people who certainly shouldn't take any drugs.
02:45:10.000 Yeah, anything.
02:45:13.000 And, you know, I've had bad experiences on a variety of psychedelics, as well as good ones, obviously, but the bad experiences I could see in the wrong mind affecting you permanently in a way that's not good for you or anyone else.
02:45:29.000 You can go off the rails.
02:45:30.000 I went off the rails for a couple weeks once.
02:45:33.000 Not really off the rails.
02:45:34.000 I was totally functional.
02:45:35.000 Most people probably didn't even know that I was off the rails.
02:45:38.000 But the way I describe it is that my grip on reality had gotten very slippery.
02:45:44.000 Like I was kind of hanging on.
02:45:46.000 Have you ever done chin-ups with sweaty hands?
02:45:48.000 Yeah.
02:45:49.000 You're not exactly sure how many you can get on before your hands give out.
02:45:54.000 And that's kind of how I felt.
02:45:55.000 I didn't feel like I had chalk on my hands and wrist straps.
02:45:59.000 I felt like I had a slippery grip on reality.
02:46:02.000 And thankfully, within a couple of weeks, it came back.
02:46:05.000 It felt normal.
02:46:07.000 But especially the first few days afterwards, just a very intense psychedelic experience that was as boundary-dissolving as you can get.
02:46:17.000 I mean, it might be like...
02:46:18.000 An argument that there's probably several versions of each person based on your reactions to whatever experiences you had, but that might have been version 2.0 of me.
02:46:29.000 After that, I'm a different person.
02:46:31.000 I became a different person because of that.
02:46:33.000 And that could easily be what happened to poor old Ted.
02:46:37.000 Yeah, like you said, some of his assertions, if you look at the direction the technology is headed, obviously he was fucking batshit crazy, but he said some things that weren't batshit crazy.
02:46:52.000 Yeah, yeah.
02:46:53.000 Well, there's nothing...
02:46:54.000 So if we can make a meme, if you could just say, Ted Kaczynski was right, and we'll just put that in quotes.
02:46:58.000 Put that on Twitter.
02:47:00.000 Isn't that what it boils down to today?
02:47:03.000 Is putting a photo of you with a quote taken completely out of context, and everybody sort of shits on it, or agrees or disagrees.
02:47:13.000 It is...
02:47:14.000 It's fun, though.
02:47:16.000 But honestly, I think it's...
02:47:18.000 You're doing well if you never knowingly do that.
02:47:24.000 I mean, if you never knowingly misrepresent your opponent, then you can get into just knock-down, drag-out arguments about anything.
02:47:31.000 Then it's all fine.
02:47:33.000 But as long as you're interacting with...
02:47:36.000 A person's actual views, then condemn those views and criticize those views to whatever degree.
02:47:42.000 But if part of your project, or the entirety of your project, is simply knowingly sliming them with a misrepresentation of their views, because you can get away with it, because you know their views are either so hard to understand for most people,
02:48:00.000 or just people aren't going to take the time to do it, that you're just defaming people.
02:48:05.000 Well it's also that there's a desire to win that a lot of people have that they apply to debates and it makes them intellectually dishonest because they don't want to agree that someone that they might have a disagreement with You may have a point or two.
02:48:22.000 You might disagree with the entirety of what they're saying, but somewhere along the line, it might be possible that you could see where they're coming from, even if you don't agree, but it just throws your argument into a bad position,
02:48:37.000 so you abandon it.
02:48:39.000 The thing is, the merit of an argument has no relationship to its source.
02:48:44.000 Really.
02:48:45.000 It's like either the argument succeeds or fails based on the structure of the argument and its connection to evidence, or it doesn't, right?
02:48:55.000 And it doesn't matter if it's Hitler's argument for the destruction of the Jews or, you know, Ted Kaczynski said something true about the progress of technology.
02:49:07.000 Whether it's true or not about the progress of technology has nothing to do with the source, but...
02:49:11.000 People imagine that if you don't like the source, there's no burden to actually address the arguments.
02:49:20.000 And if you don't like the arguments, a successful rejoinder is just to trash the source.
02:49:25.000 Neither of those are true.
02:49:27.000 If you want to deal with...
02:49:30.000 If you want to get at what's true in the world, you have to deal with arguments on their merits.
02:49:33.000 You have to deal with evidence.
02:49:35.000 And it doesn't matter if the evidence is coming from a thoroughly obnoxious source.
02:49:41.000 It's not sufficient to say, well, I hate the source.
02:49:46.000 As shorthand, we all have to privilege our attention and time.
02:49:51.000 So if you know a source...
02:49:53.000 Is disreputable.
02:49:54.000 At a certain point, you can just decide, well, I don't need to hear it from this person because I know this person doesn't understand what he's saying and has lied in the past, so I'll wait to hear it from somebody else.
02:50:06.000 So yeah, it's not that the source doesn't matter at all, but you're not actually addressing truth claims if you're just disparaging the source of those claims.
02:50:16.000 We don't have much time left.
02:50:18.000 We have less than 10 minutes.
02:50:19.000 Is there anything else you'd want to get into?
02:50:24.000 The only other thing on this list, which is just too big for 10 minutes and we're going to get in trouble, is a lot of people hit us with cops, Baltimore, self-defense, violence, weapons, all that stuff.
02:50:39.000 Yeah, that's a big one.
02:50:41.000 That's an hour.
02:50:42.000 At least.
02:50:43.000 Yeah, and it's also, we don't need to talk about it once artificial intelligence kicks in, we're not going to have crime anymore.
02:50:52.000 We've pretty much cured it all with the final hour.
02:50:56.000 The artificial intelligence conversation sort of trumps the whole thing, because Islam's not going to be important when there's robots that can...
02:51:03.000 Read your brain.
02:51:04.000 I mean, we're going to not need people anymore.
02:51:06.000 So you don't need religion.
02:51:08.000 You don't need lies.
02:51:09.000 Charlie Hebdo's completely irrelevant.
02:51:11.000 It'll be a footnote in history.
02:51:13.000 It'll be like there were monkeys, they threw their shit, and then there were robots that could think for themselves, and that was it.
02:51:19.000 Don't forget about all that building the Eiffel Tower.
02:51:22.000 And what's scary about this thing is that it's very hard to...
02:51:26.000 So I read the book.
02:51:27.000 I went to the conference.
02:51:28.000 I've written about this.
02:51:29.000 I've spoken about this.
02:51:31.000 I hang out with people who are worried about this.
02:51:33.000 And it's actually still hard to keep this concern in view.
02:51:38.000 The moment you spend ten minutes not thinking about it...
02:51:43.000 To start thinking about it again makes you feel like, I mean, that's just all crazy, right?
02:51:47.000 So this is all bullshit.
02:51:48.000 I mean, what am I, like, oh, what, they're really going to be a super-intelligent machine that's going to take, you know, swallow the world, right?
02:51:54.000 Wasn't that the devil's greatest trick?
02:51:56.000 Yeah, right.
02:51:57.000 Isn't that what the expression is?
02:51:59.000 The devil's greatest trick is convincing the world it doesn't exist?
02:52:02.000 Yeah, yeah.
02:52:03.000 But it's, I mean, this is on, unlike other, I mean, other things have this character.
02:52:09.000 Like, it's hard to really worry about climate change I mean, we're good to go.
02:52:34.000 And yet it's much easier to worry about other far more...
02:52:38.000 It's easier to worry about Twitter than it is to worry about that.
02:52:41.000 But this thing is so kind of lampoonable, and it's just kind of a goofy notion which seems just too strange to even be the substance of good, credible fiction.
02:52:54.000 And yet, when you look at the assumptions you need to get on the train, there's only two.
02:53:02.000 And they're very hard to doubt the truth of.
02:53:08.000 Again, we just have to keep making progress and there's nothing magical about biological material in terms of an intelligent system.
02:53:16.000 And by the time it becomes a threat to everyone, by the time we recognize it as a threat, ideally it'll be too late.
02:53:26.000 Well, people have different timings of what they call the takeoff, you know, whether it's a hard takeoff or something more gradual.
02:53:32.000 But, yeah, it's the kind of thing that could happen in secret, and all of a sudden, things are different in a way that no one understands.
02:53:45.000 And you can also make this argument that if you look at all the issues that we have in this world, that so many of them are almost unfixable without this.
02:53:53.000 Yeah, again, that's what I said.
02:53:55.000 I said, I think, in my blog post, the only thing scarier than the development of strong artificial intelligence is not developing it.
02:54:04.000 Because we need...
02:54:05.000 We have problems for which we need...
02:54:09.000 I mean, intelligence is our only asset, ultimately.
02:54:12.000 I mean, it's giving us everything good.
02:54:14.000 Right, and why should we accept our limited biological intelligence when we can come up with something infinitely more intelligent and godlike?
02:54:21.000 Progress in this area seems almost an intrinsic good.
02:54:27.000 Because we want to be able to, whatever you want...
02:54:30.000 Clean the ocean.
02:54:31.000 You want to be able to solve problems, and you want to be able to anticipate the negative consequences of your doing good things and mitigate those.
02:54:42.000 And intelligence is just the thing you use to do that.
02:54:47.000 Let's end here.
02:54:48.000 Freak everybody out.
02:54:49.000 Sam Harris, blog.
02:54:51.000 What is your blog?
02:54:53.000 SamHarris.org.
02:54:54.000 SamHarris.org.
02:54:55.000 Blog and podcast.
02:54:56.000 SamHarris on Twitter.
02:54:58.000 SamHarris.org.
02:54:59.000 Org on Twitter.
02:55:00.000 Yeah.
02:55:01.000 Thanks, man.
02:55:01.000 I really appreciate it.
02:55:02.000 Always a good time.
02:55:03.000 Lots of fun.
02:55:03.000 All right.
02:55:04.000 Much love, my friends.
02:55:05.000 We will be back on Friday with Rich Roll.
02:55:09.000 Until then, see you soon.
02:55:11.000 Bye-bye.
02:55:29.000 I've never met him, but...