Sam Harris ( ) joins us to talk about Jon Jones being stripped of his UFC title, Jose Aldo vs. Conor McGregor, and much, much more. We also talk about the recent car crash that left a pregnant woman in need of medical care and how the driver flees the scene of the crime. And we answer some of your burning questions. We are part of the Robots Radio Podcast Network. See all the great network shows at RobotsRadio.net. Episode Music: "Space Travel" by Borrtex "Goodbye Outer Space" by Cairo Braga "Outer Space Warning" by Fountains of Wayne "Space Junk" Adams "Outro Music" by Zapsplat "Incomptech" by D'Andra "Space Truck" by KRS-One "Outcome" by PBR "Outward Vision" by Squeex "Outpost Commando" Join our FB group! Subscribe to our new podcast! Subscribe, Like, and Share to stay up to date with the newest episodes of Robots Radio and other awesome stuff going on in the world of Radio and FMK Radio & FMK. Subscribe on all social medias including Apple Podcasts, Vevolution, Podchaser, and TikTok! We now have a new podcast version of the show called Robots Radio & MMA Radio! Subscribe on iTunes! Learn more about your ad choices! Become a supporter of RobotsRadio & MMA Podcasts! Rate, review, review and subscribe to our podcast on iTunes, and become a Friend of the Crew Podcasts on all major podcast directories and social media platforms! If you like the show, we'll be giving you a chance to win a FREE PRICING opportunity to win tickets to our upcoming show on the next episode! Send us a review on the show! and much more! on Rate/subscribe to our newest episode on Rate and review us on iTunes & review the show on Podchips, too review it on your favorite streaming platform, and we'll send you a review and review it over to the world? Thank you for listening to the show and review our podcast? Subscribe & subscribe on your thoughts and subscribe on the podcast, too send us your thoughts on the pod is amazing reviews and reviews are amazing and more reviews on the world will be reviewed on the best of your favorite podcasting experience on social media and more.
00:01:45.000A couple days I think he disappeared for.
00:01:47.000Which, you know, obviously the speculation would be that he was on something that he would worry about getting tested for for 24 hours or 48 hours or whatever it was.
00:01:55.000Think of how desperate a move and hapless a move that is to run away from the car that you're leaving in the scene, which is obviously traceable to you.
00:02:04.000It's not like you're getting out of the problem.
00:03:37.000And you were about to get into a fight with a parakeet, you'd absolutely be confident because you've always been a rhino and that parakeet is just a parakeet.
00:03:47.000But a person is a work in progress and not just from your fighting style but Your ability to manage your own mind, your ability to manage insecurities and anxieties, just the existential angst that comes with being a human being that's navigating their way through this complicated world.
00:04:17.000I mean, you can be One of those John Jones types, it's unbelievably physically talented and have a leg up on a lot of people, but even John had to get his ass kicked.
00:04:44.000You're just like, fuck, this guy's overwhelming.
00:04:46.000It's like, if you can put on a show like that, if you can put your peacock feathers up and puff up your back hairs, get those bristles up nice and high like a cat when it's angry and hunt your back up, there's a reason why that exists in nature.
00:05:00.000Well, it's beyond what Muhammad Ali used to do.
00:05:03.000It's going to be fascinating psychologically to see him when he loses.
00:05:07.000When that day comes, it'll come eventually.
00:05:20.000He has not lost since he's been in the UFC. But in all fairness, the one style that he has not faced, which is the most difficult style, he's never faced a wrestler.
00:05:32.000He's never faced a guy who can get in there and get him to the ground and outwork him and just sap his energy.
00:05:41.000Wrestlers, they get on top of you and you can't get them off and you get exhausted trying and they're punching you in the face and elbowing you in the face and punching you in the body and constantly trying to strangle you and then the round is up and you get up and the next round comes and they do it again.
00:06:34.000So Aldo is going to present him with some things inside the octagon that he's never faced before.
00:06:41.000However, Aldo has been fighting professionally for a very long time, and like a race car that doesn't ever get its tires changed or doesn't ever get its suspension redone.
00:06:54.000As a human being, your body can literally only take so many miles.
00:06:59.000There's only so many times you can go to war.
00:07:02.000There's so many sparring sessions you can take part in.
00:07:05.000Only so many wrestling sessions you can take part in.
00:07:08.000There's just a certain amount your body can take and we don't know when that number is.
00:07:13.000When you reach that number though, that's it.
00:07:16.000There's no turning back unless you're using testosterone or growth hormone or some things that turn your body into the superhuman sort of experiment.
00:07:50.000But now Nevada has made testosterone replacement therapy illegal.
00:07:55.000So his last three performances, which are some of the best performances of his career, the knockout of Michael Bisping, the knockout of Luke Rockhold, and the knockout of Dan Henderson, those were all while he was on testosterone.
00:08:23.000I would like that level of testosterone.
00:08:25.000Well, you know, it's also the level of testosterone of a regular 37-year-old man versus the level of testosterone of someone going through a camp.
00:08:33.000When you're going through those camps, it's absolutely brutal.
00:08:36.000Like, Jon Jones and Daniel Cormier both got tested randomly.
00:08:40.000Before their tidal fight, and their testosterone levels were so low, people were wondering, like, hey, maybe these guys are doing something.
00:08:48.000Maybe they were doing steroids, and that made their testosterone levels drop.
00:08:52.000But what's most likely, it's testosterone to epitestosterone.
00:09:04.000You're showing up every day exhausted, and your muscles are sore, and your body's exhausted, and you gotta go through it all again tomorrow, and you're getting kicked and punched, and you're lifting weights, and you're doing sprints, and you're jumping up on boxes, and then the next day, all over again.
00:09:18.000Your body just can't keep up, especially when you get into your 30s.
00:09:35.000As a neuroscientist, does it disturb you at all when you're seeing these guys getting their heads rattled?
00:09:42.000When you're very aware of what's going on behind the scenes in the actual brain itself, Yeah, well, I just talked about this on my blog with this writer, Jonathan Gottschall, who I told you about at one point.
00:09:54.000Yeah, I'm trying to get him on the podcast.
00:09:59.000So we had a conversation, which we published the transcript of, but he's got this book, The Professor in the Cage, where he's an academic, he's an English professor, and he decided to get really into MMA and fight a real cage match.
00:10:36.000It's just when things start going that way, you feel your own testosterone or something kick in.
00:10:44.000But I mean, his recommendation was that, and I'm sure he's not the only person who's thought of this, he thought the gloves should come off, and the gloves are making it realistic to just send endless overhand rights and other crazy punches, you know, full force into people's heads for,
00:11:03.000Whereas if it was a bare-knuckle fight, you couldn't really, it'd be fewer knockouts, but you couldn't deliver that kind of Brain trauma because you'd be breaking your hands now obviously there are things like elbows and knees and kicks and so people would still be getting hurt but What do you think about that with that change?
00:11:22.000I'm a big advocate of that and I've been said it many times I've said on the air I've talked about it on this podcast I think it's very unrealistic the way you're allowed to not just put gloves on but also wrap your hands up tape your wrists Yeah.
00:11:56.000Foreheads are far harder than your knuckles are most likely you're gonna break it unless you hit them on the nose in the eye on the jaw You're gonna hurt your hand and you can only throw so many punches like that Even just hitting a heavy bag without being wrapped up you just screw up your wrist and your hands Yeah, you have to really learn how to tighten everything up and you have to really strengthen your wrist and you have to strengthen your your your hand muscles It's a it's a completely different thing,
00:12:19.000which is why The karatekas, the karate students, would hit a makiwara, which is a board that's wrapped with rope.
00:12:27.000And the idea behind punching that board wrapped in rope over and over again is you develop these intense calluses all over the front two knuckles, which is really the only way you're supposed to hit someone.
00:12:37.000Those are the knuckles that are reinforced by the wrist, whereas where the pinky finger is and the ring finger, those knuckles are not reinforced nearly as well, especially if you have larger hands, like your hand, like my hand, Spreads out past my wrist.
00:12:51.000It doesn't go in a straight line from my wrist to the pinky.
00:13:41.000Marcello's specialty is the rear naked choke, and the rear naked choke, a big part of it is sliding your hands underneath the guy's chin.
00:13:50.000And you have the glove there, there's all this extra padding that makes it very difficult to get your hand in the proper position to get the choke right.
00:14:00.000It's also very difficult to get the glove in the back of the head.
00:14:03.000So a lot of times you'll see in an MMA fight, guys who use poor technique and still finish the choke, well they'll use their palm on the top or the back of the head because they can't do this.
00:14:12.000They can't do this movement where it's actually the back of your hand that should touch the back of your opponent's head.
00:14:19.000This is all, like, to someone who doesn't understand jiu-jitsu, this is very complicated, but I agree with him.
00:14:29.000But I think it's also, it would be beneficial for everyone to have some intensely...
00:14:40.000Comprehensive scans done on a regular basis.
00:14:43.000I don't know if that would be prohibitive Financially, I don't know how much that would cost I don't know how much that would even reveal because apparently one of the things that is troublesome for these NFL players is When they die and they do autopsies on them and they they reveal damage that they had no idea before I don't know how much like an fMRI or MRI can detect When it comes to the actual damage.
00:15:09.000If you're just going to be in a job where you have to get hit in the head, forget about competition, just training.
00:15:15.000I mean, these guys train hard, as you know, and so they're just getting hit in the head to prepare them to get hit in the head in the match.
00:15:29.000The causal linkage between getting hit in the head and brain trauma is 100%.
00:15:35.000It's just a matter of how much you individually, by dint of luck, can take until you actually have damage that matters.
00:15:45.000So, you know, I obviously haven't had an experience of anything like an MMA fighter, but I regret all the...
00:15:54.000The head injuries I took, just training as a...
00:15:57.000I mean, now in martial arts, I just don't let myself get hit in the head.
00:16:00.000But as a teenager, I got hit in the head a fair amount, and I played soccer and headed the soccer ball, and that always felt totally weird.
00:16:52.000Like your brain gets rattled around inside your head, the connective tissue dislodges, and it doesn't heal back.
00:16:59.000I spoke about this with Jonathan too, that there's obviously all of these sports and just forms of recreation that entail some risk of injury and death, right?
00:17:11.000And people should be able to do these things informed of the risks.
00:17:17.000And so, you know, cheerleaders, and the example he brought up is cheerleading.
00:17:20.000I mean, cheerleaders sometimes hit the ground and just are, you know, fantastically injured.
00:17:24.000So all these things that don't necessarily seem like high testosterone, high risk, you know, just foolishly reckless sports can be very dangerous.
00:17:37.000Skiing is very dangerous, too, and rock climbing.
00:17:39.000There are things that are even just non-violent that don't entail much risk of injury until they kill you, like free solo rock climbing.
00:17:49.000Maybe you've hurt your hands in the past, but then all of a sudden you're dead because you just went up 500 feet without a rope and fell.
00:17:57.000So there's all these kinds of risks that people can take, but the problem The problem that I think differentiates striking sports from even something like football is that the progress in the sport is synonymous with the damage.
00:18:15.000So if you and I are in a boxing match or a kickboxing match hitting each other, Every instance of successfully degrading each other's performance with respect to the head, hitting someone in the head, is synonymous with delivering injury to the brain.
00:18:33.000It's not incidental like in football, where I was trying to tackle you, I was not hoping to hurt your brain, but...
00:18:40.000You know, you fell down hard and it did.
00:18:44.000This is just, you know, a trade of brain damage.
00:18:48.000And yeah, so it's interesting ethically.
00:19:02.000It would just make it more realistic combatively, too.
00:19:05.000Insofar as you want to see what works combatively, I'm more interested to see what two people can do just with their bare hands than when they've got these tape and pillows.
00:19:31.000Or does it somehow or another, is it correlated to brain damage?
00:19:37.000I mean, that's one of the issues with brain damage, impulsive behavior.
00:19:41.000Yeah, especially in the frontal lobes, because your frontal lobes regulate your emotional and behavioral behavior.
00:19:49.000And when those connections, when either the cell bodies or the connections between the gray matter and the frontal lobes and your limbic system and your basal ganglia and other areas in the brain, when that gets damaged, yeah, you have these classic impulse control problems where you just reach out and grab the woman standing next to you at Starbucks because you couldn't dampen the impulse to do it.
00:20:16.000That's hard for people to grasp, because, I mean, again, this should be really clear.
00:20:21.000I am, without a doubt, not trying to let him off the hook.
00:20:35.000I think the UFC absolutely did the right thing in stripping him of his title, and I think law enforcement is going to do the right thing by putting him in jail.
00:20:45.000You can't do that you can't hit someone with a car and leave the scene of the crime that it is a crime yeah, but There are things that people do because they have brain damage and that's where the real question comes up is Obviously,
00:21:02.000they're responsible ultimately for their own actions, but what is it that's responsible for making them do that action?
00:21:09.000I mean we had this long conversation once Two podcasts ago, I think, about free will and determinism.
00:21:16.000These are variables that come into play when it comes to the ultimate actions that you choose to do.
00:21:22.000The ultimate movements that you choose to take, the thought processes, are unquestionably dependent upon the brain itself.
00:21:32.000And if the brain is getting damaged, and if we have proven that some of the Some of the issues with people that have brain damage is impulse control.
00:21:42.000You gotta wonder, man, when you see fighters do wild crazy shit, how much of that is due to getting just bonked in the fucking head all the time?
00:21:51.000Yeah, except for me it breaks down a little differently because the My views on free will change the picture of how I view moral culpability in those situations.
00:22:02.000So even if we knew his brain wasn't damaged, right?
00:22:06.000So he, let's say, had never got hit in the head or we did a scan on him before the car accident and we saw...
00:22:43.000This has sort of the punchline, which has certain consequences.
00:22:48.000But one of the consequences is not that we can't respond to his misbehavior, that we can't put him in jail, that we couldn't have intervened at any point along the way to have made him a better person.
00:23:01.000There's a difference between voluntary and involuntary behavior, even if it's all just causally propagating from the moment of the Big Bang.
00:23:13.000I think the brain damage case is a little bit of a red herring because it's, on some level, it's all just unconscious causes that the person himself can't ultimately account for.
00:23:27.000So there are situations in which he...
00:23:48.000One hour more sleep the night before and hadn't had a fight with his girlfriend and his blood sugar level was a little bit higher and hadn't had a friend who had told him to drink one more beer, which he normally would have resisted but couldn't because of all the other factors I just mentioned.
00:24:07.000And that is the difference that made the difference that caused him to be this total misfit on the road.
00:24:15.000Whereas, if you had just tweaked those dials a little bit, you know, no fight with a girlfriend, you know, one more bite of food in the morning, he would have been, he would have acted as you would have acted, in that case, say.
00:25:14.000You look at a text When you know you should never look at your phone when you're driving, but you decide, oh, I'm expecting a text, and you look.
00:25:22.000And there are people who are looking at that text right now and just killing some child in the crosswalk, right?
00:25:29.000And their lives are going to be ruined, and they're going to go to prison, and they're exactly like you and me, right?
00:25:39.000The luck actually propagates backward into the kind of brain you have, the kind of upbringing you had, the kind of parents you had, the fact that you got hit in the head as hard as you did or didn't, and no one has made themselves.
00:25:55.000I'm less judgmental about some of these things, given my view of free will, but I'm not...
00:26:00.000It's not that I'm not interested in making the interventions that would make a difference.
00:26:05.000Whatever we could have done to have gotten him to behave differently, we should have done.
00:26:09.000Whatever we should need to do now to him to make society better and to make him better and to get restitution for the woman, we should do all that.
00:26:17.000And so this does entail locking up certain dangerous people.
00:26:21.000It does entail, you know, we have to keep ourselves safe from people who are going to reliably act badly.
00:26:29.000And I don't know where he falls on that spectrum, but...
00:26:32.000It's just it's not the difference between the feeling you get when you hear, oh, it was brain damage.
00:26:41.000I sort of have that feeling about everything.
00:26:43.000If he gets a brain scan, if he goes to trial now, he gets a brain scan, and we find that his brain is just massively damaged in all the right areas that would have eroded his impulse control, that would seem to let him off the hook a little bit.
00:26:57.000He would look like someone who was unlucky more than he would look like a bad person.
00:27:03.000And I sort of see bad people as unlucky, too.
00:27:08.000I recognize that there are certain people who are classically bad people.
00:27:13.000Not only can you not rely on them, you can rely on them to be bad actors.
00:27:18.000So you have to be in a posture of self-defense with respect to these people.
00:27:24.000But I do view them as unlucky on some fundamental level.
00:27:30.000I share that thought, and I share that thought much more as I get older, and I have a more philosophical point of view when it comes to people that live in impoverished neighborhoods, especially like this Baltimore thing that was going on.
00:27:44.000We were just having this conversation the other day about, or last podcast, about these kids that robbed the RT reporter.
00:27:51.000I don't know if you've seen the video of it.
00:27:52.000There's an RT reporter interviewing these kids that are on the street that are causing all this havoc in Baltimore, and they start swarming this reporter, and then they rob her and take her purse and take her off.
00:28:14.000Imagine being born into this crime-ridden environment.
00:28:17.000Who knows what kind of family you have?
00:28:19.000Who knows what kind of influences you have?
00:28:21.000Who knows what kind of experiences that you've had that you've had to react to and protect yourself from and develop this hardened thick skin and attitude and And also survival instincts.
00:28:34.000And you also, your family or the people that you can reliably count on are the people that you hang out in the street, your gang.
00:28:41.000I mean, that is the big thing with gang violence.
00:28:43.000One of the big things with gang violence, one of the dirty secrets of it, is that a lot of it comes from broken homes.
00:28:48.000When people don't have a strong family environment and people they can count on and trust, they don't have anybody that's there for them.
00:28:55.000And then they find someone that's there for them in the gang.
00:29:11.000And if you were in their point of view, or if you were in their life, rather, and if you saw it through their point of view, What they see.
00:29:19.000You would look at life the way they look at life.
00:29:21.000Also, there's another variable here, which is just the influence of mob behavior.
00:29:26.000People will behave in crowds in ways that they wouldn't otherwise.
00:29:34.000Well, I can't speak to the mechanism neurologically, but it's a fascinating social phenomenon that has been thought about for at least a century.
00:29:54.000Once it gets started, the mob will behave by its dynamics that aren't really explained by the individual intentions of the individuals in the mob.
00:30:18.000He edited this literary magazine, Granta, I believe, back in the day.
00:30:24.000And he got fascinated with the phenomenon of soccer hooliganism.
00:30:28.000And he went to the UK and just started hanging out with these just diehard, I guess they were, I don't know...
00:30:37.000Manchester United or Arsenal fans, but he just got in with these guys who were normal guys, like plumbers and electricians and people who had real lives.
00:30:46.000These were not just teenagers who were thugs.
00:30:49.000They were people who had families, but soccer was their life, and they became soccer hooligans.
00:30:57.000But what's brilliant about the book, and again, it's been at least 20 years since I read it, so I could be a little off in my recollection here, but What I recall is that he wrote it in such a way that these guys he was hanging out with were really the protagonists.
00:31:14.000He got you in on their side for about 75 pages or so.
00:31:18.000And then when they start misbehaving, when they go to their first game against the Italians and form a mob and start just marauding the streets and bash kids in the head, they start behaving like sociopaths in this crowd.
00:31:36.000But he catches you out totally because you're on their team for about 75 pages.
00:32:16.000When you see people breaking windows or jumping on a car or turning over a car or looting, it takes less of any individual to participate in that.
00:32:27.000It takes less for you to go in and grab a television set when you've seen a hundred of your neighbors do it, and you wouldn't have that morning just woken up deciding to rob the store yourself.
00:32:43.000I mean, we all like to think we're the sort of people who would stand against the mob.
00:32:48.000We would be the German who would have hid Jews in our basement and stood against the Nazis.
00:32:53.000And you can multiply those cases ad nauseum.
00:33:00.000A lot of psychological science shows that, yeah, there are those people.
00:33:04.000There are the people who will stand against the tide of imbeciles who are going to do something heinous, but most people are part of the tide, and it's just a very common phenomenon.
00:33:15.000The social license, that's a really interesting way to describe it, because that is what it is, right?
00:33:23.000I mean, a big part of war is doing things that you would never do on a normal basis, in a normal scenario.
00:33:30.000On a regular basis, you are asked to put bullets into other human beings.
00:33:35.000One of the things that I thought was really interesting about the controversy about American Sniper, the Chris Kyle movie, was he was talking about what it was like the first time he killed someone.
00:33:57.000It's okay to do this and then he grew to enjoy it and then he grew It became commonplace and normal and he's like yeah, they're bad guys and I'm gonna shoot him but this the license the social license and then is a Accentuated with this This mob mentality that means you're a part of an army and you have an enemy.
00:34:19.000And it's the life or death consequences, a life or death scenario that you're a part of.
00:34:28.000It's the highest level of that type of behavior that we have in society, in our culture today.
00:34:36.000Well, interestingly, it takes a lot to get people to kill in war.
00:34:43.000I think there's some myths around how easy it is for soldiers to shoot at the bad guy, but there have been studies done in prior wars where Some shocking percentage of soldiers either never fired their guns or fired above their targets on purpose.
00:35:05.000And so some of the discipline of training soldiers has been against the grain of those tendencies, trying to get people to actually try to kill the other person.
00:35:18.000And, you know, I think we've become more successful probably at doing that.
00:35:22.000You know, this is not something I know a ton about.
00:35:24.000I just know that this research is out there.
00:35:27.000And the main dynamic, I think, with soldiers is you are trying to keep your buddy safe, and he or she now is trying to keep you safe.
00:35:39.000And they're not only firing at you We're good to go.
00:36:08.000And so now obviously there are aspects of war-making that don't fit that mold, and...
00:36:15.000Some of the more disturbing aspects that actually require less of us in terms of you're dropping bombs from 30,000 feet or you're flying a drone from an office park outside of Las Vegas or wherever they are.
00:36:28.000And so we find that sort of telescopic approach to war different ethically.
00:36:33.000And I think it's different in a variety of ways that are interesting.
00:36:40.000I think it's not so much that war unleashes in most people this bloodlust that they're struggling to contain in the civilized world, and that once the tap is open in a foreign country,
00:37:00.000People are really conflicted about what they do, and a lot of people try to not do anything of consequence.
00:37:06.000There's a great episode of one of Dan Carlin's podcasts, one of the Hardcore History podcasts about World War I. And I believe it was about the Germans and the English that they had been in battle with each other and they had...
00:37:23.000Sort of, without verbally agreeing to this, they had sort of agreed to a ceasefire during lunch.
00:38:02.000You know, horror compounded upon horror endlessly for years to no evident gain.
00:38:09.000I mean, these people, they're fighting for yards of ground forever, and just tens of thousands of people are dying, and they're basically camped out on the decomposing bodies of the people who died before them.
00:38:24.000And it's the most horrible version of warfare you've ever heard about.
00:38:28.000And then there's this no man's land between the trenches where people who run out there trying to make an incursion into the enemy trench will get caught on barbed wire or they'll get shot.
00:38:40.000So you have this spectacle of injured and dying people in the no man's land between the trenches.
00:38:49.000You know, howling for hours and hours and hours in misery, and when someone goes to try to rescue them, they get shot.
00:38:54.000And so, but there were periods where the two sides just agreed that this was just, and again, how that was communicated was kind of interesting.
00:39:07.000I don't actually recall the details there, but it was kind of a tacit agreement that emerged where, okay, we're going to let you get your, we're not going to shoot at you when you get the injured person or the dead bodies.
00:39:19.000And there was one Christmas, I believe, where they just basically went out and exchanged cigarettes and had an impromptu soccer game.
00:39:27.000And they basically called the war off at a certain point and then got chastised by the higher-ups for doing that.
00:39:33.000And then the war started all over again.
00:39:35.000But yeah, they actually socialized at one point.
00:39:38.000It really is an amazing depiction of what must have been an impossible place to be in.
00:39:47.000To imagine being a person standing on the decomposing bodies, being forced to shit in a coffee cup and throw it over the top of the trench, and know that no one's getting out of this.
00:40:00.000I mean, you might be one of a thousand people that's gonna die.
00:40:05.000In the next couple hours, you might be, you know, you might make it to next week.
00:40:29.000You're just thrust into it and it doesn't make any sense.
00:40:32.000And then to have that all sort of Eroded to the point where on Christmas you guys are hanging out and then the generals come in and say fuck this you got to kill those people and next thing you're killing each other again like so you had this brief glimpse of You know some utopia inside of war Yeah,
00:40:51.000well, what was so weird about that war in particular was that the run-up to it was so romanticized and idealistic.
00:41:00.000I mean, you had a kind of war fever that happened throughout Europe where this was just looked at in the rosiest possible terms.
00:41:12.000Like, this is just the true glory of manhood being expressed.
00:41:59.000Yeah, you read Homer and war is this glorious thing.
00:42:06.000The war ethic you get from ancient civilizations is something that we have...
00:42:13.000I think largely outgrown, but you can really see it in World War I. Don't you think a lot of people had that similar attitude post 9-11, especially when the World Trade Center towers went down and there was this flag waving fewer in America,
00:42:31.000I remember post 9-11, I remember driving down the street, leaving the street near my house, And entering into this main street and every car, every car had an American flag.
00:43:03.000And then you start hearing things from people like Pat Tillman, who left a career as an NFL player, a very promising career as a pro athlete, and all of a sudden he's over there in this war.
00:43:15.000And his impression of it was that it was a huge clusterfuck.
00:43:36.000And there was the conspiracy theory was that they shut him up.
00:43:39.000Because he was so openly critical of what was going on over there, that it wasn't what he thought it was going to be.
00:43:47.000He thought it was going to be this incredibly organized group of heroes that went over there to fight these evil bad guys that are hell-bent on destruction and suicide bombing their way into America to kill the American dream.
00:43:59.000I mean, this is the idealistic version of it.
00:44:01.000Well, I think there is an idealistic version of good and bad actors in this case.
00:44:07.000It's just the reality of fighting this war is so messy.
00:44:14.000Afghanistan, I think, was pretty clear-cut morally that we had to do something against al-Qaeda.
00:44:20.000And by definition, once the Taliban wouldn't Release Osama bin Laden to us.
00:44:28.000We had to do something against the Taliban and that's where he was and they were sheltering him And so I didn't I didn't feel ethically conflicted over that But that was such a mess.
00:44:41.000I mean you're just going into Afghanistan the reality of what it takes to go into Afghanistan and kill the bad guys is so messy that There's arguably no good way to do it.
00:44:51.000There's no way to do it which at the end of the day is going to look like a success.
00:44:56.000And so maybe that's something we're now learning that you have to, this is so messy that you have to be, you really have to pick your moments.
00:45:06.000And be far more surgical than we've ever been inclined to be, and not even think about defeating the enemy, ultimately, but just kind of keeping the enemy at bay, containing this problem for long enough to change minds or change culture in some other way.
00:45:24.000Because even in this case, I think it was very clear-cut.
00:45:28.000Killing members of Al-Qaeda was a good idea, and I think it's still a good idea.
00:45:33.000It's just, you know, a drone strike kills some of them, and it also kills some of the hostages, as we now see, and it also kills some of the people standing too close to the bomb blast, and it's ethically messy,
00:46:07.000Isn't that where the foreign policy argument comes into play?
00:46:10.000Because some people say those bad guys are bad guys because of US foreign policy, because of the way we have intervened and dominated natural resources.
00:47:11.000What did she say and do you not agree with what she said?
00:47:14.000Well, it was really interesting listening to her because...
00:47:18.000So I listened to the whole podcast and she didn't mention me until like the second hour.
00:47:24.000And I'm listening to this and I'm thinking...
00:47:26.000So I'm actually having a conversation with you in my head as I'm listening to this and I'm thinking...
00:47:32.000Joe, it's kind of remarkable what you are able to do here, because you're having a conversation with her.
00:47:36.000From my point of view, you are just drinking from a fire hose of bullshit, right?
00:47:41.000What she's saying, there's so much wrong with what she's saying.
00:47:47.000But yet you're in a position to have a conversation with her that is where there's just a ton of goodwill and it doesn't run to the ditch at all.
00:47:54.000And you can have a conversation with me in the same vein.
00:47:57.000But then I was thinking, I'm sure she's a perfectly nice person and I would be very nice talking to her.
00:48:06.000I have a feeling now of more or less total hopelessness talking to someone as polarized on these issues as I view her to be.
00:48:16.000And so I was kind of praising you in my mind thinking, you know, I couldn't do what you're doing here.
00:48:23.000And at that instance, she just mentioned me, right?
00:48:27.000So it was like one of those bad scenes in a movie where the television starts talking to the character.
00:48:32.000She just kind of called me out and then more or less totally misrepresented my views.
00:48:41.000So she said many things that are just inaccurate, which we can talk about, but in terms of what she attributes to me, she said that I only care about intentions, right?
00:48:54.000So that intentions are all that matter.
00:48:56.000So if we kill a billion people, but meant well, We're fine.
00:49:01.000And if the Muslims kill a million people but don't mean well, they're far worse than we are ethically.
00:49:10.000And she was, I think in her defense, I'm sure she's never read anything I've written, but she was reacting to a snippet of a podcast where I push back against some of the things that Noam Chomsky has said.
00:50:28.000Is it going to be rushing you to the hospital in the next instance?
00:50:31.000Right, but this is a kind of a disingenuous comparison.
00:50:34.000Because, I mean, are you describing the difference between accidentally killing civilians with a surgical strike, in quotes, of a drone strike, versus killing someone with a suicide bomb?
00:50:47.000Are you trying to kill as many people that are random as possible?
00:50:50.000So I'm using a very idealized example just to show you that The role of intention is not all that matters, because getting stabbed still sucks, right?
00:50:59.000So if you assume the same stab wound, you still have the same problem.
00:51:03.000But one of your scenarios is completely innocent and accidental.
00:53:58.000Let's keep the speed limit exactly where it is, but No matter what car you have, there's a governor on it, and you cannot go past the legal speed limit ever.
00:54:10.000So if you're in a 25-mile-an-hour zone, whatever your car is, you've got a Porsche or whatever you like to drive, it can only go 25 miles an hour, not a mile an hour more, no matter how you hit the throttle, and that would be true in every zone.
00:54:26.000There are people who would resist that, and their reasons for resisting it is just that driving would be less fun.
00:54:33.000If anything is indefensible when you're talking about kids being killed, that is.
00:54:39.000That's a far more superficial commitment.
00:54:43.000Than wanting to get the higher-ups in Al-Qaeda who are trying to, at some point, blow up an American city, right?
00:55:31.000The last 10 years has been 300,000 people in the U.S. But how many people who drive on a daily basis wind up driving their whole life and never killing anybody?
00:56:35.000We did some very understandable things and also some very stupid things, but we took the lid off of a simmering civil war.
00:56:44.000The real catastrophe of Iraq, apart from our going in in the first place, which I never supported.
00:56:51.000But the real catastrophe is that having gone in, we failed to anticipate the level of sectarian hatred.
00:56:58.000And we did very little to hedge against it.
00:57:02.000And we kicked off a civil war, which someone like Abby Martin clearly thinks we are entirely responsible for.
00:57:08.000So when Shia death squads are taking out the power drills and drilling holes into the heads of their neighbors, And the Sunni are returning the favor.
00:58:25.000Insofar as we could have anticipated the rise of ISIS and all of this consequent death toll, you are faulting us for leaving because our political interests and our stomach was no longer aligned with this project.
00:58:43.000I'm not sure that's an argument that someone like Abby Martin wants to make, that we should have stayed longer, that we should have spent more money, that we should have killed more people in an effort to keep the locals from killing so many more people.
00:58:56.000So anyway, the number 2 million is plucked out of a bad dream.
01:00:13.000So you can count bodies, you can get reports of the actual deaths.
01:00:21.000The other thing you can do is you can estimate the amount of death That would have occurred in the absence of an invasion, and then compare the reports of...
01:00:30.000You do a statistical sample of an area and compare the reports of death...
01:00:43.000The authors of this Lancet article, they did that, and they came up with a number 600 or 650,000.
01:00:51.000And that article has been widely criticized, not to the point of it being unpublished, but I don't think it's been retracted, but I don't think any serious person thinks that article is representative of the facts.
01:01:05.000And so what they did, for instance, is they...
01:01:07.000They would take a cluster of, I think, 40 homes in an area and ask the people, you know, who has died, who do you know who has died, and how did they die?
01:01:18.000And then they would get, they would just, based on that sample, they'd do that in many different sites around Iraq.
01:01:25.000Based on those samples, they would extrapolate to the rest of the population, and they came up with 600 or 650,000.
01:01:31.000So one criticism I read about that article was that they seemed to have focused on Areas near major thoroughfares in big cities where IEDs were far more common than other places in those same cities or in other places in Iraq.
01:01:50.000So the very place you would most likely plant an IED... Is a not especially representative place to poll all the families in the area to see whether they've lost loved ones in the war, right?
01:02:04.000So that was one way to get an unrealistic number.
01:02:08.000The other thing is that it seemed that there were just some shady things with the researchers where they weren't releasing their data and their methods and the communication with them broke down.
01:02:16.000And so, anyway, the sober people I trust who focus on these things think that's a fictional number.
01:02:22.000And that's one-third, not even one-third of what Abby's working with.
01:04:39.000I'm just saying that the truth is so sinister that even if our intentions were perfectly benign and we're just trying to raise the standard of living there and even just give them the freedom to practice their own religion, right?
01:04:53.000We're trying to make this like Nebraska.
01:04:57.000It would have been a bloodbath, given the beliefs of the sorts of people who now populate a group like ISIS. So, anyway, that's my claim.
01:05:08.000So this is just one aspect of what you disagreed with what she said.
01:05:29.000But there's a style of talking that you run into with people where there's just...
01:05:32.000It's kind of confabulatory, where you're just sort of talking, and it's sounding good, and you're just sort of spitballing, but you're using numbers, right?
01:05:40.000You're using numbers like two million, or you're saying things like, our biggest export is weapons.
01:06:13.000By the Nobel Peace Prize winning Doctors Group is the first to tally up the total number of civilian casualties from the U.S.-led counterterrorism interventions in Iraq, Afghanistan, and Pakistan.
01:06:35.000So, for instance, as a sanity check, when I heard Abby Martin, I sent an email to my friend Steven Pinker, who's an incredibly sober scientist, just a very careful researcher.
01:06:49.000He wrote this truly landmark book on the decline of violence in the last century, The Better Angels of Our Nature.
01:07:15.000I said, I'm hearing in liberal circles that we killed two million people in Iraq and Afghanistan.
01:07:20.000Is there any chance that this is true?
01:07:22.000And he said more or less what I'm saying to you now.
01:07:26.000It's almost certainly an order of magnitude too high.
01:07:30.000The highest briefly credible study was the Lancet one.
01:07:34.000I didn't know about this, and I don't know what he in particular would say about this study, but undoubtedly they used the same sort of extrapolation methods.
01:07:45.000They're doing, based on sort of the ambient level of death over the years, they think it's gone up by the tune of two million in those countries.
01:07:56.000But anyway, so Steve said, no, this is a totally fanciful number, and here's why.
01:08:04.000And he broke it down for me, and then I did a little more reading on the topic.
01:08:19.000But the crucial ethical difference is, did we go in and perform our own sort of final solution against the Iraqis and the Afghanis trying to kill millions of people a la Hitler?
01:08:33.000Or did we wander into a situation where we unleashed a civil war And are we culpable for that?
01:09:11.000So wait a minute, the people who are saying it may be 2 million, but it's at least 101.3 million?
01:09:17.000According to the PSR study, the much disputed Lancet study that estimated 655,000 Iraq deaths up to 2006, and over a million until today by extrapolation was likely to be far more accurate than the IBC's figure.
01:09:31.000In fact, the report confirms that A virtual consensus among epidemiologists on the reliability of the Lancet study.
01:10:31.000I mean, one problem here is that this whole area has become so politicized that it's hard to...
01:10:39.000I mean, even Amnesty International has embarrassed itself with supporting a jihadist organization in the UK. They just, at the 11th hour...
01:10:48.000Pulled out their support, but for a very long time, they were just in the same trench with jihadists and not knowing it, or they should have known it.
01:10:58.000I mean, people were telling them, but they were very slow to realize it.
01:11:00.000So, you should be slow to take even a humanitarian organization's word for the significance of a given study.
01:11:14.000But I would find it, frankly, amazing Yes.
01:11:20.000We had killed anything like that number of people.
01:11:51.000So the body count, when you look at the penalty we're paying for killing people, and when we look at how much our own soldiers don't want to die unnecessarily,
01:12:06.000and our own level of casualties, We're not on the other side of all those guns.
01:12:14.000I mean, there's just a tremendous amount of internecine violence in both Afghanistan and Iraq that is just killing, you know, like a bomb will go off and a hundred people at a mosque will be dead.
01:12:26.000So, Abby Martin, I think, I don't think I'm being uncharitable here, I think she thinks We're responsible for that.
01:14:48.000Well, the conspiracy theory would be that that's how they perpetuate this whole constant cycle of war, is that you have to keep arming your enemies.
01:15:07.000The possible role of corruption there and a sort of callous indifference to the effects of being in this business, I think that's a very real concern.
01:15:19.000So, to say that this is our main export.
01:15:38.000She might be getting, you know, that's the problem.
01:15:41.000Unless you're doing, look, we live in a world that's so broad and comprehensive that unless you're doing the actual research yourself, and not just doing it, but doing it over a long period of time, and very meticulous, most likely, you don't know the actual numbers.
01:15:55.000There's very few things that I could talk about with utmost certainty that aren't involved directly in my own life.
01:16:00.000And when you deal with numbers, like numbers of imported Guns, exported guns, people dying in a place that you've never even visited.
01:16:08.000Boy, you're relying on a lot of people's data.
01:16:37.000You can find the people who are engineers who say that 9-11 had to be an inside job because, you know, the melting point of steel, blah, blah, blah.
01:16:46.000And you can find on all these issues, you get incredibly politicized science.
01:16:54.000But certain things don't pass the smell test.
01:16:57.000And to me, two million people doesn't pass the smell test, certainly if you're going to say that we killed those two million people.
01:17:30.000The thing I can say just categorically is that what she said about my concern about intention is just not true.
01:17:38.000Intentions matter because they are the best predictor as to what the person is likely to do in the future.
01:17:45.000If you know someone is killing people because he intends that, he wants that, he wants to cause grief and suffering and death, well then you know this is a person you have to jail or kill and this is not a good actor.
01:17:58.000If someone does it because they did it by accident, or they didn't foresee the consequences of their actions, or they were trying to get the bad guy and they produced collateral damage, it's a very different scenario, and yet the body count may be the same.
01:18:13.000And so the thing I've faulted Chomsky for in the past is that he seems to talk about situations where All you need to concern yourself with is body count.
01:18:24.000So the example I dealt with in my first book, The End of Faith, and this was in reaction to a short book he did right after 9-11 called 9-11, he talked about the...
01:18:35.000Clinton's bombing of the al-Shifa pharmaceutical plant in Sudan in retaliation for the embassy bombings, al-Qaeda bombings in Kenya in the 90s.
01:18:46.000And he talked about this bombing of the pharmaceutical plant as a great atrocity, seemingly equivalent to the atrocity of 9-11 or worse.
01:19:01.000Because of the consequences for Sudan of having half the supply of pharmaceuticals destroyed.
01:19:07.000People would die from preventable illness as a result of this.
01:19:15.000The representation of our government was not, and I think any rational thinking on this topic would suggest that our intention was not to destroy a pharmaceutical plant.
01:19:24.000We claimed to be bombing what we thought was a chemical weapons factory, run by Al-Qaeda, and we wanted to degrade that capacity of theirs after they had just bombed two embassies in East Africa.
01:19:41.000I mean, who knows what our actual intentions were?
01:19:43.000But if our intention was to bomb a chemical weapons plant that we didn't know was a pharmaceutical plant, and we bombed a pharmaceutical plant that was being used for peaceful purposes, and as a result, tens of thousands of people didn't get their medicine and died, I think?
01:20:21.000If you accept that to be true, then the fact that tens of thousands of people died as a result doesn't have the same ethical significance.
01:20:29.000It is much more like you and I are just trying to get home at 55 miles an hour, but we're participating in a system that's going to kill 30,000 people this year based on our speed limits.
01:20:40.000We're not intending to kill any of those people, right?
01:21:05.000Really knowing what the bad, you know, we thought we were going to get the chemical weapons facility, but we also knew we were going to destroy all of their pharmaceutical infrastructure and that would have cascading bad effects that, all things considered, we didn't care that much about.
01:21:20.000Let's say it was that place on this continuum of moral callousness.
01:21:27.000Well, that's still different than trying to kill 10,000 people by taking away their medicine, right?
01:21:34.000In my view, it may not be so different, and it's something that we would be culpable for.
01:21:42.000The reason why intentions matter is because they are...
01:21:49.000They're the clear expression of what we are committed, the ends to which we're committed, the kind of world we want to live in.
01:21:56.000What I did in that first book, I asked, this is a thought experiment called The Perfect Weapon, where I said, just imagine what a group would do.
01:23:18.000The spirit in which she talked about our culpability on the world stage is very much in the sense that we have intentionally murdered millions of brown-skinned people because we don't care about them, and maybe it's part of the reason why we want them dead,
01:24:23.000Do you put any blame on the United States government and our foreign policy and our decisions as far as the domination of global natural resources, whatever we've done overseas?
01:24:37.000Is that in any way responsible for the hatred that these people have for America in the first place?
01:26:13.000Civilization just needs petrochemicals to survive.
01:26:16.000And they all happen to be buried in the ground, inconveniently, under the palaces of these religious maniacs.
01:26:26.000That may have been the situation then.
01:26:28.000So how culpable are we for securing our interests, and not just we, the U.S., but the West, at that point, by entering a relationship with the House of Saud?
01:26:57.000We know it's a disaster for climate change.
01:26:59.000We know that there would be the financial and technological renaissance that's waiting if we all just grab Elon Musk's coattails and go towards sustainable energy.
01:27:17.000All of this, you know, our interests...
01:27:19.000We're funding both sides of the war on terror.
01:27:23.000So we should just make a full court press in the direction of sustainability, energy security, and getting ourselves into a position to say to people like the Saudis, you treat your bloggers better or we're going to bankrupt you, right?
01:27:39.000All their wealth is coming out of the ground, right?
01:27:42.000So the moment we don't need this wealth or need to defend it, We'd be in a much better position to demand that people treat women better throughout the world, and they honor free speech, etc.
01:27:55.000So I think it's a scandal that we are not doing that, and I think, yes, we are culpable for doing that, but given what would happen to us in the near term, If we lost access to oil,
01:28:12.000and again, I'm not just talking about us, I'm talking about Europe and just the whole world, it's been a very difficult situation to be in, and it's understandable that we have gotten into this situation, but I don't find it understandable now that we aren't sprinting away for it.
01:28:28.000So if I could define your point of view.
01:28:30.000Your point of view is more of a pragmatic take on what the world is currently at this stage.
01:28:37.000You're not taking away the responsibility of the United States government.
01:28:40.000You're not saying that they haven't made horrific decisions.
01:28:43.000You're not saying that they haven't been manipulated by these gigantic corporations that are profiting off of the war that we're currently involved in.
01:28:52.000That you are just saying that if you want to look at the actual reality of the good guys and the bad guys and where the world is fucked right now, there's certain things that have to be done and there's certain people that have to be taken out.
01:29:03.000If you do not, you put everyone else at risk.
01:29:12.000I've been saying this for 10 years at least, or now closer to 15 years, and it just never gets heard.
01:29:21.000I can grant someone like Chomsky You know, 80, 90% of his thesis, right?
01:29:29.000So I think he pushes forward into kind of masochistic voodoo a little bit, but we have done horrific things historically, right?
01:29:40.000And the question is just how far you want to walk back in your time machine.
01:29:45.000But, you know, starting with, you know, our treatment of the Native Americans on up, it depends on who the we is, but we being the United States, right?
01:29:54.000We get here, we start behaving badly, and we behave badly for a very long while, and we have done terrible things, and yet it is also true that we have enemies we haven't made.
01:30:07.000There are people who have had the benefit of everything the West has to offer.
01:30:12.000Who are waking up today deciding to join ISIS for reasons that have nothing to do with U.S. foreign policy, or if they do have something to do with U.S. foreign policy, it's based on a theological grievance.
01:30:23.000It's not based on any real political concern for the lives of the Palestinians.
01:30:27.000It's based on, you've got infidels too close to Muslim holy sites.
01:30:53.000At least 3 billion people, probably something like 5 billion people who would trade places with him to be in a position of such opportunity in this world.
01:31:07.000And yet the opportunity he wants to take is to move to Iraq or Syria and cut the heads off of journalists and aid workers.
01:31:18.000Journalists and aid workers, not Navy SEALs they captured.
01:31:46.000And this is something that someone like Abby Martin and someone like Noam Chomsky...
01:31:50.000This is the phenomenon they really don't explain.
01:31:52.000How is it that someone with all the opportunity, who's never been victimized by anyone, how is it that he is committed to the most abhorrent and profligate misuse of human life, where he's just ready to burn up the world, right?
01:32:09.000And how do you get tens of thousands of people like this coming from first world societies?
01:32:14.000And so given that phenomenon, then what explains the commitments of the people who don't have all those opportunities, right?
01:32:24.000The people who are born in these societies and are shell-shocked and have been mistreated and who are...
01:32:35.000I mean, some of them still love the West.
01:32:41.000Some of them still are trying to get out.
01:32:44.000I hear from atheists in these countries who don't hate the West.
01:32:48.000I mean, they don't follow Abby Martin's line on this.
01:32:51.000They understand why we were bombing in their neighborhoods, right?
01:32:55.000But the fact is, this is really like a science experiment.
01:33:00.000There are pristine cases of people who have no rational grievance, who devote their lives to waging jihad, and they're not mentally ill.
01:33:09.000And that's the problem that I... That problem is scaling.
01:33:15.000The thing that I worry about is that is a meme that is spreadable.
01:33:17.000You don't have to ever meet anyone affiliated with a terrorist organization to get this idea into your head.
01:33:23.000And so that's the piece I have focused on.
01:33:28.000And it's not that I've denied the reality of the other pieces.
01:33:31.000Is this related in any way to just the natural instinct that a certain amount of people have to be contrarians?
01:33:37.000I mean, there's a certain amount of people that when they find any sort of large group that's in power, they want to oppose them.
01:33:44.000If they find a ban that's popular, they want to hate it.
01:33:46.000If they find a political party that's in control, they want to oppose it.
01:33:50.000There's a certain amount of people that are just natural contrarians.
01:33:53.000When they find a group that is absolutely committed And completely involved in an ideology to the point where they're rabid about it.
01:34:04.000It becomes attractive to them and they want to join that resistance to fight against the Death Star that is the United States.
01:34:11.000And religious by any stretch of the imagination.
01:34:28.000And there's a certain amount of, when I see the Islamic scholars that are talking in absolute confidence about their beliefs, there's a certain amount of that that I personally find attractive.
01:34:41.000I don't want to join ISIS. I don't want to become a Muslim.
01:34:45.000But when I see someone, almost like what we were talking about with Conor McGregor earlier, where he just fucking believes, man.
01:34:52.000I was watching this guy, I forget his name, but he's a guy from, he lives in the UK and he's this rabid Islamic scholar that, you know, All of his tweets are on how Islam is superior and it doesn't have to be adjusted like the laws of modern society and secular wisdom is inferior to the Islamic wisdom and blah blah blah.
01:35:14.000I watched this guy do this YouTube video where he's describing how Islamic culture is superior to Western culture in terms of the way they manage money and he made a lot of fucking good points.
01:35:27.000I made a lot of good points about wealth and about building economies and about how you take a company that's only worth $100,000 but you could sell it for a million dollars or trade it.
01:35:40.000You have stocks and this is invisible wealth and Islam doesn't allow invisible wealth because that's how societies get crushed and that's how other economies crumble.
01:35:50.000And I'm watching this guy with his moral certainty.
01:35:56.000And his extreme confidence in what he's saying, absolute, and it becomes compelling.
01:36:01.000And I'm not joining, I'm not saying that he got me, but I'm saying that I'm just absolutely admitting there's a certain aspect of human nature that gets compelled to join groups.
01:36:13.000Well, there's something, there's that component of it, which I understand, but there's also just the religious ecstasy component, the aesthetics, the emotional component of it, which...
01:36:26.000I really understand and I'm susceptible to.
01:36:29.000I have a blog post, I believe it's called Islam and the Misuses of Ecstasy.
01:36:35.000This is actually the first blog post I ever wrote where I realized I could not possibly have written this in book form or in a newspaper because it relied on embedded video.
01:36:45.000The only way to have done this was with embedded video.
01:36:51.000I wrote this, I think, once again over protests, something was said about me by Glenn Greenwald or somebody.
01:37:00.000The charge had been that I totally lack empathy.
01:37:03.000I don't even know what it's like to be...
01:37:05.000what these people are getting out of their religion.
01:38:07.000Let's all get up in the morning and consider how profound human consciousness is and consider our togetherness on this rock spinning through empty space and realize that we just have this common project to make the world beautiful.
01:38:39.000And so I went through many other instances of this where something I'm seeing in the Muslim world, I really grok how beautiful and meaningful and captivating this is for people.
01:38:51.000But then at the end, I put in a Quranic recitation and sermon by a...
01:39:00.000I forgot his name now, but some sheikh who's got, you know, like ten times the number of Twitter followers you have, right?
01:39:22.000And, you know, it's a packed house in wherever it was, Saudi Arabia or Yemen.
01:39:29.000But what is being said there is so ethically ugly, right?
01:39:35.000Essentially celebrating the tortures of hell, right?
01:39:39.000Just expressing a certainty that infidels are going to go to hell and how, you know, just this is a...
01:39:46.000You have to organize your life around this question about how to escape the torments of hell.
01:39:54.000And the only way to do it is to be a true believer in the Quran and Muhammad, etc.
01:39:59.000And this is at the center of the mandala of their ethical concern.
01:40:07.000Nothing in this life matters but avoiding hellfire.
01:40:13.000And so there's a kind of a ghastly perversion of this impulse that I think many of us feel, I certainly feel it, to It's very much like Burning Man for people.
01:40:35.000Imagine if Burning Man were just as ecstatic as it was and attracting all the smart people that it attracts.
01:40:44.000But strewn throughout it was a message of just true divisiveness.
01:40:50.000Like, everyone else who's not here is going to be tortured for eternity and they deserve it and we shouldn't be their friends and we should fuck them over any way we can when we get out of this place.
01:41:02.000If God had wanted to enlighten them, He would have, but He hasn't.
01:41:41.000It's not to say that Buddhists can't do terrible things, and it's not to say you can't find Buddhist reasons for doing terrible things, but...
01:41:50.000Jihad, martyrdom, paradise, this is the jewel, the horrible jewel that so many millions of people are contemplating in Islamic context, and that's what I'm worried about,
01:42:05.000and I'm not insensitive to the experience people are having.
01:42:10.000Is this version of Islam recent in human history?
01:42:16.000There are some things you can say that have, you know, with Wahhabism and Salafi-style Islam generally, that have been politicized and tuned up in a negative way in the last century.
01:42:32.000You can say that, but the reality is that jihad is...
01:43:28.000I mean, if you're going to follow Muhammad's example, which is perhaps the main lens through which you have to look at this, I mean, there's just what's in the Quran, and there's what's in the Hadith, the larger literature, and there's the example of Muhammad,
01:43:45.000which is attested to in both those literatures and in the early biographies about him.
01:43:55.000He was a conquering warlord who succeeded, right?
01:44:02.000And that is an example that is very different from the example of a guy who got crucified, or the example of a guy who spent his life meditating and then teaching, right?
01:44:11.000If the Buddha had been lopping heads off You know, at every sermon and advocating, just talking endlessly about when to kill people and how many people to kill and, you know, how to treat your sex slaves.
01:44:27.000If that was just strewn throughout the Buddhist teaching, I would expect Buddhists to behave exactly the way we see members of ISIS and Al-Qaeda and Al-Shabaab and Boko Haram behave.
01:45:29.000You know, after centuries, decided, more or less unanimously, that slavery is an abomination, that proves that there's more moral wisdom to be found outside of these books than inside, at least on that point.
01:45:43.000And I would argue on virtually every other point of consequence.
01:45:47.000Now, it's not to say they're not gems of moral wisdom in some of these books, but We're good to go.
01:46:12.000The members of ISIS right now have the theology on their side.
01:46:17.000It's not like they're ignoring the books.
01:46:19.000They're looking at the books very literally.
01:46:21.000And they're saying, what are we doing that you don't find in the books, essentially?
01:46:33.000That was one of the videos that you had posted up on your blog that you and I discussed with the guy that was standing in front of all those people that was talking about stoning people for adultery or the treatment of homosexuals and how that this is not radical Islam this is just Islam and that was shocking and that's one of those videos where You post it or you talk about it and you get a million people that get upset at you over it.
01:46:57.000You get a million people that call you Islamophobic or what have you and get upset about it and a lot of those are the same people.
01:47:05.000There was a weird thing that happened after Charlie Hebdo that really kind of freaked me out.
01:48:02.000And in this case, mocking also Christianity and the Vatican and many of the things that were interpreted as racist weren't even racist if you understood French or French politics.
01:48:14.000And the people who missed the The train on this, people like Gary Trudeau, the Doonesbury creator, he just came out against Charlie Hebdo and a bunch of writers who belong to the PEN America organization,
01:48:31.000which is the whole point of which is to defend free speech.
01:48:34.000They just walked out of a gala event or declined to show up because PEN had given Charlie Hebdo the Freedom of Expression Award this year, as they should have.
01:48:45.000And some prominent people left in protest.
01:48:55.000Well, it's political correctness and fears about being perceived as a racist and this notion that you should...
01:49:07.000That it makes sense to have a double standard here where you can...
01:49:12.000That there's some trade-off between freedom of expression and freedom of religion, where when the freedom that's being claimed on the religious side is the freedom not to be offended, right?
01:49:26.000So, I mean, really what's happening here is some number of Muslims...
01:49:31.000Are demanding that non-Muslims follow the taboos of Islam.
01:49:37.000So, it's taboo for you to say anything negative about the Prophet, or even to depict him in a drawing, right?
01:49:44.000That's where it gets really crazy, right?
01:49:47.000And we want you to follow this taboo, though you are not Muslim.
01:49:50.000And we feel so strongly about this that we're going to kill you, or make credible threats of killing you, or...
01:49:59.000We're just going to—when people do kill you, we're going to blame you for having gotten yourself killed, for having been so stupid and insensitive by caricaturing the prophet.
01:50:07.000And that whole—I mean, that just has to lose.
01:50:11.000I mean, we have to hold to free speech so unequivocally that all the people over here who think that there's this trade-off between religious sensitivity and free speech— Just have to realize that they've lost.
01:50:27.000Because we don't play this game with any other religion.
01:50:29.000Just think about this analogy I've used before, but the Book of Mormon, right?
01:52:08.000That it's a fear of Islam, a fear of retaliation, that they want to be on the side of the others because it's so dangerous, because they are the only religion that will come out and kill you.
01:52:16.000And these same people, I've found, that will call people out on being Islamophobic will not say a fucking peep about anti-Christian rhetoric.
01:52:25.000If you start talking shit about Jesus or mocking Christianity, they never have a word to say about it because it's not dangerous.
01:52:32.000Because it's not dangerous to be on that side.
01:52:34.000I think it's much more just white guilt and political correctness.
01:52:42.000There's definitely some of that as well.
01:52:54.000If you take her view of our foreign policy, if you just agreed with her down the line, just check all those boxes, two million people, we did it all, we just kill brown-skinned people all over the world because we just like to sell bombs, and that's really our moral core, then we should have a fair amount of white guilt,
01:53:41.000But we've participated in a system the existence of which is predicated on some of this shit.
01:53:47.000The existence of which existed long before you and I were ever born.
01:53:50.000We're born into a system we have zero control over.
01:53:53.000And that's why I think that some of the greatest...
01:53:56.000Ethical changes, the greatest ethical progress for us as a species is going to come not with each one of us developing an ethical code that allows us to be a hero, you know, personally and just bucking a system and bucking a trend, you know,
01:55:43.000That is a terrible way to sell your phone.
01:55:46.000Why would you have that fucking ridiculous phone?
01:55:49.000You can't put that in your pocket, son.
01:55:51.000Yeah, well, all of those people that buy those things that have those extreme liberal values, progressive values, you have to deal with the absolute reality that, at the very least, your phone is being produced in a factory where people are jumping off the roof.
01:56:09.000Unless they're making them in Korea, the Samsung phones, I think ethically, I think they have like a leg up on the iPhone in the sense that, you know, those Foxconn buildings where they have nets installed all around the building to keep people from jumping off the roof because it sucks so bad there.
01:56:26.000And I've heard the argument against that.
01:56:30.000You've got to deal with the fact that these factories employ half a million people, and the number of people that commit suicide is directly proportionate to the same number of people that would commit suicide in the regular population.
01:56:40.000But they're killing themselves at work.
01:56:42.000Like, how many people kill themselves at work?
01:56:43.000That's not normal, and they live at work.
01:57:17.000I think you and I would, and millions of other people would probably, I mean, I know I would, but I think millions, if we could make the problem transparent, we would pay more to be truly good actors across,
01:57:36.000you know, in all of the streams of influence.
01:57:40.000But there are certain situations, again, where, you know, I just mentioned, you know, child labor laws in Pakistan.
01:58:00.000Well, there are situations where that may be workable, right?
01:58:04.000Where you get the kid out of the factory, and where he's been working 14 hours a day, and you get him into school, and he's got a better life.
01:58:11.000But there are many situations in places like Pakistan where...
01:58:15.000No, what you've just done is you've made it impossible for this kid to work and you've further impoverished his family.
01:58:21.000Because he wasn't going to go to school anyway.
01:58:22.000Now he's going to be picking stuff out of a trash heap or whatever it is.
01:58:26.000And you haven't put in place an alternative that's workable.
01:58:30.000And so with many problems of this sort, we have to find a path forward where the first doors we open, the choice between the doors we have to open, All suck, right?
01:58:45.000And there are situations, you know, geopolitically that are like that, where you can either back a guy who's a dictator, right, but he's secular and he's committed to a first approximation to basically sane relations with the rest of the world.
01:59:03.000But he really is a dictator, and he really has a history of treating people badly, and he's going to treat political dissent very badly because of the possible consequences for him if he doesn't, because the society is bursting, coming apart at the seams.
01:59:19.000Or you can just let the Islamists and the jihadists run the place, right?
01:59:24.000And that is a, you know, there's no good option, and it's understandable that we have In many cases, chosen the dictator there.
01:59:32.000Well, that was sort of the situation with Saddam Hussein.
01:59:55.000If you could, if ultimately someone said, look, Sam, you're going to be king of the world.
02:00:01.000You are going to be the guy that gets to sort this mess out.
02:00:06.000We need someone to engineer a global culture.
02:00:09.000What would be the step that you would take to try to alleviate some of the suffering of the world, alleviate some of the bloodshed, alleviate all these conflicts, these geopolitical conflicts?
02:00:20.000Well, in this area, the first few things I would do, we've already talked about.
02:00:25.000One is I would make it absolutely clear that free speech just wins, right?
02:00:33.000So whenever you got into a Charlie Hebdo situation or the Danish cartoons, you know, the riots over those cartoons, we've had half a dozen situations like that in the last 10 years.
02:01:15.000The Benghazi thing, it's true that it did kick off riots everywhere, but the thing that was egregious about our...
02:01:28.000Government statement there was that we basically just, rather than to take the totally sane line of saying, listen, in our society we're committed to freedom of speech, and you can make films about anything here, and that never gives you license to kill people,
02:01:49.000Well, it was a film called The Innocence of Muslims, or Innocence of Muslims, made by some crackpot somewhere.
02:01:58.000And it was just a YouTube video, but it got spun as this major scandal in the Muslim world, and it reliably produced this reaction of the sort that the Danish cartoons had.
02:02:12.000And we, rather than just hold the line for free speech...
02:02:17.000We, I mean, the State Department said something like, you know, we totally repudiate this attack upon Islam.
02:02:27.000And we just distanced ourselves from it just as a way of trying to contain the madness, right?
02:02:33.000It was a symptom of just how afraid we are that this sort of thing can get out of hand in the Muslim world, because it can, right?
02:02:42.000If there's a rumor that a Quran got burned, or if some, you know, pastor...
02:02:47.000In Florida, threatens to burn a Quran.
02:02:51.000Reliably, people by the dozens get killed in places like Afghanistan.
02:02:56.000Because it's in a way that a suicide bombing between Sunni and Shia never produces a response of that sort.
02:03:03.000So I would hold to free speech, and I would just make that...
02:03:10.000Because free speech is the freedom that safeguards every other freedom.
02:03:14.000If you can't speak freely, if you can't criticize powerful people...
02:05:04.000And that would prove, beyond any shadow of a doubt, that spending your life splitting hairs about...
02:05:16.000Muslim theology and demonizing the rest of the world and exporting crazy madrasas by the tens of thousands all over the world, as the Saudis do.
02:05:29.000It would prove that that is not a way to join the community of functional nations, because absent an ability to pull their wealth out of the ground, they have no intellectual content.
02:05:40.000They don't produce anything of value that anyone wants.
02:05:44.000That's a problem they would have to solve, right, if they don't want to be beggared in a global community.
02:05:49.000Well, isn't that an issue also with the ideology of the religion, is that you're not allowed to question or change or manipulate the way you approach life because it's all dictated by the religion?
02:06:03.000And just when you look at societies where they keep half the population, the female half, more or less hostage and unable to get educated or to work or to drive cars, depending on which society you're talking about.
02:06:19.000This economically and socially doesn't make any sense in a context where you need to produce intellectual content to be part of a global conversation.
02:06:31.000So the only way they've been able to do this is because of the fact that they have an extreme amount of money that comes from oil.
02:06:36.000Well, certainly if you're talking about the oil states, yeah.
02:06:45.000And we actually could get to a time where that would be the case, whereas oil is just a dirty fluid that no one wants to have anything to do with, right?
02:07:00.000Now, I'm sure there's another side to this argument where it would be a destabilizing change.
02:07:04.000I mean, just imagine how things will start to run off the rails in the Middle East if oil is worthless, right?
02:07:10.000And what's Saudi Arabia going to be like?
02:07:12.000Arguably, I think they've probably hedged their bets and they have so much money in other investments now that at least the royal family would be fine.
02:08:21.000We don't think this thought for the same reasons.
02:08:23.000But she pointed out that religions change, right?
02:08:25.000That you roll back the clock 500 or so years, Christians were burning people alive and actively prosecuting people for sins.
02:08:35.000For blasphemy, you had the Inquisition in Europe, and that was every bit as much of a horror show as what's going on in Iraq now.
02:08:41.000So look, Christianity can be just as bad as Islam.
02:08:45.000Now, it's true, as a matter of history, that is in fact true.
02:08:50.000There are differences between Islam and Christianity that are nevertheless important, but the crucial piece is that Christianity did not change from the inside.
02:08:59.000You know, Christianity got hammered from the outside by a Renaissance and, I mean, a Reformation initially, which was bloody and horrible, but It got hammered by the forces of a scientific revolution and an attendant industrial revolution and capitalism and the rest of culture that didn't want to be shackled to theocracy for very good reason,
02:10:06.000We have to figure out how to engineer it for Muslims.
02:10:08.000And again, it's not going to come from the outside.
02:10:11.000Non-Muslims are not going to force it on Muslims.
02:10:13.000But we have to support the genuine reformers And the people who are fighting for the rights of women and gays and free thinkers in the Muslim world.
02:10:25.000And the horrible thing is that the liberals on our side don't do that.
02:10:29.000The liberals on our side criticize people like me and even Ayaan Hirsi Ali, a former Muslim who has been hunted by theocrats, right?
02:11:21.000But let's talk about the fact that girls six years of age are getting clitorectomies By barbarians in septic conditions, and everyone around them thinks it's a good and necessary thing, right?
02:11:38.000And, you know, women who get raped get killed because they brought dishonor on their family.
02:11:43.000I mean, there's another planet over there that we have to interact with because violence is coming our way for no other reason.
02:11:52.000There's the ethical imperative of figuring out how to help people who are hostage to a bad system.
02:11:59.000And so, yeah, let's be for women's rights globally, but what does that look like?
02:12:05.000That looks like a rather staunch criticism of the way women are treated under Islam.
02:12:10.000There's a lot of seemingly open-minded European cultures that have opened the door for a lot of Islamic immigrants or Muslim immigrants to come over to their country.
02:12:19.000Now they're dealing with a lot of the issues that involve these ideologies being a part of their culture now.
02:12:59.000We have political refugees who are leaving war-torn places for obvious reasons and winding up in the closest shores across the Mediterranean.
02:13:09.000So yeah, the people they're attracting are different from many of the Muslim immigrants we get in the U.S. who are coming to work for Google or they get engineering degrees.
02:13:22.000It's a different demographic, largely.
02:13:26.000Okay, I think we've covered that subject.
02:14:28.000And the word in neuroscience has been, for a very long time, and really science generally, is that AI hasn't panned out.
02:14:39.000It's not that it's inconceivable that something interesting is going to happen, but it has been Old-style AI was really a dead-end, and we never really got out of that particular cul-de-sac, and we just haven't made much progress, and so we have, you know,
02:14:54.000the best chess player in the world is a computer that's the size of this table, but...
02:15:00.000The prospect of having truly general artificial intelligence and superhuman level intelligence, that's not something we have to worry about in the near term at all.
02:15:12.000But then I heard, as many people did, my friend Elon Musk say something which seemed quite hyperbolic.
02:15:23.000He thought it was the greatest threat to humanity, probably worse than nuclear weapons.
02:15:28.000And there was a lot of pushback against him there.
02:15:33.000But I actually know Elon, and I knew he just wouldn't say that without any basis for it.
02:15:40.000And it just so happened there was a conference that had been scheduled long before in Puerto Rico, in San Juan, that was really like a closed-door conference for the people who are really at the cutting edge of AI research.
02:15:52.000It was organized by the Future of Life Institute, which is a...
02:15:57.000A non-profit purpose toward looking at the existential threat and looking at, in this case, how to create a...
02:16:07.000foreseeing the existential problems around the development of AI. And it was a conference.
02:16:16.000I mean, maybe there were 70 people at the conference.
02:16:18.000And it was all people who were doing this work and...
02:16:25.000I literally think I was one of maybe two people who had sort of talked his way into the conference.
02:16:29.000Everyone else was just invited and they had a good reason to be there.
02:16:35.000What was interesting is that outside this conference, Elon was getting a lot of pushback.
02:16:39.000Like, dude, you don't know what you're talking about.
02:16:42.000Go back to your rockets and your cars, but you don't know anything about computers, apparently.
02:16:46.000And he was getting this pushback from serious people, people who are on the edge.org website, where I am also occasionally published.
02:16:57.000You know, roboticists at MIT and people who should, you know, former top people at Microsoft, people who you think are very close to this, would say, no, no, this is 50 or 100 years out, and this is crazy.
02:17:13.000And so anyway, I went to this conference just wanting to see, you know, what was up.
02:17:20.000And what was interesting and, frankly, scary was that At the conference, even among people who clearly drunk the Kool-Aid and are just not willing to pull the brakes on this at all, I mean, they don't even...
02:17:34.000Arguably, it's hard to conceive of how you would pull the brakes on this, because the incentive to make these breakthroughs financially is just so huge that, you know, if you don't do it, someone will, and so everyone's just pedal to the metal.
02:17:51.000Basically, even the people who were going at this most aggressively were people who were conceding that huge...
02:18:03.000It was not at all fanciful to say that huge breakthroughs in artificial general intelligence could come in five or ten years, given the nature of the progress that had been made in the last ten years.
02:18:14.000And the scary thing is that when you look at the details, it's not at all obvious to see a path forward that doesn't just destroy us.
02:19:40.000You're talking about something that is...
02:19:44.000That learns how to learn in such a way that the learning transfers to novel situations, and it doesn't degrade.
02:19:55.000If you give it a new problem, it won't get worse at the other problems that it got good at because you're giving it new problems now.
02:20:03.000So you're giving it something that scales, that can move into new territories, that can become better at learning, and in the ultimate case...
02:20:15.000I mean, once these machines become the best designers of the next iteration of software and hardware, well, then you get this sort of...
02:20:24.000this exponential takeoff function, or, you know, often called the singularity, where you have something where there's a runaway effect, where it's just...
02:20:41.000So you imagine, what's often said is that we're going to build something, the near-term goal is to build something that's human-level intelligence, right?
02:20:49.000So you're going to build, we have a chess computer that's not quite as good as a person, and then it is as good as a person, and now it's, you know, a little better than a person, but it's still not so much better as to be completely uncanny to us.
02:21:01.000And we're thinking of doing that for everything, but...
02:21:06.000We're not going to build a human-level AGI. Once we build an AGI, it's going to be better at...
02:21:12.000Which is to say, once we build a truly generalizable intelligence, something that can prove mathematical theorems and make scientific hypotheses and test them, and everything a human can do...
02:21:25.000It's going to be so much better than a human for a variety of reasons.
02:21:30.000One is that your phone is already better than you.
02:23:46.000This is called a honeypot strategy, where you tempt it to make certain moves in the direction of acquiring more power than you wanted to give it, and then you can just, you know, shut it off immediately.
02:23:56.000But you're talking about guys who are a lot younger than us, many of whom are somewhere on the Asperger's continuum, who are drinking a lot of Red Bull and have billions of dollars at their disposal to do this work.
02:24:20.000There's a huge responsibility not to do anything that obviously destroys the world.
02:24:26.000And the problem is, even when you think about the most benign versions of this, the possibility of destroying the world is not fanciful.
02:24:36.000Forget about what I just said about 20,000 years of progress.
02:24:39.000Just imagine we have an AI. Someone working for Facebook or whatever builds this thing.
02:24:47.000We've solved what's called the control problem.
02:24:49.000We've figured out how to keep this thing doing our bidding.
02:24:53.000It's not going to come up with near-term goals that are antithetical to human happiness.
02:25:02.000If you say, you have to be committed to human well-being, if that's the foundational architecture of the system, It depends what that means in certain cases.
02:25:12.000I mean, what if the thing decides, well, okay, if I'm committed to human well-being, I'm going to kill all the unhappy people, right?
02:25:17.000Or I'm just going to plug, you know, electrodes into the right part of the brain of every human being and give them just pure pleasure, right?
02:25:26.000So let's say we build something that's totally under our control, it works perfectly, and we don't have to worry about the control problem.
02:25:32.000We still have to worry about the political and economic effects of building something that's going to put the better part of humanity out of work.
02:25:41.000I mean, you're talking about now something that can build further iterations of itself, where the cost of building versions of itself now is going to plummet to more or less the cost of raw materials.
02:25:52.000You're talking about a labor-saving device of a sort that no one has ever anticipated.
02:25:58.000And we don't have a political system that can absorb that.
02:26:02.000We have a political system where we would see the picture of some trillionaire on the cover of Inc.
02:26:10.000magazine, and we would hear that unemployment now was at 30% even among white-collar people.
02:26:46.000So just imagine the cyber war and the drone war we could unleash on the rest of the world if we had the ultimate war-making computer, right?
02:27:04.000Politically, we have to be in a position, and economically, where if this thing were handed to us, we could use it for benign purposes and share the wealth, and we're not there yet.
02:27:44.000Well, no, it's called the Chatham House Rules.
02:27:46.000Certain conferences are organized under, you know, board meetings are organized under these...
02:27:52.000Rules where you, because you want to encourage, so there's no press there, right?
02:27:55.000And you want to encourage just a free exchange of ideas, and you can talk about what was talked about there, but you can't give any attribution, and you can't, I mean, nothing's formally on the record.
02:29:25.000It could be nothing that it's like to be that sort of system, or it could be conscious.
02:29:30.000But, in any case, this one guy gave a talk...
02:29:34.000This one guy gave a talk where he just speculated about this thing taking off and...
02:29:42.000More or less standing in relation to us the way we stand in relation to bacteria or snails or, you know, life forms that we just squash without a qualm.
02:29:52.000And that's a totally benign, acceptable, not only acceptable, but to be hoped for, right?
02:30:00.000And I can follow him there at least halfway.
02:30:05.000If you imagine that these are conscious and actually become the center of the greatest possible happiness in the universe, right?
02:30:14.000So, like, if you imagine we build—if we give birth to a conscious machine that is essentially a god, right, that has interests and states of pleasure and insight and meaning that we can't even imagine, right— That is,
02:30:32.000that thing by definition, by my definition, becomes more important than us, right?
02:30:36.000Then we really are like the chickens that, you know, hope we don't kill them to eat them, but they're just chickens and we're more important because we have a, you know, this greater scope to our pains and pleasures.
02:30:48.000And that's not to say that I don't see any moral problem with killing chickens.
02:30:51.000I'm just saying that we are more important than chickens because of the nature of human experience.
02:31:45.000That there's no good that has come of this.
02:31:47.000This is just blind mechanism, which still is godlike in its power.
02:31:57.000And it could be antithetical to our survival, or it could just sort of part ways with us, you know?
02:32:04.000That's the mindfuck of all mindfucks, is that we really are just a caterpillar, and we're giving birth to this ultimate fractal intelligence that's infinite in its span.
02:32:15.000You could create something within, like as you said, a week, 20,000 years of life.
02:32:47.000Is it possible that these super intelligent human beings, that a lot of them do have this sort of Asperger's-y way of approaching life, and a lot of them are not on the spectrum.
02:33:02.000Did nature sort of design that in order to make sure that we do create these things?
02:33:09.000I mean if everything in life if life itself everything in life We look at alpha wolves and the way caterpillars interact with their environment and bugs and whatever all that stuff's natural is human behavior human Cognitive thinking is human creativity is all that nature is all that just a part of human beings ultimate curiosity is Almost inevitably leading to the creation of artificial intelligence.
02:33:34.000Was it sort of programmed into the system to create something far better than what we are?
02:33:39.000Well, I wouldn't say it's programmed into the system necessarily.
02:33:42.000I think you can explain all this just by...
02:33:46.000Everything being pushed from behind by...
02:33:50.000We have all of the inclinations and abilities that evolution has selected for in us, and we have an ability to create increasingly powerful technology.
02:34:03.000But the inevitability of this is hard to escape.
02:34:07.000There are really only two assumptions.
02:34:10.000All you have to assume is that we are going to build better and better computers, which I think you have to assume, apart from the possibility that we're just going to destroy ourselves and lose the ability to do so.
02:34:22.000But if we don't destroy ourselves some other way, we are going to continue to make progress in both hardware and software design.
02:34:31.000And the only other thing you have to assume is that there's nothing magical about the wetware we have inside our heads as far as information processing is concerned and that it's possible to build intelligent machines in silicon, right?
02:34:47.000I can't imagine any, at this point, serious scientist fundamentally downing either of those two assumptions.
02:34:56.000Nobody thinks that there's something magical about being, you know, neural material when it comes to intelligent, you know, the process of information that is underlying intelligence.
02:35:10.000And we're just going to keep making progress.
02:35:12.000So at some point this progress is going to birth a generation of computers that is better able to make this sort of progress than we are.
02:35:24.000And so the benign version of this that some people imagine, and this is where the whole singularity begins to sound like a religion, but there are many people in Silicon Valley imagining that we are going to merge We're going to upload our consciousnesses onto the internet eventually and become immortal and just live in the dreamscape of the paradise that we have engineered
02:35:57.000That vision Presupposes a few other things that are much more far-fetched than the first two assumptions I just listed.
02:36:09.000One is that before this happens, we will crack the neural code and truly understand how to upload the information in a human brain into another medium, and that you could move consciousness, mind and consciousness, into the Internet.
02:36:48.000And then when you die, aren't you just dying every bit as much as you would be dying if we hadn't done that?
02:36:52.000I mean, there are problems of sort of identity that come in there that are sort of hard to solve.
02:36:58.000But no, there are people who are looking at this as a, you know, it's very much like we're building the matrix in some sense, and we're going to leap into it at the opportune moment, and it's going to be glorious.
02:37:35.000But our own consciousness is, I mean, it's so archaic in comparison.
02:37:40.000If you're talking about something that can exponentially increase in one week, 20,000 years, and then on and on and on from there, why would you want to take your consciousness and download it?
02:37:51.000It's like a chicken asking, you know, how can I stay a chicken?
02:37:55.000Where are my feathers going to go in this new world?
02:37:58.000Yeah, if you could take an ant and turn it into Einstein, would it really want to go back to being an ant?
02:38:02.000I prefer digging in the dirt and just dropping my eggs and cutting leaves.
02:38:51.000Help us solve problems that we're not smart enough to solve.
02:38:54.000But the prospect of building that and keeping it reliably benign or keeping ourselves from going nuts in its presence, it's just a non-trivial problem.
02:39:06.000Wouldn't it almost instantaneously recognize that part of the problem is us itself?
02:39:15.000And it would also immediately recognize, like, hey, this planet has only got, like, another billion years of reliable sunlight.
02:39:22.000Like, we've got to get the fuck out of here and propagate the universe.
02:39:25.000Well, there's a great book, if you really want to get into this, there's a book by the philosopher Nick Bostrom and...
02:39:32.000He actually might have been the one who convinced Elon this is such a problem.
02:39:37.000He was one of the organizers at this conference, and virtually everyone had read his book at the conference.
02:39:42.000He wrote a book called Superintelligence, which just lays out the whole case.
02:39:47.000Virtually everything that you've heard me say on this topic is some version of a concern that he expresses in that book.
02:39:57.000And it's very interesting because he just goes through it.
02:40:02.000It's like 400 pages of systematically closing the door to every utopian way this could go right for us.
02:40:14.000And he just is like, yeah, well, here are the things you're not foreseeing about how even a...
02:40:22.000You just have to anticipate absolutely everything.
02:40:25.000So if you're trying to create a machine that is going to block spam, you need to create a machine that will not, as a strategy for reducing spam, just kill people.
02:40:44.000You can't assume common sense in a super intelligent machine unless you have engineered it into the architecture or you have taught it, you've built it to emulate your values.
02:41:02.000You would build a machine where it would not merely emulate current human values.
02:41:12.000Ultimately, you want a machine that instantiates the values that we should have, not that we necessarily do in any moment.
02:41:22.000One thing that's interesting to me in thinking about this is that the moment you think about building a machine like this, You realize that you have to solve some fundamental philosophical problems.
02:41:34.000You can't pretend that everyone has equivalently valuable values, right?
02:41:41.000Because you have to decide what to engineer into this thing.
02:41:44.000So do you want to engineer the values of jihadists and Islamists?
02:41:49.000Did the Taliban get a vote on how we should design the values of this thing?
02:41:54.000I think it's pretty obvious that the answer to that is no.
02:41:59.000But then you have to cut through basically every other moral quandary we have, because this thing is going to be acting on the basis of values.
02:42:08.000But initially, if it's independent and autonomous, it's going to automatically realize that a lot of our ideas are based on our own biological needs, and that a lot of those are unnecessary for it.
02:42:22.000Oh, yeah, but we will be building it, I mean, if we're sane, we're going to build it not to be merely self-interested.
02:42:30.000We're going to build it to conserve our interests, whatever those deepest interests are, ultimately.
02:42:47.000We're building a wrecking ball, and we're going to swing it out away from the planet and watch it hurtle back.
02:42:53.000So essentially, the Unabomber was right.
02:42:57.000Have you ever seen people, a few people have done this with his text because there are sections of his text that can read totally irrational.
02:43:06.000And people occasionally will put a section there and then it's not until you turn the page and have already agreed with it that you see who wrote it.
02:43:36.000He went to Berkeley, started teaching, and saved up all his money from teaching, and went to the woods and started blowing up people that were involved in technology.
02:43:43.000Yeah, there's a documentary called The Net, and I believe it's from Germany.
02:43:48.000I believe it was a German documentary, but it's very secretive, like, who was and was not involved in those Harvard LSD studies.
02:44:07.000But, I mean, he might have had a vision that he chased down, you know...
02:44:12.000Add to the final point, and he recognized from his experiences, like, whoa, if we keep going, this is inevitable, and became obsessed with it.
02:44:21.000Obviously, one of the things that people try to connect is various drugs with schizophrenia and mental illnesses.
02:44:31.000And most of those have not been able to stick because there's a certain percentage of people that will inevitably have issues.
02:44:38.000And the percentage of people that have issues with schizophrenia or various mental illnesses are almost mirrored by the percentage of people who do psychedelic drugs, various psychoactive drugs, and develop drugs.
02:44:53.000So it might not be the cause, but it's a concern.
02:44:58.000And if you get a guy who may have a propensity for mental illness and you dose the shit out of him with LSD, you might get a Ted Kaczynski.
02:45:07.000No, I think there are some people who certainly shouldn't take any drugs.
02:45:13.000And, you know, I've had bad experiences on a variety of psychedelics, as well as good ones, obviously, but the bad experiences I could see in the wrong mind affecting you permanently in a way that's not good for you or anyone else.
02:46:18.000An argument that there's probably several versions of each person based on your reactions to whatever experiences you had, but that might have been version 2.0 of me.
02:46:31.000I became a different person because of that.
02:46:33.000And that could easily be what happened to poor old Ted.
02:46:37.000Yeah, like you said, some of his assertions, if you look at the direction the technology is headed, obviously he was fucking batshit crazy, but he said some things that weren't batshit crazy.
02:47:33.000But as long as you're interacting with...
02:47:36.000A person's actual views, then condemn those views and criticize those views to whatever degree.
02:47:42.000But if part of your project, or the entirety of your project, is simply knowingly sliming them with a misrepresentation of their views, because you can get away with it, because you know their views are either so hard to understand for most people,
02:48:00.000or just people aren't going to take the time to do it, that you're just defaming people.
02:48:05.000Well it's also that there's a desire to win that a lot of people have that they apply to debates and it makes them intellectually dishonest because they don't want to agree that someone that they might have a disagreement with You may have a point or two.
02:48:22.000You might disagree with the entirety of what they're saying, but somewhere along the line, it might be possible that you could see where they're coming from, even if you don't agree, but it just throws your argument into a bad position,
02:48:45.000It's like either the argument succeeds or fails based on the structure of the argument and its connection to evidence, or it doesn't, right?
02:48:55.000And it doesn't matter if it's Hitler's argument for the destruction of the Jews or, you know, Ted Kaczynski said something true about the progress of technology.
02:49:07.000Whether it's true or not about the progress of technology has nothing to do with the source, but...
02:49:11.000People imagine that if you don't like the source, there's no burden to actually address the arguments.
02:49:20.000And if you don't like the arguments, a successful rejoinder is just to trash the source.
02:49:54.000At a certain point, you can just decide, well, I don't need to hear it from this person because I know this person doesn't understand what he's saying and has lied in the past, so I'll wait to hear it from somebody else.
02:50:06.000So yeah, it's not that the source doesn't matter at all, but you're not actually addressing truth claims if you're just disparaging the source of those claims.
02:50:19.000Is there anything else you'd want to get into?
02:50:24.000The only other thing on this list, which is just too big for 10 minutes and we're going to get in trouble, is a lot of people hit us with cops, Baltimore, self-defense, violence, weapons, all that stuff.
02:50:43.000Yeah, and it's also, we don't need to talk about it once artificial intelligence kicks in, we're not going to have crime anymore.
02:50:52.000We've pretty much cured it all with the final hour.
02:50:56.000The artificial intelligence conversation sort of trumps the whole thing, because Islam's not going to be important when there's robots that can...
02:51:48.000I mean, what am I, like, oh, what, they're really going to be a super-intelligent machine that's going to take, you know, swallow the world, right?
02:51:54.000Wasn't that the devil's greatest trick?
02:52:03.000But it's, I mean, this is on, unlike other, I mean, other things have this character.
02:52:09.000Like, it's hard to really worry about climate change I mean, we're good to go.
02:52:34.000And yet it's much easier to worry about other far more...
02:52:38.000It's easier to worry about Twitter than it is to worry about that.
02:52:41.000But this thing is so kind of lampoonable, and it's just kind of a goofy notion which seems just too strange to even be the substance of good, credible fiction.
02:52:54.000And yet, when you look at the assumptions you need to get on the train, there's only two.
02:53:02.000And they're very hard to doubt the truth of.
02:53:08.000Again, we just have to keep making progress and there's nothing magical about biological material in terms of an intelligent system.
02:53:16.000And by the time it becomes a threat to everyone, by the time we recognize it as a threat, ideally it'll be too late.
02:53:26.000Well, people have different timings of what they call the takeoff, you know, whether it's a hard takeoff or something more gradual.
02:53:32.000But, yeah, it's the kind of thing that could happen in secret, and all of a sudden, things are different in a way that no one understands.
02:53:45.000And you can also make this argument that if you look at all the issues that we have in this world, that so many of them are almost unfixable without this.
02:54:09.000I mean, intelligence is our only asset, ultimately.
02:54:12.000I mean, it's giving us everything good.
02:54:14.000Right, and why should we accept our limited biological intelligence when we can come up with something infinitely more intelligent and godlike?
02:54:21.000Progress in this area seems almost an intrinsic good.
02:54:27.000Because we want to be able to, whatever you want...
02:54:31.000You want to be able to solve problems, and you want to be able to anticipate the negative consequences of your doing good things and mitigate those.
02:54:42.000And intelligence is just the thing you use to do that.