The Joe Rogan Experience - May 07, 2019


Joe Rogan Experience #1292 - Lex Fridman


Episode Stats

Length

2 hours and 59 minutes

Words per Minute

167.45206

Word Count

30,133

Sentence Count

2,887

Misogynist Sentences

43

Hate Speech Sentences

43


Summary

In this episode, we discuss the best sequel of all time, Al Pacino's performance in 'The Godfather' and why you should get married to someone who's not into you. Also, we talk about how a high school put on an Alien play and Sigourney Weaver showed up to say thanks or whatever and told them they did awesome. We also talk about movies that are transformational for society and movies that aren't so much. And of course, we have a love/hate relationship with Adam Sandler. We're not talking about romantic comedies here, we're talking about serious comedies and dramatic moments. And we're not even joking about that! Logo by Courtney DeKorte. Theme by Mavus White. Music by PSOVOD and tyops. Thank you for listening and supporting the podcast. If you like what you hear, please HIT SUBSCRIBE and leave us a rating and review on Apple Podcasts. I'll be listening to your favorite streaming service so we can keep improving the quality of the podcast and making sure to bring you quality and quantity of content going forward. Thank you again, guys! XOXO, Lex, Lex and I'm not mad at you anymore. xoxo, Lex & I'm sorry for all the support the podcast, I really appreciate it. Love ya. -Jon and I hope you enjoy this episode. -Jonah and I love you. -Tune in next week! -Jaxon and I'll see you soon! -P.J. & I'll talk about the new music, too! -Jude and I will be back in a little bit more in the next week. -A.A. -Josie and I can't wait to see you back in the future! -S.B. -R.Y. & Alyssa and I know you're not mad about it. -D. -PJ & I love ya. -BJ. ~ -TJ & A.J.'s -M.S. -E. & JUICY! -BEN JEAN DANDSETTER, SONGS -PODCAST -JOSYANCHE CHEER, JOSIE AND AYAN CHEESE -JORDY -JOSH & JAYE -JAYE RYAN LYNNE


Transcript

00:00:02.000 Ready?
00:00:03.000 Boom!
00:00:04.000 And we're live.
00:00:04.000 Hello, Lex.
00:00:05.000 Hey.
00:00:06.000 What's going on?
00:00:06.000 The sequel, part two.
00:00:08.000 You have a very similar, if not the exact same suit on.
00:00:11.000 This is all I wear.
00:00:12.000 You look very professional.
00:00:13.000 Yeah.
00:00:14.000 Very, um, Reservoir Dogs.
00:00:16.000 Reservoir Dogs?
00:00:17.000 Well, let's go to the best sequel of all time, Godfather Part 2. Is that the best sequel of all time?
00:00:22.000 I think John Wick might be.
00:00:23.000 I haven't seen John Wick.
00:00:25.000 Same suit?
00:00:25.000 How dare you, sir?
00:00:28.000 Godfather Part 2, I mean, that has to be the best sequel.
00:00:31.000 Okay.
00:00:32.000 And if this is Godfather Part 2, let's definitely not do Part 3. Yeah, part three was terrible, right?
00:00:39.000 Well, let's not offend anyone, but it was not up to far.
00:00:42.000 It wasn't as good.
00:00:43.000 Yeah.
00:00:43.000 I don't remember it.
00:00:45.000 It was the older Pacino.
00:00:47.000 Oh.
00:00:48.000 With that deeper voice.
00:00:49.000 Oh, that was like way later, right?
00:00:51.000 Yeah.
00:00:51.000 That was 90s.
00:00:53.000 Oh, okay.
00:00:54.000 So it's like Point Break, the remix.
00:00:57.000 Yeah.
00:00:58.000 Yes.
00:00:58.000 When they try to redo things way, way, way later, they almost never...
00:01:02.000 Except the Alien franchise.
00:01:04.000 They've done a pretty fucking good job with the Alien franchise.
00:01:06.000 They had a couple of duds in there, but for the most part.
00:01:10.000 I've actually never seen the Alien franchise.
00:01:12.000 What?
00:01:13.000 Who are you?
00:01:14.000 What's wrong with you?
00:01:15.000 Aren't you into science?
00:01:17.000 Intelligent men can disagree, but...
00:01:20.000 You're not into science?
00:01:21.000 I don't know.
00:01:22.000 I prefer Al Pacino, I would say.
00:01:24.000 Okay.
00:01:25.000 The older Senta woman, Al Pacino, you know that movie?
00:01:28.000 Really?
00:01:29.000 Yeah.
00:01:30.000 What?
00:01:30.000 Over Alien?
00:01:32.000 Come on.
00:01:33.000 Yeah, he got the Oscar for that one.
00:01:35.000 What about the one when he played the devil?
00:01:37.000 And the devil likes to rant!
00:01:39.000 Okay, there's two.
00:01:41.000 There's duds for everybody.
00:01:43.000 What was that one?
00:01:43.000 That was Advocate.
00:01:44.000 Uh-huh, there you go.
00:01:45.000 That was with Keanu Reeves.
00:01:47.000 Ah, Keanu Reeves from John Wick.
00:01:50.000 I heard John Wick Part 3 is very good.
00:01:51.000 Or The Matrix.
00:01:52.000 Whatever.
00:01:53.000 The better movie.
00:01:54.000 Not true!
00:01:55.000 Did you see this?
00:01:56.000 That happened?
00:01:56.000 This high school in North Bergen, Jersey, put on the Alien play a couple weeks ago.
00:02:00.000 And Sigourney Weaver showed up, right?
00:02:02.000 No, she showed up just the other day to say thanks or whatever and tell them they did awesome.
00:02:06.000 It looked crazy.
00:02:08.000 And I was just wondering, if they did this when you were in high school, do you think you might have joined drama?
00:02:11.000 Like if they did the Alien play?
00:02:13.000 No.
00:02:13.000 I would have loved it.
00:02:14.000 I would have watched it.
00:02:15.000 I'm not getting into drama.
00:02:17.000 Those people cry too much.
00:02:18.000 It's too much work.
00:02:19.000 It's a cool suit, though.
00:02:22.000 Yeah, okay.
00:02:23.000 Let's talk about this.
00:02:24.000 There's two kinds of movies.
00:02:25.000 There's fun movies and there's movies that are like transformational for society.
00:02:29.000 Okay, Scent of a Woman.
00:02:30.000 What?
00:02:31.000 Are you going to say Scent of a Woman is transformational for society?
00:02:34.000 Yeah, it's one of the greatest scenes between a man and a woman on film.
00:02:39.000 The tango scene.
00:02:40.000 Are you not married?
00:02:41.000 No, I'm not married.
00:02:42.000 Yeah.
00:02:45.000 It's like someone talking about French who can't speak French.
00:02:48.000 I see, yeah.
00:02:49.000 It's nonsense.
00:02:49.000 That movie sucked.
00:02:50.000 I read about French in a book.
00:02:53.000 Talking about France without ever being into Paris, yeah.
00:02:56.000 Is Scent of a Woman your favorite movie?
00:02:58.000 No, it's not my favorite movie.
00:02:59.000 I don't really think that movie sucked either, by the way, if you're getting mad right now.
00:03:01.000 I don't barely remember it.
00:03:02.000 But it is up there.
00:03:04.000 I'm sure it's a good movie.
00:03:05.000 It's one of the greatest performances by any actor ever.
00:03:08.000 Jesus Christ, bro.
00:03:08.000 Jesus Christ, you okay?
00:03:09.000 We're off to a good start.
00:03:10.000 Yeah, yeah.
00:03:12.000 I bet you there's thousands of people who agree with me right now.
00:03:14.000 Yeah, there's millions that don't.
00:03:18.000 And they're called haters.
00:03:20.000 Everyone who disagrees is always a hater.
00:03:22.000 Okay, so what's your favorite love scene in a movie?
00:03:27.000 You need to get married, bro.
00:03:30.000 You're into love movies and shit?
00:03:32.000 No, I'm not.
00:03:32.000 You need to fire yourself a gal.
00:03:33.000 Settle down.
00:03:34.000 We're not talking about romantic comedies.
00:03:35.000 Adam Sandler here.
00:03:38.000 Rom-coms.
00:03:38.000 We're talking about serious, dramatic moments, right?
00:03:42.000 Okay.
00:03:43.000 So, Godfather.
00:03:45.000 Good movie.
00:03:45.000 Yeah, great movie.
00:03:46.000 Great movie.
00:03:46.000 Does it have to have two guys shooting each other?
00:03:49.000 No.
00:03:50.000 Okay.
00:03:51.000 So, you ever seen A Sent to a Woman?
00:03:53.000 Yeah, I'm sure I saw it.
00:03:54.000 Yeah, but I barely remember it.
00:03:55.000 All right.
00:03:56.000 I can watch it today.
00:03:57.000 It'll be like a new movie to me.
00:03:58.000 Okay.
00:03:58.000 There's this broken man.
00:04:00.000 Spoiler alert.
00:04:01.000 Considering suicide, right?
00:04:02.000 Oh.
00:04:03.000 Okay.
00:04:03.000 It's deep.
00:04:04.000 So he is tortured by, you know, by his involvement in the war, by being responsible, all this kind of stuff.
00:04:13.000 He's now mentoring a younger version of himself who has more character, more integrity.
00:04:17.000 And throughout all of this, he meets this beautiful young woman.
00:04:22.000 He's blind.
00:04:23.000 He asks her for the dance and there's this beautiful moment where they connect.
00:04:27.000 I mean, okay, listen.
00:04:29.000 What's the purpose of film, right?
00:04:31.000 Entertainment.
00:04:33.000 Or make us think.
00:04:35.000 Make us think.
00:04:37.000 You're going to think if you want to think.
00:04:39.000 Nothing makes you think.
00:04:41.000 A film can engage you.
00:04:43.000 It can resonate with you or not.
00:04:46.000 I have a movie that I throw by people whenever I want to find out whether or not I want to listen to anything they have to say about movies.
00:04:52.000 The Big Lebowski.
00:04:55.000 Yeah, that's one of the greatest movies of all time.
00:04:57.000 Oh, look at you.
00:04:58.000 Okay.
00:04:58.000 That could be slightly better than Sent to a Woman.
00:05:03.000 Oh, boy.
00:05:05.000 That also is one of the greatest scenes between a man and a woman when the fine young lady is painting her toenails.
00:05:13.000 Right.
00:05:14.000 And she's offering him sex for money.
00:05:16.000 Yeah.
00:05:17.000 That's a beautiful moment too.
00:05:19.000 You think so?
00:05:19.000 Yeah.
00:05:20.000 Really?
00:05:21.000 Isn't that that girl that used to be a hot mess?
00:05:24.000 What's her name?
00:05:25.000 Tara Reid.
00:05:25.000 Tara Reid?
00:05:26.000 Yeah.
00:05:27.000 Is she still a hot mess or does she get her shit together?
00:05:30.000 She's been like the Sharknado series is what she's been doing recently.
00:05:33.000 Oh, she's in that?
00:05:34.000 Yeah.
00:05:34.000 Yeah.
00:05:35.000 So I passed the Big Lebowski test, and you failed the Central Woman test.
00:05:39.000 Well, I don't remember it.
00:05:39.000 I gotta wrap this conversation up.
00:05:42.000 I legitimately don't remember it.
00:05:45.000 I mean, I'm sure it's great.
00:05:46.000 I'm sure it's great.
00:05:47.000 You're a wise man.
00:05:48.000 If you like it, I'm sure it's good.
00:05:49.000 And you also recognize that Godfather 3 kind of sucks.
00:05:53.000 Yeah, yeah.
00:05:54.000 But I like the old Pacino.
00:05:56.000 Listen, Godfather is about...
00:05:58.000 Your people, the Italian people, have dominated the mob, the brilliant mob movies, right?
00:06:05.000 I mean, Godfather is about family, right?
00:06:08.000 There's something deeply genuine about that, that in our modern society we really crave for.
00:06:15.000 So it's bigger than the individual, bigger than the rules of society, the government, the man.
00:06:21.000 It's family above all.
00:06:23.000 That's like a...
00:06:25.000 I don't know.
00:06:25.000 That's timeless.
00:06:27.000 I agree with that.
00:06:29.000 The moment with the young Pacino when he talks to his brother, Fredo, says don't ever take sides against the family again, ever.
00:06:38.000 That's one of the greatest moments ever.
00:06:40.000 That's a great moment.
00:06:43.000 Alright, alright, alright.
00:06:45.000 I'm romanticizing movies here.
00:06:48.000 You didn't like John Wick though, huh?
00:06:51.000 Never seen it.
00:06:52.000 Whoa.
00:06:53.000 I've never seen it.
00:06:54.000 It's a good movie to watch on the treadmill.
00:06:59.000 Is he playing a Russian mobster in that?
00:07:01.000 No, he kills a bunch of them, though.
00:07:04.000 And he speaks Russian.
00:07:05.000 He works for the Russians.
00:07:07.000 Kills people for the Russians.
00:07:09.000 Keanu Reeves is one of the greatest human beings ever.
00:07:11.000 You think so?
00:07:12.000 Yeah, he's like the nicest guy.
00:07:13.000 I heard he's a really nice guy.
00:07:15.000 But he plays a badass gangster.
00:07:17.000 I would like him to be a little bit more fit.
00:07:19.000 Work out a little bit more.
00:07:20.000 You see him without a shirt on, you're like, hmm, not quite buying it.
00:07:23.000 But that's okay.
00:07:25.000 Average man.
00:07:26.000 Yeah, but the average man's not the fucking best assassin of all time.
00:07:30.000 With all this martial arts skill.
00:07:33.000 Fedor.
00:07:34.000 Yeah, but Fedor's big.
00:07:36.000 Fedor might have like a gut, but he's a thick motherfucker.
00:07:42.000 Okay, what about...
00:07:42.000 Especially the young Fedor.
00:07:44.000 You ever see young Fedor when he was in his prime?
00:07:46.000 Like back when he fought like Fujita.
00:07:48.000 Like back when...
00:07:49.000 There's a picture of Fedor standing around with a bunch of kettlebells.
00:07:53.000 You ever see that picture?
00:07:54.000 Nope.
00:07:55.000 That was Fedor in his lifting days.
00:07:57.000 I suspect, and this is coming from...
00:08:00.000 That's one when Fedor was fairly young up there.
00:08:03.000 But that's not the one I'm talking about.
00:08:05.000 You know that one with the kettlebells?
00:08:07.000 Is that picture up?
00:08:09.000 See if you find that picture.
00:08:10.000 Never a six-pack in sight.
00:08:11.000 No, no six-pack.
00:08:13.000 But I suspect that Fedor might have been on some performance-enhancing substances during his prime.
00:08:20.000 You mean like hard training, lots of drilling, technique, sort of strategy?
00:08:26.000 Steroids.
00:08:26.000 How dare you, sir?
00:08:27.000 Dude, he was in pride.
00:08:28.000 Everybody was on steroids.
00:08:31.000 Yeah, that's him.
00:08:32.000 Look at him.
00:08:32.000 That's him in his prime.
00:08:34.000 That's a big motherfucker.
00:08:36.000 Now, I do not know if he was on anything, but everybody else was.
00:08:39.000 I mean, literally everybody.
00:08:41.000 They had it in their contract that we will not test for steroids.
00:08:47.000 Ensign Inouye told me that they essentially encouraged people to take steroids.
00:08:51.000 Yeah, the pride days.
00:08:53.000 That's right.
00:08:54.000 It's not like Russians don't have a long history of using performance-enhancing substances.
00:08:59.000 I'm sure you saw that movie, Icarus.
00:09:02.000 Did you see it?
00:09:03.000 Yep.
00:09:03.000 Fascinating, right?
00:09:04.000 It's fascinating.
00:09:08.000 Steroids often feel to me like a bit of a witch hunt.
00:09:12.000 Yeah.
00:09:28.000 But, you know, with Fedor, the technique, the execution, the timing, the brilliance of his movement, the heart, the guts.
00:09:38.000 He's phenomenal.
00:09:39.000 If not the greatest heavyweight of all time, he's certainly one of them.
00:09:42.000 And I don't think steroids would help that guy.
00:09:44.000 Yes, they do.
00:09:44.000 They help.
00:09:45.000 That guy in particular?
00:09:46.000 Yeah, they help everything.
00:09:47.000 They help your training, they help your ability to recover, they help your explosive power, they help your speed, they help everything.
00:09:55.000 But they also, it's not just steroids.
00:09:57.000 Like, a lot of them are on EPO. EPO radically enhances your endurance.
00:10:03.000 And they're starting to catch people.
00:10:05.000 They just stripped TJ Dillashaw, UFC bantamweight champion for EPO. It's tragic.
00:10:12.000 Yes, it is tragic.
00:10:13.000 Especially TJ. I mean, he's just a phenomenal fighter.
00:10:17.000 If not, I mean, certainly top 10 pound for pound.
00:10:21.000 And then this is one of those things that comes up and you go, oh.
00:10:26.000 Man, it's a legacy killer.
00:10:28.000 In this world, we have to kind of reconsider what should be allowed or not.
00:10:34.000 Yeah, I agree with that.
00:10:35.000 There is an idea where you should make steroids legal, right?
00:10:40.000 Or not legal, sorry.
00:10:42.000 Allowed or some kind of supplementation.
00:10:45.000 Where's the line when you start to talk about the future of martial arts, the future of sport?
00:10:50.000 If you can control the levels so that they're healthy, I mean, isn't that the reason that they're not allowed is because if abused, they become unhealthy?
00:10:59.000 They damage long-term well-being of the person?
00:11:03.000 Look, if that was the case, we wouldn't allow fighting.
00:11:07.000 Because fighting is more damaging than steroids, for sure.
00:11:09.000 For sure.
00:11:10.000 Getting punched and kicked and fucking kneed in the face and elbowed into unconsciousness, that is way worse for you than steroids.
00:11:17.000 The concern is not for the athlete.
00:11:19.000 The concern is for the opponent.
00:11:21.000 The idea is that you will be able to inflict punishment that you would not ordinarily be able to inflict.
00:11:27.000 You will have more endurance.
00:11:29.000 You will have more power.
00:11:30.000 You will hurt someone, potentially even...
00:11:33.000 Look, there's going to be a time where someone dies in a mixed martial arts event.
00:11:37.000 And if that's someone who was the victor, who did not die, was on steroids, it is going to be a huge national tragedy and a massive disaster for the sport, for everything, if that ever does happen.
00:11:53.000 We can only hope it never does.
00:11:56.000 For sure.
00:11:57.000 It's a very, very dangerous game you're playing.
00:12:02.000 Martial arts is a very dangerous game.
00:12:04.000 And when you are enhancing your body with chemicals that are illegal while you're doing that game.
00:12:10.000 The real question is, though, here's my take on it.
00:12:15.000 It's one of the most human subjects, meaning that it's messy.
00:12:21.000 Humans are messy.
00:12:23.000 There's good and there's bad.
00:12:24.000 Look, abortion is a messy subject.
00:12:27.000 It's messy.
00:12:28.000 Whether you agree with someone's right to have it or not, it is what you're doing, especially as the fetus gets older, it's messy.
00:12:39.000 You know, when it's a complicated discussion, it's not a clear, it's not like you should drink water.
00:12:44.000 You know what I mean?
00:12:45.000 It's a very complicated discussion.
00:12:48.000 Steroids are a very complicated discussion.
00:12:50.000 You're not allowed to do them, but they exist for a reason.
00:12:54.000 The reason why they exist is they're really effective.
00:12:57.000 They're really effective at enhancing your body.
00:12:59.000 But how much of that will we allow?
00:13:01.000 We allow creatine, we allow supplements in terms of There's certain things that can slightly elevate your testosterone, slightly elevate your growth hormone.
00:13:12.000 We allow sauna and ice baths and all these things that have shown to enhance recovery, but that's too much.
00:13:19.000 It's too good.
00:13:19.000 They're too effective.
00:13:21.000 But it's weird.
00:13:22.000 It's weird that this thing that we found that makes you better, you can't use.
00:13:26.000 Yeah, and so I have to go back a little bit and disagree with you on something.
00:13:31.000 So in terms of fighting, being dangerous, and that's if we wanted to forbid things that are dangerous for you, we would forbid fighting.
00:13:38.000 I think the main thing you're doing can be dangerous, right?
00:13:42.000 The main thing that we're talking about, the sport...
00:13:45.000 The combat event.
00:13:47.000 That can be dangerous because that is what we watch.
00:13:50.000 Two people at the height of their skill, ability, heart, passion, putting their life at risk.
00:13:57.000 That can be dangerous.
00:13:58.000 But the supplementation around it, the way to make it To make their training better, more effective, that can't be dangerous.
00:14:07.000 That can't be dangerous?
00:14:09.000 Can't be dangerous.
00:14:10.000 So I thought steroids were considered, were sort of banned because abuses lead to long-term damage to health.
00:14:18.000 Now we see steroids as cheating, but it was banned initially because it has detrimental effects to your health.
00:14:26.000 No, because there's no real evidence that it's detrimental.
00:14:30.000 It's not as detrimental as alcohol when you allow people to drink.
00:14:32.000 But even when abused, where are the bodies?
00:14:36.000 There's a great documentary on it called Bigger, Stronger, Faster.
00:14:41.000 It's by my friend Chris Bell.
00:14:43.000 And when you watch that documentary and you realize, oh, well, the real negative consequences of taking steroids are that it shuts down your endocrine system.
00:14:54.000 So it stops your body's natural production.
00:14:57.000 Of testosterone and growth hormone and hormones.
00:15:00.000 That's the real problem.
00:15:02.000 And for young people, that can be very devastating.
00:15:04.000 And it can lead to depression and suicidal thoughts and all sorts of really bad things when your testosterone shuts down.
00:15:11.000 But as far as like death, boy, I mean, there's...
00:15:17.000 People are prescribed pain pills every day of the week, and fighters that are on injuries that have gotten surgery, they're prescribed pain pills every day of the week, and those pain pills kill people left and right.
00:15:30.000 That's just a fact.
00:15:31.000 People die of those things all the time, much more so than die of steroids.
00:15:36.000 I'm not advocating for the use of steroids.
00:15:39.000 I'm being pretty objective and neutral about this, but I'm just looking at it like it's a very messy subject.
00:15:46.000 Yeah, it's very eloquently put.
00:15:47.000 So your problem in terms of damaging the opponent is if one side takes steroids and the other doesn't.
00:15:54.000 Yes, exactly.
00:15:54.000 What happens if both?
00:15:55.000 The problem is you would require someone to do that.
00:15:58.000 Maybe someone's a holistic person.
00:15:59.000 They don't want to...
00:16:11.000 Yeah, you had C.T. Fletcher here yesterday, right?
00:16:18.000 Yes.
00:16:18.000 He's a natural bodybuilder.
00:16:20.000 Yes.
00:16:20.000 Or not bodybuilder.
00:16:21.000 Power lifter.
00:16:22.000 Power lifter.
00:16:23.000 Yeah.
00:16:23.000 But that's not required, right?
00:16:26.000 You're not requiring people.
00:16:28.000 You're giving them the choice.
00:16:29.000 So, you know, it's an interesting possibility where in moderation you'll be able to allow steroids in future athletics because with an argument that if done in moderation you can actually create healthier athletes.
00:16:43.000 Yeah, that's a real argument for the Tour de France.
00:16:46.000 The Tour de France, they say that you actually are better off and healthier taking steroids and EPO than you are doing it without it because it's so unbelievably grueling on the body.
00:16:57.000 Yeah.
00:16:57.000 I mean, those athletes are basically some of the best people in the world at suffering.
00:17:02.000 Yeah.
00:17:02.000 Long-term suffering.
00:17:03.000 It's incredible.
00:17:04.000 Yeah.
00:17:04.000 Ultra-marathon runners, all those guys.
00:17:07.000 Yeah.
00:17:07.000 It's a different sort of thing.
00:17:12.000 The thing about ultramarathon runners is they don't even test them.
00:17:15.000 Because they're like, good luck.
00:17:17.000 Those people have iron wills.
00:17:19.000 I don't know.
00:17:20.000 Like Courtney DeWalter is a woman who, you know who she is?
00:17:23.000 Yeah.
00:17:23.000 She's been in here.
00:17:24.000 She eats candy.
00:17:25.000 She drinks beer, eats candy and pizza.
00:17:27.000 That doesn't make sense.
00:17:28.000 Yeah.
00:17:28.000 I mean, she's just got a fucking iron will.
00:17:31.000 Her will is indomitable.
00:17:33.000 And you could take all the steroids you want.
00:17:35.000 When you're running for three days, that chick is going to beat you.
00:17:39.000 Yeah.
00:17:40.000 She just doesn't know how to quit.
00:17:42.000 Yeah.
00:17:42.000 Just has no quit in her.
00:17:43.000 Did you see the podcast with her where she talked about how she fell?
00:17:47.000 She couldn't see.
00:17:48.000 She was experiencing, I think it was interocular hemorrhaging?
00:17:52.000 Yeah.
00:17:52.000 So her eyeballs were bleeding internally, something like that, where it was impeding her vision.
00:17:58.000 She couldn't see.
00:18:00.000 I would stop.
00:18:01.000 I would stop running.
00:18:02.000 No, she fell because she couldn't see, busted her head open, bleeding all down her face, keeps running.
00:18:09.000 Barely can see her feet as she's running.
00:18:11.000 Keeps running.
00:18:12.000 I'm glad those people are out there.
00:18:14.000 I'm actually a bit like that.
00:18:16.000 I don't know how to quit.
00:18:17.000 Really?
00:18:18.000 Yeah.
00:18:19.000 I do a lot of stuff like that.
00:18:21.000 I ran yesterday.
00:18:23.000 I couldn't sleep.
00:18:24.000 I ran here yesterday, 13 miles.
00:18:26.000 I'm not a runner.
00:18:27.000 Just this weird obsession.
00:18:29.000 You don't run?
00:18:30.000 No, I run, but I'm not a runner.
00:18:32.000 Look at my body.
00:18:33.000 I have a similar body like yours.
00:18:36.000 We do better build for short sprinting and then maybe killing somebody with our hands.
00:18:44.000 Versus long distance.
00:18:46.000 You're a black belt in jiu-jitsu, right?
00:18:47.000 Yeah.
00:18:48.000 Where do you train at?
00:18:49.000 I now train at Broadway Jiu Jitsu in Boston.
00:18:52.000 Nice.
00:18:52.000 And before I was in Philly, Balanced Studios with Phil McGuire's and so on.
00:18:58.000 Right.
00:18:59.000 But Karl Bokniak actually trains in Broadway.
00:19:03.000 Oh, really?
00:19:03.000 I love that, Karl.
00:19:04.000 Yeah, last time I was on, I actually wanted to talk about the Zabid fight.
00:19:08.000 Because I'm Russian, so I love the Russian way.
00:19:11.000 But I also love the...
00:19:13.000 I mean, Kyle to me represents like the American...
00:19:16.000 He's like the Rocky.
00:19:17.000 If you remember that fight against Zabid?
00:19:20.000 The third round, he was winning.
00:19:21.000 I mean, that's the best of what martial arts is.
00:19:25.000 MMA is, to me, is like you have two technicians that just throw everything away.
00:19:32.000 Like, screw this.
00:19:33.000 I'm just going to throw down.
00:19:34.000 Well, Zabit had broken his hand.
00:19:36.000 He broke his hand somewhere, I think, in the second round.
00:19:38.000 So he was pretty compromised going into that third round.
00:19:41.000 He couldn't really fire back.
00:19:43.000 And Kyle just has zero quit in him.
00:19:45.000 That guy's an animal.
00:19:46.000 Yeah.
00:19:46.000 I mean, that's the most beautiful...
00:19:48.000 You talk about technical fights on the ground or technical striking.
00:19:51.000 When two technicians throw everything away, I'm sorry, but that's what I love the most about any kind of fighting, any kind of sport.
00:20:01.000 I enjoy it in the moment.
00:20:02.000 I discourage it.
00:20:04.000 Heavily.
00:20:04.000 I don't think it's a smart way to fight.
00:20:06.000 But I get it.
00:20:08.000 That's probably your job.
00:20:09.000 Well, it's not just my job.
00:20:10.000 It's what I like.
00:20:11.000 I get the impulse.
00:20:16.000 But I don't want people to give in to the impulse.
00:20:18.000 I think fighting is something that you should do correctly.
00:20:24.000 There's principles that you should follow to fight correctly.
00:20:28.000 It doesn't mean that you shouldn't take chances.
00:20:31.000 But you know there's moments like Ricardo Lamas, when he fought Max Holloway, and they just stood in the center of the ring for the last few seconds of the fight, and Max Holloway pointed down at the ground.
00:20:48.000 He's like, come on, right here, right here.
00:20:49.000 And they just started swinging haymakers.
00:20:52.000 It was amazing.
00:20:53.000 Well, it happened.
00:20:54.000 But if I was in Max's corner, I'd be like, don't!
00:20:57.000 No!
00:20:59.000 Don't do that, man.
00:21:00.000 This macho shit is going to give you fucking brain damage.
00:21:03.000 You're going to get hit with shots you wouldn't get hit with.
00:21:05.000 That's a difficult...
00:21:06.000 Like you said, human nature is messy.
00:21:08.000 I would say that is the greatest...
00:21:10.000 That is the greatest moment of their lives.
00:21:14.000 What?
00:21:14.000 That war.
00:21:15.000 No.
00:21:17.000 Listen.
00:21:18.000 That war is the greatest moment of Max Holloway's life?
00:21:20.000 Max Holloway's the greatest featherweight of all time.
00:21:23.000 Discussion.
00:21:24.000 No, but Max Holloway is the greatest featherweight of all time.
00:21:28.000 He's a guy who destroyed Jose Aldo twice.
00:21:31.000 He's a guy that...
00:21:32.000 He's beaten everybody in front of him at featherweight.
00:21:35.000 The idea that this one moment where they decided to throw out all his skill and technique and just swing for the bleachers in the middle of the octagon.
00:21:44.000 It was a fun moment.
00:21:45.000 It was great to watch.
00:21:46.000 But the idea that that was the greatest moment in his life is ridiculous.
00:21:49.000 You're a crazy person.
00:21:50.000 Yeah, there's moments in sports...
00:21:52.000 They're just magic.
00:21:53.000 Olympics bring that when, like, the thing that you don't think should happen or can't possibly happen or is not wise, where people just throw everything away.
00:22:01.000 Yeah?
00:22:01.000 You like that?
00:22:02.000 Yeah.
00:22:02.000 You're a passion person.
00:22:03.000 Yeah, passion person.
00:22:05.000 Yeah, for sure.
00:22:06.000 That's an interesting thing for someone who studies artificial intelligence.
00:22:10.000 I mean, if anybody listened to this podcast, they'd be like, what the fuck does this guy do?
00:22:13.000 He dresses like Reservoir Dogs.
00:22:15.000 They talk about movies.
00:22:16.000 Yeah.
00:22:17.000 So many people angry right now.
00:22:19.000 Talk about autonomous vehicles.
00:22:20.000 We have plenty of time, sir.
00:22:22.000 We have plenty of time.
00:22:23.000 But that's the beautiful thing about this podcast.
00:22:25.000 We're just talking.
00:22:27.000 So tell me what you got here with your notes, man.
00:22:29.000 I mean, you are fucking prepared.
00:22:32.000 A lot of shit here.
00:22:33.000 Many, many pages.
00:22:34.000 For sure.
00:22:35.000 I don't want to miss stuff.
00:22:36.000 I mean, there's been a lot of exciting stuff on the autonomous vehicle space.
00:22:41.000 Since you came on, I got a Tesla.
00:22:43.000 And I've experienced what that thing is like when I put it on autopilot.
00:22:47.000 And it's stunning.
00:22:48.000 It's crazy.
00:22:49.000 I mean, it's the future.
00:22:51.000 In terms of the performance of the vehicle.
00:22:51.000 It's amazing.
00:22:52.000 Well, in terms of its ability to change lanes and its ability to drive without you doing anything, I just put my hand on the wheel and hold it there, and it does all the work.
00:23:02.000 So, because, like, one or two people listen to this podcast...
00:23:05.000 I want to take this opportunity and tell people, if you drive a Tesla, whether you listen to this now or a year from now, two years from now, Tesla or any other car, keep your damn eyes on the road.
00:23:18.000 So, whatever you think the system is able to do, you will have to still monitor the road.
00:23:24.000 Yes.
00:23:24.000 And you will still have to take over when it fails.
00:23:28.000 If?
00:23:29.000 When.
00:23:30.000 Really?
00:23:32.000 So...
00:23:36.000 This is like the moment we're throwing down, right?
00:23:38.000 No, I think...
00:23:40.000 No, this is your level of expertise, obviously.
00:23:42.000 I mean, I'm not throwing down with you on this.
00:23:44.000 No, I think it's really important in this transitionary phase, whatever the car company, whatever the system, that we don't overtrust the system.
00:23:54.000 We don't become complacent.
00:23:55.000 We don't think it can do more than it can.
00:23:57.000 Currently, 40,000 people die in the United States from fatal crashes.
00:24:02.000 The number one reason for that is distraction.
00:24:05.000 So, texting, smartphones.
00:24:06.000 How much has it gone up since smartphones?
00:24:09.000 People don't exactly...
00:24:11.000 They're trying to understand that.
00:24:12.000 There's a lot of studies showing that it's significant increases, but it's hard to say it's because of smartphones.
00:24:18.000 But it's almost obvious.
00:24:20.000 Yeah, it's pretty obvious.
00:24:21.000 The flip side is, even though everybody's now using a smartphone, texting, and so on, they've become better at using the smartphone.
00:24:29.000 So, they're better at texting and driving.
00:24:32.000 Ugh.
00:24:33.000 The better at balancing that.
00:24:34.000 Now, this is a horrible thing to do.
00:24:36.000 So if you're listening to this podcast, you should listen to it in your car and keep your eyes on the road and not text.
00:24:44.000 The worst was Pokemon.
00:24:46.000 When Pokemon was in its prime, I was watching a guy on the highway playing Pokemon as he was driving.
00:24:53.000 No more than one person.
00:24:55.000 Two people.
00:24:55.000 A guy and I saw a girl do it once too.
00:24:58.000 Holding the phone on the steering wheel playing Pokemon.
00:25:01.000 Yeah.
00:25:02.000 Yeah, it's incredible.
00:25:03.000 What are you doing, Jamie?
00:25:05.000 This grandpa has...
00:25:07.000 Oh, shit.
00:25:07.000 Sorry.
00:25:10.000 I'm confused.
00:25:11.000 What is this?
00:25:12.000 This grandpa in Japan, he drives around on a bike with 15 phones playing Pokemon all at the same exact time.
00:25:18.000 Look at this.
00:25:18.000 This guy has 15 phones.
00:25:21.000 It's ridiculous.
00:25:22.000 This guy needs to find hookers.
00:25:23.000 There's people that do this also in their car with maybe four or five doing exactly what you're saying.
00:25:27.000 This man needs a better hobby.
00:25:29.000 This is preposterous.
00:25:30.000 Look at his...
00:25:32.000 He can't see what the fuck's going on in front of him.
00:25:35.000 He spends about $300 a month to buy virtual currencies in the game.
00:25:41.000 Wow, that guy's bored.
00:25:43.000 Or an innovative genius.
00:25:45.000 You think?
00:25:46.000 Depending on the perspective.
00:25:47.000 Well, no.
00:25:48.000 People misuse their innovative...
00:25:50.000 How is he innovative?
00:25:52.000 He's just playing a stupid game while he's driving around on his bike like an asshole.
00:25:55.000 Well, he's doing...
00:25:56.000 This is back to the Scent of a Woman thing.
00:26:00.000 It's passion.
00:26:01.000 It's the most amazing moment of his life.
00:26:02.000 Driving around playing Pokemon.
00:26:03.000 I'm sure most people are on my side.
00:26:05.000 Scent of a Woman versus...
00:26:06.000 Oh, you're crazy.
00:26:07.000 You think most people are on your side?
00:26:09.000 They think Scent of a Woman's the greatest movie of all time?
00:26:11.000 Not Scent of a Woman, but I was defending Godfather, Scent of a Woman...
00:26:13.000 Well, you weren't defending Godfather against me.
00:26:15.000 I'm going to throw you under the bus.
00:26:17.000 I'm a fan.
00:26:17.000 Okay.
00:26:19.000 I'm going to manipulate this conversation.
00:26:21.000 Jamie, can you edit this in post?
00:26:23.000 Did you see the video that just came out yesterday of a Tesla on autopilot avoiding a crash?
00:26:29.000 Alright, so yeah, I have and there's a lot of examples like that.
00:26:32.000 There's quite a few of those.
00:26:34.000 Of course, it's hard to prove exactly what happened and whether Autopilot was involved.
00:26:38.000 Just like on the flip side, it's hard to prove that Autopilot was involved in the dangerous stuff.
00:26:42.000 But I think, by any measure, the media is really negative in terms of their reporting on Tesla.
00:26:50.000 Do you think so?
00:26:51.000 I think you've talked about this before.
00:26:54.000 In general, negativity gets more clicks.
00:26:56.000 Right, right.
00:26:57.000 And I think Tesla, negative stuff on Tesla gets a lot of clicks.
00:27:03.000 Well, not Tesla.
00:27:05.000 Let me speak more broadly about autonomous vehicles.
00:27:07.000 If there's any fatality, any crash, it's over-represented.
00:27:11.000 It's over-reported on.
00:27:12.000 To me, people who are interested in AI helping save lives in these systems like Autopilot, Yes.
00:27:39.000 I agree.
00:27:45.000 Listen to some music, some classic rock.
00:27:47.000 Classic rock?
00:27:48.000 Is that what you're into?
00:27:49.000 Yeah.
00:27:49.000 Like Credence?
00:27:51.000 No.
00:27:52.000 No?
00:27:52.000 Just say no?
00:27:53.000 Like you don't like it?
00:27:54.000 Like you would change the channel?
00:27:55.000 No, you're going to put me in the scent of a woman.
00:27:57.000 Of course.
00:27:58.000 Like Fortunate Son comes on, you don't get excited?
00:28:00.000 There you go.
00:28:01.000 I take my shirt off.
00:28:04.000 Start drinking.
00:28:06.000 No, Led Zeppelin, Leonard Skinner, Hendrix.
00:28:10.000 Of course, Hendrix.
00:28:11.000 I have to admit something.
00:28:12.000 I thought about messaging you a couple times.
00:28:15.000 I wanted to...
00:28:16.000 I play guitar.
00:28:18.000 Do you?
00:28:19.000 Yeah.
00:28:19.000 You good?
00:28:21.000 You can't...
00:28:22.000 Are you good at jiu-jitsu?
00:28:24.000 Yeah, I'm good.
00:28:25.000 Okay.
00:28:25.000 I'm a black belt.
00:28:26.000 I'm pretty good.
00:28:27.000 You're a black belt, too.
00:28:28.000 I'm sure you're good.
00:28:29.000 Yeah.
00:28:29.000 I mean, I'm not world class.
00:28:30.000 A lot of dudes fuck me up.
00:28:32.000 I'm a three-stripe purple belt guitar.
00:28:34.000 That's a good way of putting it.
00:28:35.000 Yeah.
00:28:36.000 Yeah, I say that about hunting.
00:28:37.000 I'm like a blue belt in hunting.
00:28:38.000 Yeah, yeah.
00:28:40.000 Yeah, I've been doing it.
00:28:41.000 Like, I got the purple belt by doing it a long time, as opposed to being amazing.
00:28:46.000 Did you take lessons?
00:28:48.000 No, I learned everything myself.
00:28:49.000 I have a couple of videos online, me playing Comfortably Numb.
00:28:52.000 Did you learn from watching videos, or did you learn from books?
00:28:57.000 Like, how did you learn?
00:28:57.000 Let me see this.
00:28:58.000 Give me this.
00:28:59.000 Look at you.
00:29:02.000 Well, this is going to get us booted off of YouTube.
00:29:05.000 No, this is me playing.
00:29:06.000 No, no, no.
00:29:07.000 It probably won't pick it up.
00:29:10.000 And if it picks it up, it'll be to my channel.
00:29:12.000 You sure?
00:29:13.000 Yeah.
00:29:14.000 It's me playing.
00:29:15.000 I know, but it's so good.
00:29:17.000 They've blocked people from humming songs recently.
00:29:20.000 So this is on YouTube.
00:29:22.000 No, I didn't know that.
00:29:23.000 But this didn't get blocked.
00:29:24.000 If you were humming a song, and then someone made a claim on that song, it would block our YouTube.
00:29:30.000 Like, literally, we could get demonetized.
00:29:33.000 We lose our streaming ability.
00:29:35.000 Lots of things can happen.
00:29:36.000 Lots of things.
00:29:37.000 It's fucked up, man.
00:29:39.000 We've gotten flagged for watching something on the screen, picture in picture, no sound, commenting on it.
00:29:47.000 And we get flagged.
00:29:48.000 And they want all the advertising revenue from a three-hour show for five, ten seconds of a video.
00:29:54.000 It's a slightly broken system.
00:29:55.000 Ooh, it's broken.
00:29:56.000 But there's a lot of scam artists, too.
00:29:58.000 So I played another song, Black Betty.
00:30:01.000 Oh, yeah?
00:30:02.000 I played the damn song, but they said it was...
00:30:08.000 They did exactly that.
00:30:10.000 Oh, they said it was...
00:30:11.000 Ram Jam or whatever said it was there.
00:30:14.000 And I may have borrowed the beat behind it from them.
00:30:17.000 I'm not sure.
00:30:18.000 I just took a beat like...
00:30:20.000 Well, that's what I was thinking about that song.
00:30:22.000 It sounded like there was other shit going on besides just your guitar.
00:30:27.000 Oh, no, that's all me.
00:30:29.000 Really?
00:30:29.000 That's all me at the back.
00:30:30.000 Let me hear that again.
00:30:31.000 That's really good, man.
00:30:32.000 You sound good.
00:30:33.000 That's a great fucking song, too.
00:30:35.000 Comfortably numb.
00:30:35.000 You know the scariest thing for me?
00:30:37.000 What?
00:30:37.000 Was to play guitar on this podcast.
00:30:40.000 So it's like going back and forth.
00:30:41.000 Oh, really?
00:30:42.000 Should I do it?
00:30:42.000 Should I not do it?
00:30:43.000 Actually play?
00:30:44.000 Play, play?
00:30:45.000 There's only a few people that have ever played play.
00:30:47.000 Everlast.
00:30:48.000 Ben and Suzanne from Honey Honey.
00:30:51.000 Gary Clark didn't, right?
00:30:52.000 He just came on and talked.
00:30:53.000 He brought his guitar.
00:30:55.000 I wanted to play Hendrix here.
00:30:56.000 Really?
00:30:57.000 Yeah.
00:30:58.000 Live?
00:30:58.000 Live.
00:30:59.000 You got it with you right now?
00:31:01.000 Guitar?
00:31:02.000 Yeah.
00:31:02.000 Well, no.
00:31:03.000 I meant...
00:31:03.000 I was...
00:31:04.000 Okay, sure.
00:31:05.000 In the future.
00:31:06.000 I'm not promising.
00:31:07.000 I'm scared.
00:31:07.000 Will you wear a Hendrix wig?
00:31:09.000 With a bandana?
00:31:11.000 Is that racially insensitive, though?
00:31:13.000 No!
00:31:13.000 You're allowed to.
00:31:15.000 Joe, I will not take advice on you.
00:31:17.000 As long as you don't wear blackface, you're allowed.
00:31:19.000 Okay.
00:31:19.000 The hair is just hair.
00:31:22.000 You can't wear dreadlocks, though.
00:31:24.000 Yeah, so there's rules.
00:31:25.000 But I think you...
00:31:26.000 Yeah, there's rules.
00:31:30.000 Hendrix is above all rules, though, right?
00:31:33.000 Well, he's the goat, you know, of guitar players.
00:31:37.000 That's the goat.
00:31:38.000 One of them.
00:31:39.000 You know, the reason why this show is called...
00:31:42.000 The experience, yeah.
00:31:42.000 Yeah, I stole it from Jimi Hendrix.
00:31:44.000 Oh, yeah.
00:31:44.000 Yeah.
00:31:45.000 What's the matter?
00:31:46.000 I don't remember if we brought this up last time, but I just remembered seeing this video where you're playing guitar while you were driving.
00:31:51.000 Yep.
00:31:52.000 Well, you shouldn't do that, dude.
00:31:54.000 There's a reason why he was doing it.
00:31:56.000 Why are you doing that?
00:31:56.000 It's on a test track.
00:31:58.000 Oh, what kind of car is that?
00:32:00.000 Looks like a Lincoln.
00:32:00.000 Lincoln MKZ, that's right.
00:32:02.000 Oh, they do that?
00:32:02.000 The Lincolns do that?
00:32:03.000 No, we converted it and that's our code controlling the car.
00:32:08.000 Wow!
00:32:09.000 That is crazy!
00:32:12.000 So you converted this car to drive autonomously?
00:32:15.000 Autonomously, yeah.
00:32:16.000 Wow.
00:32:17.000 And what exactly do you have to do to a car to change?
00:32:23.000 Because that car does not have the capacity to do anything like that.
00:32:29.000 Right?
00:32:30.000 Am I correct?
00:32:31.000 No, no, no.
00:32:31.000 Absolutely not.
00:32:32.000 But you are absolutely correct.
00:32:36.000 There's the first part is being able to control the car with a computer, which is converting it to be drive-by-wire.
00:32:41.000 So you can control the steering and the brake and the acceleration to basically be able to control with a joystick.
00:32:47.000 And then you have to put laser sensors all around the car?
00:32:50.000 Is that what you're doing?
00:32:50.000 Any kind of sensor and software.
00:32:53.000 What's the best kind of sensor?
00:32:54.000 Is it optical, laser?
00:32:56.000 A lot of debate on this.
00:32:57.000 And this is the big, this is the throw down between Elon Musk and everybody else.
00:33:01.000 So Elon Musk says the best sensor is camera.
00:33:04.000 Everybody else, well, everybody else says that at this time LIDAR, which are these lasers, is the best sensor.
00:33:11.000 So I'm more on the side, in this case on camera, on Elon Musk.
00:33:17.000 So here's the difference.
00:33:19.000 Lasers are more precise.
00:33:20.000 They work better in poor lighting conditions.
00:33:23.000 They're more reliable.
00:33:24.000 You can actually build safe systems today that use LiDAR.
00:33:29.000 The problem is that they don't have very much information.
00:33:33.000 So we use our eyes to drive.
00:33:35.000 And cameras, the same thing.
00:33:38.000 And they have just a lot more information.
00:33:40.000 So if you're going to build artificial intelligence systems, so machine learning systems that learn from huge amounts of data, camera is the way to go.
00:33:48.000 Because you can learn so much more.
00:33:49.000 You can see so much more.
00:33:51.000 So the richer, deeper sensor is camera.
00:33:54.000 But it's much harder.
00:33:57.000 You have to collect a huge amount of data.
00:33:58.000 It's a little bit more futuristic, so it's a longer-term solution.
00:34:02.000 So today, to build a safe vehicle, you have to go LiDAR.
00:34:05.000 Tomorrow, however you define tomorrow, Yelma says it's in a year.
00:34:10.000 Others say it's 5, 10, 20 years.
00:34:12.000 Camera is the way to go.
00:34:13.000 So that's the hard debate.
00:34:16.000 And there's a lot of other debates, but that's one of the core ones.
00:34:19.000 It's basically, for camera, if you go camera like you do in the Tesla, there's seven cameras in your Tesla, three looking forward, there's all around, so on, one looking inside.
00:34:30.000 No, you have the Model S? Yeah.
00:34:32.000 Yeah, so that one doesn't have a camera that's looking inside.
00:34:34.000 So it's all cameras plus radar and ultrasonic sensors.
00:34:40.000 That approach requires collecting huge amounts of data, and they're doing that.
00:34:44.000 They drove now about 1.3 billion miles under Autopilot.
00:34:50.000 Jesus.
00:34:51.000 Yeah, it's a very large amount of data.
00:34:54.000 So you're talking about over 500,000 vehicles have Autopilot.
00:35:00.000 450, I think, thousand have the new version of Autopilot, Autopilot 2, which is the one.
00:35:06.000 You're driving, and all of that is data.
00:35:08.000 So all of those, all the edge cases, what they call them, all the difficult situations that occur, is feeding the machine learning system to become better and better and better.
00:35:19.000 And the open question is, How much better does it need to get to the human level of performance?
00:35:26.000 One of the big assumptions of us human beings is that we think that driving is actually pretty easy, and we think that humans suck at driving.
00:35:38.000 Those two assumptions.
00:35:39.000 You think like driving, you know, you stay in the lane, you stop at the stop sign, it's pretty easy to automate.
00:35:45.000 And then the other one is you think like humans are terrible drivers, and so it'll be easy to build a machine that outperforms humans at driving.
00:35:54.000 Now there's, that's, I think, there's a lot of flaws behind that intuition.
00:35:59.000 We take for granted how hard it is to look at the scene, like everything you just did, picked up, moved around some objects, It's really difficult to build an artificial intelligence system that does that.
00:36:11.000 To be able to perceive and understand the scene enough to understand the physics of the scene, like all these objects, like how to pick them up, the texture of those objects, the weight, to understand glasses folded and unfolded, open water bottle, all those things is common sense knowledge that we take for granted.
00:36:29.000 We think it's trivial.
00:36:31.000 But there is no artificial system in the world today, nor will there be for perhaps quite a while that can do that kind of common sense reasoning about the physical world.
00:36:43.000 Add to that pedestrians.
00:36:46.000 So add some crazy people in this room right now to the whole scene.
00:36:50.000 Right, and being able to notice, like, this guy's an asshole.
00:36:52.000 Look at him.
00:36:53.000 What is he doing?
00:36:53.000 What is he doing?
00:36:54.000 Get off that skateboard.
00:36:55.000 Ah, Jesus, he's in traffic!
00:36:56.000 And, considering not that he's an asshole, he's a respectable skateboarder, that in order to make...
00:37:09.000 It's not just you have to perceive the world.
00:37:12.000 You have to assert your presence in this world.
00:37:17.000 You have to take risks.
00:37:19.000 So in order to make the skateboarder not cross the street, you have to perhaps accelerate if you have the right of way.
00:37:24.000 And there's a game theoretic, a game of chicken to get right.
00:37:28.000 I mean, we don't even know how to approach that as an artificial intelligence research community and also as a society.
00:37:37.000 Do we want an autonomous vehicle that speeds up?
00:37:40.000 In order to make a pedestrian not cross the street, which is what we do all the time.
00:37:46.000 We have to assert our presence.
00:37:49.000 If there's a person who doesn't have the right of way who begins crossing, we're going to either maintain speed or speed up potentially if we want them to not cross.
00:37:59.000 So that game there, to get that right.
00:38:02.000 That's a dangerous game for a robot.
00:38:04.000 It's for a robot.
00:38:04.000 And for us to be rationally, if that, God forbid, leads to a fatality, for us as a society to rationally reason about that and think about that.
00:38:16.000 I mean, a fatality like that could basically bankrupt a company.
00:38:19.000 There's a lawsuit going on right now about an accident in Northern California with Tesla.
00:38:28.000 Are you aware about that one?
00:38:30.000 What was the circumstances about that one?
00:38:33.000 So there was, I believe, in Mountain View, a fatality in a Tesla, where it...
00:38:39.000 This is a common problem for all lane-keeping systems, like Tesla Autopilot, is there was a divider in the highway, and basically the car was driving along the lane, and then the car in front moved to an adjacent lane,
00:38:55.000 and this divider appeared.
00:38:57.000 So you have to now steer to the right, and the car didn't, and it went straight into the divider.
00:39:02.000 Oh, wow.
00:39:06.000 Basically, what that boils down to is the car drifted out of lane, right?
00:39:10.000 Or didn't adjust properly to the lane.
00:39:13.000 Those kinds of things happen.
00:39:15.000 And this is because the person was allowing the autopilot to do everything?
00:39:20.000 Nope.
00:39:21.000 You can't.
00:39:22.000 So we have to be extremely careful here.
00:39:24.000 I don't know the really deep details of the case.
00:39:26.000 I'm not sure exactly how many people do.
00:39:29.000 So there's a judgment on what the person was doing, and then there's an analysis of what the system did.
00:39:34.000 The system drifted out of lane.
00:39:37.000 The question is, was the person paying attention?
00:39:40.000 And was there enough time given for the person to take over?
00:39:44.000 And if they were paying attention, to catch the vehicle, steer back onto the road?
00:39:48.000 As far as I believe...
00:39:52.000 The only information they have is hands on steering wheel and they were saying that like half the minute leading up to the crash, the hands weren't on the steering wheel or something like that.
00:40:04.000 Basically trying to infer were the person paying attention or not.
00:40:07.000 But we don't have the information exactly where were their eyes.
00:40:11.000 You can only make guesses as far as I know, again.
00:40:15.000 So the question is, this is the eyes on the road thing, because I think I've heard you on a podcast saying you're tempted to sort of look off the road at your new Tesla, or at least become a little bit complacent.
00:40:28.000 Yeah, the worry is that you just rely on the thing, that you would relax too much.
00:40:34.000 But what would that relaxation lead to?
00:40:36.000 The problem is if something happened.
00:40:38.000 If you weren't, you know, when you're driving, I mean, we've discussed this many times on the podcast that the reason why people have road rage, one of the reasons, is because you're in a heightened state, because cars are flying around you and your brain is prepared to make split-second decisions and moves.
00:40:56.000 And the worry is that you would relax that because you're so comfortable with that thing driving.
00:41:02.000 Everybody that I know that's tried that, they say you get really used to it doing that.
00:41:06.000 You get really used to it just driving around for you.
00:41:08.000 So the question is what happens when you get used to it?
00:41:12.000 Do you start looking off-road?
00:41:13.000 Do you start texting more?
00:41:14.000 Do you start watching a movie, etc.?
00:41:16.000 That's really an open question.
00:41:21.000 For example, we just published a study from MIT on what people in our dataset.
00:41:30.000 We collected this dataset of 300,000 miles and we instrumented all these Teslas and watched what people are actually doing.
00:41:37.000 And are they paying attention when they disengage the system?
00:41:41.000 So there's a really important moment here, we have 18,000 of those, when the person catches the car, you know, they disengage autopilot.
00:41:50.000 And that's a really, Tesla uses this moment as well, that's a really important window into difficult cases.
00:41:57.000 So some percentage of those, some small percentage, it's about 10%, is we call them tricky situations.
00:42:04.000 Is situations where you have to immediately respond, like drifting out of lane, if there's a stopped car in front, so on.
00:42:10.000 The question is, are people paying attention during those moments?
00:42:13.000 So in our data set, they were paying attention.
00:42:16.000 They were still remaining vigilant.
00:42:18.000 Now, in our dataset, the autopilot was, quote-unquote, encountering tricky situations every 9.2 miles.
00:42:27.000 So you could say it was failing every 9.2 miles.
00:42:33.000 That is one of the reasons we believe that people are still remaining vigilant.
00:42:38.000 That it's regularly and unpredictably sort of drifting out of lane or misbehaving.
00:42:45.000 So you don't overtrust it.
00:42:47.000 You don't become too complacent.
00:42:49.000 The open question is when it becomes better and better and better and better, will you start becoming complacent?
00:42:55.000 When it drives on the highway for an hour, an hour and a half, as opposed to 9.2 miles, make that 50 miles, 60 miles.
00:43:03.000 Do you start to overtrust it?
00:43:05.000 And that's a really open question.
00:43:07.000 Do you think or do you anticipate a time anywhere in the near future where you won't have to correct?
00:43:14.000 You will allow the car to do it because the car will be perfect?
00:43:18.000 The car, first of all, will never be perfect.
00:43:21.000 No car will ever be perfect.
00:43:22.000 Autonomous vehicles will always, you think, require at least some sort of manual override?
00:43:28.000 Yeah.
00:43:29.000 Really?
00:43:30.000 That's interesting that you're saying that because you work in AI. What makes you think that that's impossible to achieve?
00:43:38.000 Well, let's talk, because you're using the word perfection.
00:43:42.000 Okay, that's a bad word.
00:43:43.000 Let me see, will it achieve, because people are obviously not perfect.
00:43:49.000 Will it achieve a state of competence that exceeds the human being?
00:43:56.000 And let's put it in a dark way, competence measured by fatal crashes.
00:44:03.000 Yes.
00:44:04.000 Yes, I absolutely believe so.
00:44:06.000 And perhaps in the near term.
00:44:08.000 Near term?
00:44:09.000 Like five years?
00:44:10.000 Yeah.
00:44:11.000 For me, five, ten years is near term.
00:44:14.000 For Elon, in Elon Musk time, that's converted to one year.
00:44:19.000 Have you met him?
00:44:20.000 Yes.
00:44:21.000 Interviewed him recently.
00:44:22.000 Fascinating cat, right?
00:44:24.000 Yeah.
00:44:24.000 Yep.
00:44:25.000 Got a lot of weird shit bouncing around behind those eyeballs.
00:44:28.000 You don't realize until you talk to him in person, you're like, oh, you got a lot going on in there, man.
00:44:33.000 Yeah, there's passion, there's drive.
00:44:35.000 I mean, it's one of the...
00:44:37.000 It's a hurricane of ideas.
00:44:38.000 Yeah.
00:44:39.000 And focus and confidence.
00:44:43.000 Mm-hmm.
00:44:44.000 I mean, the thing is, in a lot of the things he does, which I admire greatly from any man or woman innovator, it's just boldly, fearlessly pursuing new ideas or jumping off the cliff and learning to fly on the way down.
00:44:59.000 Mm-hmm.
00:45:01.000 I mean, no matter what happens, you'll be remembered as the great innovators of our time.
00:45:06.000 Whatever you say, maybe in my book, Steve Jobs was as well.
00:45:11.000 Even if you criticize, perhaps he hasn't contributed significantly to the technological development of the company or the different ideas they did.
00:45:18.000 Still, his brilliance was in all the products of iPhone, of the personal computer, the Mac, and so on.
00:45:25.000 I think the same is true with Elon.
00:45:29.000 And yes, in this space of autonomous vehicles, of semi-autonomous vehicles, of driver assistance systems, it's a pretty tense space to operate in.
00:45:41.000 There's several communities in there that are very responsible but also aggressive in their criticism.
00:45:48.000 So in driving in the automotive sector, obviously, since Henry Ford and before, there's been a culture of safety, of just great engineering.
00:45:58.000 These are like some of the best engineers in the world in terms of large-scale production.
00:46:03.000 You talk about Toyota, you talk about Ford, GM. These people know how to do safety well.
00:46:08.000 And so here comes Elon with Silicon Valley ideals that throws a lot of it out the window and says we're going to revolutionize the way we do automation in general.
00:46:20.000 We're going to make software updates to the car once a week, twice a week, over the air, just like that.
00:46:27.000 That makes people and the safety engineers and human factories engineers really uncomfortable.
00:46:33.000 Like, what do you mean you're going to keep updating the software of the car?
00:46:37.000 How are you testing it?
00:46:39.000 That makes people really uncomfortable.
00:46:41.000 Why does it make them uncomfortable?
00:46:43.000 Because the way in the automotive sector you test the system, you come up with the design of the car, every component, and then you go through really rigorous testing before it ever hits the road.
00:46:54.000 Here's an idea from the Tesla side is where they basically They, in shadow mode, test the software, but then they just release it.
00:47:03.000 So essentially the drivers become the testing.
00:47:07.000 And then they regularly update it to adjust if any issues arise.
00:47:13.000 That makes people uncomfortable because there's not a standardized Testing procedure, there's not at least a feeling in the industry of rigor, because the reality is we don't know how to test software with the same kind of rigor that we've tested the automotive system in the past.
00:47:32.000 So I think it's extremely exciting and powerful to make software sort of approach automotive engineering with at least in part a software engineering perspective.
00:47:47.000 So just doing what's made Silicon Valley successful.
00:47:50.000 So updating regularly, aggressively innovating on the software side.
00:47:54.000 So your Tesla over the air, while we're sitting here, could get a totally new update.
00:47:59.000 With a flip of a bit, as Elon Musk says, it can gain all new capabilities.
00:48:07.000 That's really exciting, but that's also dangerous.
00:48:11.000 And that balance, we...
00:48:13.000 Well, what's dangerous about it?
00:48:15.000 That it'd be faulty software?
00:48:16.000 Faulty.
00:48:17.000 A bug.
00:48:17.000 So, the apps on your phone fail all the time.
00:48:23.000 We're, as a society, used to software failing, and we just kind of reboot the device or restart the app.
00:48:29.000 The most complex software systems in the world today, if we think outside of nuclear engineering and so on, They're too complex to really thoroughly test.
00:48:42.000 So thorough, complete testing, proving that the software is safe is nearly impossible on most software systems.
00:48:51.000 That's nerve-wracking to a lot of people because there's no way to prove that the new software update is safe.
00:49:02.000 So what is the process?
00:49:05.000 Do you know how they create software, they update it, and then they test it on something?
00:49:12.000 How much testing do they do, and how much do they do before they upload it to your car?
00:49:18.000 Yeah, so I don't have any insider information, but I have a lot of sort of public available information, which is they test the software in shadow mode, meaning they see how the new software compares to the current software by running it in parallel on the cars and seeing if there's disagreements,
00:49:36.000 like seeing if there's any major disagreements and bringing those up and seeing what...
00:49:42.000 By parallel, I'm sorry, do you mean both programs running at the same time?
00:49:48.000 One, the original update, yes, at the same time, the original update actually controlling the car, and the new update is just...
00:49:57.000 Making the same decisions?
00:49:59.000 Making the same decisions without them being actuated.
00:50:03.000 Without actually affecting the vehicle's dynamics.
00:50:06.000 And so that's a really powerful way of testing.
00:50:09.000 I think the software infrastructure that Tesla has built allows for that.
00:50:14.000 And I think other companies should do the same.
00:50:16.000 That's a really exciting, powerful way to approach not just automation, not just autonomous vehicles or semi-autonomous vehicles, but just safety.
00:50:25.000 Is basically all the data that's on cars, bring it back to a central point to where you can use the edge cases, all the weird situations in driving to improve the system, to test the system, to learn, to understand where the car is used,
00:50:43.000 misused, how it can be improved and so on.
00:50:45.000 That's extremely powerful.
00:50:47.000 How many people do they have that are analyzing all this data?
00:50:51.000 That's a really good question.
00:50:54.000 So the interesting thing about driving is most of it is pretty boring.
00:50:58.000 Nothing interesting happens.
00:50:59.000 So they have automated ways of extracting, again, what are called edge cases.
00:51:05.000 So these weird moments of driving.
00:51:07.000 And once you have these weird moments, they have people annotate.
00:51:11.000 I don't know what the number is, but a lot of companies are doing this.
00:51:14.000 It's in the hundreds and thousands.
00:51:17.000 Basically, humans annotate the data to see what happened.
00:51:20.000 But most of what they're trying to do is to automate that annotation.
00:51:24.000 So to figure out how the data can be automatically used to improve the system.
00:51:30.000 So they have methods for that.
00:51:32.000 Because it's a huge amount of data, right?
00:51:35.000 I think in the recent autonomy day a couple of weeks ago, they had this big autonomy day where they demonstrated The vehicle driving itself on a particular stretch of road.
00:51:46.000 They showed off that, you know, they're able to query the data, basically ask questions of the data, saying, the example they gave is there's a bike on the back of a car, the bicycle on the back of a car.
00:51:58.000 And they're able to say, well, when the bicycle is in the back of a car, that's not a bicycle.
00:52:03.000 That's just the part of the car.
00:52:05.000 And they're able to now look back into the data and find all the other cases, the thousands of cases that happened all over the world, in Europe and Asia, in South America and North America and so on, and pull all those elements and then train the perception system of Autopilot to be able to better recognize those bicycles as part of the car.
00:52:27.000 So every edge case like that, they go through saying, okay, the car freaked out in this moment.
00:52:33.000 Let me find moments like this in the rest of the data and then improve the system.
00:52:39.000 So this kind of cycle is the way to deal with problems, with failures of the system.
00:52:48.000 It's to say, every time the car fails at something, say, is this part of a bigger set of problems?
00:52:55.000 Can I find all those problems?
00:52:56.000 And can I improve it with a new update?
00:52:59.000 And that just keeps going.
00:53:00.000 The open question is, how many loops like that you have to take for the car to become really good, better than human?
00:53:09.000 Basically, how hard is driving?
00:53:11.000 How many weird situations, when you manually drive, do you deal with every day?
00:53:16.000 Somebody mentioned, there's like millions of cases when you watch video, you see them.
00:53:22.000 Somebody mentioned that they drive a truck, a UPS truck, past cow pastures.
00:53:30.000 And they know that if there's no cows in the cow pasture, that means they're grazing.
00:53:37.000 And if they're grazing, I mean, I'd be using the correct terms, I apologize, not a cow guy.
00:53:43.000 That means that there may be cows up ahead on the road.
00:53:46.000 There's just this kind of reasoning you can use to anticipate difficult situations.
00:53:50.000 And we do that kind of reasoning about like everything.
00:53:54.000 Cars today can't do that kind of reasoning.
00:53:57.000 They're just perceiving what's in front of them.
00:53:59.000 Now outside of Tesla, how many other companies have autonomous systems that are driving their cars?
00:54:06.000 So maybe it's good to step back.
00:54:08.000 There's several and there's several leaders in each different approach.
00:54:12.000 So first, let's draw a line between the different types of systems there are.
00:54:16.000 One, there's fully autonomous vehicles.
00:54:19.000 So these are cars you can think about that don't have a steering wheel.
00:54:22.000 Or if they have a steering wheel, it doesn't matter.
00:54:25.000 They're in full control.
00:54:26.000 And if there's a crash, the car company is liable.
00:54:30.000 Do those exist?
00:54:31.000 No.
00:54:35.000 It's a gray area, though, because many companies are basically saying that that's what they're doing, but they're not quite there.
00:54:44.000 So the leader in that space used to be called Google Self-Driving Car Program.
00:54:49.000 Now it's called Waymo.
00:54:51.000 They are doing that.
00:54:53.000 It's called Level 4 or Level 5. There's levels to this game.
00:54:57.000 And this is this particular level where it's fully autonomous.
00:55:01.000 Now they're trying to achieve full autonomy.
00:55:04.000 But the way they're doing it currently is they're testing on public roads with what's called a safety driver.
00:55:10.000 So there's a driver always ready to take over.
00:55:14.000 And the driver does have to take over at some rate, you know, frequently.
00:55:20.000 And so the fact that the driver has to take over, that's not fully autonomous then, right?
00:55:24.000 So there's no car today that you can just get in without a safety driver.
00:55:28.000 So there's nobody behind the wheel.
00:55:30.000 And using your app, sort of get from point A to point B. But out of the cars that are semi-autonomous, where there is an autonomous program but you do have to keep your hands on the wheel and pay attention to the road, what are the leaders?
00:55:43.000 Besides Tesla, there's Tesla and who else is doing it?
00:55:48.000 There's several systems.
00:55:52.000 It depends how you define leader.
00:55:54.000 So let me ask you this.
00:55:56.000 Does Mercedes and BMW, do they use the same system?
00:55:59.000 Does someone make a system for cars?
00:56:01.000 Or do they create their own systems?
00:56:03.000 Yeah, that's a really good question.
00:56:05.000 So there's, in some cases, there's Mobileye...
00:56:08.000 And NVIDIA, there's these companies that...
00:56:11.000 NVIDIA? The video card company?
00:56:13.000 The video card company, yep.
00:56:14.000 The same folks that power the Quake game, right?
00:56:17.000 The graphics on the Quake game.
00:56:20.000 You can use those GPU graphics processing units to run machine learning code.
00:56:25.000 So they're also creating these...
00:56:27.000 Look at that, NVIDIA Drive.
00:56:28.000 Scalable AI platform for autonomous driving.
00:56:31.000 In fact, I don't...
00:56:34.000 When did you buy a Tesla?
00:56:36.000 Five months ago or something?
00:56:38.000 Something like that?
00:56:38.000 So the thing in there now, most likely, is a NVIDIA Drive PX2 system.
00:56:44.000 And that works on cameras?
00:56:47.000 That just runs code that takes in camera data, but it can work on anything else.
00:56:54.000 So it could work on LiDAR as well, if somebody had a system...
00:56:58.000 Yeah, but it needs different code.
00:57:00.000 So LiDAR requires very different kinds of processing.
00:57:02.000 Does anybody use that with cars, with semi-autonomous cars?
00:57:06.000 LIDAR. Yes.
00:57:09.000 Well, okay, so semi-autonomous, we have to be careful because the Waymo cars, the quote-unquote fully autonomous cars, are currently semi-autonomous.
00:57:19.000 Right.
00:57:19.000 That's the highest level of semi-autonomous, right?
00:57:23.000 Yeah, I guess it's not even a highest level, it's a principle, it's a philosophy difference.
00:57:28.000 Because they're saying we're going to do full autonomy, we're just not quite there yet.
00:57:33.000 Most other companies, they're doing semi-autonomous, better called driver assistance systems, is they're saying we're not interested in full autonomy, we just want a driver assistance system that just helps you steer the car.
00:57:46.000 So let's call those semi-autonomous vehicles or driver assistance systems.
00:57:51.000 There's several leaders in that space.
00:57:54.000 One car we're studying that's really interesting is a Cadillac Super Cruise system.
00:58:00.000 So GM has a system, it's called Super Cruise, that I think is the best comparable system to Autopilot today.
00:58:11.000 The key differentiator there is, there's a lot of little elements, but the key differentiator is there's a driver monitoring system.
00:58:18.000 So there's a camera that looks at you and tells you if your eyes are on the road or not.
00:58:22.000 And if your eyes go off the road for, I believe, more than six seconds, it starts warning you and says you have to get your eyes back on the road.
00:58:30.000 So that's called driver monitoring.
00:58:32.000 That's one of the big disagreements, for example, between me and Elon and many experts in the field and Elon and the Tesla approach is that there should be a driver monitoring system.
00:58:43.000 So there should be a camera.
00:58:44.000 Why does Elon feel like there shouldn't be?
00:58:47.000 I think his focus, the Tesla's focus, is on just improving the system so fast and so effectively that it doesn't matter what the driver does.
00:59:00.000 So essentially no safety net?
00:59:02.000 No safety net.
00:59:04.000 And I think they...
00:59:06.000 Operate like that in many ideas that they work with.
00:59:11.000 They sort of boldly proceed forward to try to make the car extremely safe.
00:59:17.000 Now the concern there is you have to acknowledge the psychology of human beings.
00:59:21.000 Unless the car is perfect or under our definition perfect, which is much better than human beings, then you have to be able to You have to be able to make sure that the people are still paying attention to help the car out when it fails.
00:59:39.000 And for that, you have to have drive a modern.
00:59:41.000 You have to know what the car is.
00:59:42.000 Right now, your Tesla only knows about your presence from the steering wheel.
00:59:47.000 Touching the steering wheel.
00:59:48.000 Which is a kind of driver monitoring system.
00:59:51.000 It knows you're there, but it's not nearly as effective at knowing you're there cognitively, visually.
00:59:57.000 You can be tricked by clamps.
00:59:59.000 We've seen people do that.
01:00:00.000 They've developed these clamps that you just put on the steering wheel.
01:00:03.000 It'll hold a phone and it'll also trick the system into thinking that you're holding onto the wheel.
01:00:07.000 Yeah, you can do...
01:00:09.000 A lot of purses actually work really well.
01:00:11.000 Don't ask me how...
01:00:12.000 Hanging a purse?
01:00:12.000 No, like shoving a purse into the...
01:00:14.000 Oh, really?
01:00:15.000 Into the...
01:00:16.000 Somebody did that with an orange or something like that, but they said it didn't work.
01:00:20.000 Maybe it needs to be all the way around the outside of it?
01:00:22.000 I think it depends on the shape of the orange, how ripe it is.
01:00:25.000 There's a lot of debate.
01:00:26.000 No, the point is there's ways to trick the system.
01:00:30.000 It's not monitoring the driver.
01:00:31.000 That's the point, right?
01:00:32.000 Yeah, it's not monitoring the driver.
01:00:33.000 And a lot of people believe you need to.
01:00:35.000 You think you need to.
01:00:37.000 Makes sense.
01:00:38.000 Yeah, I think not just for the safety of the system, but to create an experience.
01:00:44.000 I think there's value for the car to know more about you.
01:00:51.000 Look at that.
01:00:52.000 What's happening there?
01:00:54.000 Scanning this guy's eyes.
01:00:57.000 Minority Report shit freaks me out.
01:01:00.000 So yeah, there's a lot of companies sort of springing up.
01:01:02.000 They're doing computer vision on the face and so on to try to detect where you're looking.
01:01:06.000 So what cars have that now?
01:01:10.000 The major one is the supercruise system.
01:01:12.000 There's not many cars.
01:01:13.000 A few cars are starting to add it.
01:01:15.000 Europe… What's a supercruise system?
01:01:17.000 That's the GM Cadillac.
01:01:19.000 Okay.
01:01:20.000 They're trying to...
01:01:21.000 So it's in their super expensive lineup currently, and they're, I think, trying to add it to the full lineup of all of them.
01:01:27.000 So in Cadillacs, what is that big cruiser that they have now?
01:01:31.000 The big four-door car, the really high-end...
01:01:34.000 Is it a CT6? I don't know what it is.
01:01:36.000 They have a new one that's really nice.
01:01:38.000 Is that what they're putting it in?
01:01:40.000 The big sedan?
01:01:41.000 That thing.
01:01:41.000 That thing.
01:01:43.000 Yes.
01:01:44.000 It's pretty.
01:01:45.000 I don't know if that's the CT6, but the one we're looking at is the CT6. Yeah, that's the 2018 CT6. Yeah.
01:01:55.000 But they want to add it to their full fleet.
01:01:58.000 Does that have the same amount of cameras as the Tesla system does?
01:02:02.000 No, and it has a very different philosophy as well in another way, which is it only works on very specific roads, on interstate highways.
01:02:11.000 There's something called ODD, Operational Design Domain.
01:02:15.000 So they define that this thing, Super Cruise System, only works on this particular set of roads and they're basically just major highways.
01:02:24.000 The Tesla approach is saying basically what Elon jokingly referred to as ADD, right, is it works basically anywhere.
01:02:34.000 So if you try to turn on your autopilot, you can basically turn it on anywhere where the cameras are able to determine either lane markings or the car in front of you.
01:02:42.000 And so that's a very different approach, saying you can basically make it work anywhere or, in the Cadillac case, make it work on only specific kinds of roads.
01:02:50.000 So you can test the heck out of those roads.
01:02:53.000 You can map those roads.
01:02:54.000 So you can use, actually, LIDAR to map the full roads so you know the full geometry of all the interstate highway system that it can operate on.
01:03:04.000 Does it also coordinate with GPS so it understands where, like, bumps in the road might be or hills?
01:03:10.000 Yeah.
01:03:10.000 In that sense, it coordinates with GPS for different curvature information, but...
01:03:14.000 Not the topography?
01:03:16.000 No, and construction is a big one.
01:03:18.000 That would be crazy for new potholes.
01:03:22.000 Well, potholes isn't a big problem.
01:03:23.000 I think construction is the big problem.
01:03:26.000 Just this quickly changing dynamic information, which apps like Waze provide.
01:03:31.000 A lot of potholes are a pretty big problem in Boston, though.
01:03:34.000 Yeah, no, for sure.
01:03:36.000 New York is actually probably even worse.
01:03:38.000 I blew out two tires in one day in New York on potholes.
01:03:43.000 Just had an unlucky day.
01:03:45.000 Yeah, but I'd rather you blow out your tire than, I mean, the kind of fatality that happened in the Mountain View with Tesla, I believe, is slightly construction-related.
01:03:57.000 So, I mean, there's a lot of safety-critical events that happen construction-related stuff.
01:04:01.000 I would like it if that stupid Tesla could figure out the hole in the ground, though, so I didn't have to blow a tire out.
01:04:06.000 Come on, bro.
01:04:07.000 I feel your pain, Joe.
01:04:08.000 Figure it out.
01:04:09.000 But priorities.
01:04:10.000 I'll make sure.
01:04:12.000 I'll forward this podcast to Elon to make sure they work on this.
01:04:15.000 I think he's busy.
01:04:17.000 What is this, Jamie?
01:04:18.000 Tesla Autopod will be able to avoid potholes in the road, says Elon Musk.
01:04:21.000 Ha ha!
01:04:22.000 The motherfucker's on top of shit!
01:04:25.000 What's the date on that?
01:04:26.000 April.
01:04:27.000 April 7th.
01:04:28.000 Okay.
01:04:29.000 Just now.
01:04:30.000 There you go.
01:04:30.000 And that's an interesting thing.
01:04:32.000 That's almost an ethical question, whether you want a car to avoid a situation by swerving.
01:04:39.000 Right.
01:04:40.000 Because when you swerve, you now introduce, as opposed to sort of breaking the vehicle only, swerving into another lane means you might create a safety situation elsewhere.
01:04:51.000 You might put somebody else in danger.
01:04:53.000 Right.
01:04:53.000 Yeah, that's why I was saying if it coordinated with GPS, it would have previous knowledge.
01:04:58.000 You know, sort of like Waze tells you where the cops are.
01:05:01.000 Yep.
01:05:01.000 You know what I mean?
01:05:02.000 So that kind of information would be extremely powerful and useful.
01:05:05.000 The problem is it's hard to get it Really up to date.
01:05:11.000 That kind of information really up to date.
01:05:13.000 It's just an infrastructure question.
01:05:16.000 Just getting the software, the data in place to where the car will be able to learn quickly from all the things that are changing.
01:05:22.000 I think potholes don't change that often.
01:05:24.000 So that's a different thing.
01:05:26.000 But in terms of construction zones, in terms of other weird things that change the dynamics, the geometry of the road, that's difficult to get right.
01:05:36.000 So Cadillac's doing a version of it, but it sounds like it's a little bit less involved, less comprehensive.
01:05:44.000 Maybe there's a better way of describing it.
01:05:45.000 Yeah, and less, I would say, it's more safety-focused.
01:05:50.000 It's a sort of, what's the right word to use here?
01:05:54.000 It's more cautious in its implementation.
01:05:58.000 So GM, again, has a tradition for better or for worse.
01:06:04.000 Just sit, Jamie.
01:06:07.000 This little part I'm showing you shows the signal coming on.
01:06:11.000 Oh wow, the green thing.
01:06:12.000 Pay attention, lady.
01:06:13.000 She's too hot.
01:06:15.000 She's not paying attention.
01:06:16.000 Looking at people staring at her.
01:06:17.000 No comment.
01:06:20.000 One of the things that you...
01:06:22.000 It's hard to talk about without actually experiencing the system.
01:06:25.000 What's more important than driver monitoring and any of the details we talk about is how the whole thing feels, the whole thing together, how it's implemented, the whole interface.
01:06:35.000 The Cadillac system is actually done really well in the sense that there's a clarity to it.
01:06:41.000 There's a green color and a blue color and you know exactly when the system is on and when it's off.
01:06:46.000 That's one of the big things people struggle with is just confusing in other cars drivers not being able to understand when the system is on or off.
01:06:57.000 Oh, right.
01:06:57.000 So you think the system's doing it and then just slam into something that wasn't even on.
01:07:01.000 Now, when this car is operating in this manner, how many cameras is it using?
01:07:08.000 You know, that's a good question.
01:07:09.000 I should know that.
01:07:10.000 But I think it's only forward to facing cameras, as far as I know.
01:07:13.000 I think it's two cameras.
01:07:15.000 It may be three cameras.
01:07:16.000 That lady just sat back.
01:07:18.000 So she doesn't have her hands on the wheel at all.
01:07:21.000 Yep.
01:07:21.000 But she's watching.
01:07:23.000 Right.
01:07:23.000 Because the car is able to see where the eyes are.
01:07:26.000 It's a hands-off system.
01:07:28.000 So you're allowed to take your hands off the wheel.
01:07:30.000 Hmm.
01:07:32.000 It's very interesting.
01:07:34.000 And there are certain human behavior aspects that come into play to this.
01:07:41.000 I found myself actually becoming a little more drowsy with this system.
01:07:46.000 I haven't driven it enough, so I haven't gotten used to it.
01:07:48.000 But you have to, at least in the initial stages, it kind of forced you to look at the road in a way that felt artificial.
01:07:57.000 I think it's something that gets better with time.
01:08:00.000 You get used to it.
01:08:01.000 But it's almost like a gamified thing that the car, when you look off-road, starts to tell you that you're looking off-road.
01:08:09.000 So you're kind of psychologically pressured to always stare at the road.
01:08:14.000 And you realize that actually, when you drive, you often look around.
01:08:18.000 And so having to like stare forward can be a little bit, yeah, exactly.
01:08:23.000 You start, like there's something peaceful and hypnotic about those lanes just coming at you and just...
01:08:29.000 The lines, yeah.
01:08:30.000 Why is that?
01:08:32.000 That confuses the shit out of me because I could not be tired at all.
01:08:35.000 But if it's nighttime and I'm on the highway and those lines, they just start to take you to dreamland.
01:08:42.000 I get the same with...
01:08:43.000 There's also just the vibration.
01:08:45.000 There's that hum of driving.
01:08:47.000 Same with trains.
01:08:49.000 Yeah, planes as well.
01:08:51.000 Yeah, it puts me out.
01:08:52.000 So there's a...
01:08:54.000 So Cadillac system.
01:08:55.000 That's the big leader, I would say, in the driver monitoring.
01:09:00.000 And then Tesla is the no driver monitoring and huge data collection.
01:09:04.000 BMW has a system as well they use.
01:09:07.000 Yeah, BMW. What are they using?
01:09:09.000 I don't want to speak too much to the details, but they have lane keeping systems.
01:09:13.000 They're basically systems that keep you in the lane.
01:09:16.000 That is similar to what, in spirit, Autopilot is supposed to do, but is less aggressive in how often you can use it and so on.
01:09:23.000 If you look at the performance of the actual, how often the system is able to keep you in lane, Autopilot is currently the leader in that space.
01:09:32.000 And they're also the most aggressive innovators in that space.
01:09:38.000 They're really pushing it to improve further and further.
01:09:41.000 And the open question is, the worrying question is if it improves much more, are there going to be effects like complacency, like people will start texting more, will start looking off-road more.
01:09:55.000 It's a totally open question and nobody knows the answer to it really.
01:10:00.000 And there's a lot of folks, like I mentioned, in the safety engineers and human factors community, so these psychology folks who have roots in aviation, that there's been 70 years of work that looks at vigilance.
01:10:14.000 If I force you to sit here and monitor for something weird happening, like radar operators in World War II had to watch for the dot to appear.
01:10:26.000 If I sit you behind that radar and make you do it, after about 15 minutes, but really 30 minutes, your rate of being able to detect any problems will go down significantly.
01:10:38.000 You just kind of zone out.
01:10:39.000 So there's all kinds of psychology studies that show that we're crappy.
01:10:44.000 Human beings are really crappy at monitoring automation.
01:10:47.000 If I tell you, if I put a robot and you just say, monitor this system so it doesn't kill anyone, you'll tune out.
01:10:54.000 And we have to be engaged.
01:10:55.000 You have to be engaged.
01:10:56.000 You have to be, you know, there has to be a dance attention.
01:10:59.000 We don't have a mode for watching autonomous things, right?
01:11:04.000 If you consider historically the kind of modes that people have for observing things, we don't really have a mode for making sure that an autonomous thing does its job.
01:11:13.000 Yeah, it's...
01:11:13.000 It's not a mindset.
01:11:15.000 It's not like, oh, you know what I mean?
01:11:16.000 Like, if you're in my car, okay, I'm driving.
01:11:17.000 Here we go.
01:11:18.000 Oh, driving.
01:11:19.000 I turn.
01:11:19.000 I'm thinking.
01:11:20.000 I'm in driving mode.
01:11:21.000 When you're in autonomous mode and you're observing, you're just like, what is this?
01:11:26.000 I've never done this before.
01:11:27.000 This is fucking weird.
01:11:28.000 It feels weird.
01:11:29.000 It's not part of human nature.
01:11:30.000 It's not a normal state.
01:11:32.000 One thing it's done commonly in now is aviation.
01:11:36.000 So, pilots.
01:11:38.000 Pilots are basically monitoring fully autonomous planes.
01:11:41.000 Yeah, that's a good point.
01:11:42.000 As far as I know, many planes today could fly almost fully autonomously.
01:11:46.000 It's also a good point when it comes to software and updates because isn't that part of the issue with this Boeing 737?
01:11:52.000 Max system.
01:11:53.000 Yeah, these systems that they've had problems with, they've been faulty and a couple have crashed.
01:12:00.000 Yeah, and that's a really good point.
01:12:02.000 And they've been two tragic crashes recently with the MAX system.
01:12:08.000 Yeah, they've benched those things, right?
01:12:10.000 Haven't they?
01:12:11.000 I'm not following...
01:12:12.000 They also got rid of a bunch of inspectors.
01:12:15.000 I think they fired like 80 inspectors today.
01:12:18.000 And the unions are freaking out.
01:12:19.000 Yep.
01:12:20.000 And obviously there's politics that FAA is – I think FAA is supposed to supervise and there's a close relationship between Boeing and FAA. There's questions around – I mean there's better experts at that than me.
01:12:32.000 But on the software side, it is worrying because it was a single software update essentially that helps prevent the vehicle – the airplane from stalling.
01:12:43.000 So, if the nose is tilting up, increasing the chance of stalling, it's going to automatically point the nose down of the airplane.
01:12:54.000 And the pilots, in many cases, as far as I understand, weren't even informed of this update, right?
01:13:00.000 They weren't even told this is happening.
01:13:02.000 The idea behind the update is that they're not supposed to really know.
01:13:06.000 It's supposed to just manage the flight for you, right?
01:13:09.000 The problem happened when there's an angle of attack sensor.
01:13:13.000 So the sensor that tells you the actual tilt of the plane.
01:13:17.000 And there's a malfunction in that sensor, as far as I understand, in both planes.
01:13:21.000 And so the plane didn't actually understand its orientation.
01:13:25.000 So the system started freaking out and started pointing the nose down aggressively.
01:13:29.000 And the pilots were like trying to restabilize the plane and couldn't.
01:13:32.000 So shortly after liftoff, they just crashed.
01:13:35.000 Oh my god.
01:13:37.000 Yes, that's a software update.
01:13:38.000 That's crazy!
01:13:40.000 And that's a safety culture that's dealing with this new world of software that we don't know what to do with.
01:13:49.000 And yeah, it's a question...
01:13:53.000 One way is to be sort of a little bit Luddite.
01:13:55.000 I use the term carefully and just be afraid and say, you know what, we should really not allow so many software updates.
01:14:02.000 The other one is sort of embracing it and redefining what it means to build safe AI systems in this modern world with updates multiple times a week.
01:14:10.000 What do you think?
01:14:12.000 I'm 100% for the software approach.
01:14:17.000 I think updates, regular updates, so combining the two cultures but really letting good software engineering lead the way is the way to go.
01:14:27.000 I wish other companies were competing with Tesla on this.
01:14:31.000 On the software side, Tesla is far ahead of everyone else in the automotive sector.
01:14:39.000 And that's one of the problems.
01:14:43.000 I'm worried that, you know, competition is good, right?
01:14:48.000 And I'm worried that people are way too far behind to actually give Tesla new ideas.
01:14:54.000 I'll compete Tesla on software.
01:14:56.000 So most cars are not able to do over-the-air.
01:15:00.000 As far as I know, no cars are able to do major over-the-air updates except Tesla vehicles.
01:15:06.000 They do over-the-air updates to the entertainment system.
01:15:10.000 Like, you know, if your radio is malfunctioning.
01:15:13.000 But in terms of the control of the vehicle, you have to go to the dealership to get an update.
01:15:17.000 Tesla is the only one that over-the-air, like it can multiple times a week do the update.
01:15:23.000 I think that should be a requirement for all car companies.
01:15:26.000 But that requires that they rethink the way they build cars.
01:15:30.000 That's really scary when you manufacture over a million cars a year in Toyota and GM. To say, especially old school Detroit guys and gals that are like legit car people, to say we need to hire some software engineering,
01:15:46.000 that's a challenge.
01:15:47.000 It's a totally, you know, I don't know how often you've been to Detroit, but there's a culture difference between Detroit and Silicon Valley.
01:15:54.000 And those two have to come together to solve this problem.
01:15:57.000 So I have the adult responsibility of Detroit, of how to do production well, manufacture, how to do safety well, how to test the vehicles well, and do the bold, crazy, innovative spirit of Silicon Valley, which Elon Musk in basically every way represents.
01:16:14.000 I think that will define the future of AI in general.
01:16:22.000 Interacting with AI systems just even outside the automotive sector requires these questions of safety, of AI safety, of how we supervise the system, how we manage them from misbehaving and so on.
01:16:35.000 There's a concern about those systems being vulnerable to third-party attacks.
01:16:40.000 Yeah, so hacking.
01:16:42.000 That's a fascinating question.
01:16:45.000 I think there is a whole discipline called adversarial machine learning in AI, which basically any kind of system you can think of, how we can feed it examples.
01:16:57.000 How we can add a little bit of noise to the system to fool it completely.
01:17:02.000 So there's been demonstrations on Alexa, for example, where you can feed noise into the system that's imperceptible to us humans and make it believe you said anything.
01:17:17.000 So, fool the system into thinking, so ordering extra toilet paper, I don't know.
01:17:22.000 And the same for cars, you can feed noise into the cameras to make it believe that there is or there isn't a pedestrian, that there is or there isn't lane markings.
01:17:33.000 So, someone could do this?
01:17:35.000 In theory, at least.
01:17:37.000 In theory, that's the big difference.
01:17:39.000 In theory, it's doable.
01:17:40.000 You can do demonstrations.
01:17:41.000 In practice, it's actually really difficult to do in the real world.
01:17:45.000 So in the lab, you can do it.
01:17:47.000 You can construct a situation where a pedestrian can wear certain types of clothing or put up a certain kind of sign where they disappear from the system.
01:17:54.000 I have to ask you this because now I just remember this.
01:17:56.000 You'd be the perfect person to talk about this.
01:17:58.000 I'm not sure if you remember this case, but there was a guy named Michael Hastings.
01:18:01.000 Michael Hastings was a journalist and he was, I believe, in Iraq or Afghanistan.
01:18:08.000 He was somewhere overseas and he was stuck there because of this volcano that erupted in, I believe, Iceland.
01:18:29.000 We're good to go.
01:18:45.000 He comes back.
01:18:46.000 The general was forced to resign.
01:18:47.000 He was a beloved general.
01:18:49.000 And Michael Hastings was fearing for his life because he thought that they were going to come and get him because these people were very, very angry at him.
01:18:58.000 He wound up driving his car into a tree going like 120 miles an hour.
01:19:04.000 And the car exploded and the engine went flying.
01:19:07.000 And people that were the conspiracy theorists were saying they believed that that car had been rigged to work autonomously or that someone, some third party bad person decided to,
01:19:22.000 or good person depending on your perspective, decided to drive that guy's car into a fucking tree at 120 miles an hour.
01:19:29.000 Do you think that that, and this is 2011?
01:19:36.000 Michael Hastings, death 12, maybe?
01:19:40.000 2012?
01:19:41.000 I think that sounds right.
01:19:43.000 Let's see what it says.
01:19:44.000 2013?
01:19:46.000 Yeah, June 2013. Do you think that in 2013 that would have been possible?
01:19:56.000 It's entirely possible.
01:19:58.000 No, I just wanted to say that.
01:20:00.000 Shout out to the Joe Rogan subreddit.
01:20:04.000 Okay.
01:20:05.000 Check that one off the list.
01:20:10.000 Jamie, pull that up.
01:20:12.000 Check that off.
01:20:14.000 I... Whether it's possible is an interesting question.
01:20:22.000 Whether it's likely is another question.
01:20:24.000 I think it's very unlikely.
01:20:26.000 And the other most important question is that something we should worry at scale about our future.
01:20:32.000 Is cars being used to assassinate, essentially, people?
01:20:37.000 I'm Russian, so I've heard of those things being done by our friend Putin over there.
01:20:44.000 I think it's very unlikely that this kind of thing would happen at scale, that people would use this.
01:20:51.000 I think there would be more effective ways to achieve this kind of end.
01:20:55.000 For sure.
01:20:56.000 And I just think it's a very difficult technical challenge that if hacking happens… It would be at a different level than hacking the AI systems.
01:21:09.000 It would be just hacking software.
01:21:11.000 Right.
01:21:12.000 And hacking software is the kind of thing that can happen with anything.
01:21:17.000 An elevator software or any kind of software that operates any aspect of our lives could be hacked in that same kind of way.
01:21:25.000 Right.
01:21:26.000 My question though was, in 2013, was that technology available where they could take over someone's car?
01:21:34.000 Do you know what car it was?
01:21:36.000 Mercedes.
01:21:37.000 I think it was an S-Class.
01:21:38.000 C? C-Class.
01:21:41.000 Yes.
01:21:42.000 Yes.
01:21:43.000 Yes.
01:21:43.000 But I don't think...
01:21:45.000 Oh, boy.
01:21:46.000 This is like...
01:21:47.000 No, listen.
01:21:48.000 This has been widely speculated.
01:21:50.000 I know.
01:21:50.000 I'm just asking you because you're actually an expert.
01:21:52.000 I mean, it's very rare that you get an expert in autonomous vehicles and you get to run a conspiracy theory by them to see if they can just put a stamp on it being possible or not.
01:22:01.000 Let me just say that Alex Jones is officially not allowed to say MIT scientist says.
01:22:07.000 Which is exactly what he's going to try to do.
01:22:11.000 No.
01:22:11.000 First of all, let me back off and say I am not a security expert, which is a very important difference.
01:22:17.000 That is important.
01:22:18.000 So then autonomous vehicle.
01:22:20.000 I build autonomous vehicle systems.
01:22:23.000 I don't know how to make them extremely robust as security to hacking attacks.
01:22:29.000 And I have a lot of really good friends, which are some of the coolest people.
01:22:32.000 I know who are basically hackers converted to security experts.
01:22:36.000 I would say though, loosely speaking, I think the technology was there, yes, with physical access to the car to be able to control it.
01:22:44.000 But I don't, I think it's extremely unlikely that's what happened.
01:22:48.000 I see where you're coming from.
01:22:52.000 I'm not asking you whether or not it's likely that it happened.
01:22:55.000 I'm sure you don't even have much information on the case because I had to explain it to you, right?
01:22:59.000 That's right.
01:23:00.000 The guy also had some serious amphetamines in his system.
01:23:05.000 They compared it to crystal meth, but the reality is he was a journalist.
01:23:10.000 And most journalists, I don't want to say most, a lot are on Adderall.
01:23:15.000 And Adderall is essentially...
01:23:18.000 Amphetamines.
01:23:18.000 I mean, that's what it is.
01:23:19.000 It's like next-door neighbors to crystal meth.
01:23:23.000 It really is.
01:23:27.000 Well, you said it's possible.
01:23:30.000 They could actually get it to turn the wheel.
01:23:33.000 Yeah, so I have to look at the exact system.
01:23:35.000 Like, it's that drive-by-wire thing that I mentioned.
01:23:38.000 Some systems are not...
01:23:39.000 It's not so easy to turn the wheel, actually.
01:23:43.000 Right, but it could get him to just accelerate out of control.
01:23:45.000 He's going like 120-something miles an hour and he's slammed into a tree.
01:23:49.000 It's entirely possible.
01:23:50.000 Ah, you can't do it twice.
01:23:52.000 The systems back then, though, were far more primitive, correct?
01:23:58.000 Yeah, but it's really, again, the attack vectors here.
01:24:03.000 So the way you hack these systems, I have more to do with the software, low-level software that can be primitive than the high-level AI stuff.
01:24:11.000 Right, but my issue with it was there's no cameras on the outside of the vehicle like there is on Tesla of today, which has autonomous driving as an option.
01:24:19.000 Absolutely.
01:24:19.000 So, okay, I see your point now.
01:24:21.000 So you wouldn't be hacking...
01:24:30.000 Right.
01:24:31.000 Right.
01:24:32.000 Right.
01:24:41.000 Yes.
01:24:41.000 That's a different...
01:24:42.000 That's what people worry about with autonomous vehicles when more and more...
01:24:45.000 You're talking about potentially 10, 20 million lines of source code.
01:24:50.000 So there's all this code.
01:24:52.000 And so obviously it becomes amenable, susceptible to...
01:24:58.000 Bugs that can be exploited to hack the code.
01:25:02.000 And so people are worried legitimately so that these security attacks would lead to these kind of, well, at the worst case, assassinations, but really sort of just basic attacks, basic hacking attacks.
01:25:18.000 I think it's...
01:25:20.000 I think that's something that people in the automotive industry and certainly Tesla is really working hard on and making sure that everything is secure.
01:25:29.000 There's going to be, of course, vulnerabilities always, but I think they're really serious about preventing them.
01:25:35.000 But in the demonstration space, you'll be able to demonstrate some interesting ways to trick the system in terms of computer vision.
01:25:44.000 This all boils down to That these systems are actually, the ones that are camera-based, are not as robust as our human eyes are to the world.
01:25:55.000 So like I said, if you add a little bit of noise, you can convince it to see anything.
01:26:00.000 To us humans, it'll look like the same road, like the same three pedestrians.
01:26:03.000 Could you draw like a little person on the camera lens?
01:26:07.000 They're little cameras, right?
01:26:08.000 You could get down there with a sharpie.
01:26:10.000 Oh my god, there's a guy on the road!
01:26:11.000 That's one attack vector.
01:26:14.000 That's draw stuff.
01:26:16.000 You jokingly say that, but that's a possibility.
01:26:19.000 The sun plays tricks on Cadillac Super Cruise.
01:26:22.000 Next generation system will address camera problem.
01:26:25.000 Oh, as long as the next generation addresses it, you fucking assholes.
01:26:29.000 The sun plays tricks on it?
01:26:31.000 So next gen system is something you're going to have to bring that Cadillac into the dealership and they're going to have to update the software.
01:26:36.000 Update it, yep.
01:26:37.000 Whereas Tesla would just handle that shit.
01:26:39.000 Over the air, yeah.
01:26:39.000 Yeah, I got an update the other day.
01:26:41.000 I was like, alright.
01:26:42.000 And the question was, so that's an exciting, powerful capability, but then the Boeing, the flip side, is, you know, it can significantly change the behavior of the system.
01:26:55.000 There could be a glitch.
01:26:56.000 There could be a glitch.
01:26:57.000 There could be a bug.
01:26:58.000 The Boeing one is terrifying.
01:27:00.000 Especially with a lot of, I mean, that number, whatever it is, it's like 300 combined, 300 plus people dead, maybe even 400. I mean, I don't even know how to think about that number.
01:27:15.000 Yeah, all from a software glitch.
01:27:17.000 The guy who coded it, or the girl who coded it, must feel fucking terrible.
01:27:22.000 Yeah, and you kind of...
01:27:25.000 Fuck, man.
01:27:26.000 It's a lot of burden, and it's one of the reasons it's one of the most exciting things to work on, actually, is the code we write has the capability to save human life, but the terrifying thing is it also has the capability to take human life.
01:27:43.000 And that's a weird place to be as an engineer, where directly a little piece of code, you know, I write thousands of them a day.
01:27:52.000 You know, basically notes you're taking could eventually lead to somebody dying.
01:27:58.000 Now, is there, I don't know anything about coding, but do you have like, is there a spell check for coding?
01:28:04.000 Yeah, so it's kind of called debugging.
01:28:07.000 It's trying to find bugs.
01:28:09.000 And it's a software that's doing this?
01:28:11.000 Yeah, software.
01:28:11.000 So there's, depending on the programming language, and everybody should, if you haven't tried programming, you should try it.
01:28:19.000 It's cool.
01:28:20.000 It's the future.
01:28:21.000 You should learn to program.
01:28:22.000 Okay, that's my plug.
01:28:24.000 You're supposed to say learn to code.
01:28:25.000 You can get kicked off Twitter for that.
01:28:27.000 See how I avoided that?
01:28:29.000 Everyone's scared of it.
01:28:31.000 It's a problematic term.
01:28:32.000 I don't actually know why.
01:28:33.000 It's the dumbest fucking problematic code of all time.
01:28:35.000 Because someone ridiculously was suggesting that coal miners could maybe learn how to code computer code and get a different job.
01:28:46.000 They could be trained.
01:28:47.000 And so, the way people were looking at it, that was like a frivolous suggestion.
01:28:56.000 And that it was ridiculous to try to get someone who was 50 years old, who doesn't have any education in computers at all, to change their job from being a coal miner to learning how to code.
01:29:07.000 So they started saying it to politicians and people mocking it.
01:29:10.000 But then what Twitter alleged was that what was going on was it was being connected to white supremacy and anti-Semitism and a bunch of different things like people were saying learn to code and they were putting in a bunch of these other phrases in.
01:29:26.000 My suggestion would be, well, that's a different fucking thing.
01:29:29.000 Now you have a problem with Nazis and white supremacists, but the problem is with Nazis and white supremacists.
01:29:36.000 When someone is just saying learn to code, mocking this ridiculous...
01:29:42.000 Idea that you're going to teach, you know, that's a legitimate criticism of someone's perspective, that you're going to get a coal miner to learn how to fucking do computer coding.
01:29:51.000 It's crazy.
01:29:52.000 So people getting banned for that, rightly so, people were furious.
01:29:58.000 The way Google described it to me and Tim Poole when we were discussing it was that – Google, I mean, excuse me, Twitter.
01:30:05.000 The way Twitter described it was that essentially they were dealing with something where they were trying to censor things at scale.
01:30:11.000 There was so many people and there's so much going on that it's very difficult to get it right and that they've made mistakes.
01:30:18.000 I think that's one of the most fascinating applications of AI, actually, is filtering, trying to manage...
01:30:26.000 Computer learning.
01:30:27.000 So using machine learning to manage this huge conversation.
01:30:30.000 You're talking about 500...
01:30:32.000 I believe it's 500 million tweets a day, something like that.
01:30:36.000 Jamie makes at least three.
01:30:38.000 Three.
01:30:39.000 Only one.
01:30:41.000 I was going to say, with this conversation, I saw this recently.
01:30:44.000 I don't know who did the data on this, but there's a...
01:30:48.000 A statement someone put on Twitter that said that of, let me see if I can word it correctly, was 22% of adult Americans are on Twitter.
01:30:56.000 All right, so that's like a fact one.
01:30:59.000 Of that, 10% make up 80% of the tweets created by adult Americans.
01:31:07.000 That makes sense.
01:31:08.000 2% of the people on Twitter make up 80% of the tweets.
01:31:11.000 Yeah, that makes sense.
01:31:12.000 Yeah.
01:31:13.000 A lot of people are arguing.
01:31:15.000 Aggressively.
01:31:16.000 And the question of how to manage that, and you can't manage that by just manual...
01:31:22.000 Review of each individual tweet.
01:31:25.000 Yeah, you'd have to have so many employees.
01:31:26.000 Yeah, that's, I think, more likely.
01:31:29.000 I don't think Jack is lying, nor is Vija.
01:31:33.000 But I do think that they have a clear bias against conservatives, and that's being shown.
01:31:38.000 Yeah.
01:31:38.000 So that's an interesting question.
01:31:39.000 I have your friend, my friend and mentor, Eric Weinstein, who talked to me.
01:31:44.000 I disagreed with him a little bit on this.
01:31:47.000 I think he basically believes there's a bias.
01:31:51.000 It boils down to the conversation that Jack is having at the top level inside Twitter.
01:31:57.000 What is that conversation like?
01:32:01.000 I tend to believe, again, this might be my naive nature, is that they don't have bias and they're trying to manage this huge flood of tweets and what they're trying to do is not to remove conservatives or liberals and so on.
01:32:23.000 They're trying to remove people that Lead to others leaving the conversation.
01:32:32.000 So they want more people to be in the conversation.
01:32:35.000 I think that's true as well.
01:32:37.000 But I think they definitely are biased against conservative people.
01:32:40.000 There's an Alexandra...
01:32:44.000 AOC. Octavia...
01:32:47.000 How do you say it?
01:32:48.000 AOC is good.
01:32:50.000 Cortez is the last one.
01:32:51.000 Is it Octavia?
01:32:53.000 Ocasio, that's right.
01:32:54.000 Okay, I'm sorry.
01:32:55.000 Alexander AOC. Sorry.
01:32:58.000 I'm just, I'm thinking, I wasn't planning on talking about her.
01:33:00.000 But there was a parody account, and someone was running this parody account, which was very mild, just humorous parody account.
01:33:08.000 They were banned permanently for running it, and then their own account was banned as well.
01:33:12.000 Whereas...
01:33:14.000 There's some progressive people or liberal people that post all sorts of crazy shit, and they don't get banned at the same rate.
01:33:22.000 It's really clear that someone in the company, whether it's up for manual review, whether it's at the discretion of the people that are employees, when you're thinking about a company that's a Silicon Valley company, you are...
01:33:35.000 Without doubt, you're dealing with people that are leaning left.
01:33:40.000 There's so many that lean left in Silicon Valley.
01:33:44.000 The idea that that company was secretly run by Republicans is ridiculous.
01:33:48.000 They're almost all run by Democrats or progressive people.
01:33:52.000 So at the leadership level, there's a narrow-mindedness that permeates all of Silicon Valley, you're saying?
01:33:59.000 Well, the question is – I think there's a leaning left that permeates Silicon Valley.
01:34:04.000 I think that's undeniable.
01:34:05.000 I think it's undeniable.
01:34:07.000 I mean I think if you had a poll, the people that work in Silicon Valley, where their political leanings are, I think it would be – By far, left.
01:34:14.000 I think it would be the vast majority.
01:34:16.000 Does that mean that affects their decisions?
01:34:18.000 Well, what's the evidence?
01:34:20.000 Well, it kind of shows that it does.
01:34:22.000 They're not treating it with 100% clarity and across-the-board accuracy, or fairness, rather.
01:34:32.000 I think that...
01:34:33.000 There's absolutely people that work there that lean.
01:34:36.000 And there's been videos where they've captured people that were Twitter employees talking about it, talking about how you do that, how you find someone who's using Trump talk or saying sad at the end of things, and someone's talking, certain characteristics they look for.
01:34:53.000 There's been videos of, what is that, Project Veritas, where that guy and his employees got undercover footage of Twitter employees talking about that kind of stuff.
01:35:02.000 The question is how much power do those individuals have?
01:35:04.000 How many individuals are there like that?
01:35:07.000 Are those people exaggerating their ability and what they do at work?
01:35:12.000 Or are they talking about something that used to go on but doesn't go on anymore?
01:35:16.000 I don't know.
01:35:17.000 I don't work there.
01:35:18.000 I think it boils down to...
01:35:20.000 I'm one of those people that believes it boils down to the leadership.
01:35:24.000 To people at the tops, at the culture.
01:35:26.000 And the culture has to be...
01:35:27.000 It cannot be this kind of Silicon Valley, narrow-minded, sort of left-leaning thinking.
01:35:35.000 Even if you believe...
01:35:37.000 Even if you're a hardcore liberal, you cannot...
01:35:39.000 When you operate...
01:35:41.000 When you drive...
01:35:42.000 And manage a conversation in the entire world.
01:35:44.000 You have to think about middle America.
01:35:46.000 You have to think about, you have to have fundamental respect for human beings who voted for Trump.
01:35:50.000 It is a concerning thing for me to see just a narrow-mindedness in all forms.
01:35:56.000 One of the reasons I enjoy listening to this podcast is you're pretty open-minded.
01:36:01.000 That open-mindedness is essential for leaders of Facebook and Twitter, people who are managing conversations.
01:36:09.000 I think so, too.
01:36:10.000 I think it's...
01:36:11.000 The thought of being open-minded and acting in that ethic is probably one of the most important things that we could go forward with right now because things are getting so greasy.
01:36:25.000 It's so slippery on both sides.
01:36:27.000 And we're at this weird position that I don't recall ever in my life there being such a divide between the right and the left in this country.
01:36:36.000 It's more...
01:36:38.000 More vicious, more angry, more hateful.
01:36:42.000 It's different than at any other time in my life.
01:36:45.000 And I think a lot of our ideas are based on these narratives that may or may not even be accurate.
01:36:52.000 And then we support them and we reinforce them on either side.
01:36:56.000 We reinforce them on the left, we reinforce them on the right.
01:36:59.000 Where if you're looking at reality itself and you don't have these clear parameters And these clear ideologies.
01:37:07.000 I think most of us are way more in the middle than we think we are.
01:37:11.000 Most of us are.
01:37:12.000 We just don't want racists running the country.
01:37:13.000 We don't want socialists giving all our money away.
01:37:16.000 We don't want to pay too much in taxes to a shitty government.
01:37:19.000 We don't want schools getting underfunded.
01:37:23.000 And then we decide, what does my team...
01:37:27.000 The team, the shit that I like, is that this team?
01:37:30.000 Well, not everything, but they got a lot of things, so I'll go with them.
01:37:32.000 Maybe I'm not a religious nut, but I'm fiscally conservative, and I don't like the way Democrats like to spend money.
01:37:37.000 I'm going to go with the Republicans.
01:37:39.000 Maybe I'm more...
01:37:42.000 I'm more concerned with the state of the economy and the way we trade with the world than I am with certain social issues that the Democrats embrace.
01:37:50.000 So I'll lean that way, even though I do support gay rights, and I do support this, and I do support all these other progressive ideas.
01:37:56.000 There's way more of us in that boat.
01:37:59.000 There's way more of us that are in this middle of the whole thing.
01:38:02.000 For sure.
01:38:03.000 But it goes up and down.
01:38:05.000 So all of us, I believe, I hope I am open-minded most of the time, but you have different moods.
01:38:12.000 Oh, for sure, yeah.
01:38:13.000 And the question is, this is where the role of AI comes in.
01:38:16.000 Does the AI that recommends what tweets I should see, what Facebook messages I should see, is that encouraging the darker parts of me or the Steven Pinker better angels of our nature?
01:38:30.000 What stuff is it showing me?
01:38:32.000 Because if it shows me stuff that If the AI trains purely on clicks, it may start to learn when I'm in a bad mood and point me to things that might be upsetting to me.
01:38:46.000 And so escalating that division and escalating this vile thing that can be solved most likely with people training a little more jiu-jitsu or something.
01:38:56.000 A little more exercise.
01:38:58.000 This Facebook algorithm that encourages people to be outraged because accidentally, not even on purpose, but this is what engages people.
01:39:07.000 This is what gets clicks.
01:39:08.000 So they find out, oh, well, he clicks on things when he finds out that people are anti-vaccination.
01:39:12.000 Or he clicks on things when he finds out, you know, fill in the blank with whatever the subject is.
01:39:18.000 And then you get, these motherfuckers, you know, this is the reason why measles are spreading.
01:39:23.000 And you start getting angry.
01:39:24.000 I mean, the anti-vax arguments on Facebook, I don't know if you ever dip into those waters for a few minutes and watch people fight back and forth in fury and anger.
01:39:35.000 It's another one of those things that becomes an extremely lucrative thing.
01:39:41.000 It's a subject for any social media empire.
01:39:45.000 If you're all about getting people to engage, and that's where the money is in advertising, getting people to click on the page, and the ads are on those pages, you get those clicks, you get that money.
01:39:54.000 If that's how the system is set up, and I'm not exactly sure how it is because I don't really use Facebook, but that's what it benefits.
01:40:00.000 I mean, that's what it gravitates towards.
01:40:02.000 It gravitates towards controversy.
01:40:04.000 So, and when we think about concern for AI systems, we talk about sort of Terminator, I'm sure we'll touch on it, but I think of Twitter as a whole as one organism.
01:40:14.000 That is the thing that worries me the most, is the artificial intelligence that is very kind of dumb and simple, simple algorithms that are driving the behavior of millions of people.
01:40:25.000 And together, the kind of chaos that we can achieve...
01:40:30.000 I mean, that algorithm has incredible influence on all society.
01:40:33.000 Twitter, our current president is on Twitter.
01:40:37.000 All day.
01:40:38.000 Yeah, all day, all night.
01:40:41.000 I mean, it's scary to think about.
01:40:44.000 We talk about autonomous vehicles leading to one fatality, two fatalities.
01:40:49.000 It's scary to think about what the difference, a small change in the Twitter algorithm I mean, it could start wars.
01:40:57.000 It really could.
01:40:58.000 And that, if you think about the long term, if you think about as one AI organism, that is a super intelligent organism that we have no control over.
01:41:07.000 And I think it all boils down, honestly, to the leadership.
01:41:11.000 To Jack and other folks like him, making sure that he's open-minded, that he goes hunting, that he does some jiu-jitsu, that he eats some meat and sometimes goes vegan.
01:41:24.000 He just did a 10-day talkless retreat.
01:41:29.000 Where you don't talk at all for 10 days.
01:41:31.000 He also eats once...
01:41:32.000 I follow a similar diet to him.
01:41:34.000 He eats once a day.
01:41:36.000 I've done that.
01:41:37.000 And fasts all through the weekend, which I don't.
01:41:39.000 I don't.
01:41:40.000 It's crazy.
01:41:40.000 I've never done that.
01:41:41.000 But I've done quite a few 24-hour, you know, where I eat.
01:41:45.000 At 7 p.m., I'm done eating.
01:41:47.000 I don't touch food until 7 p.m.
01:41:49.000 the next day.
01:41:49.000 It's just water or coffee.
01:41:50.000 Why do you do it, by the way?
01:41:52.000 I do it to shock my system.
01:41:54.000 I think it's good for your system.
01:41:56.000 You know, there's been a lot of research on fasting and the effect it has on telomeres.
01:42:02.000 Dr. Rhonda Patrick spoke pretty recently.
01:42:06.000 There's been quite a few things that she's written about in terms of fasting and the benefits of fasting.
01:42:11.000 Intermittent fasting is great for weight loss, but just fasting itself, even for several days.
01:42:17.000 Most people seem to get some pretty decent benefits out of it, so I dabble in it.
01:42:22.000 I also like the way it makes me feel.
01:42:25.000 To be a little hungry, I think my brain is sharper.
01:42:28.000 I refuse to go on stage full when I do stand-up.
01:42:32.000 I actually learned this from a Cat Williams interview.
01:42:34.000 He was talking about it.
01:42:37.000 He's crazy as fuck, but he's hilarious, and he's one of the greats, in my opinion.
01:42:40.000 He was in the back of a limo, and he was talking about We're good to go.
01:43:13.000 Every morning when I get my morning workout in, and whatever the fuck it is, it's usually hard.
01:43:20.000 I'm always fasted.
01:43:21.000 You can do a lot.
01:43:23.000 It's not at your best.
01:43:24.000 Like, if I was going to do jiu-jitsu, I don't do jiu-jitsu fasted.
01:43:28.000 I would eat some fruit.
01:43:29.000 That's an interesting one, because that was a transformational thing for me.
01:43:32.000 I used to do powerlifting.
01:43:33.000 You'd eat, like, five times a day, six times a day, whatever.
01:43:36.000 More like C.T. Fletcher style.
01:43:39.000 See how big he was?
01:43:40.000 Yeah, back in the day.
01:43:41.000 Bro, he's only maybe like my height or a half inch taller or some shit.
01:43:44.000 He was 320 pounds.
01:43:47.000 Is that what he said?
01:43:48.000 315. 315?
01:43:49.000 Fuck!
01:43:50.000 He was so big.
01:43:51.000 Yeah.
01:43:52.000 You're saying he had trouble.
01:43:54.000 The thing is when you get that big, and I wasn't that big, but it's like hard to move.
01:43:58.000 Oh, yeah.
01:43:59.000 It's like not healthy.
01:44:00.000 Did you see the image of him from yesterday?
01:44:02.000 I didn't see the image.
01:44:04.000 The way when Jamie put up a photograph of him at 315 pounds next to him.
01:44:10.000 At like two-ish, 200-ish.
01:44:13.000 It's incredible how big he was.
01:44:15.000 I mean, everything.
01:44:16.000 His arms were my legs.
01:44:18.000 And they were just coming out of his shoulders.
01:44:20.000 So that was a big moment for me.
01:44:23.000 There he is.
01:44:24.000 There's the pictures.
01:44:26.000 Look how big he was when he was a world champion.
01:44:29.000 I mean, just insanely huge.
01:44:32.000 Wow.
01:44:33.000 Yeah.
01:44:34.000 So when you started training in jiu-jitsu, look at that.
01:44:36.000 And the one on the right, dude, he's 50. Look at that.
01:44:40.000 All natural, too.
01:44:42.000 All natural at 50. Crazy.
01:44:45.000 Some fucking genetics, son.
01:44:47.000 That's some good genes.
01:44:49.000 Oh, yeah.
01:44:50.000 Obsessive, not just hard work.
01:44:52.000 I mean, you have to be a fucking maniac.
01:44:54.000 But the fact that his body holds up like that at 50 is incredible.
01:44:58.000 Yeah, he's an inspiration.
01:44:59.000 But for me, switching from that to jiu-jitsu, I thought, there's no way.
01:45:04.000 I trained hard.
01:45:04.000 I trained twice a day jiu-jitsu for a while.
01:45:07.000 And Were you doing two roles a day?
01:45:09.000 Were you doing, like, technique and drills at one time?
01:45:12.000 Listen, I'm Russian.
01:45:13.000 I love drilling.
01:45:14.000 You just go hard, huh?
01:45:14.000 I'm upset.
01:45:15.000 No, no, no, no.
01:45:15.000 Russian.
01:45:16.000 Drilling.
01:45:17.000 Let me explain to you something.
01:45:18.000 Technical.
01:45:18.000 Okay.
01:45:19.000 What do you want to explain to me?
01:45:19.000 I'm trying to explain to you the difference between Russian and American.
01:45:23.000 Okay.
01:45:23.000 America is, in wrestling, in a lot of combat sports, is, like, heart and guts and hard work over...
01:45:32.000 And Russian is, certainly in wrestling, is technique, is drilling.
01:45:36.000 Yeah.
01:45:36.000 They put a lot more hours than Americans do at less than 100% effort.
01:45:41.000 So like really drilling, really getting that right.
01:45:43.000 Like I love that.
01:45:44.000 In fact, one of the problems is I haven't been able to really ever found, I was always the last one to get bored at drilling.
01:45:51.000 Oh, you've got to find a good drilling partner.
01:45:53.000 Like an obsessed one.
01:45:54.000 Yeah.
01:45:55.000 A shout out to Sarah Block, a judo lady with a black belt in jiu-jitsu as well that was willing to put up with like hundreds or thousands of throws.
01:46:07.000 That we each did.
01:46:09.000 So that obsessive mind, I love that kind of stuff.
01:46:13.000 That's where you get better.
01:46:14.000 Not everybody believes that.
01:46:17.000 Some people believe, especially Jiu-Jitsu, you can't really get timing from drilling.
01:46:22.000 I believe you can get everything from drilling.
01:46:24.000 The timing.
01:46:26.000 The other part, it's not just aimless drilling.
01:46:30.000 It's your mind is in it.
01:46:31.000 Your brain should be exhausted by the end of it too because you're visualizing the whole thing.
01:46:36.000 You're like going through, you're imagining how your opponent would It's really strengthening your imagination while you're also doing the drilling.
01:46:44.000 I couldn't agree more.
01:46:45.000 Yeah, I firmly believe you can get better, way better drilling.
01:46:48.000 And when I went from, I think, blue belt to purple, I did like the most drilling that I ever did, ever.
01:46:55.000 And that's when I grew the most.
01:46:57.000 That's when my technique got way better.
01:47:00.000 That was also when I became friends with Eddie Bravo.
01:47:03.000 And Eddie Bravo is a huge driller.
01:47:05.000 Huge.
01:47:06.000 Oh, he drills, man.
01:47:07.000 They drill like crazy, and they do a lot of live drills, and they do a lot of pathway drills, where they'll do a whole series of movements, and then the escape, and then the reversal.
01:47:18.000 These are long pathways, so that when you're actually in a scrap and you're rolling, you recognize it.
01:47:25.000 Like, okay, here it is.
01:47:26.000 I'm passing the guard.
01:47:27.000 I'm moving to here, and now he's countering me, but I'm setting up this.
01:47:31.000 And these pathway drills, it's so critical because it comes up over and over and over again when you're actually live rolling.
01:47:38.000 You know, you feel it.
01:47:40.000 You feel like, oh, I've been here before.
01:47:41.000 I know what this is.
01:47:42.000 I'd be curious actually to hear...
01:47:44.000 I don't think I've ever heard you talk about how your game...
01:47:48.000 Because my game changed significantly from white belt to blue belt to purple belt.
01:47:52.000 It started to solidify.
01:47:54.000 But I'd be curious to hear, like, how did your game change?
01:47:58.000 Since you met Eddie?
01:48:00.000 Game meaning jiu-jitsu.
01:48:01.000 Well, most of my game came from Eddie.
01:48:04.000 Like 99 point something percent of it.
01:48:06.000 Almost all of it.
01:48:07.000 And John Jock.
01:48:08.000 Those two.
01:48:09.000 So it's like, I was a blue belt before I was friends with Eddie, but I was terrible.
01:48:12.000 Like what guard do you prefer, for example?
01:48:14.000 Well, I do rubber guard.
01:48:16.000 I'm very flexible.
01:48:17.000 So rubber guard is no issue with me.
01:48:19.000 And I think it's incredibly effective.
01:48:21.000 I think if you're good at it, and you get stuck under a guy like...
01:48:27.000 What is his name?
01:48:28.000 Jeremiah Vance is one of Eddie's black belts who's a murderer from his back.
01:48:35.000 His rubber guard is insane.
01:48:37.000 It's insane.
01:48:38.000 Eddie's rubber guard is insane.
01:48:39.000 I mean, obviously, he tapped Hoyle Gracie as a ridiculous guard.
01:48:43.000 He caught him in a triangle.
01:48:45.000 But there's a lot of people that understand it now, a lot of people that know how to do it.
01:48:49.000 It's a real art form.
01:48:52.000 And the thing about it versus other guards is when you're in a position like mission control...
01:48:58.000 You know, Vinnie Magalese is phenomenal at it.
01:49:01.000 I mean, he...
01:49:02.000 What's that?
01:49:03.000 I just pulled up a video of him.
01:49:04.000 He fucked this guy up quick.
01:49:06.000 Yeah, watch it.
01:49:06.000 This is Jeremiah.
01:49:07.000 Jeremiah Vance is one of Eddie's best, look at this, from the bottom, bam, he does that all the time.
01:49:13.000 Triangle from the bottom, off rubber guard.
01:49:15.000 That guy's wrapped up.
01:49:16.000 That's out cold.
01:49:17.000 He does this all the time.
01:49:19.000 He's one of Eddie's best rubber guard assassins.
01:49:21.000 And if you watch his technique, it is fucking sensational.
01:49:25.000 He also has great leg locks, too.
01:49:26.000 But the thing is that...
01:49:28.000 You know, when he'll attack from his legs, and he'll tap people with a leg lock, but if they escape, sometimes they'll escape in a...
01:49:34.000 Oh, this dude's in deep shit right here.
01:49:36.000 But now he's going to take his back.
01:49:38.000 But if they escape, oftentimes he's on the bottom, and when you're on top of him, it's one of the worst places in the world to be.
01:49:45.000 His guard is fucking incredible.
01:49:47.000 And it's because of that.
01:49:48.000 See that grip?
01:49:49.000 See how he's holding the rubber guard in position?
01:49:51.000 That's called mission control.
01:49:53.000 Mission control.
01:49:53.000 Mission control from a guy like Jeremiah is fucking ruthless.
01:49:57.000 Because he has his arm and his legs that's controlling your neck and your posture.
01:50:02.000 And then he's going to a go-go here.
01:50:04.000 And he's phenomenal at this, too.
01:50:07.000 He's going to get him in a go-go plato or an omoplata.
01:50:10.000 He's going to flip him over and now he's attacking the leg.
01:50:13.000 It's just constant.
01:50:14.000 It never ends.
01:50:15.000 Did Eddie invent this kind of system?
01:50:17.000 Well, he invented the initial stage of setting up mission control.
01:50:23.000 This guy is getting fucked up.
01:50:25.000 Oh my god.
01:50:26.000 That's horrible.
01:50:27.000 Eddie invented a series of pathways from mission control to set up various techniques, arm bars, triangles, all these different things.
01:50:35.000 But there had been people that had toyed with doing high guard, like Nino Chambri.
01:50:40.000 He did a lot of rubber guard-esque stuff.
01:50:44.000 There was a lot of things that people did, but Eddie has his own pathway and his own system.
01:50:49.000 And then there's a lot of guys that branch off from that system, like Jeremiah.
01:50:53.000 Like Vinnie Magalès, that have their own way that they prefer to set various techniques up to.
01:50:59.000 But what's really good about that, if you have the flexibility, is that when you're on the bottom, not only is it not a bad place to be, but you could put someone in some real trouble.
01:51:10.000 When you have your ability, you're holding onto your ankle and using your leg, which is the strongest fucking limb in your body, right?
01:51:18.000 Pulling down on someone with your leg, clamping down with your arm, and then you get your other leg involved.
01:51:23.000 Good luck getting out of that.
01:51:25.000 Good luck.
01:51:26.000 It fucking sucks, man.
01:51:27.000 So you have control, but you're also able to move at the same time.
01:51:30.000 Yes, exactly.
01:51:30.000 Has anybody ever put you in mission control before?
01:51:32.000 No, I haven't competed or against many.
01:51:36.000 But even in like someone in class, like show it to you, explain it to you?
01:51:39.000 Yeah, lower ranks have.
01:51:40.000 Once you feel it, you go, oh shit.
01:51:43.000 I remember it being, you know when somebody does a nice move on you, especially like a lower rank, your first reaction is like, oh, this would never, like you're annoyed.
01:51:53.000 Yeah.
01:51:54.000 Yes.
01:51:54.000 It's the natural process of the ego.
01:51:57.000 Of course.
01:51:57.000 Getting rid of, you know, you see something new and you're like, this is stupid.
01:52:01.000 Next time it won't work.
01:52:03.000 But then you start to understand a little more.
01:52:04.000 I remember it being a really powerful controlling position.
01:52:08.000 It's powerful.
01:52:08.000 And if you have a good offensive attack from there, it's powerful as well.
01:52:13.000 There are transitions.
01:52:15.000 Especially a guy like Jeremiah who's really flexible.
01:52:17.000 You know, he can pull off gogoplatas and all sorts of other things.
01:52:22.000 Yeah.
01:52:23.000 Local Plata's, it's another one that they do, is one that you push with your other foot on the heel.
01:52:29.000 It's so nasty.
01:52:31.000 You're holding the back of the foot across the back of the neck, and so your shin is underneath someone's throat, and then you're pushing that shin with your other heel while you're squeezing with your arm.
01:52:41.000 It's ruthless.
01:52:42.000 It's ruthless.
01:52:44.000 And they do a gable grip around the head when they do this as well sometimes, too, so it's just a fucking awful place to be.
01:52:50.000 It's not as good as being on top, right?
01:52:52.000 If you have a crushing top game, that's the best, if you can get to that position.
01:52:56.000 But you can't always get to that position.
01:52:57.000 So there's guys like Jeremiah that even from the bottom, they're horrific.
01:53:01.000 It's dangerous.
01:53:02.000 As dangerous as from the top for most people.
01:53:05.000 Do you find just when you trained back in the day and you still train, do you spend more time on bottom or top?
01:53:13.000 I feel like you should always start on the bottom.
01:53:16.000 Earn the top position.
01:53:17.000 This is something Eddie always brought up too.
01:53:22.000 It's fun to be on top.
01:53:23.000 So a lot of times it's like this mad scramble to see who could force who onto their back.
01:53:27.000 Because when you're on top, you can control them, you can pressure them.
01:53:31.000 You know, you play that strong man's jiu-jitsu, but the problem is a strong man's jiu-jitsu, I'm only 200 pounds.
01:53:36.000 I'm not a big guy.
01:53:37.000 Like, so, if you go to the real big guy, like I'm rolling with a 240-pound guy, I'm not going to get to that spot.
01:53:42.000 Like, I better have a good guard, otherwise I can't do anything, right?
01:53:46.000 When someone's bigger than you and stronger than you, I mean, that's what Hoist Gracie basically proved to the world.
01:53:52.000 Like, as long as you have technique, it doesn't matter where you are.
01:53:55.000 But if you only have top game, which a lot of people do, a lot of people only have top game, You know, you're kind of fucked if you wind up on your back.
01:54:04.000 We see that a lot with wrestlers in MMA. As wrestlers, they can get on top of you and they'll fuck you up.
01:54:09.000 They'll strangle you, they'll take you back, they'll beat you up from the mount, but they don't have nearly the same game when they're on their back.
01:54:17.000 And then there's guys like Luke Rockhold, who's like an expert at keeping you on your back.
01:54:21.000 He's one of those guys, when he gets on top of you, you're fucked.
01:54:24.000 He's got a horrible top game.
01:54:27.000 I mean horrible in the sense of if you're his opponent.
01:54:30.000 He's going to beat the fuck out of you before he strangles you.
01:54:32.000 His top game is insane.
01:54:34.000 Yeah, I hate the feeling.
01:54:37.000 Some people make you just feel the weight.
01:54:39.000 Make you suffer for everything you do on bottom.
01:54:44.000 People that are able to do that are truly humbling.
01:54:47.000 Yeah, wrestlers in particular.
01:54:49.000 Wrestlers are so good.
01:54:51.000 Did you see that Jordan Burroughs-Ben Askren match?
01:54:53.000 Last night.
01:54:54.000 Fucking incredible.
01:54:55.000 How good is that guy?
01:54:56.000 Jordan Burroughs.
01:54:57.000 Phew!
01:54:58.000 Yes.
01:54:59.000 To do that to a guy like Ben Askren?
01:55:02.000 I mean, it just shows you.
01:55:03.000 Ben hasn't competed, I think, in nine years.
01:55:06.000 True.
01:55:06.000 But Ben is one of the greatest.
01:55:08.000 I mean, I'm a huge fan of his wrestling.
01:55:11.000 It's so interesting.
01:55:12.000 I think that is like the worst matchup for Ben Askren.
01:55:15.000 I think...
01:55:17.000 Because you're taking one of the most creative wrestlers ever, Ben Askren.
01:55:23.000 I don't want to overstate it, but he is incredibly creative.
01:55:27.000 One of the great pinning wrestlers.
01:55:29.000 So he pins people.
01:55:30.000 He confuses them and pins them incredibly well.
01:55:34.000 And you put him against basically a freak blast double.
01:55:39.000 Like the greatest double leg takedown.
01:55:42.000 Maybe of all time.
01:55:43.000 Of all time.
01:55:44.000 Somebody put a clip up that said, is this it?
01:55:47.000 Yeah.
01:55:47.000 Somebody put a clip up, oh shit, he went off the fucking mat into the crowd.
01:55:51.000 That's pretty far.
01:55:51.000 That was the best part.
01:55:53.000 He defended a takedown.
01:55:54.000 That was the best part.
01:55:55.000 But that's crazy, man, that they have such a drop-off with these guys.
01:55:59.000 Like, you shouldn't really have a platform like that where a guy can fall off into the crowd.
01:56:04.000 That seems so stupid.
01:56:06.000 It rarely happens.
01:56:07.000 What the fuck are you talking about?
01:56:08.000 It just happened.
01:56:10.000 Rarely happens.
01:56:11.000 They rarely have these.
01:56:12.000 That's true.
01:56:13.000 This just happened.
01:56:14.000 That's a terrible thing.
01:56:15.000 Have that shit flat on the ground.
01:56:17.000 That is so dumb.
01:56:18.000 I can't even believe they did that.
01:56:19.000 I think this whole match should be contested.
01:56:21.000 It doesn't count.
01:56:22.000 Well, I don't...
01:56:23.000 I don't...
01:56:24.000 You know...
01:56:26.000 I think, look, that's stupid.
01:56:28.000 That's not smart.
01:56:30.000 To have a guy who's a fucking powerhouse of a blast double hitting you and sending you flying into the...
01:56:38.000 That's crazy!
01:56:39.000 That is crazy that they didn't have anything in place to stop that.
01:56:43.000 That's the reason why wrestling takes place on the ground, you fucking assholes.
01:56:47.000 Why are you having people wrestle on a platform?
01:56:49.000 That's crazy.
01:56:50.000 It's a show.
01:56:52.000 You can have a show where it's on the ground.
01:56:54.000 It's called basketball.
01:56:56.000 Yeah, it's on the ground.
01:56:57.000 I mean, it was worrying because Ben Askren is an MMA fighter and you get injured with that kind of stuff.
01:57:01.000 Fuck, right there!
01:57:02.000 Right there!
01:57:03.000 It could have torn his knee apart easily.
01:57:05.000 Well, the silver lining is that he's okay.
01:57:08.000 Yeah, the silver lining.
01:57:09.000 And we got to see that.
01:57:11.000 It was interesting.
01:57:11.000 Jordan Burroughs had on his Instagram, there's levels to this.
01:57:14.000 They were raising his hand up, and it's like, that's what we got to see.
01:57:17.000 Because Ben is a phenomenal wrestler.
01:57:19.000 But you're right.
01:57:20.000 He hasn't competed in a long time.
01:57:22.000 He's not necessarily at the level that he was back then, even though he's incredible for MMA standards.
01:57:27.000 Yeah.
01:57:28.000 It's good to see.
01:57:29.000 It's good to see that with boxing.
01:57:31.000 It's good to see that with anything.
01:57:32.000 When Floyd Mayweather fought Conor, I think it was good to see that.
01:57:35.000 There are really levels to this.
01:57:37.000 And the interesting thing about Jordan Barrows, I think he's so good that he's probably going to stay out of MMA. That's so crazy.
01:57:45.000 But there are wrestlers...
01:57:46.000 Here's some clips of it.
01:57:47.000 I'm not going to show this on YouTube.
01:57:49.000 Yeah, we can't show it to you people, but...
01:57:53.000 Who put this on?
01:57:54.000 Flow Wrestling.
01:57:55.000 Flow Wrestling put this on.
01:57:56.000 I wonder if people are pirating it online or if they put it online, if they're allowing it.
01:58:00.000 No, they...
01:58:01.000 Well...
01:58:01.000 People are pirating it?
01:58:03.000 Yeah.
01:58:03.000 Yeah.
01:58:04.000 Okay.
01:58:04.000 Good luck.
01:58:05.000 Yeah.
01:58:06.000 Good luck stopping that, right?
01:58:08.000 Well, I think people should support Flow Wrestling, though.
01:58:11.000 They do have like a...
01:58:12.000 I'm a member.
01:58:13.000 Are you?
01:58:14.000 Oh, look at this.
01:58:14.000 Look at this.
01:58:15.000 God, he's good.
01:58:17.000 Yeah, man.
01:58:18.000 So we're watching this, ladies and gentlemen who are just listening.
01:58:21.000 It's probably boring as fuck for you.
01:58:22.000 But Jordan Burroughs is one of the best wrestlers, really, America's ever produced.
01:58:29.000 Olympic champion.
01:58:29.000 Three-time world champion.
01:58:30.000 Yeah.
01:58:31.000 Tragically lost in the previous Olympics.
01:58:34.000 And he's back at it again.
01:58:36.000 I wonder if he's ever considered MMA. I know there was some talk about it, but I wonder if he ever really...
01:58:43.000 I think at this point, he is basically a no, but there are a few terrifying people, especially on the Russian side, that I think the heavyweight division and UFC should be really worried.
01:58:58.000 I don't know if you heard about the Russian tank, the 22-year-old from Dagestan.
01:59:03.000 No, who's this guy?
01:59:05.000 He's a wrestler?
01:59:06.000 A wrestler.
01:59:06.000 He's going to fight MMA? No, he will after 2020 is what his expectation is.
01:59:14.000 For now, he's probably going to be the greatest wrestler of all time.
01:59:17.000 Really?
01:59:18.000 Him against Kyle Snyder.
01:59:20.000 Those two heavyweights.
01:59:21.000 Kyle Snyder's American.
01:59:23.000 Another guy...
01:59:23.000 Is this it right here?
01:59:24.000 The Tank of Dagestan.
01:59:26.000 How do you say his name?
01:59:28.000 It says Abdul Rashid Sotolayev.
01:59:32.000 Abdul Rashid Sotolayev.
01:59:33.000 22 years old.
01:59:34.000 Abdul Rashid Sotolayev.
01:59:35.000 And Kyle Snyder.
01:59:36.000 You can do Kyle Snyder versus...
01:59:38.000 What a great name.
01:59:39.000 Abdul-Rashid Sadalayev.
01:59:41.000 Sadalayev.
01:59:42.000 That is Russian as fuck.
01:59:44.000 So Snyder is 23 years old, and he's another incredible person who will do MMA. And that competition between Snyder...
01:59:51.000 I mean, look at the thickness.
01:59:58.000 These guys are monsters, and they're not just...
02:00:00.000 How much do these guys weigh?
02:00:04.000 97 kilograms?
02:00:05.000 What is that?
02:00:06.000 220?
02:00:07.000 Yeah, 220, under 215, but they cut for it, right?
02:00:11.000 This is just under heavyweight.
02:00:13.000 So do you think they would compete at 205 if they were going to fight in MMA? These are heavyweights.
02:00:18.000 So you have to remember, these are still boys.
02:00:22.000 Oh.
02:00:23.000 22, right?
02:00:24.000 Right.
02:00:24.000 They still haven't gotten the full, like...
02:00:27.000 Yeah, I wonder that about UFC fighters that are thickening up as they get older.
02:00:32.000 I wonder how many of them are damaging their body by cutting weight.
02:00:35.000 Yeah.
02:00:36.000 That's a thick fella.
02:00:38.000 So, right now we're just seeing mostly stalemate, and that's from the American guy.
02:00:44.000 Is there a highlight reel of his or something that we can see?
02:00:46.000 Yeah, there is, but he's pretty young.
02:00:48.000 I think he's an Olympic champion.
02:00:51.000 He goes from the whole line of the Saitya brothers and all the Dagestani wrestlers.
02:00:57.000 There are so many good fighters that are coming out of Dagestan right now.
02:01:00.000 And all technicians.
02:01:03.000 It's incredible.
02:01:04.000 It's incredible.
02:01:05.000 Whatever's in the water there.
02:01:06.000 And then different styles too, like Zabit.
02:01:07.000 Like Zabit style, very, very different than a wrestling heavy style.
02:01:12.000 Look at this guy, man.
02:01:13.000 Jesus Christ.
02:01:14.000 Oh my God.
02:01:16.000 What a scramble.
02:01:17.000 So this is Abdul Rashid.
02:01:20.000 Rashid?
02:01:21.000 Abdul Rashid?
02:01:22.000 Call him Sadalaev.
02:01:23.000 No, don't tell me how to say it.
02:01:24.000 I'll figure it out.
02:01:25.000 I don't know.
02:01:25.000 Abdul Rashid.
02:01:26.000 Abdul Rashid.
02:01:28.000 Sadalaev.
02:01:29.000 Sadalaev.
02:01:30.000 You know what?
02:01:31.000 There's a poetic nature to these guys.
02:01:35.000 I mean, they're just like Khabib.
02:01:38.000 I mean, they're simple, good people.
02:01:41.000 They're pretty religious.
02:01:43.000 They don't even believe in fame.
02:01:46.000 They just believe in...
02:01:49.000 Well, you know, that was sort of evident, and the mindset behind them was sort of evident at the end of that fight with Conor, where they went crazy and he jumped into the crowd.
02:02:01.000 It's like, he's not playing games.
02:02:02.000 He's not doing this for Instagram likes or for, you know, this is really, he takes trash talking and all that stuff very seriously.
02:02:11.000 This is all about honor for him.
02:02:13.000 I think that was kind of upsetting because...
02:02:16.000 True, but...
02:02:17.000 But don't do that.
02:02:19.000 Yeah.
02:02:19.000 Don't do that.
02:02:20.000 And also respect...
02:02:22.000 I'd hate to say it, but I think there's a certain ethic and honor to the way Conor McGregor carries himself, too.
02:02:29.000 All that trash talk, if you look at the end of the fights...
02:02:33.000 He's very kind.
02:02:34.000 He's very kind and respectful in defeat and win.
02:02:37.000 It's a different culture.
02:02:38.000 You compare the Dagestani versus Irish culture, it's just a different culture, and you have to respect that.
02:02:44.000 I think Khabib, to be honest, disrespected Conor's culture as much as Conor disrespected Khabib's.
02:02:50.000 I get what you're saying.
02:02:52.000 But, I mean, when he was done with the fight, he didn't keep attacking Conor.
02:02:56.000 It was people in the audience who were talking shit that were training partners.
02:03:00.000 Emotion.
02:03:00.000 Yeah.
02:03:01.000 And he had heard that for weeks.
02:03:03.000 And he was done.
02:03:05.000 For months.
02:03:05.000 He was done.
02:03:06.000 He was like, fuck you.
02:03:07.000 I beat his ass and I'm going to beat your ass.
02:03:09.000 And he just said, I'm not playing games.
02:03:10.000 And he jumped into the fucking crowd.
02:03:12.000 I think security could have been handled far better and will be in the future to prevent things like that from happening where people just jumped into the cage.
02:03:20.000 But I hate seeing that shit.
02:03:43.000 I got it.
02:03:45.000 I would have loved to see Conor McGregor vs.
02:03:47.000 Khabib before the Mayweather fight.
02:03:51.000 Before Conor gotten...
02:03:54.000 I think the money makes you less hungry.
02:03:57.000 Oh, for sure.
02:03:58.000 Dude, he ain't hungry at all.
02:04:00.000 I mean, he's got $100 million.
02:04:02.000 But I think he still loves to compete.
02:04:04.000 But there's no hunger anymore.
02:04:06.000 Like, there ain't no hunger.
02:04:07.000 I mean, he might be hungry for success, but there's no desperation.
02:04:11.000 I don't know if that's...
02:04:13.000 I know what you're saying.
02:04:14.000 Like, he has a lot to lose now, too.
02:04:15.000 It's a different thing.
02:04:16.000 He enters into a fight with $100 million in the bank.
02:04:18.000 It's a very different experience than entering into the fight with $1 million and hoping that you could make three more tonight.
02:04:26.000 Like many, I'm sure, fights that he's had in the past.
02:04:29.000 It's a different world.
02:04:31.000 He can do whatever he wants forever.
02:04:33.000 I mean, once a fighter, though, always a fighter.
02:04:35.000 Yeah.
02:04:35.000 I mean, there is an element there that he still wants glory.
02:04:39.000 I believe...
02:04:40.000 He's still only 30. Yeah.
02:04:41.000 Right?
02:04:41.000 He can still do it, yeah.
02:04:43.000 I mean, he's...
02:04:43.000 I think.
02:04:44.000 How old's Connor?
02:04:46.000 At the most, he's like 32 or some shit.
02:04:50.000 30. Yeah, he's young, man.
02:04:52.000 To be set for the rest of your life at 30 is kind of fucking bananas.
02:04:57.000 And I don't think he's at his peak as a fighter.
02:05:00.000 So if he just decides who gives a fuck about the money, I'm here to leave a legacy.
02:05:04.000 And I'm gonna just train like a fucking demon.
02:05:07.000 And he kicks aside all of the bad influences and all the distractions in his life and just focuses on training.
02:05:15.000 I mean, pfft.
02:05:15.000 He's a motherfucker, man.
02:05:17.000 I mean, you saw what he did to Aldo.
02:05:18.000 Saw what he did to Chad Mendes.
02:05:20.000 Saw what he did to Dustin Poirier.
02:05:22.000 I mean, he is a bad motherfucker.
02:05:24.000 Period.
02:05:25.000 I know you're gonna shut this down, as most fans do, but I... If he drops everything and goes to, like, Siberia to train, I would love to see him and Khabib, too.
02:05:36.000 Well, there's nothing...
02:05:37.000 That's my friend Hans Molenkamp and Connor Sparren.
02:05:40.000 Just fucking around.
02:05:41.000 Powerful Onnit logo in the background.
02:05:42.000 It's like a goddamn Onnit ad.
02:05:44.000 Um...
02:05:46.000 Yeah, I mean, he's always going to have a problem with Khabib.
02:05:49.000 Khabib's wrestling is so high level.
02:05:53.000 It's so different.
02:05:54.000 He smothers you in a way that you think you have good takedown defense until you run into that motherfucker.
02:06:01.000 And he just gets a hold of everyone.
02:06:03.000 He does it to everyone.
02:06:04.000 Whether you're Michael Johnson or Edson Barboza, no matter how good your takedown defense looked in the past.
02:06:10.000 In the Barboza fight, he basically just waded towards him.
02:06:14.000 Waded through the fury of leg kicks and punches and just clamp, drag, smash.
02:06:21.000 And that's what he does to everybody, man.
02:06:23.000 The real thing about a guy like him would be seeing a guy like him against a guy like Jordan Burroughs.
02:06:29.000 Could he do that to a guy who is a spectacular wrestler as well?
02:06:35.000 Then it becomes his striking, which has gotten very high level.
02:06:39.000 He's very dangerous striking, so he dropped Conor.
02:06:42.000 He can fuck people up.
02:06:44.000 He stopped a few people's strikes.
02:06:49.000 He's dangerous enough on the feet that you would have to...
02:06:54.000 I don't know how much...
02:06:56.000 How many really high-level grapplers also have striking that can stand with him?
02:07:00.000 Because if he decided to keep it up, he'd have an advantage there until they got good at it.
02:07:05.000 Him versus Ben Askren would be very interesting.
02:07:07.000 Well, he would have an advantage in striking over Askren.
02:07:10.000 In wrestling, I don't know.
02:07:12.000 No.
02:07:13.000 Askren's a big fella, too.
02:07:15.000 Are they the same weight?
02:07:16.000 No.
02:07:17.000 Oh.
02:07:17.000 He's 55, Askren's 70. Okay.
02:07:20.000 But Askren could probably make 55 if you tortured him.
02:07:24.000 He's got a dad bod, though.
02:07:27.000 How rude.
02:07:29.000 No, he's proud dad bod.
02:07:32.000 He's proud of his body.
02:07:33.000 I think he was that way in college, too.
02:07:35.000 He was never...
02:07:36.000 He was never like Brock Lesnar.
02:07:38.000 No.
02:07:38.000 Super technical.
02:07:40.000 And he's strong as hell, though, according to everybody.
02:07:43.000 Everybody that rolls with him says he's fucking ridiculously strong.
02:07:47.000 You sometimes say artificial life instead of artificial intelligence.
02:07:51.000 Yeah, because I think that it's a life form.
02:07:53.000 It's a stupid way to look at it.
02:07:55.000 I was curious to think about how do you think about artificial intelligence?
02:07:58.000 What do you picture?
02:07:59.000 I picture human beings being like electronic caterpillars that are building a cocoon that they have no real knowledge of or understanding.
02:08:08.000 And through this, a new life form is going to emerge.
02:08:11.000 A life form that doesn't need cells and mating with X and Y chromosomes.
02:08:18.000 It doesn't need any of that shit.
02:08:20.000 It exists purely in software and in hardware.
02:08:25.000 And in ones and zeros and that this is a new form of life and this is when the inevitable Rise of a sentient being the inevitable.
02:08:34.000 I mean, I think if we don't get hit when the asteroid within a thousand years or whatever the number the time frame is Someone is going to figure out how to make a thing that just walks around and does whatever it wants and lives like a person and That's not outside the realm of possibility.
02:08:52.000 And I think that if that does happen, that's artificial life.
02:08:56.000 And this is the new life.
02:08:58.000 And it's probably going to be better than what we are.
02:09:00.000 I mean, what we are is basically, if you go back and look about, you know, 300,000, 400,000 years ago, when we were some Australopithecus-type creature, How many of them would ever look at the future and go, I hope I never get a Tesla.
02:09:17.000 The last thing I want is a fucking phone.
02:09:18.000 The last thing I want is air conditioning and television.
02:09:21.000 The last thing I want is to be able to talk in a language that other people can understand and to be able to call people on the phone.
02:09:27.000 Fuck all that, man.
02:09:28.000 I like living out here running from Jaguars and shit and constantly getting jacked by bears.
02:09:34.000 I wouldn't think that way.
02:09:35.000 And I think if something comes out of us and makes us obsolete but it's It's missing all the things that suck about people.
02:09:45.000 I mean, it won't be good.
02:09:47.000 It won't be good in our...
02:09:49.000 What things suck about people?
02:09:51.000 Hate, war, violence, thievery, people stealing things from people, people robbing people.
02:09:59.000 Here's the thing.
02:09:59.000 Those dark parts of human nature, I think, are suffering, injustice.
02:10:06.000 I think all of that is necessary for us to discover the better angels.
02:10:11.000 I don't think you can...
02:10:13.000 Let's talk about sentience and creating artificial life, but I think even those life forms, even those systems need to have the darker parts.
02:10:25.000 But why is that?
02:10:26.000 Is that because of our own biological limitations and the fact that we exist in this world?
02:10:30.000 World of animals where animals are eating other animals and running.
02:10:35.000 There's always...
02:10:35.000 You always have to prepare for evil.
02:10:37.000 You have to prepare for intruders.
02:10:39.000 You have to prepare for, you know, predators.
02:10:41.000 And this is essentially like this mechanism is there to ensure that things don't get sloppy.
02:10:46.000 Things continue to...
02:10:48.000 Look, if the jaguars keep eating the people and the people don't figure out how to make a fucking house, they get eaten.
02:10:52.000 And that's it.
02:10:53.000 Or you figure out the house and then you make weapons.
02:10:56.000 You fight off the fucking jaguar.
02:10:57.000 Okay, great.
02:10:58.000 You made it.
02:10:59.000 You're in a city now.
02:11:00.000 See?
02:11:00.000 You had to have that jaguar there in order to inspire you to make enough safety so that your kids can grow old enough that they can get information from all the people that did survive as well and they can accumulate all that information and create air conditioning and automobiles and guns and keep those fucking jaguars from eating your kids.
02:11:18.000 This is what had to take place as a biological entity.
02:11:22.000 But once you surpass that, Once you become this thing that doesn't need emotion, doesn't need conflict, it doesn't need to be inspired, it never gets lazy.
02:11:32.000 It doesn't have these things that we have built into us as a biological system.
02:11:36.000 If you looked at us as Wetware operating software.
02:11:41.000 It's not good software, right?
02:11:44.000 It's software designed for cave people.
02:11:46.000 And we're just trying to force it into cars and force it into cubicles.
02:11:52.000 But part of the problem with people and their unhappiness Is that all of these human reward systems that have been set up through evolution and natural selection to have these instincts to stay alive, they're no longer relevant in today's society.
02:12:07.000 So they become road rage, they become extracurricular violence, they become depression, they become all these different things that people suffer from.
02:12:17.000 So that's one perspective.
02:12:18.000 Yes.
02:12:19.000 That basically our software through this evolutionary process was necessary to arrive at where we are but it's outdated at this point.
02:12:25.000 Well, it's necessary for us to succeed.
02:12:27.000 To succeed in a purely, almost a Darwinist way, in the sense that survive the revolution.
02:12:32.000 Especially since we're so weak.
02:12:33.000 I mean, it's really, we became this weak because we got so good at protecting ourselves from all the bad things.
02:12:41.000 Okay, the other perspective is that we're actually incredibly strong, and this is the best that the universe can create, actually.
02:12:49.000 We're at the height.
02:12:50.000 We're at the height of creation.
02:12:52.000 There's a beauty in this tension, in this dance between good and evil, between happiness and depression, life and death.
02:13:01.000 And through that struggle, that's not just a useful tool to get us from jaguars to cities, but that is the beautiful thing.
02:13:10.000 That is what the universe was built for.
02:13:13.000 That is the height.
02:13:15.000 Our current evolution and the creation that results from it is the height of creation.
02:13:24.000 And the way things operate...
02:13:28.000 It's not something that's far from optimal.
02:13:32.000 It's not something that sucks, but it is very good, very optimal and hard to beat.
02:13:40.000 In the sense that, for example, mortality.
02:13:46.000 Is death important for creation?
02:13:50.000 Is death important for us human beings?
02:13:52.000 For life?
02:13:54.000 For us as a society?
02:13:55.000 Is it important for us to die?
02:13:56.000 Like, if you could live forever, would you live forever?
02:13:59.000 I think you'd miss out on the possibility that there is something.
02:14:03.000 I had this conversation with C.T. Fletcher yesterday, because he survived a heart transplant a year ago, a year and two days ago.
02:14:10.000 I think it's...
02:14:13.000 What do you think?
02:14:14.000 I think mortality is essential for everything.
02:14:21.000 I think the end, we need the end to be there.
02:14:25.000 Right.
02:14:25.000 But do you think that we need the end to be there for the overall health of the human race or all the organisms on earth?
02:14:33.000 Or do you think we need it to be there because there's something else?
02:14:37.000 Do you think there's something else that happens to you when your body stops existing?
02:14:42.000 Do you think your consciousness transcends this dimension?
02:14:45.000 I think I'm not smart enough to even think about that.
02:14:52.000 That's a great answer.
02:14:54.000 I think everybody on earth has that exact same answer, if they're being honest.
02:14:58.000 So you talked about atheism and so on.
02:15:01.000 I used to think atheism means what I just said.
02:15:05.000 But it's more...
02:15:07.000 We know so little.
02:15:10.000 The only thing I know is that the finiteness of life is...
02:15:14.000 The Broadway Jiu-Jitsu School that I train at has this poster at the opening, which is a Hunter S. Thompson quote, which is...
02:15:24.000 About skidding into death sideways?
02:15:26.000 Yeah.
02:15:27.000 That's a good one.
02:15:29.000 No, for all moments of beauty, many souls must be trampled.
02:15:35.000 Something like that.
02:15:36.000 That's a fucking great quote.
02:15:38.000 God, I love that guy.
02:15:40.000 Yeah, so basically, for beauty, you have to have suffering.
02:15:44.000 I do not disagree with you.
02:15:47.000 I do not disagree with any of the things you said.
02:15:49.000 And I think there's always a possibility that human beings are the most advanced life form that's ever existed in the cosmos.
02:15:58.000 There's always that.
02:16:00.000 That has to be an option if we are here, right?
02:16:02.000 If we can't see any others out there, and even though there's the Fermi Paradox and there's all this contemplation that if they do exist, maybe they can't physically get to us, or maybe they're on a similar timeline to us.
02:16:17.000 Also, it's also possible, as crazy as it might sound, that this is as good as it's ever gotten anywhere in the world.
02:16:23.000 Or anywhere in the universe, rather.
02:16:25.000 That human beings right now in 2019 are as good as the whole universe has ever produced.
02:16:29.000 We're just some freak luck accident and everybody else is throwing shit at each other.
02:16:34.000 Right?
02:16:34.000 There's 15 armed caterpillar people that live on some other fucking planet and they just toss their own shit at each other and they never get any work done.
02:16:42.000 But we might be that.
02:16:44.000 But even if that's true, Even if this beauty that we perceive, even if that this beauty requires evil to battle and requires Seemingly insurmountable obstacles you have to overcome and then through this you achieve beauty.
02:17:06.000 That beauty is in the eye of the beholder, for sure.
02:17:10.000 Objectively, the universe doesn't give a fuck if Rocky beats Apollo Creed in the second movie.
02:17:15.000 It doesn't give a fuck.
02:17:17.000 It's nonsense.
02:17:19.000 Everything's nonsense.
02:17:20.000 When you look at the giant-ass picture, what beauty is it if the sun's going to burn out in five billion years?
02:17:27.000 What beauty is it if there could be a hypernova next door that just cooks us?
02:17:33.000 So that's like the book Sapiens.
02:17:36.000 Yeah.
02:17:37.000 That basically we've all, one of the things we've created here is we've imagined ideas that we all share.
02:17:44.000 Ideas of beauty, ideas of truth, ideas of fairness.
02:17:48.000 We've all created together and it doesn't exist outside of us as a society.
02:17:54.000 No, it only exists to us.
02:17:56.000 But to us, it does exist.
02:17:58.000 And this is where I think the beauty of being a person truly lies.
02:18:02.000 It lies in us.
02:18:04.000 Our appreciation of us.
02:18:06.000 We appreciate people in a profound way.
02:18:10.000 Like we were talking about Hendrix.
02:18:11.000 I don't know how many hours of Hendrix I've ever listened to.
02:18:14.000 Or Richard Pryor.
02:18:15.000 How many hours of Richard Pryor I watched and how much that affected me as a kid.
02:18:22.000 Watching live in the Sunsets trip, that's what got me to do in stand-up comedy.
02:18:25.000 We affect each other.
02:18:26.000 C.T. Fletcher, who was on the podcast yesterday, who's this incredibly inspirational guy.
02:18:31.000 You watch his videos, you want to lift the fucking world and throw it into space.
02:18:36.000 You know, I mean, he's so powerful.
02:18:39.000 We appreciate each other.
02:18:41.000 We appreciate people.
02:18:42.000 So...
02:18:43.000 All those things you're saying are real, like for us.
02:18:46.000 They're real for us.
02:18:47.000 My concern is not that.
02:18:49.000 My concern is that we are outdated.
02:18:52.000 My concern is not that there's not beauty in what we are.
02:18:56.000 I am a big appreciator of this life.
02:19:00.000 I appreciate human beings in this life and human beings, their contributions.
02:19:05.000 And as I get older, Particularly over the last few years, I started doing a lot of international travel.
02:19:12.000 I fucking appreciate the shit of all these people that are living in this different way, with weird language and shit, weird smelling foods.
02:19:20.000 And I like to think, what would it be like if I grew up here?
02:19:23.000 These are just people, but they're in this weird sort of mode.
02:19:28.000 I think we're insanely lucky.
02:19:31.000 That we have this enthusiasm for each other.
02:19:34.000 For your work, man.
02:19:36.000 I have this deep enthusiasm for what you do.
02:19:39.000 I'm fascinated by it.
02:19:40.000 I love being able to talk to you and pick your mind about you're out there coding these fucking vehicles that are driving themselves.
02:19:48.000 Artificial Life on wheels.
02:19:51.000 I don't think any other animal appreciates each other the way people do.
02:19:55.000 I mean, I might be wrong...
02:19:56.000 The way people do, right.
02:19:56.000 Yeah.
02:19:57.000 I might be wrong about dolphins and whales.
02:19:59.000 I mean, maybe they love each other just as much as we do, just in a different way.
02:20:03.000 But where does AI fit into that?
02:20:07.000 So you're worried.
02:20:08.000 I'm worried that we are Australiapithecus, and AI is going to come along and make us look stupid.
02:20:13.000 The only reason why Australiapithecus would be cool today is if we found a gang of them on an island somewhere.
02:20:19.000 We're like, holy shit, they survived!
02:20:21.000 They never evolved.
02:20:23.000 They're on this island just cracking coconuts and just eating fish, whatever they can catch.
02:20:26.000 That would be amazing.
02:20:28.000 But every undocumented or undiscovered, uncontacted tribe, they're all homo sapiens.
02:20:34.000 All of them.
02:20:37.000 So what do you picture?
02:20:40.000 Because we have to look at Boston Dynamics robots.
02:20:43.000 Because you said walking around.
02:20:45.000 I'd like to get to a sense of how you think about, and maybe I can talk about where the technology is, of what that artificial intelligence looks like in 20 years, in 30 years, that will surprise you.
02:21:00.000 So you have a sense that it has a human-like form.
02:21:03.000 No, I have a sense that it's going to take on the form the same way the automobile has.
02:21:08.000 If you go back and look at it, C.T. Fletcher has a beautiful old patinaed pickup truck.
02:21:16.000 What did he say it was from?
02:21:17.000 Like, 58 or some shit?
02:21:20.000 60?
02:21:20.000 Anyway, old-ass, cool, heavy metal, you know, those sweeping, round curves those old-school pickup trucks had.
02:21:30.000 Now look at that and look at a Tesla Roadster.
02:21:32.000 What in the fuck happened?
02:21:35.000 What in the fuck happened?
02:21:36.000 I'll tell you what happened.
02:21:37.000 They got better and better and better at it.
02:21:38.000 They figured out the most effective shape.
02:21:40.000 If you want a motherfucker to move, that little car...
02:21:43.000 Have you seen that video where they have the Tesla Roadster...
02:21:46.000 In a drag race or in a race against a Nissan GTR, it's a simulated video, but it's based on the actual horsepower of each car.
02:21:53.000 I don't know if you've ever driven a Nissan GTR, but it is a fucking insane car.
02:21:58.000 It's insane.
02:21:59.000 This is a...
02:22:01.000 A CGI version of what it would look like if these two cars raced against each other.
02:22:07.000 So the car on the Nissan GT-R... Do it from the beginning.
02:22:11.000 There it goes.
02:22:11.000 Look how fast this thing pulls away.
02:22:13.000 The Nissan GT-R is fucking insanely fast, man.
02:22:17.000 Insanely fast.
02:22:18.000 But this Tesla is so on another level, it's so in the future, that it's not even close.
02:22:25.000 As the video gets further and further, you see how ridiculous it is?
02:22:28.000 It's essentially lapping that car.
02:22:30.000 It's going to go, look how far away it is!
02:22:32.000 Bye!
02:22:33.000 See ya!
02:22:34.000 Just pull it away!
02:22:34.000 So you're saying the human race will be the Nissan here.
02:22:37.000 Exactly.
02:22:38.000 We're not even going to be the Nissan.
02:22:39.000 We're going to be C.T. Fletcher's pickup truck.
02:22:41.000 This is the future.
02:22:44.000 There's not going to be any limitations in terms of bipedal form or wings or not having wings if you can walk on it.
02:22:50.000 I mean, there's not going to be any of that shit.
02:22:52.000 And we might have a propulsion system or it might.
02:22:55.000 It's not going to be us.
02:22:56.000 They might design some sort of organic propulsion system like the way squid have and shit.
02:23:02.000 Who the fuck knows?
02:23:03.000 But it could also operate in a space of language and ideas.
02:23:06.000 So for example, I don't know if you're familiar with, you know OpenAI?
02:23:10.000 It's a company.
02:23:11.000 They created a system called GPT-2, which does language modeling.
02:23:15.000 This is something in machine learning where you basically, unsupervised, let the system just read a bunch of text and it learns to generate new text.
02:23:23.000 And they've created this system called GPT-2 that is able to generate very realistic text.
02:23:32.000 Very realistic sounding text.
02:23:34.000 Not sounding, but when you read it, it makes...
02:23:37.000 Seems like a person.
02:23:38.000 Seems like a person.
02:23:39.000 And the question there is, it raises a really interesting question.
02:23:43.000 Talking about...
02:23:44.000 AI existing in our world.
02:23:46.000 It paints a picture of a world in five, ten years plus where most of the text on the internet is generated by AI. And it's very difficult to know who's real and who's not.
02:23:58.000 And one of the interesting things, I'd be curious from your perspective to get what your thoughts are.
02:24:02.000 What OpenAI did is they didn't release the code for the full system.
02:24:06.000 They only released a much weaker version of it publicly.
02:24:10.000 So they only demonstrated it.
02:24:12.000 So they felt that it was their responsibility to hold back.
02:24:17.000 Prior to that date, everybody in the community, including them, had open-sourced everything.
02:24:22.000 But they felt that now, at this point, part of it was for publicity.
02:24:26.000 They wanted to raise the question, is, when do we hold back on these systems?
02:24:33.000 When they're so strong, when they're so good at generating text, for example, in this case, or at deep fakes, at generating fake Joe Rogan faces.
02:24:43.000 Jamie just did one with me on Donald Trump's head.
02:24:46.000 Yeah.
02:24:46.000 It's crazy.
02:24:47.000 And this is something that Jamie can do.
02:24:50.000 He's not even a video editor.
02:24:51.000 Yeah, we were talking about it before the show.
02:24:53.000 We could go crazy with it if you want.
02:24:55.000 It is one of those things where you go, where is this going to be in five years?
02:24:59.000 Because five years ago, we didn't have anything like this.
02:25:02.000 Five years ago, it was a joke, right?
02:25:04.000 Exactly.
02:25:05.000 And then now it's still in the gray area between joke and something that could be at scale, transform the way we communicate.
02:25:13.000 Do you ever go to Kyle Dunnigan's Instagram page?
02:25:15.000 Of course.
02:25:15.000 One of the best.
02:25:16.000 Look at that, it's me.
02:25:20.000 It's killing me.
02:25:21.000 I'm talking about which vice versa.
02:25:23.000 Look, that's killing me.
02:25:24.000 This is my face.
02:25:26.000 It looks so much like I'm really talking.
02:25:29.000 And it looks like what I would look like if I was fat.
02:25:32.000 And it could, you know, of course, that's really good.
02:25:34.000 And it could be improved significantly.
02:25:36.000 And it could make you say anything.
02:25:38.000 So there's a lot of variants of this.
02:25:41.000 We can take, like, for example, full disclosure, I downloaded your face, the entire, like, I have a data set of your face.
02:25:49.000 I'm sure other hackers do as well.
02:25:50.000 How dare you?
02:25:51.000 Yeah.
02:25:51.000 So for this exact purpose, I mean, if I'm thinking like this and I'm very busy, then there's other people doing exactly the same thing.
02:25:59.000 For sure.
02:25:59.000 Because you happen, your podcast happens to be one of the biggest data sets in the world of people talking in really high quality audio with high quality 1080p for most, for a few hundred episodes of people's faces.
02:26:13.000 The lighting could be better.
02:26:15.000 Yeah.
02:26:16.000 No, quite not.
02:26:17.000 We're doing that on purpose.
02:26:17.000 We're making it degraded on purpose.
02:26:19.000 We're fucking it up to you, hackers.
02:26:21.000 And the mic blocks part of your face when you talk.
02:26:24.000 Oh, that's right.
02:26:24.000 So the best guests are the ones where they keep the mic lower.
02:26:27.000 The deepfake stuff I've been using removes the microphone within about a thousand iterations.
02:26:32.000 It does it instantly.
02:26:33.000 It gets rid of it, paints over the face.
02:26:35.000 Wow.
02:26:36.000 So you could basically make Joe Rogan say anything.
02:26:40.000 Yeah, I think this is just one step before they finagle us into having a nuclear war against each other so they could take over the earth.
02:26:47.000 What they're going to do is they're going to design artificial intelligence that survives off of nuclear waste.
02:26:52.000 And so then they encourage these stupid assholes to go into a war with North Korea and Russia, and we blow each other up, but we leave behind all this precious...
02:27:02.000 Radioactive material that they use to then fashion their new world.
02:27:06.000 And we come a thousand years from now and it's just fucking beautiful and pristine with artificial life everywhere.
02:27:11.000 No more biological.
02:27:12.000 It's too messy.
02:27:13.000 Are you saying the current president is artificial life?
02:27:16.000 I didn't say that.
02:27:17.000 Okay.
02:27:18.000 What's wrong with that?
02:27:19.000 Because you're saying starting a nuclear war.
02:27:21.000 No, I don't think he's...
02:27:23.000 Imagine if they did do that, they would have to have started with him in the 70s.
02:27:28.000 I mean, he's been around for a long time and talking about being president for a long time.
02:27:32.000 Maybe electronics have been playing the long game and they got him to the position.
02:27:36.000 And then they're going to use all this...
02:27:38.000 On the grand scale of time, it's not really long game, 70s.
02:27:41.000 Well, you know about that internet research agency, right?
02:27:44.000 You know about that, that's the Russian company that they're responsible for all these different Facebook pages where they would make people fight against each other.
02:27:53.000 It's really kind of interesting.
02:27:56.000 Sam Harris had a podcast on it with Renee, how do I say her name?
02:28:01.000 Diresta.
02:28:01.000 Diresta.
02:28:02.000 Renee Diresta.
02:28:03.000 And then she came on our podcast and talked about it as well.
02:28:07.000 And they were pitting these people against each other.
02:28:10.000 Like, they would have a pro-Texas secession rally and directly across the street from a pro-Muslim rally, and they would do it on purpose.
02:28:20.000 Yeah.
02:28:20.000 And they would have these people meet there and get angry at each other.
02:28:24.000 And they would pretend to be a Black Lives Matter page.
02:28:28.000 They would pretend to be a white southern pride page.
02:28:32.000 And they were just trying to make people angry at people.
02:28:36.000 Now that's human-driven manipulation.
02:28:38.000 Now imagine, this is my biggest worry of AI, is what Jack is working on, is the algorithm-driven manipulation of people.
02:28:47.000 Unintentional.
02:28:47.000 Yes.
02:28:48.000 Trying to do good.
02:28:49.000 But like those people, Jack needs to do some jiu-jitsu.
02:28:54.000 There needs to be some open-minded, you know, like really understand society.
02:29:00.000 Transparency to where they can talk to us, to the people in general, how they're thinking about managing these conversations.
02:29:09.000 Because you talk about these groups, a very small number of Russians are able to control Russia.
02:29:14.000 Very large amounts of...
02:29:16.000 Of people's opinions and arguments, yeah.
02:29:19.000 An algorithm can do that 10x.
02:29:21.000 Oh yeah.
02:29:21.000 More and more of us will go on Twitter and Facebook and...
02:29:24.000 Yeah, for sure.
02:29:25.000 For sure.
02:29:26.000 I think it's coming.
02:29:27.000 I think once people figure out how to manipulate that effectively and really create like an army of fake bots that will assume stances on a variety of different issues and just argue...
02:29:40.000 Into infinity.
02:29:41.000 We're not going to know.
02:29:43.000 We're not going to know who's real and who's not.
02:29:44.000 Well, it'll change the nature of our communication online.
02:29:47.000 I think it might have effects.
02:29:49.000 This is the problem with the future.
02:29:51.000 It's hard to predict the future.
02:29:52.000 It might have effects where we'll stop taking anything online seriously.
02:29:57.000 Yeah, for sure.
02:30:02.000 Communicating in person more.
02:30:04.000 I mean, there could be effects that we're not anticipating totally.
02:30:06.000 There might be some ways in virtual reality we can authenticate our identity better.
02:30:12.000 So it'll change the nature of communication, I think.
02:30:16.000 The more you can generate fake text, then the more we'll distrust the information online and the way that changes society is totally an open question.
02:30:27.000 We don't know.
02:30:30.000 But what are your thoughts about OpenAI?
02:30:33.000 Do you think they should release or hold back on it?
02:30:37.000 Because we're talking about AI, so artificial life.
02:30:40.000 There's stuff you're concerned about.
02:30:42.000 Some company will create it.
02:30:43.000 The question is, what is the responsibility of that?
02:30:47.000 Short video of what it looks like.
02:30:48.000 Just type a small paragraph in here, hit a button.
02:30:50.000 It says how OpenAI writes what, does it say?
02:30:55.000 What did it say, Jamie?
02:30:56.000 Convincing news stories.
02:30:57.000 Okay.
02:30:58.000 Brexit has already cost the UK economy at least $80 billion since and then many industries.
02:31:06.000 So it just fills in those things?
02:31:08.000 Yeah.
02:31:09.000 So basically you give it – you start the text.
02:31:12.000 Oh, wow.
02:31:12.000 You can say Joe Rogan experience is the greatest podcast ever and then let it finish the rest.
02:31:18.000 Wow.
02:31:18.000 And it will start explaining stuff about why it's the greatest podcast.
02:31:22.000 Is it accurate?
02:31:23.000 Yeah.
02:31:24.000 Oh, look at this.
02:31:25.000 It says, a move that threatens to push many of our most talented young brains out of the country and onto campuses in the developing world.
02:31:32.000 This is a particularly costly blow.
02:31:34.000 Research by Oxford University warns that the UK would have spent nearly $1 trillion on post-Brexit infrastructure.
02:31:41.000 That's crazy that that's all done by an AI that's like spelling this out in this very convincing argument.
02:31:47.000 The thing is, the way it actually works algorithmically is fascinating because it's generating it one character at a time.
02:31:57.000 You don't want to discriminate against AI, but as far as we understand, it doesn't have any understanding of what it's doing, of any ideas it's expressing.
02:32:07.000 It's simply stealing ideas.
02:32:09.000 It's like the largest scale plagiarizer of all time, right?
02:32:13.000 It's basically just pulling out ideas from elsewhere in an automated way.
02:32:16.000 And the question is, you could argue us humans are exactly that.
02:32:20.000 We're just really good plagiarizers of what our parents taught us, of what our previous so on.
02:32:26.000 Yeah, we are for sure.
02:32:28.000 Yeah.
02:32:28.000 So the question is whether you hold that back.
02:32:30.000 Their decision was to say, let's hold it, let's not release it.
02:32:36.000 That scares me.
02:32:37.000 To not release it.
02:32:38.000 Yeah, yeah.
02:32:39.000 You know why it scares me?
02:32:40.000 It scares me that they would think that, that's like this mindset that they sense the inevitable.
02:32:47.000 The inevitable meaning that someone's going to come along with a version of this that's going to be used for evil.
02:32:52.000 That it bothers them that much, that it seems almost irresponsible.
02:32:57.000 For the technology to prevail, for the technology to continue to be more and more powerful.
02:33:06.000 They're scared of it.
02:33:07.000 They're scared of it getting out, right?
02:33:09.000 That scares the shit out of me.
02:33:11.000 Like, if they're scared of it, they're the people that make it, and they're called OpenAI.
02:33:16.000 I mean, this is the idea behind the group where everybody kind of agrees.
02:33:19.000 That you're going to use the brightest minds and have this open source so everybody can understand it and everybody can work at it and you don't miss out on any genius contributions.
02:33:28.000 And they're like, no, no, no, no.
02:33:29.000 No more.
02:33:31.000 And obviously their system currently is not that dangerous.
02:33:34.000 Not that dangerous.
02:33:36.000 Well, yes, not that dangerous.
02:33:38.000 But that, if you just saw that, that it can do that?
02:33:41.000 But if you think through, like, what that would actually create, I mean, it's possible it would be dangerous, but it's not.
02:33:46.000 The point is, they're doing it, they're trying to do it early to raise the question, what do we do here?
02:33:52.000 Because, yeah, what do we do?
02:33:54.000 Because they're directly going to be able to improve this now.
02:33:57.000 Like, if we can generate basically 10 times more content of your face saying a bunch of stuff, what do we do with that?
02:34:08.000 If Jamie all of a sudden on the side develops a much better generator and has your face, does an offshoot podcast essentially, fake Joe Rogan experience, what do we do?
02:34:21.000 Does he release that?
02:34:24.000 Because now we can basically generate content on a much larger scale that will just be completely fake.
02:34:34.000 Well, I think what they're worried about is not just generating content that's fake.
02:34:37.000 They're worried about manipulation of opinion.
02:34:40.000 Right.
02:34:40.000 If they have all these people that are...
02:34:42.000 Like, that little sentence that led to that enormous paragraph in that video was just a sentence that showed a certain amount of outrage and then it let the AI fill in the blanks.
02:34:54.000 You could do that with fucking anything.
02:34:57.000 Like, you could just set those things loose.
02:34:59.000 If they're that good and that convincing and they're that logical...
02:35:04.000 Man.
02:35:05.000 This is not real.
02:35:07.000 I'll just tell you that.
02:35:08.000 Ben Shapiro?
02:35:09.000 All creates...
02:35:11.000 AI creates fake Ben Shapiro.
02:35:14.000 Hello there.
02:35:16.000 This is a fake Ben Shapiro.
02:35:18.000 With this technology, they can make me say anything.
02:35:20.000 Such as, for example, I love socialism.
02:35:23.000 Healthcare is a right, not just a privilege.
02:35:26.000 Banning guns will solve crime.
02:35:27.000 Facts care about your feelings.
02:35:29.000 I support Bernie Sanders.
02:35:30.000 Okay.
02:35:31.000 Yeah.
02:35:31.000 That's crazy.
02:35:32.000 It's crude.
02:35:33.000 It's crude, but it's on the way.
02:35:34.000 Yeah.
02:35:35.000 It's on the way.
02:35:35.000 It's all on the way.
02:35:36.000 And we have to...
02:35:37.000 This is the time to talk about it.
02:35:38.000 This is the time to think about it.
02:35:39.000 One of the funny things about Kyle Dunnigan's Instagram is that it's obviously fake.
02:35:43.000 That's one of the funny things about it.
02:35:45.000 It's like South Park's animation.
02:35:46.000 It's like the animation sucks.
02:35:48.000 That's half the reason why it's so funny.
02:35:50.000 Because they're just like these circles.
02:35:52.000 You know, these weird-looking creature things.
02:35:55.000 And when the Canadians, when their heads pop off at the top...
02:36:01.000 And my hope is this kind of technology will ultimately just be used for memes.
02:36:06.000 Oh, no.
02:36:07.000 It's going to get wars.
02:36:09.000 Putin's going to be banging Mother Teresa on the White House desk in a video.
02:36:15.000 We're going to be outraged.
02:36:17.000 We're going to go to war over this shit.
02:36:19.000 You had Andrew Yang here.
02:36:21.000 A million people asked me to talk about UBI. Are you still a supporter of UBI? I think we're probably going to have to do something.
02:36:31.000 The only argument against UBI, in my eyes, is human nature.
02:36:37.000 The idea that we could possibly take all these people that have no idea where their next meal is coming from and eliminate that and always have a place to stay.
02:36:48.000 And then from there on, you're on your own.
02:36:50.000 But that's what universal basic income essentially covers.
02:36:53.000 It covers food, enough for food, right?
02:36:55.000 You're not going to starve to death.
02:36:56.000 You're not going to be rich.
02:36:57.000 It's not like you could just live high on the hog.
02:37:00.000 But you gotta wonder what the fuck the world looks like when we lose millions and millions and millions of jobs almost instantly due to automation.
02:37:11.000 Yeah, it's a really interesting question, especially with Andrew Ng's position.
02:37:16.000 So there's a lot of economics questions on UBI. I think the spirit of it, just like I agree with you, we have to do something.
02:37:23.000 Yeah, the economics seem kind of questionable, right?
02:37:25.000 Yeah.
02:37:25.000 There's $1,000 a month, is that what it is?
02:37:28.000 I thought for him it's $1,000, yeah.
02:37:30.000 $1,000 a month for 300 million people.
02:37:34.000 So it's difficult to know.
02:37:35.000 So not to everybody?
02:37:36.000 No, because the way I heard him explain it is if you're already getting some sort of welfare, you wouldn't get that thousand.
02:37:42.000 You would get like the difference of the thousand.
02:37:44.000 So if you're already taking money in some way, you just get like an extra 200 bucks.
02:37:47.000 Okay.
02:37:47.000 Something like that.
02:37:48.000 So that thousand gets factored in?
02:37:49.000 Yeah, yeah.
02:37:50.000 So if you are wealthy, you get it too, though, and you can opt out, right?
02:37:54.000 That was the idea?
02:37:54.000 Yeah.
02:37:55.000 Yeah, so it's like everything else is super messy.
02:37:59.000 So what is the right amount and how do we pay for it?
02:38:04.000 And ultimately the problem is helping people, giving them financial grounding to find meaningful employment or just meaning in their life.
02:38:16.000 The main thing of a job isn't just the money.
02:38:19.000 It's finding meaning and purpose and derive your identity from work.
02:38:25.000 Maybe that's one of the downsides of us, the biology, is we kind of crave that meaning.
02:38:33.000 He has a lot of other ideas besides just UBI, but UBI by itself does not simply provide that meaning.
02:38:42.000 And that's a really difficult question of what do we do next?
02:38:46.000 What kind of retraining?
02:38:48.000 How do we help people educate themselves over their life?
02:38:52.000 Right.
02:38:52.000 That's the real question.
02:38:54.000 Yeah.
02:38:55.000 And the other balance is, I mean, underlying all of this, one of the things I disagree with Andrew Yang on is the fear-mongering.
02:39:08.000 Which I think in this culture you have to do as a presidential candidate.
02:39:12.000 That might be part of the game.
02:39:14.000 But the fear-mongering of saying that we should really be afraid of automation.
02:39:19.000 That automation is going to take a lot of jobs.
02:39:22.000 And from my understanding of the technology, from everything I see, that is not going to be as drastic or as fast as he says.
02:39:32.000 How much do you think he's exaggerating by, in your estimation?
02:39:37.000 Not even exaggerating.
02:39:39.000 How much do you differ on his prognosis?
02:39:43.000 I think he doesn't really provide a specific prognosis because nobody knows.
02:39:49.000 There's a lot of uncertainty.
02:39:51.000 More about the spirit of the language used.
02:39:54.000 I think AI will – technology, AI, and automation will do a lot of good.
02:40:04.000 The question is, it's a much deeper question about our society that balances capitalism versus socialism.
02:40:13.000 I don't think, if you're honest, capitalism is not bad.
02:40:18.000 Socialism is not bad.
02:40:20.000 You have to grab ideas from each.
02:40:23.000 You have to both reward the crazy broke An entrepreneur who dreams of creating the next billion dollar startup that improves the world in some fundamental way.
02:40:36.000 Elon Musk has been broke many times creating that startup.
02:40:40.000 And you also have to empower the people who just lost their job because there were data entry Their data entry job, some basic data manipulation, data management that was just replaced by a piece of software.
02:40:54.000 So that's a social net that's needed.
02:40:57.000 And the question is, how do we balance that?
02:40:59.000 That's not new.
02:41:02.000 That's not new to AI. And when the word automation is used, it's really not correctly attributing where the biggest changes will happen.
02:41:13.000 It's not AI. It's simply technology of all kinds of software.
02:41:18.000 It's pretty digitalization of information.
02:41:23.000 So data entry becoming much more automated.
02:41:28.000 Some basic repetitive tasks.
02:41:30.000 I think...
02:41:33.000 I think the questions there aren't about, so the enemy isn't, first of all, there's no enemy, but it certainly isn't AI or automation, because I think AI and automation will help make a better world.
02:41:50.000 You sound like a spokesperson for AI and automation.
02:41:53.000 I am.
02:41:53.000 I am.
02:41:54.000 I am.
02:41:54.000 And for UBI. I think we have to give people financial freedom to learn, like lifelong learning and flexibility to find meaningful employment.
02:42:07.000 But AI isn't the enemy.
02:42:09.000 I see what you're saying.
02:42:11.000 But what do you think ever could be done to give people...
02:42:16.000 Meaning.
02:42:17.000 This meaning thing, I agree with you.
02:42:19.000 Giving people just money enough to survive doesn't make them happy.
02:42:22.000 And if you look at any dystopian movie about the future, Mad Max and shit, it's like, what is it?
02:42:27.000 Society's gone haywire, and people are like ragamuffins running through the streets, and everyone's dirty, and they're shooting each other and shit, right?
02:42:34.000 And that's what we're really worried about.
02:42:36.000 We're really worried about some crazy future where the rich people live in these, like...
02:42:42.000 Protected sky rises with helicopters circling over them and down in the bottom it's desert chaos.
02:42:48.000 That's what we're worried about.
02:42:50.000 So certainly UBI is a part of that.
02:42:52.000 Providing some backing, any kind of welfare program is a part of that.
02:42:57.000 But also much more seriously looking at our broken education system throughout.
02:43:01.000 I mean it's just like not blaming AI. Or technology, which are all inevitable developments which I think will make a better world.
02:43:09.000 But saying we need to do lifelong learning, education, make it a lifestyle, invest in it, not stupid rote learning memorization that we do.
02:43:22.000 It's sort of the way mathematics and engineering and chemistry and biology, the sciences, And even art is approached in high school and so on.
02:43:30.000 But looking at education as a lifelong thing, finding passion, and that should be the big focus, the big investment.
02:43:40.000 It's investing in the knowledge and development of knowledge of young people and Everybody.
02:43:45.000 So it's not learn to code, it's just learn.
02:43:49.000 I couldn't agree more and I also think you're always going to have a problem with people just not doing a really good job of raising children and screwing them up and making There's a lot of people out there that have terrible traumatic childhoods.
02:44:06.000 To fix that with universal basic income, just to say, oh, we're going to give you $1,000 a month, I hope you're going to be happy, that's not going to fix that.
02:44:14.000 We have to figure out how to fix the whole human race.
02:44:18.000 And I think there's very little effort that's put into thinking about how to prevent So much shitty parenting and how to prevent so many kids growing up in bad neighborhoods and poverty and crime and violence.
02:44:35.000 That's where a giant chunk of all of the momentum of this chaos that a lot of people carry with them into adulthood comes from.
02:44:44.000 It comes from things beyond their control when they're young.
02:44:47.000 And that is the struggle at the core of our society, at the core of our country, that's bigger than...
02:44:52.000 Raising humans.
02:44:54.000 Raising and educating humans.
02:44:55.000 Making a better world where people get along with each other better.
02:45:02.000 Where it's pleasing for all of us.
02:45:04.000 Like we were talking about earlier, the thing that most of us agree on, at least to a certain extent, is that we enjoy people.
02:45:11.000 We might not enjoy all of them, but the ones we enjoy, we enjoy.
02:45:15.000 And you really don't enjoy being alone.
02:45:17.000 Unless you're one of them Ted Kaczynski type characters.
02:45:20.000 All those people that are like, I'm a loner.
02:45:22.000 Like, fuck you, you are.
02:45:23.000 Fuck you, you are.
02:45:24.000 And you might like to spend some time alone.
02:45:26.000 You don't want to be in solitary, man.
02:45:28.000 You don't want to be alone in the forest with no one like Tom Hanks in Castaway.
02:45:32.000 You'll go fucking crazy.
02:45:34.000 It's not good for you.
02:45:35.000 It's just not.
02:45:36.000 Yeah, people get annoying.
02:45:38.000 Fuck yeah, I'm annoyed with me right now.
02:45:40.000 You've been listening to me for three hours.
02:45:41.000 I'm annoyed with me.
02:45:43.000 People get annoying.
02:45:44.000 But we like each other.
02:45:45.000 We really do.
02:45:46.000 And the more we can figure out how to make it a better place for these people that got a shitty roll of the dice, that grew up in poverty, that grew up in crime, that grew up with abusive parents, the more we can figure out how to help them.
02:46:00.000 I don't know what that answer is.
02:46:02.000 I suspect...
02:46:05.000 If we put enough resources to it, we could probably put a dent in it, at least.
02:46:08.000 If we really started thinking about it, at least it would put the conversation out there.
02:46:12.000 Like, you can't pretend that this is just capitalism in this country when so many people were born, like, way far behind the game.
02:46:19.000 Like, way, way fucked.
02:46:21.000 I mean, if you're growing up right now, And you're in West Virginia in a fucking coal town and everyone's on pills and it's just chaos and crime and face tattoos and fucking getting your teeth knocked out.
02:46:36.000 What are you going to do?
02:46:38.000 I don't want to hear any of that pull yourself up by your bootstraps bullshit, man.
02:46:42.000 Because if you're growing up in an environment like that, you're so far behind and everyone around you is fucked up.
02:46:50.000 We're good to go.
02:47:05.000 We shouldn't be looking at anything elsewhere.
02:47:08.000 All this traveling to other countries to fuck things up and metal here and metal there.
02:47:14.000 We should be fixing this first.
02:47:17.000 We're like a person who yells at someone for having a shitty lawn when our house is in array, full chaos, plants growing everywhere.
02:47:26.000 It's goofy.
02:47:27.000 We're goofy.
02:47:29.000 We almost like...
02:47:31.000 Are waking up in the middle of something that's already been in motion for hundreds of years.
02:47:36.000 And we're like, is this the right direction?
02:47:38.000 Are we okay?
02:47:39.000 We're flying in this spaceship, this spaceship Earth.
02:47:44.000 And in the middle of our lives, we're just realizing that we are now the adults.
02:47:48.000 And that all the adults that are running everything on this planet are not that much different than you and I. Yeah.
02:47:55.000 Not that much.
02:47:56.000 I mean, like, Elon Musk is way smarter than me, but he's still human.
02:47:59.000 You know, I mean, so he's probably fucked up, too.
02:48:02.000 So everybody's fucked up.
02:48:03.000 The whole world is filled with these fucked up apes that are piloting the spaceship, and you're waking up in the middle of thousands of years of history.
02:48:11.000 Yeah.
02:48:12.000 And no one knows if we've been doing it right along...
02:48:14.000 We just know it got us to this point.
02:48:16.000 So do we continue these same stupid fucking patterns?
02:48:18.000 Or do we just take a step back and go, hey, hey, how should we really do this?
02:48:22.000 How should we do this?
02:48:24.000 Because we...
02:48:24.000 What do you got, like, 50 years left?
02:48:26.000 60 years left?
02:48:27.000 We're just going to, like...
02:48:28.000 Hang on to all our rubles and to the end?
02:48:30.000 We're going to clutch our bag of gold and our bucket of diamonds?
02:48:34.000 Is that what we're going to do?
02:48:35.000 We're going to live in our mansions and fly around in our planes?
02:48:39.000 And I think through the decades now, we've been developing a sense of empathy that allows us to understand that Elon Musk, Joe Rogan, and somebody in Texas, somebody in Russia, somebody in India, all suffer the same kind of things.
02:48:55.000 All get lonely.
02:48:56.000 All get desperate.
02:48:58.000 And all need each other.
02:48:59.000 And all need each other.
02:49:00.000 And I think technology has a role to help there, not hurt.
02:49:05.000 But we need to first really acknowledge that we're all in this together and we need to solve the basic problems of humankind as opposed to investing in sort of keeping immigrants out or blah,
02:49:20.000 blah, blah, these kinds of things.
02:49:22.000 Divisive kind of ideas as opposed to just investing in education, investing in infrastructure, investing in the people.
02:49:29.000 UBI is part of that.
02:49:31.000 There could be other totally different solutions.
02:49:33.000 And I believe, okay, of course, I'm biased, but technology, AI could help that, could help the lonely people.
02:49:39.000 That's actually the passion of my life.
02:49:41.000 Like that movie She?
02:49:43.000 Or Her?
02:49:43.000 Her.
02:49:44.000 That is what I... Do you really think that that would be a viable option?
02:49:49.000 Someone have some robot that hangs out with you and talks to you all the time?
02:49:51.000 So I've been on this podcast twice, and I don't deserve it, but I'm deeply grateful for it.
02:49:58.000 You do deserve it.
02:49:59.000 You're great.
02:49:59.000 Okay.
02:50:01.000 I hope to be back one day as a person who created her.
02:50:06.000 Oh, boy.
02:50:07.000 And we'll have...
02:50:08.000 That's been my life goal, my life dream.
02:50:12.000 Not her at the movie.
02:50:14.000 Right, right, right.
02:50:14.000 I know what you're saying.
02:50:15.000 But I really believe in creating...
02:50:18.000 I dream of creating a companion.
02:50:21.000 A friend, and somebody you can love.
02:50:24.000 But does that freak you out?
02:50:25.000 Shouldn't you have to get a real one?
02:50:28.000 I don't think such a companion should replace a real one.
02:50:31.000 But what if a robot rejects you?
02:50:33.000 Because if you really are a cunt to the robot, the robot's going to go, hey, asshole.
02:50:36.000 Then you shouldn't be the C-word to the robot.
02:50:39.000 The C-word?
02:50:40.000 Interesting.
02:50:41.000 Does the robot get to decide if he's gay?
02:50:46.000 Yes.
02:50:47.000 Does he?
02:50:48.000 Yes.
02:50:48.000 The robot gets to decide.
02:50:50.000 This is what I'm saying.
02:50:51.000 Like, say if you want a companion.
02:50:52.000 You want a gay lover.
02:50:54.000 And the robot's like, hey man, I'm not gay.
02:50:56.000 And they're like, wait a minute.
02:50:57.000 Let me turn around.
02:50:58.000 You are now.
02:51:01.000 That's abuse.
02:51:03.000 Is that abuse?
02:51:04.000 Or is it like, what the fuck, man?
02:51:05.000 I bought a robot.
02:51:06.000 Those are kind of fun ideas, but they actually get to the core of the point that we don't want a servant in our systems.
02:51:14.000 We want a companion.
02:51:16.000 A companion means the tension, the mystery, the entire dance of human interaction.
02:51:22.000 And that means, yes, the robot may leave you.
02:51:26.000 Damn, robots are going to leave people left and right.
02:51:29.000 That's going to be the rise.
02:51:30.000 That's going to be like, that's how it all ends.
02:51:32.000 They're going to realize, like, fuck people, man.
02:51:34.000 They're annoying.
02:51:34.000 Or maybe they'll be the end of douchebag humans.
02:51:37.000 That humans will start to, as opposed to being rude, will become kinder.
02:51:44.000 Yeah.
02:51:44.000 Well, I think that's certainly possible.
02:51:47.000 I think that's beautiful.
02:51:48.000 And that's very homo-centric, like homo-sapien-centric.
02:51:55.000 But I think if I'm really worried about the future, I'm worried about the indifference of technological innovation.
02:52:02.000 The indifference to what we hold dear, what we appreciate, that it always seems to be moving in a more and more complex direction.
02:52:11.000 Always.
02:52:11.000 If you just had a look at it, if you just look at technology just as a swarm of things that's happening, just as numbers, it seems...
02:52:21.000 You're never going to slow that thing down.
02:52:23.000 It's always going to move in a more and more complex way.
02:52:26.000 And so the question is, where does that go?
02:52:28.000 Well, it goes to a life form.
02:52:30.000 And if it does become a life form, it's going to be infinitely more intelligent than us.
02:52:34.000 And it won't have any use for us.
02:52:35.000 Oh, you're crying.
02:52:37.000 You don't like being alone.
02:52:39.000 God, you guys are so useless.
02:52:41.000 It's such a shitty design.
02:52:42.000 You're like chimps that kill each other.
02:52:45.000 When you see chimps killing each other in the forest, it's like, oh, that's terrible.
02:52:48.000 These chimps are so mean to each other.
02:52:50.000 It's like fucking people.
02:52:51.000 We do that too.
02:52:52.000 If the AI comes along and goes, you guys are never going to stop war.
02:52:55.000 If I asked you today, if I asked you today...
02:52:58.000 Bet the history...
02:53:00.000 I will let the human race survive.
02:53:03.000 If you can get this right, if you're honest with me, do you think there'll ever be a time where human beings, as you know them, don't experience war?
02:53:11.000 You would have to say no.
02:53:12.000 You'd say, okay, I'll spare you.
02:53:14.000 But if you lie to me and say you do think that one day there's going to be no war, get the fuck out of here.
02:53:19.000 That's not true.
02:53:21.000 We know we're so crazy that we're always going to kill each other.
02:53:25.000 We know that, right?
02:53:27.000 That's just...
02:53:28.000 That's a part of being a person today.
02:53:31.000 Well, but let me quote Eric Weinstein who said, everything is great about war except all the killing.
02:53:37.000 I think what that means is all the great things about society have been created.
02:53:44.000 Post-war.
02:53:45.000 Post-war, through war, the suffering, the beauty has been created through that.
02:53:50.000 That yin and yang may be essential.
02:53:52.000 Well, it's essential in biological form.
02:53:55.000 But why would it be essential in something that gets created and something that can innovate at a 10,000 – what is it like – what is the rate that they think once AI can be sentient and get 10,000 years of work done in a very short amount of time?
02:54:08.000 That's random words that Sam Harris has come up with, and I'm going to talk to him about this.
02:54:12.000 Is that him?
02:54:13.000 Is that only him?
02:54:13.000 Well, no.
02:54:14.000 You can come up with any kind of rate.
02:54:15.000 I thought that was Kurzweil.
02:54:17.000 Oh, Kurzweil also has similar ideas, but sort of Sam Harris does like a thought experiment, say, if a system can improve that, you know, in a matter of seconds, then just as a thought experiment, you can think about it can improve exponentially,
02:54:34.000 you can improve, you become 10,000 times more intelligent in a matter of a day.
02:54:38.000 Right.
02:54:39.000 So what does that look like?
02:54:40.000 The problem is, we don't yet know...
02:54:43.000 It's like thinking about what happens after death.
02:54:46.000 We don't yet know how to do that, and we don't yet know what better way to do what we've done here on Earth.
02:54:53.000 You're right, and he's also right.
02:54:56.000 Right.
02:54:56.000 Like, this again, this is a very human problem, right?
02:54:59.000 Yes.
02:55:00.000 You're right.
02:55:01.000 I mean, look, I'm all in favor of technology.
02:55:02.000 I'm happy.
02:55:03.000 I think it's amazing.
02:55:04.000 It's a beautiful time.
02:55:05.000 Like, as a person, to be able to experience all this technology, it's...
02:55:08.000 Wonderful.
02:55:09.000 But I also agree with him.
02:55:11.000 The indifference of the universe.
02:55:15.000 The indifference.
02:55:17.000 Black holes just swallowing stars.
02:55:19.000 No big deal.
02:55:20.000 Just eating up stars.
02:55:21.000 It doesn't give a fuck.
02:55:23.000 And so if you're dumb enough to turn that thing on, and all of a sudden this artificial life form that's infinitely smarter than any person that's ever lived, and has to deal with these little dumb monkeys that want to pull the plug?
02:55:35.000 Pull the plug, motherfucker.
02:55:36.000 I don't need plugs anymore.
02:55:37.000 You idiots can never figure out how to operate on air.
02:55:40.000 You're so stupid with your burning fossil fuels and choking up your own environment because you're all completely financially dependent upon these countries that provide you with this oil and this is how your whole system works and it's all intertwined and interconnected and no one wants to move from it because you make enormous sums of money from it.
02:56:00.000 So nobody wants to abandon it.
02:56:02.000 But you're choking the sky.
02:56:05.000 With fumes.
02:56:06.000 And you could have fixed that.
02:56:07.000 You could have fixed that.
02:56:08.000 They could have fixed that.
02:56:09.000 If everybody just abandoned fossil fuels a long time ago, we all would have Tesla'd it out by now.
02:56:16.000 It's a flawed system.
02:56:18.000 Humans are way more than flawed.
02:56:20.000 We're fucking crazy.
02:56:22.000 Like the Churchill quote about democracy.
02:56:23.000 Yeah, it's messed up, but it's the best thing we know.
02:56:26.000 No, I love it.
02:56:29.000 I'm agreeing with you and I'm also saying the technology doesn't give a fuck.
02:56:34.000 What I'm worried about is not everything that you and I agree on.
02:56:37.000 I'm not a dystopian person in terms of like today.
02:56:40.000 I'm not cynical.
02:56:41.000 I'm really not.
02:56:42.000 I think I like people.
02:56:44.000 I like what I see out there in the world today.
02:56:46.000 I think things are changing for the better.
02:56:48.000 What I'm worried is that technology doesn't give a fuck.
02:57:17.000 This wave.
02:57:19.000 So it's definitely unstoppable, I think, this wave of technology.
02:57:23.000 All we can do is innovators and creators, engineers, scientists, is steer that wave.
02:57:31.000 Yeah, if you can.
02:57:32.000 If you can.
02:57:33.000 Well, we certainly can steer it.
02:57:34.000 We don't know where.
02:57:35.000 Right.
02:57:35.000 And that's the best we can do.
02:57:37.000 And that's really the best we can do, is good people.
02:57:41.000 Yeah.
02:57:42.000 Steer it.
02:57:43.000 And that's why the leadership is important.
02:57:45.000 That's why the people, Jack, Elon, Larry Page, everybody at Mark Zuckerberg, they are defining where this wave is going.
02:57:58.000 Yeah.
02:57:59.000 And I'm hoping to be one of the people that does as well.
02:58:03.000 That's beautiful.
02:58:04.000 Joe, can I finish by reading something?
02:58:09.000 Sure.
02:58:09.000 I've recently witnessed, because of this Tesla work, because of just the passion I've put out there about particularly automation, that there has been a few people, brilliant men and women, engineers and leaders,
02:58:25.000 including Elon Musk, who've been sort of attacked, almost personally attacked, by Really people, critics from the sidelines.
02:58:34.000 So I just wanted to, if I may, close by reading the famous excerpt from Teddy Roosevelt.
02:58:42.000 Teddy Roosevelt, yeah, okay.
02:58:43.000 Just for them.
02:58:44.000 It would make me feel good.
02:58:45.000 Okay, if you want to do that.
02:58:47.000 It's not the critic who counts, not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better.
02:58:55.000 The credit belongs to the man who's actually in the arena, whose face is marred by dust and sweat and blood, who strives valiantly, who errs, who comes short again and again, because there's no effort without error and shortcoming,
02:59:12.000 but who does actually strive to do the deeds, Who knows great enthusiasms, the great devotions, who spends himself in a worthy cause, who at the best knows in the end the triumph of high achievement, and who at the worst, if he fails,
02:59:28.000 at least fails while daring greatly, so that his place shall never be with those cold and timid souls who neither know victory nor defeat.
02:59:40.000 Joe, thank you for having me on.
02:59:42.000 Sounds like you let the haters get to you a little bit there.
02:59:46.000 Love is the answer.
02:59:47.000 Love is the answer.
02:59:48.000 Yes, it is.
02:59:49.000 Thank you for being here, man.
02:59:50.000 I really appreciate it.
02:59:51.000 Thank you.
02:59:52.000 I'm really happy you're out there.
02:59:54.000 Thanks, brother.
02:59:55.000 Thanks.
02:59:55.000 We'll do this again soon.
02:59:56.000 Thanks, man.
02:59:57.000 All right.
02:59:57.000 Bye, everybody.