The Joe Rogan Experience - February 04, 2020


Joe Rogan Experience #1422 - Lex Fridman


Episode Stats

Length

2 hours and 50 minutes

Words per Minute

166.24326

Word Count

28,314

Sentence Count

2,732

Misogynist Sentences

43

Hate Speech Sentences

55


Summary

Lex and I talk about the dangers of being in a suit and how to deal with them. We also talk about who would win in a fight if you had to wear a tie and what you would do if you were in the middle of being chocked to death by it. And who would you choose to be the next person to be chocked by a tie? Who would you pick to win the fight and who would be your go to person to help you get out of it if it was your only chance to win a fight? Thanks for listening and stay tuned for the next episode! Logo by Courtney DeKorte. Theme by Mavus White. Music by PSOVOD and tyops. All rights reserved. Used by permission. The opinions stated here are our own and not those of our companies, unless otherwise stated. We do not own the rights to any music used in this podcast. If you enjoyed this podcast, please leave us a review and/or share it on your social media if you enjoyed it! Thank you and share it with a friend or become a supporter of our cause and we'll make sure to bring you more quality content like this in the future episodes! XOXO. Xoxo. xoxo Xx - The Crew - P.S. Thank you so much for all the love, support, support and support this podcast and all the support we can get from our sponsorships. - Thank you for all of your support and reviews! - we really appreciate it greatly :) -P.R. -- Thank you! -- The Crew is looking out for your support is greatly appreciated! <3 -JUICY. ~ Thank you, P.E.A. -PODCAST SUPPORTED -D.M. and P.B. & P.O.C. & JUICED. (Thank you for the support is much appreciated! -PRAISE IS MUCH LUVY (A.D. & AVAILABLE -A.K. & PRAISE ATTRACTUAL SUPPORTED BY PODCAST PRODUCEDUCARES & PRODCAST LINKS AND SUPPORTED INCLUDEFINITION & SUPPORTED ATTRACTS - AUGMENTED IN THE NEXT EPISODE


Transcript

00:00:02.000 Three, two, one.
00:00:03.000 Lex, handsome as ever.
00:00:05.000 Thank you.
00:00:06.000 Well dressed.
00:00:06.000 Always feel like a slob when I'm around you.
00:00:08.000 Do you dress like that in real life or only when you do podcasts?
00:00:11.000 Yeah, so I have two outfits.
00:00:12.000 This and black shirt and jeans.
00:00:15.000 Slick outfit.
00:00:16.000 There's nothing more classic than a dark suit with a white shirt and a black tie.
00:00:22.000 Is that a black tie or is that a dark blue?
00:00:25.000 Black tie.
00:00:25.000 Black suit, black tie.
00:00:27.000 It's armor.
00:00:28.000 Yes.
00:00:28.000 It makes me feel like it focuses the mind.
00:00:31.000 Like a professional.
00:00:33.000 Yeah, like I'm taking this seriously.
00:00:34.000 Yes, yes, yes.
00:00:36.000 Like, you're fucking for real, man.
00:00:38.000 You got notes and shit?
00:00:39.000 Yeah, I got notes and shit, but I... But given this suit, like, I like to get, like, dirty.
00:00:46.000 Like, I like to work in a car or whatever.
00:00:48.000 Like, I don't want to...
00:00:49.000 Like, I love to get in a fight in this.
00:00:51.000 This isn't like me trying to protect myself from the messiness of the real world.
00:00:57.000 Oh, I understand.
00:00:58.000 This is a karma.
00:00:59.000 It just looks good.
00:01:00.000 It just makes you feel like a professional.
00:01:01.000 Is it flexible?
00:01:04.000 Like, you know, they make clothes that are flex.
00:01:06.000 Yeah.
00:01:06.000 You can move in it?
00:01:07.000 I can move in it.
00:01:07.000 Oh, that's nice.
00:01:08.000 I mean, you showed me how you can choke me last time with the tie.
00:01:13.000 Did you get a breakaway tie?
00:01:15.000 No, I didn't.
00:01:16.000 But, you know, I kind of let you have that one because I think I can defend it pretty well.
00:01:21.000 Well, you're probably very good at defending chokes, yeah.
00:01:24.000 No, no, no.
00:01:24.000 With the tie.
00:01:25.000 I don't have a system yet.
00:01:26.000 I'll have to talk to John Donahue to develop a tie.
00:01:29.000 All you have to do, man, is just take the back of the tie, cut it, put a little piece of Velcro on each end.
00:01:36.000 You got the same tie.
00:01:38.000 But I think you going under the tie to try to start the choke, actually, I mean, you're making yourself vulnerable.
00:01:46.000 Like maybe to an arm bar or something like that.
00:01:48.000 Don't be silly.
00:01:50.000 Don't be silly.
00:01:51.000 Well, listen, if someone grabs a hold of your collar, that's the same thing.
00:01:55.000 Ezekiel chokes are deadly, right?
00:01:57.000 Yeah, but it's not over.
00:01:58.000 Yeah, if you sink it in, it's over.
00:02:02.000 Collars are a real problem, right, in jiu-jitsu.
00:02:06.000 Yeah.
00:02:06.000 They're a real problem.
00:02:07.000 If someone gets deep on your collar...
00:02:09.000 Like even on this, with a suit, right?
00:02:11.000 Someone starts doing this, man, you're fucked.
00:02:13.000 Not good.
00:02:15.000 Not good.
00:02:16.000 Collars are not good.
00:02:17.000 If you go deep, if you get in deep.
00:02:19.000 Yeah, yeah, yeah.
00:02:19.000 Well, the problem with that is it's a handle.
00:02:22.000 It's worse than a collar because I'll get underneath that knot and I'll grab ahold of that bitch and then it's all just twisting.
00:02:28.000 Yeah, but you have to...
00:02:31.000 You're right.
00:02:31.000 I would have to get it.
00:02:33.000 I'd have to get it.
00:02:34.000 And you also kind of have to hold on to this part because it can loosen naturally unless you're really good at like...
00:02:39.000 Because it loosens...
00:02:41.000 Does it?
00:02:41.000 Yeah, it loosens naturally.
00:02:42.000 There's a system to this.
00:02:44.000 I think...
00:02:45.000 You haven't thought through this.
00:02:47.000 You don't think I have?
00:02:49.000 Dude, I try to choke people with ties on.
00:02:51.000 Just friends, yeah.
00:02:52.000 I'm like, let me grab a hold of that tie real quick.
00:02:54.000 What happens if I do this?
00:02:55.000 Yeah.
00:02:56.000 No.
00:02:56.000 No, not jujitsu people.
00:02:57.000 And also, it's probably a joke.
00:02:59.000 Yeah, yeah, yeah.
00:02:59.000 If I was fighting for my life, I think it'd be different.
00:03:02.000 But sure, you're a tough guy.
00:03:03.000 You are actually trained martial artists.
00:03:05.000 I mean, I'm not saying it would be easy to grab your tie and choke you to death.
00:03:07.000 What I'm saying is it's one more area of vulnerability that doesn't need to exist.
00:03:12.000 Yeah, but see, I'm disagreeing with you and saying, like, if I was gonna fight to the death, I would wear the suit.
00:03:19.000 Okay.
00:03:19.000 Because then I would look good.
00:03:20.000 Let me tell you something about CIA agents and Secret Service guys.
00:03:23.000 They wear breakaway ties.
00:03:25.000 That's because they're not good martial artists.
00:03:27.000 Oh, that's not true.
00:03:28.000 There's a lot of those guys are savages.
00:03:30.000 Are they?
00:03:31.000 Fuck yeah, man.
00:03:31.000 You mean like blue belts or purple belts?
00:03:33.000 No, black belts, man.
00:03:34.000 If you're a fucking, if you're a Secret Service guy and you're supposed to be protecting the president, I guarantee you, a bunch of those guys are savages.
00:03:40.000 I think they're smart enough to use guns.
00:03:42.000 That too.
00:03:43.000 Yeah.
00:03:43.000 But they don't, you know, if they have to wear a tie, a lot of people like to wear breakaway ties.
00:03:47.000 Is that a fact?
00:03:49.000 Mm-hmm.
00:03:49.000 I might be making this up right now.
00:03:50.000 I think that's a bro fact.
00:03:51.000 No, I know, but it's a little bro fact, but only like 10%.
00:03:55.000 I think, let's Google breakaway ties for self-defense.
00:04:02.000 Because, dude, look, I'm definitely a dummy, right?
00:04:06.000 Okay, I think about this stuff too much.
00:04:07.000 But when I was driving limos, I always felt super vulnerable when I had to wear that tie.
00:04:13.000 It looks good, though.
00:04:16.000 Actually, my first album, my first real album that I ever did for Warner Brothers was in 1999, and I wore exactly that outfit.
00:04:24.000 I wore a black suit with a white shirt and a black tie, and it looked dope.
00:04:29.000 It's called I'm Gonna Be Dead Someday.
00:04:31.000 Like a stand-up?
00:04:33.000 Mm-hmm.
00:04:34.000 Look at that.
00:04:34.000 Breakaway tie, son.
00:04:35.000 Low pro.
00:04:36.000 Breakaway tie.
00:04:37.000 Woo!
00:04:38.000 That's what I'm wearing.
00:04:38.000 Let me explain to you something.
00:04:39.000 Come get some.
00:04:40.000 Most people, when they're vulnerable, like say, I'm afraid I'm going to be picked on by bullies.
00:04:46.000 I learned a martial art how to defend myself.
00:04:49.000 Yes.
00:04:49.000 You, when you felt vulnerable wearing a tie, decided not to wear a tie as opposed to learn how to defend yourself while wearing a tie.
00:04:58.000 There must be a system.
00:04:59.000 I guarantee you.
00:05:00.000 Sure.
00:05:01.000 You could say that.
00:05:02.000 You could defend yourself if you had a dog collar around your neck too, but I wouldn't recommend it.
00:05:06.000 I wouldn't recommend it.
00:05:07.000 I took my dog out once.
00:05:10.000 I had a pit bull and he bit my cat.
00:05:12.000 He grabbed ahold of my cat.
00:05:13.000 It's a terrible story.
00:05:14.000 I had a crazy dog.
00:05:15.000 One of my dogs was a dog that I had gotten.
00:05:19.000 I was young and irresponsible in my 20s.
00:05:21.000 And I had gotten this dog that was bred from a pig hunting dog in Hawaii.
00:05:28.000 Wow.
00:05:28.000 And those dogs are hyper animal aggressive.
00:05:32.000 They're great with people.
00:05:33.000 He was great with people.
00:05:34.000 He loved people.
00:05:35.000 But everything that moved, he was like locked in on.
00:05:38.000 He would spend his days in my yard chasing lizards.
00:05:41.000 His thing was to jump up on the wall of the house and try to snatch lizards.
00:05:45.000 It was like a video game for him.
00:05:47.000 And my friend Eddie was terrified of this dog.
00:05:50.000 Eddie Bravo?
00:05:51.000 Yeah, and so Eddie would come over the house, and Frank would just decide that he runs shit when Eddie's around, because Eddie was so scared of him.
00:05:57.000 He'd be like, hmm, I think I'm gonna kill this cat.
00:05:59.000 So he just tried to kill my cat, and I got a hold of him in time, and I got my hand into his collar, and I choked him unconscious.
00:06:07.000 Like on top of his head like that?
00:06:08.000 Yeah, I just dug...
00:06:09.000 Oh, from behind?
00:06:09.000 Yeah, from behind, I just dug my hand under his collar, and I twisted, and I put him to sleep.
00:06:14.000 He went right out.
00:06:15.000 It's crazy.
00:06:16.000 Yeah, it works.
00:06:16.000 It works on dogs.
00:06:18.000 Yeah, I wasn't thinking from the back, I was thinking from the front.
00:06:22.000 Anywhere you can grab a dog collar, if you get your hand in there, if you're strong enough and you have good technique, you know how to go knee on belly, and then you twist it.
00:06:30.000 You can put a dog to sleep.
00:06:32.000 Wow, you're changing my mind.
00:06:34.000 See?
00:06:35.000 Yeah, but I don't know.
00:06:37.000 I mean, look, obviously you're going to be aware of that and you're going to defend, but it's an area of vulnerability.
00:06:44.000 Right.
00:06:45.000 Pull up 1990 and pull up I'm Gonna Be Dead Someday, the cover of that, because I'm literally dressed exactly like you.
00:06:51.000 On the cover or when actually doing the show?
00:06:55.000 No, I never wore it doing a show.
00:06:57.000 I think I just wore it for the cover.
00:06:59.000 Almost ironically.
00:07:01.000 No, I kind of like the way it looked.
00:07:03.000 You know, there it is.
00:07:04.000 Bam.
00:07:05.000 It's hard to tell there because that one's orange and the other one's hot pink.
00:07:09.000 But it's like the shirt collar's a little more open, like you don't give a damn.
00:07:15.000 Well, that was a long day, and there was a long photo shoot, and we were drinking.
00:07:19.000 Yeah, that's what it looks like.
00:07:20.000 There was a lot of chaos involved.
00:07:22.000 That's legit.
00:07:23.000 There's legit.
00:07:24.000 There was a lot of stuff going on there.
00:07:26.000 But I like that look.
00:07:30.000 It's a good look.
00:07:30.000 By the way, congratulations on the 10 years.
00:07:34.000 Oh, thank you very much.
00:07:34.000 I don't think you've celebrated.
00:07:36.000 All I see is on Jamie's Instagram like a naked picture of Bert.
00:07:41.000 The 10-year picture?
00:07:44.000 Yeah, we probably should do something.
00:07:46.000 It was December was officially 10 years, so it was two months ago.
00:07:50.000 Probably should have some sort of a party or something.
00:07:52.000 I know you don't like to talk about it, think about it, but you've inspired millions.
00:07:56.000 It's very nice.
00:07:58.000 It's a very nice side effect, but it's a weird gig, man.
00:08:02.000 It's a gig that became what it is.
00:08:05.000 Slowly, without me understanding what was happening, why it was happening, which makes it weirder and weirder.
00:08:10.000 And with it has come increasingly stronger levels of responsibility.
00:08:17.000 To where, you know, now I have to actually vet guests and think about what they're saying, whereas before I would have someone on if they're crazy, I was like, let that crazy motherfucker on, let's hear what he has to say.
00:08:26.000 And people would say a lot of crazy shit, and then they would say, oh, you know, you didn't push back, or you had this person on, and they said something irresponsible, and I had no idea what they were going to say.
00:08:37.000 There's a lot of people that have said some pretty outrageous things that I had no idea they were going to say.
00:08:41.000 Yeah, I saw the...
00:08:42.000 One of the things you inspired me to do is to start a podcast on artificial intelligence.
00:08:49.000 And I have Jack Dorsey as a guest coming up.
00:08:52.000 And that's a good example of somebody you got an insane amount of pushback on.
00:08:58.000 Yes, because they were mad that I didn't talk to him about censorship.
00:09:00.000 My take on it was...
00:09:04.000 It was certainly irresponsible on my part, the first podcast.
00:09:08.000 Because my take on it was, I just want to see what it's like to be a guy that starts this thing and it becomes probably one of the most important conversation tools the world has ever known.
00:09:21.000 And also along the way, it becomes something weird.
00:09:28.000 Like, now it's weird.
00:09:29.000 Twitter now is just this...
00:09:31.000 It's just 50% hot dumpster fire.
00:09:36.000 It's so much.
00:09:37.000 Yeah, but it's also amazing, inspiring stuff.
00:09:39.000 You can always find the dumpster fire in all kinds of conversations.
00:09:42.000 Yes.
00:09:42.000 The confusing thing to me about your conversation with Jack, which I didn't look at the internet before I listened to it, and I really enjoyed it.
00:09:51.000 It was interesting.
00:09:51.000 I learned a lot from your first conversation with Jack.
00:09:54.000 And then I looked at the internet that told me I'm supposed to hate that conversation.
00:09:59.000 And what I'm confused about is why.
00:10:03.000 Why is there such hatred thrown towards...
00:10:07.000 I also talked to the head of the YouTube algorithm, Search and Discovery.
00:10:12.000 A lot of hate towards YouTube.
00:10:14.000 A lot of hate towards Twitter.
00:10:16.000 A lot of hate towards Facebook.
00:10:17.000 And deservedly so.
00:10:18.000 There's some challenges and so on.
00:10:20.000 But they're doing like an incredible service.
00:10:22.000 And the algorithm they're trying to develop and control is really hard to develop and control.
00:10:28.000 Yes, for sure.
00:10:29.000 So the pushback that people get, it's almost like they're taking specific anecdotal pieces of evidence.
00:10:37.000 Or look, this person said this and it's...
00:10:41.000 It's not that problematic in our eyes, but they somehow got censored from the platform, removed from the platform.
00:10:47.000 And they don't look at the bigger picture of how challenging the entirety of it is and how incredible...
00:10:53.000 First of all, how incredible the platform is to have a conversation, like a global conversation like this, and how hard it is to do to achieve the goal of having...
00:11:02.000 It sounds like cheesy, but having a healthy conversation, a healthy discourse.
00:11:08.000 Because...
00:11:09.000 You want an algorithm and a platform that removes the assholes from the scene because it's a really difficult challenge because one person who's really loud, who's screaming in the room, comes to the party.
00:11:24.000 You have a cool party, a bunch of cool people, some communists, some right-wingers, whatever.
00:11:29.000 It doesn't matter.
00:11:30.000 They can all disagree.
00:11:31.000 But they're not assholes.
00:11:32.000 They're there to have an interesting debate, conversation, and so on.
00:11:36.000 And then there's somebody that comes and just starts screaming one slogan or something like that.
00:11:43.000 Or is trolling, is completely non-genuine in their way of communication.
00:11:48.000 They're destroying the nature of the conversation.
00:11:51.000 And then, of course, that person, if they get, you know, the bodyguards come in and say, can you please leave the party, sir?
00:11:58.000 Then they get extremely, that's exactly the kind of personality that's extremely upset.
00:12:03.000 And sometimes they almost look for that.
00:12:05.000 So what are you supposed to do as Jack Dorsey, as a leader of that kind of platform?
00:12:10.000 It's a very good question and I really think that there's no real answer.
00:12:14.000 It's one of the reasons why it's so frustrating.
00:12:16.000 You know, if you just let people say whatever they want whenever they want to, there's gonna be a lot of people that get turned off to that kind of a platform because you're gonna have a lot of people yelling out racial slurs, ethnic slurs,
00:12:32.000 gender slurs, homophobic slurs, There's going to be a bunch of people that are trolling.
00:12:38.000 There's going to be a bunch of people that just say things to rile people up and that's all they do.
00:12:42.000 There's going to be a bunch of people that just want to shit stir and they want to dox people.
00:12:46.000 So then you have to set parameters.
00:12:48.000 Like what are the parameters?
00:12:49.000 You can't dox people.
00:12:51.000 You can't – don't say racial slurs.
00:12:54.000 Don't say ethnic slurs.
00:12:56.000 It's you're managing at scale and you're managing an insane amount of people.
00:13:01.000 But then there's legitimate criticism that they lean towards progressive people and liberal people and they have woke politics.
00:13:12.000 Like, for instance, you can get banned from Twitter.
00:13:15.000 For life if you dead name someone so Lex if you became a Female and you change your name to Ally and I just said fuck you man, you're Lex Banned for life.
00:13:27.000 That's what a dead name That's dead naming like if you wanted to call Caitlyn Jenner if you want to call Caitlyn Jenner Bruce on Twitter You would get dead named or you would be dead naming her and you would get banned for life a woman named Megan Murphy Who is a TERF? Do you know the TERF is?
00:13:43.000 What do you think?
00:13:43.000 I don't know what a TERF is.
00:13:45.000 I'm sure you don't.
00:13:45.000 You're two balls deep in science.
00:13:47.000 Yeah.
00:13:48.000 TERF is trans-exclusionary radical feminist.
00:13:52.000 So trans-exclusionary...
00:13:54.000 Why do I have such a hard time with that word?
00:13:57.000 Exclusionary, right?
00:13:59.000 Exclusionary.
00:13:59.000 Why does it sound wrong?
00:14:01.000 Exclusionary sounds wrong.
00:14:03.000 What does it mean to be exclusionary to trans?
00:14:08.000 Turfs do not want trans people to have a say in women's issues.
00:14:16.000 I see.
00:14:16.000 They think that they are a different thing.
00:14:19.000 That there's women and women's issues and these feminists that have been female their whole life dealing with women's issues do not want trans people coming in and in many cases what you find is that trans people come in and then the conversation changes and it becomes about trans issues and they want these conversations to be about women's issues in feminist movements.
00:14:41.000 It's complicated, right?
00:14:42.000 She got banned from Twitter for life for saying a man is never a woman.
00:14:48.000 They made her take the tweet down, so she took a screenshot of it, took it down, and then put the screenshot back up, and then they banned it for life.
00:14:56.000 Should she get banned?
00:14:57.000 No!
00:14:58.000 No, she shouldn't, because biologically she's correct.
00:15:01.000 If there's an argument there, if there's an argument, a scientific argument, a man is never a woman, but can a man identify as a woman, and should you respect him I'm not too deep into thinking about these specific issues,
00:15:21.000 but the question is whether you should get banned for being an asshole or you should get banned for lying.
00:15:27.000 Because I think lying is okay.
00:15:30.000 A lot of people lie on Twitter.
00:15:34.000 Insult.
00:15:34.000 You can insult people on Twitter as long as you're not specific about their gender.
00:15:38.000 The insult thing, that's where it gets, it's the party thing.
00:15:41.000 If you have the asshole douchebag, whatever term you want to use, they show up to the party.
00:15:46.000 And then if a person shows up to the party and a lot of people leave because they're annoying or whatever, that should be, like we should do something to discourage that behavior.
00:15:57.000 That's a good point.
00:15:58.000 However, let's paint a different picture of a party.
00:16:01.000 Let's have a party where everyone says, my pronouns are they, them, and zzer, and javu, and then you come in, you go, come on, bro, you're a guy.
00:16:13.000 And like, no, no, no, I'm a they, you fucking cisgendered, heteronormative piece of shit.
00:16:19.000 And then they want to kick you out of the party.
00:16:21.000 Yeah.
00:16:22.000 All you're saying is you're a guy.
00:16:23.000 Ban both of them.
00:16:25.000 No, wait.
00:16:25.000 Ban the person who's not open-minded or respectful for the...
00:16:31.000 Don't ban people.
00:16:34.000 No, no, no.
00:16:36.000 So, of course, it's been well documented by people now.
00:16:41.000 The reason we probably have the current president is that the people on the left are very also rude and disrespectful.
00:16:47.000 It's a small percentage of the people on the left.
00:16:49.000 It's very small.
00:16:49.000 This is part of the real issue.
00:16:51.000 They're all on Twitter.
00:16:52.000 They are all on Twitter, but it's also the small percentage when you...
00:16:56.000 It's so hard to have a group and call that group the left because the variables are so extreme.
00:17:06.000 There's so many different people that follow politics or that espouse to certain belief systems that recognize themselves as left.
00:17:16.000 Funny enough, you're probably on the left.
00:17:18.000 Yes, I'm very much on the left.
00:17:20.000 But I don't get considered to be on the left because I'm a cage fighting commentator.
00:17:25.000 With an American flag behind you.
00:17:27.000 Yeah, I'm very bro-ish.
00:17:28.000 I hunt.
00:17:29.000 I bow hunt, which is even more bro-ish.
00:17:32.000 And I am unabashedly masculine.
00:17:35.000 I'm a man.
00:17:36.000 And a comedian.
00:17:37.000 Yes, and I'm a dirty comedian, and I make fun of everything, including sacred cows, like gender, homosexuality, heterosexuality, my own kids, my wife, my mom, everybody.
00:17:49.000 I make fun of everybody.
00:17:50.000 And if you take that stuff out of context and just publish a bunch of it, it makes you look like a moron, or it makes you look like an asshole.
00:17:59.000 That's, you know, what is the left, right?
00:18:03.000 What is the left?
00:18:04.000 In my mind, the left, when I was a child, I always thought of the left because I grew up, my parents were hippies, right?
00:18:10.000 My stepdad was an architect and before that he was a computer programmer.
00:18:16.000 He had long hair until I was, I think I was 20 years old when he cut his hair.
00:18:21.000 I mean long, like down to his ass like a Native American.
00:18:24.000 Nice.
00:18:24.000 And he, you know, they always, he smoked pot when I was little.
00:18:29.000 I mean, he...
00:18:31.000 I was always around hippies.
00:18:33.000 I lived in San Francisco from the time I was seven till I was eleven.
00:18:38.000 And my family was very left-wing.
00:18:40.000 They were always pro-gay marriage, pro-gay rights, pro-racial equality, pro...
00:18:49.000 Just name it, man.
00:18:52.000 Pro-welfare, pro...
00:18:56.000 The idea was open-mindedness, education, all these things are good.
00:19:01.000 And war was bad.
00:19:04.000 There's a lot of things that maybe they had very strong beliefs on that maybe they weren't entirely nuanced on as well.
00:19:17.000 You find that about people on the left as much as you find that about people on the right.
00:19:22.000 But it's...
00:19:23.000 The radicals on both sides.
00:19:26.000 There's nothing wrong with being conservative, right?
00:19:28.000 There's nothing wrong with valuing hard work.
00:19:30.000 There's nothing wrong with someone who values fiscal frugality or someone who is, you know, you have a conservative view on economics or on social policies.
00:19:43.000 You know, and you want less government.
00:19:45.000 There's nothing wrong with those things either.
00:19:47.000 Yeah, that's when you get extreme.
00:19:48.000 Yeah.
00:19:49.000 The guest was an amazing guest you had recently that converted a bunch of folks from the KKK. Daryl Davis.
00:19:54.000 Yeah, wow.
00:19:55.000 Fuck.
00:19:55.000 This is him right here.
00:19:56.000 This is his CD. He's amazing.
00:19:58.000 He's an incredible human, man.
00:19:59.000 But that kind of thinking, I wish you saw more of that in politics.
00:20:03.000 Sort of like, not, even if you're on the left, to be, to talk to people on the right.
00:20:07.000 Right.
00:20:08.000 Instead of just shut them out, that's the problem with this idea of kicking people out of the party.
00:20:13.000 You kick people out of the party, guys like Daryl Davis never get to convert them.
00:20:17.000 There's been people from Twitter that have been converted.
00:20:20.000 You know, Megan Phelps is a famous one.
00:20:22.000 She was a part of the Westboro Baptist Church.
00:20:24.000 Her grandfather was Fred Phelps, that fucking famous crazy asshole who was like super rude, like who, you know, would make them...
00:20:33.000 Take those signs that say God hates fags and literally go to soldiers' funerals and say that soldiers died because God is angry that people are homosexual.
00:20:45.000 So Megan was completely entrenched in this toxic ideology.
00:20:51.000 And Twitter allowed her to escape that.
00:20:52.000 Yeah.
00:20:52.000 Yes.
00:20:53.000 She met her husband on Twitter from arguing.
00:20:55.000 It's a beautiful thing.
00:20:55.000 Back and forth.
00:20:56.000 And now she's out.
00:20:57.000 And if you talk to her, you would never believe it.
00:21:00.000 And man, not that long ago either.
00:21:02.000 Not that long ago, she was in that church like six years ago.
00:21:04.000 It's kind of incredible that you can sort of outgrow that mindset.
00:21:08.000 So no matter, I mean, that's inspiring that you can hold a mindset of hatred and outgrow, escape it.
00:21:13.000 Well, she was indoctrinated into it from the time she was a child.
00:21:17.000 And, you know, for her, it was the only life she knew, right?
00:21:20.000 Her family is in that.
00:21:22.000 And for her, she just, I mean, by whatever, for whatever grace of the grand universe plan, she had enough open-mindedness to take into consideration some of these other things that people were saying.
00:21:36.000 Man.
00:21:37.000 We have a problem today with cancel culture.
00:21:40.000 It's a real problem, is that you just want to write people off.
00:21:42.000 Well, those people still exist.
00:21:44.000 It's basically a cultural form of euthanasia.
00:21:47.000 You just want to go out and whack everyone who doesn't agree with you.
00:21:50.000 But if you do that, Whether it's eugenics or whatever you want to call it, you just eliminate everyone who's not the way that you like.
00:22:01.000 Culturally eliminate them.
00:22:02.000 Take them out of the conversation.
00:22:03.000 They still exist.
00:22:05.000 They still exist.
00:22:06.000 So what happens then?
00:22:07.000 Well, then they're angry.
00:22:08.000 They're angry.
00:22:09.000 They're left out of the conversation and they don't grow.
00:22:12.000 And then you've written them off as a human being.
00:22:13.000 You said that they're 100% bad.
00:22:16.000 Now, if you had a spectrum Of people in this world, 100% bad and 100% good.
00:22:21.000 I mean, there are some beautiful people that really are 100% good.
00:22:25.000 Like my friend Justin Wren, who runs Fight for the Forgotten Charity, he's about as close to 100% good as you can get.
00:22:31.000 I mean, this beautiful person goes to the Congo and makes wells for the pygmies and gets malaria.
00:22:39.000 He's got some crazy parasite now that they don't even know what it is.
00:22:44.000 They can't recognize it.
00:22:44.000 He's been suffering for eight months now, I think.
00:22:47.000 That's about as good as you can get, right?
00:22:50.000 And then there's people, you know, you could...
00:22:52.000 It's a gray area when you start to drift away from the...
00:22:55.000 I have the same thing in my...
00:22:57.000 That's the focus I have in the academic setting of science.
00:23:02.000 That's the inspiration of your podcast that you gave me, is to talk outside the people that are sort of conventionally accepted by the scientific community.
00:23:12.000 Like a little bit on the fringes.
00:23:13.000 On the quote-unquote fringes.
00:23:15.000 So you have the same thing in machine learning and artificial intelligence.
00:23:18.000 There's people that are working on specific, it's called deep learning, these learning methodologies that are accepted.
00:23:25.000 There's conferences and we all kind of accept the problems we're working on and there's people a little bit on the fringes.
00:23:30.000 There's people in neuroscience, actually anybody thinking about working on what's called artificial general intelligence is already on the fringes.
00:23:39.000 If you even raise the question, okay, so how do we build human level intelligence?
00:23:44.000 That's a little bit of a taboo subject.
00:23:46.000 The consciousness is called the C word for a while.
00:23:50.000 Consciousness.
00:23:51.000 Really?
00:23:51.000 Well, it's scientists.
00:23:53.000 I know.
00:23:53.000 I understand.
00:23:54.000 Explain it to me.
00:23:56.000 What's the aversion?
00:23:57.000 What is everyone worried about?
00:24:01.000 What are they worried about?
00:24:04.000 It's this culture of rolling your eyes the same way you might roll your eyes if somebody tells you the earth is flat.
00:24:12.000 They sort of put all other things in that category as well.
00:24:17.000 It's like, well, okay, whatever.
00:24:19.000 So in the case of consciousness, we really don't understand very much at all what consciousness is.
00:24:24.000 What the subjective experience, the fact that it feels like something to take in the world, that it's not just...
00:24:31.000 Raw sensory information being processed.
00:24:34.000 It actually feels like to touch something, to taste something, to see something.
00:24:39.000 It's like incredible.
00:24:41.000 David Chalmers calls it the hard problem of consciousness.
00:24:44.000 Why do we feel it?
00:24:45.000 Okay.
00:24:46.000 But we don't have scientific, physics, engineering methods of studying consciousness.
00:24:50.000 So it immediately gets put into this bin that it's not an okay thing.
00:24:56.000 Like you're a little bit crazy.
00:25:00.000 Daryl was saying that that's a slur.
00:25:02.000 I never even thought of that.
00:25:03.000 I didn't think of what that meant.
00:25:05.000 Yeah, so they already put in this bin of you're not a legitimate researcher.
00:25:09.000 And the same kind of, you know, and I think we're now in a culture which is great.
00:25:15.000 You know, Eric Weinstein is good at this.
00:25:16.000 I'm hoping to be good at this.
00:25:18.000 You're good at this, at allowing those people on the fringes in and saying, what are your ideas?
00:25:23.000 Exploring those.
00:25:24.000 Of course, you have a greater and greater platform to where there is a line.
00:25:28.000 You don't want too far on the fringes.
00:25:31.000 Yeah, that's something I'm aware of now that I wasn't aware of, say, like three or four years ago.
00:25:37.000 And I used to have a lot of those...
00:25:39.000 I've had some people on that I would never have on again.
00:25:42.000 And then I've had some people on that I've been criticized for having them on.
00:25:46.000 I'm like, okay, I see why you are upset, but I think there's value in having conversations with people that are on the fringes.
00:25:54.000 There's people that are bad faith actors, right?
00:25:57.000 They act in bad faith.
00:25:58.000 Those are the ones you have to be careful of.
00:26:00.000 And sometimes you don't know who they are until you get to know them.
00:26:03.000 And then you've already kind of opened the door.
00:26:05.000 Like for some people, for like the Democratic, the legitimate seven-year-old plus Democratic Party, Tulsi Gabbard is on the fringe.
00:26:14.000 Yeah.
00:26:14.000 Right?
00:26:15.000 But I think you having her on is great.
00:26:18.000 It's exploring, you know… She's one of the young minds exploring sort of the role of the United States, the foreign policy in the world, militarily, in terms of trade and so on.
00:26:29.000 So she has an excellent mind who I don't think is on the fringe.
00:26:33.000 I don't think she's on the fringe either.
00:26:35.000 Bernie Sanders for many people still is on the fringe.
00:26:38.000 Yeah.
00:26:40.000 I think he gets misrepresented, though.
00:26:42.000 Yeah, for sure.
00:26:43.000 One of the things that was tremendously beneficial for me is to sit down with him for hours and have a conversation.
00:26:50.000 And you go, oh, you're a real person.
00:26:53.000 You're not this wacky guy yelling about billionaires.
00:26:56.000 When you get these 90-second soundbites and these debates, you don't get a chance to know who someone is.
00:27:01.000 Yeah, so I used to listen to this...
00:27:04.000 I listen to a lot of radio on the left and the right to try to take in what people are thinking about.
00:27:09.000 I used to listen to this program, I think it's called the Tom Hartman program.
00:27:13.000 He's like a major lefty.
00:27:16.000 But he had this segment called Brunch with Bernie.
00:27:20.000 And he would invite Bernie Sanders like every Friday or something like that.
00:27:24.000 And just sort of the intellectual honesty and curiosity that Bernie exhibited was just fascinating.
00:27:31.000 Sort of like, as opposed to being a political thing that just repeats the same message over and over, which actually what it kind of sounds like when you listen to him now publicly, he's actually a thinking individual and somebody who's open and changing his mind, but within that has just completely been consistent.
00:27:50.000 What people are terrified of is that he's going to raise taxes on successful people and ruin business.
00:27:57.000 That's what people are worried about, that in doing that, it will crash the economy.
00:28:01.000 I don't know if they're right.
00:28:03.000 I don't even know if they're...
00:28:05.000 So first of all, the people are using the word socialist.
00:28:08.000 So you're saying, he's a socialist.
00:28:10.000 Do you really want socialism?
00:28:12.000 America is a great country because we're a capitalist kind of thing.
00:28:16.000 From my perspective, I think we already have a huge number of socialists.
00:28:20.000 Well, he's a democratic socialist.
00:28:22.000 It's a different perspective.
00:28:25.000 He just values workers.
00:28:27.000 The idea is he wants people to earn a living wage.
00:28:30.000 He wants people to not be indebted with a tremendous amount of student loan debt when you're just 21 years old and getting out of college.
00:28:39.000 He thinks it's insane, and I agree with him.
00:28:40.000 He doesn't want people to be burdened in this insane way if you ever get sick, and I agree with him.
00:28:47.000 He wants to improve the healthcare system.
00:28:48.000 I think as a community, if we're looking at the United States as a community, one of the things that, you know, look, it's great to support business.
00:28:56.000 It's great to have a strong economy.
00:28:58.000 It's great to give business the confidence to take chances, and a lot of people think Donald Trump does that.
00:29:04.000 It's also great to take care of our own and I don't think we do that enough.
00:29:09.000 I don't think we take care of our own enough in terms of we have the same problems in the same inner cities that we've had for decade after decade after decade and there's no significant attempt to change that but meanwhile we do these nation-building projects in other countries and we have the interventionalist foreign policy where we go and invade these countries and try to prop up new New governments and try to support them and we spend insane amounts of money doing that and along the while we don't do
00:29:39.000 anything to our inner cities that are the exact same fucked up places that they were in the 70s and in the 60s.
00:29:45.000 Do you know who Michael Wood Jr. is?
00:29:48.000 No.
00:29:49.000 He was on the podcast a couple of times, and he used to be a police officer in Baltimore.
00:29:54.000 Yes, I know him.
00:29:55.000 Okay, so I listen to that podcast.
00:29:57.000 I'm just horrible with names.
00:29:58.000 His experience was, first of all, he found a piece of paper that showed a crime docket from the 1970s, all the stuff like drugs, crime, robbery.
00:30:12.000 It was all the same issues in the same neighborhoods that he was patrolling in today.
00:30:18.000 And he was like, holy shit.
00:30:20.000 And he realized like, oh, this is a quagmire.
00:30:23.000 And then he found out about the laws that were in place from way back in the day where you literally, if you were an African American, you couldn't buy a home in certain areas.
00:30:35.000 They had, what is that term?
00:30:37.000 Is it redlined?
00:30:38.000 Is that what the term is?
00:30:40.000 Where they designate certain areas where they literally won't sell homes to black people.
00:30:46.000 And he was becoming aware of this shit as he was a cop.
00:30:51.000 And, you know, in the beginning, he was all gung-ho.
00:30:54.000 He was like, I'm a cop.
00:30:55.000 You know, I'm here to bust bad guys and do the right thing.
00:30:58.000 And then along the way, he kind of recognized, you're dealing with systemic racism.
00:31:03.000 Redlining?
00:31:04.000 Redlining.
00:31:04.000 Yeah.
00:31:05.000 So, that hasn't been addressed.
00:31:08.000 It's all about, I mean, there's a million other things at home.
00:31:10.000 Education, everything.
00:31:11.000 Yes, all those things.
00:31:12.000 I think Bernie Sanders, when he talks about those things...
00:31:16.000 He seems like a guy who really cares about education, healthcare, and people that live in poverty.
00:31:23.000 Yeah.
00:31:24.000 I don't know if he's going to be able to do anything.
00:31:26.000 I don't know.
00:31:27.000 That's the main thing is like people say democratic socialist and so on is going to...
00:31:31.000 He's going to make a slight move into whatever direction he's trying to advocate, which in this case is more investment into the infrastructure and so on into our at home.
00:31:41.000 But like, you know, he's just one human being.
00:31:44.000 There has to be a Congress that represents the people.
00:31:47.000 And if there's anything, I think Congress is probably the most hated entity in all of the universe.
00:31:52.000 Like you look at all the polls of what people like and hate.
00:31:55.000 Like rats are above in terms of favorability ratings.
00:31:59.000 So Congress is really the broken system.
00:32:02.000 Bernie won't be able to do much, except take a little...
00:32:05.000 The role of the president, as I see it, is to...
00:32:09.000 One, the terrifying one, is to start wars.
00:32:12.000 And so it's a very serious responsibility you have to take.
00:32:16.000 And the second is to inspire the population.
00:32:18.000 In terms of executive power of enacting laws, there's not much power.
00:32:22.000 All you can do is...
00:32:25.000 What our current president is doing, sort of...
00:32:29.000 Inspiring, in that case, the Republicans in Congress to sort of work together to work on certain legislations.
00:32:37.000 So you can inspire the Congress and you can inspire the people, but you don't have actual direct power.
00:32:42.000 So Bernie is not going to turn America into a socialist He's going to take a small step into maybe focusing on one aspect, like healthcare or something like that,
00:32:57.000 like President Obama did, and try to make a little change.
00:33:01.000 So in that sense, people that are genuine and have ideas, like Andrew Yang is another one.
00:33:10.000 He has a ridiculous number of ideas.
00:33:12.000 I don't know if you've seen He thinks all cops should be purple belts in jujitsu.
00:33:16.000 Yeah, I like it.
00:33:17.000 I'm like, go Andrew!
00:33:18.000 Fuck yeah!
00:33:19.000 He has a million other ideas like it.
00:33:20.000 He does!
00:33:21.000 Well, he's a genius.
00:33:22.000 I mean, he's a brilliant guy.
00:33:24.000 And he's an entrepreneur, so he comes at this stuff from a different angle.
00:33:28.000 Yeah, and he's open-minded.
00:33:29.000 Yes.
00:33:31.000 Well, I disagree with him on his evaluation of the state of artificial intelligence and automation in terms of its capabilities and having an impact on the economy.
00:33:42.000 You don't think it's going to be as much of a deal as he thinks it is?
00:33:45.000 On the time scale that he thinks it is.
00:33:47.000 But I also want to be careful sort of commenting on that because I think for him it's a tool to describe the concerns, the suffering that people go through in terms of losing their job, like the pain that people are feeling throughout the country.
00:34:04.000 It's like a mechanism he uses to talk to people about the future.
00:34:08.000 You know, there are people that are well off, like the different tech companies that should also contribute to investing in our community.
00:34:15.000 I mean, the specifics, I want to kind of sit back and relax a little bit.
00:34:20.000 It's like when you watch a sci-fi movie and the details are all really bad.
00:34:24.000 I want to just suspension a disbelief or whatever and just enjoy the movie.
00:34:29.000 In the same way, the stuff he says about AI, he's not very knowledgeable about AI and automation.
00:34:34.000 So it touches me a little bit the wrong way.
00:34:37.000 We're not as far along.
00:34:40.000 The transformative effects of artificial intelligence in terms of replacing humans in trucking, autonomous vehicles, something I know a couple of things about, is not going to be as...
00:34:50.000 I can speak relatively confidently.
00:34:54.000 The revolution in autonomous vehicles will be more gradual than Andrew is describing.
00:35:00.000 But that's okay.
00:35:01.000 He has a million other ideas.
00:35:02.000 And UBI, nevertheless, the universal basic income or some kind of support structure of that kind, nevertheless, could be a very good idea for people that lose their job, for people to be mobile in terms of going from one type of job to another type of job,
00:35:18.000 so continually learning throughout their life.
00:35:20.000 It's just that artificial intelligence, in this case, I don't think will be the enemy.
00:35:25.000 There could be other things that are a little bit sort of neighbors of artificial intelligence, which is sort of the software world eating up some of the mechanization of factors and so on.
00:35:37.000 Maybe the fact that...
00:35:41.000 The kind of way that Tesla and Elon Musk are approaching the design and engineering of vehicles that are a little bit more software-centric will change, will sort of move some of the job from Detroit, Michigan in terms of cars.
00:36:11.000 Yeah.
00:36:17.000 Yeah, so there'll be some job replacement and so on, but it's not this artificial intelligence, trucks will completely replace your job.
00:36:25.000 And in the case of trucks, you know, it's not...
00:36:27.000 There's a lot of complicated aspects about the impact of automation.
00:36:32.000 Sort of trucking jobs, there's actually a lot of need for jobs.
00:36:37.000 Like, there's not the truck...
00:36:40.000 That job, there's already people leaving that job sector.
00:36:44.000 It's a really difficult job.
00:36:45.000 It doesn't pay as well as it should.
00:36:47.000 It's really difficult to train people and so on.
00:36:50.000 So the impact that he talks about in terms of AI is a little bit exaggerated.
00:36:55.000 Like I said, a million really good ideas.
00:36:58.000 He's open-minded.
00:37:00.000 So in terms of, I think, the nice role of a president is to have ideas, like the Purple Belt one, to inspire people and inspire Congress to implement some of those ideas and be open-minded and not take yourself seriously enough to think that you know all the right answers.
00:37:19.000 Andrew Yang, Bernie is like that.
00:37:24.000 Although Bernie is like 78 years old, so he's getting up there.
00:37:29.000 Yeah, look at President Tulsi when he kicks the bucket.
00:37:32.000 You know what?
00:37:35.000 I think Hillary Clinton endorsed Bernie and Tulsi Gabbard for president.
00:37:39.000 Reverse endorsement.
00:37:41.000 Accidentally?
00:37:41.000 Well, yeah.
00:37:44.000 It's just such a petty thing to say that no one likes Bernie.
00:37:48.000 Like, come on, lady.
00:37:49.000 You're in the twilight of your life.
00:37:51.000 I think she's really aware of the fact that if she says something like that, people are going to like Bernie more.
00:37:58.000 I think it's an endorsement.
00:38:00.000 I don't think she has any idea of that.
00:38:02.000 I think she's super insulated.
00:38:04.000 I think she thinks that she can actually hamstring him by saying something like that.
00:38:08.000 And she doesn't understand that it just makes people realize that the things that they say about her are correct.
00:38:12.000 I don't think you gave her enough credit.
00:38:15.000 Really?
00:38:15.000 You gave her credit for killing Epstein.
00:38:18.000 I was joking.
00:38:19.000 I don't think she did it.
00:38:20.000 I think Bill did it.
00:38:20.000 I'm joking too.
00:38:24.000 Somebody did it.
00:38:26.000 I don't know who it was.
00:38:27.000 Maybe it's some scientist character.
00:38:29.000 Maybe he's still alive.
00:38:30.000 Could be.
00:38:31.000 That's what Eddie Bravo thinks.
00:38:32.000 Eddie Bravo thinks he's in Dominican Republic somewhere eating bananas and drinking Mai Tais.
00:38:37.000 It's a conspiracy on the conspiracy.
00:38:39.000 Yeah, well, Eddie's always like that.
00:38:41.000 He's many levels deep.
00:38:42.000 He plays 4D chess when it comes to conspiracies.
00:38:45.000 Do you think that Andrew Yang is off, but ultimately will be correct in terms of the automation timeline?
00:38:53.000 Do you think that maybe...
00:38:55.000 He doesn't know clearly as much as you know about automation and artificial intelligence.
00:39:00.000 But do you think that it's possible that, you know, I think he's looking at a timeline, I think he was thinking within the next 10 years, millions and millions of jobs are going to be replaced.
00:39:09.000 Do you think that it's more like 20 years or 30 years?
00:39:12.000 Yeah.
00:39:13.000 But still something to be concerned?
00:39:14.000 Exactly.
00:39:14.000 So the timeline, of course, nobody knows.
00:39:16.000 But I think the timeline is much – the timescale is more stretched out.
00:39:20.000 So 20, 30 years.
00:39:21.000 And it will continue.
00:39:23.000 There will be certain key revolutions.
00:39:26.000 And those revolutions, it's an incorrect word to use, but they will be stretched out over time.
00:39:32.000 I think the autonomous vehicle revolution is something – To achieve a scale of millions of vehicles that are fully autonomously navigating our streets, I think is 20, 30 years away.
00:39:44.000 And it won't be like all of a sudden.
00:39:46.000 It'll be gradual.
00:39:47.000 It'll be people like the former Google self-driving car, Waymo company who's doing a lot of testing now, incredible engineer.
00:39:56.000 I visited them for a day.
00:39:57.000 It'll be expanding their efforts slowly.
00:39:59.000 They're doing also way more trucks, autonomous trucking.
00:40:02.000 They're already deploying them in Texas, I think.
00:40:04.000 And then, of course, Tesla, who's this year going to approach a million vehicles, and they're trying to achieve full self-driving capability.
00:40:13.000 But that's going to be gradual.
00:40:15.000 I just got a new update for the Tesla.
00:40:18.000 Uh-oh.
00:40:18.000 Some new self-driving update.
00:40:20.000 It costs four grand.
00:40:22.000 And I was like, what is it?
00:40:25.000 But I think I was high.
00:40:27.000 And I was looking at my phone, and I was like, hmm, okay, let's do it.
00:40:31.000 Let's do it.
00:40:31.000 And so I got this update, but I'm like, what did I just pay for?
00:40:34.000 And I don't even know if I'm going to use it, but I think it can change.
00:40:39.000 I think it does everything.
00:40:40.000 I think it changes lanes.
00:40:41.000 Yeah.
00:40:42.000 Well, okay.
00:40:43.000 So I'm not exactly sure what the update is, but it's probably...
00:40:45.000 See if you can find out, Jamie.
00:40:46.000 So it's probably the quote-unquote full self-driving.
00:40:50.000 Yeah.
00:40:50.000 Very important.
00:40:51.000 I'm the safety person, I guess, on this podcast.
00:40:55.000 Tesla cannot drive itself fully, autonomously.
00:40:59.000 You have to keep your eyes on the road, always pay attention.
00:41:01.000 But I saw a guy sleeping on the internet, and he was fine.
00:41:03.000 Yeah, well...
00:41:04.000 In a car.
00:41:05.000 Yeah.
00:41:05.000 Out cold?
00:41:06.000 I'll look into it.
00:41:09.000 Was it on CNN? No, it was someone filmed the guy.
00:41:13.000 He was in his car, passed out.
00:41:15.000 Not just one.
00:41:15.000 There's been a few examples of that.
00:41:17.000 People commuting on their way to work, so out cold.
00:41:20.000 So some are for fun and fake, but it's certainly a real thing that you pass out and sleep.
00:41:25.000 We do that with manual-driven cars, too.
00:41:27.000 I enjoy driving home in my Tesla from the commie store at like 1 o'clock in the morning, hitting that autopilot.
00:41:34.000 And I keep my hand on the wheel, but it's a level of relaxation.
00:41:38.000 Keep your eyes on the road?
00:41:39.000 Yes!
00:41:39.000 I'm not looking at my phone or anything stupid, but it's just like a doo-doo.
00:41:43.000 You press that double button, and I just...
00:41:46.000 And it changes lanes.
00:41:48.000 Oh, it doesn't change lanes.
00:41:49.000 It stays in the lane.
00:41:50.000 It can change lanes, but I think you have to prompt it.
00:41:56.000 There's an option for navigate on autopilot.
00:41:59.000 It'll take you everywhere you need to go.
00:42:01.000 But I think you need to step in at certain points.
00:42:03.000 Yeah, and you actually...
00:42:04.000 So now it can change lanes without you pressing...
00:42:08.000 That's what it is now?
00:42:09.000 Yeah, so you can do it automatically.
00:42:11.000 And they're doing hundreds of thousands, I think.
00:42:13.000 They're tracking the number of automated lane changes.
00:42:15.000 First of all, incredible that this is possible.
00:42:19.000 There's hundreds of thousands of automated lane changes without human initiation happening right now.
00:42:25.000 I mean, to me as a sort of a robotics person, it's just incredibly...
00:42:29.000 Here it is from WholeSnack on Twitter.
00:42:42.000 Right.
00:42:47.000 Wow, you can't run stops on.
00:42:49.000 So this isn't the update you paid $4,000 for.
00:42:51.000 That's part of that, but I'm actually surprised.
00:42:55.000 But it says Tesla's new update.
00:42:57.000 What's the time on this, Jamie?
00:42:58.000 This is December 24th.
00:43:00.000 I was looking for more recent.
00:43:01.000 So this isn't the exact update that you paid $4,000 for.
00:43:07.000 I think this is a general part of the full self-driving, which is $4,000.
00:43:12.000 And just to be clear, again, safety person, it's not like it detects traffic lights, but it doesn't stop at the traffic lights for you.
00:43:21.000 And maybe in this case it does emergency braking on the south side, but it's not good enough.
00:43:25.000 It's not good enough.
00:43:26.000 It's not there.
00:43:26.000 It's not there.
00:43:27.000 Don't trust it.
00:43:28.000 It's not there.
00:43:29.000 In fact, there's a lot of people, including myself, think we're quite a few years away, but also on the podcast, just like you, got a chance to talk to Elon Musk, meet him, talk to him in person, and realize that there's people in this world that can make the impossible happen.
00:43:47.000 You interviewed him as well?
00:43:49.000 Yeah, twice.
00:43:49.000 Tell me, what's that experience like for you?
00:43:55.000 So, you know, it's quite incredible in the sense that he is a legit engineer and designer, which is like a pleasure for me.
00:44:03.000 I've talked to a few CEOs, talked to Eric Schmidt, just CEOs, and they're a little bit more business oriented.
00:44:09.000 Elon is really, really focused on the first principles to the physics level of the problems that are being solved, whether that's SpaceX with the fundamentals of reusable rockets and going into deep space and colonizing Mars,
00:44:26.000 whether that's in Neuralink, getting to the core, the fundamentals of what it's like to have a computer communicate with the human brain.
00:44:33.000 And with Tesla, on the battery side, sort of saying...
00:44:37.000 He threw away a lot of the conventional thinking about what's required to build, first of all, an appealing electric car, but also one that has a long range.
00:44:48.000 That's something I don't know as much about.
00:44:50.000 But on the AI side, just...
00:44:52.000 I mean, he boldly said, from scratch, we can build a system ourselves in a matter of months, now a couple of years, that's able to drive autonomously.
00:45:03.000 I mean, most people would laugh at that idea.
00:45:05.000 Most roboticists that know from the DARPA challenge days, most of them know how hard this problem is.
00:45:11.000 He said, no, no, no, we're not only going to throw away LIDAR, which is this laser-based sensor, we're going to say cameras only, and we're going to use deep learning, machine learning, which is a learning-based system.
00:45:23.000 So it's a system that learns from scratch, and we're going to teach it to drive.
00:45:28.000 We're good to go.
00:45:35.000 We're good to go.
00:45:50.000 Is it really impossible?
00:45:50.000 What you find out when you start to think about most problems from first principles is that it's not actually impossible.
00:45:57.000 And then you have to think, okay, so how do we make it happen?
00:45:59.000 How do we create an infrastructure that allows you to learn from huge amounts of data?
00:46:05.000 So one of the most revolutionary things that Tesla is doing and hopefully other car companies will be doing is the over-the-air software updates.
00:46:13.000 Just like the update that you got, the fact that just like on your phone you can get updates over time It means you can have a learning system, a machine learning based system that can learn and then deploy the thing it learned over time and do that weekly.
00:46:27.000 That sounds like maybe trivial, but nobody else is doing it and it's completely revolutionary.
00:46:34.000 So cars, once you buy them, they don't learn.
00:46:36.000 Most cars.
00:46:37.000 Tesla learns.
00:46:39.000 That's a huge thing.
00:46:41.000 Forget about Tesla Autopilot, all this stuff.
00:46:43.000 Just the fact that you can update the software, I think it's a revolutionary idea.
00:46:47.000 And then they're also doing everything else from scratch.
00:46:50.000 This is the first principles type of thinking.
00:46:53.000 The hardware.
00:46:54.000 So the hardware in your car, I don't know when you got the Tesla, but it should be hardware version 2. But that hardware performs what's called inference.
00:47:06.000 So it's already trained, it's already learned its thing, and it's just taking in the raw sensory input and making decisions.
00:47:12.000 Okay.
00:47:13.000 They built that hardware themselves from scratch.
00:47:15.000 Again, ballsy move.
00:47:17.000 Now they're building what they're calling, again, he's such a troll, but they're calling Dojo is the name of the specialized hardware for training the neural networks or training the models.
00:47:32.000 What training is, is the learning side of it.
00:47:34.000 So they're building their own like supercomputer.
00:47:36.000 Google has a TPU to improve the training.
00:47:40.000 TPU, what does that stand for?
00:47:42.000 Tensive processing unit.
00:47:43.000 It's the same thing as the more general NVIDIA has graphics processing unit GPUs that all the nerds, all the people like me have been using for machine learning to train neural networks.
00:47:55.000 It's what most gamers use to play video games, right?
00:47:58.000 But they have this nice quality that you can train huge neural networks on them.
00:48:03.000 TPU is a specialized hardware for training neural networks.
00:48:10.000 GPUs allow you to play video games and train neural networks.
00:48:13.000 TPUs clean some stuff up to make it more efficient, energy efficient, more efficient for the kinds of computation neural networks need.
00:48:20.000 Google has them.
00:48:21.000 A bunch of other companies have them.
00:48:23.000 You know, most car companies would be like, okay, let me partner with somebody else with Google to use their TPUs or use NVIDIA's GPUs.
00:48:33.000 Tesla's building it from scratch.
00:48:35.000 So that kind of from scratch thinking is incredible.
00:48:40.000 And the other two things that I really like about Musk is the hard work.
00:48:48.000 We live in a culture, like so many people, like I often don't sleep.
00:48:52.000 I do crazy shit in terms of just focus, stay up night sometimes.
00:48:57.000 And often people recommend to me that balance is really important.
00:49:01.000 Taking a break is important.
00:49:04.000 That you rejuvenate yourself, you return to it with fresh ideas.
00:49:08.000 All those things are true.
00:49:09.000 Sleep is important.
00:49:10.000 You had people on the podcast tell you how important sleep is.
00:49:13.000 But what most people don't Don't advise me.
00:49:18.000 Hard work is more important.
00:49:20.000 Passion is more important than all of those things.
00:49:23.000 That should come first.
00:49:24.000 And then sleep empowers it.
00:49:26.000 Rest empowers it.
00:49:27.000 Rejuvenation empowers it.
00:49:28.000 Especially in engineering disciplines.
00:49:30.000 Hard work is everything.
00:49:31.000 And he's sort of unapologetically about that.
00:49:35.000 It's not like a...
00:49:36.000 Come to us.
00:49:38.000 Come work with us.
00:49:39.000 It'll be a friendly environment with free snacks.
00:49:42.000 It's like you're going to work the hardest you've ever worked on, whether you agree with him or not, on the most important problems of your life.
00:49:52.000 I like that kind of thinking because it emphasizes the hard work.
00:49:57.000 The other part In terms of meeting him in person, I don't know if you got to interact with that off, because when he was on mic with you, he was very kind of...
00:50:10.000 It was hard to bring it out of him.
00:50:11.000 In person before that, he was very jovial and friendly and huggy.
00:50:16.000 He's great.
00:50:17.000 Yeah, he's fun.
00:50:17.000 And then once he got on the microphone, I was like, oh, this is heavy lifting.
00:50:21.000 Bring this out of him.
00:50:22.000 So then we started drinking.
00:50:23.000 Drinking.
00:50:24.000 And then, oh, yeah.
00:50:25.000 It helps a lot.
00:50:26.000 And then once the drinking, you know, then I got to see who he is.
00:50:31.000 Yeah, I should have done that.
00:50:33.000 Your wife's drinking.
00:50:35.000 Yeah.
00:50:36.000 No, I mean, the thing that's really interesting is he's gone...
00:50:40.000 If you look at his biography, like the kind of stress he's been under in terms of he's been at the brink of losing his companies several times.
00:50:49.000 Yes.
00:50:49.000 And he, you know, he lost a child.
00:50:53.000 And he's just...
00:50:55.000 That's the other thing that inspired me is...
00:50:58.000 It's that he can be a good dad while running so many companies.
00:51:02.000 Because I often wonder about the kind of hours I pull and what I'm doing.
00:51:07.000 Can I have a family?
00:51:09.000 Because I'd love to be a father.
00:51:10.000 Can I have a family?
00:51:12.000 Can I be a good person?
00:51:13.000 It's very, very, very, very difficult if you're working 18 hours a day to give your kids the time that they need.
00:51:19.000 But it's possible.
00:51:21.000 Not 18 hours.
00:51:22.000 I believe there's in life months, maybe years, that you have to do the 18 hours a day.
00:51:28.000 But not always.
00:51:30.000 There's time for everything.
00:51:30.000 Right.
00:51:31.000 Do the sprint.
00:51:32.000 Sprints, yeah.
00:51:33.000 And then establish everything and then sit back.
00:51:35.000 But the problem with a lot of guys like him is, first of all, it's very difficult to find a replacement for the way he thinks, right?
00:51:42.000 So if he's a CEO of these companies and he's the one who's the mastermind behind all these things and then he wants to step back, finding a commensurate replacement is insanely difficult.
00:51:54.000 Because most people who would be Yeah, and there's not many people like him.
00:52:03.000 That's interesting.
00:52:04.000 That's actually the disappointing thing to me, is that his kind of thinking is a rarity.
00:52:11.000 I'm not sure why that is exactly.
00:52:13.000 Well, he's...
00:52:14.000 I joke around about it, but I think there's a spectrum of evolution.
00:52:22.000 And his mind is clearly way more advanced than my mind.
00:52:26.000 There's something going on in his mind in terms of his attraction to engineering issues, solutions to global problems, solutions to traffic problems, pollution problems, all the things that he's...
00:52:41.000 the Internet.
00:52:42.000 I mean, he's trying to give the world Internet.
00:52:45.000 I mean, he's got all these things going simultaneously.
00:52:47.000 And one of the things that I got out of him when I was talking to him...
00:52:50.000 Was that he almost has a hard time containing these ideas that are just pouring out of his head like a raging river like he's trying to catch handfuls of water and this raging river of ideas is going through his head You know and when he described his childhood that he thought that everybody was like that and then as he got older you know thought he was insane and Yeah.
00:53:15.000 I can relate to that.
00:53:18.000 I'm trying to learn how to talk, but I have trouble talking because there's like a million ideas running in my head.
00:53:24.000 Anything you say, I'll immediately start.
00:53:27.000 There's these weird tangents that go off, and I want to start thinking about them.
00:53:31.000 Is that true with a lot of people in your line of work?
00:53:33.000 Yeah.
00:53:33.000 I think so.
00:53:34.000 I think that's kind of puzzle solving.
00:53:36.000 Like, that's where the comfort is.
00:53:37.000 I'm just surprised that a CEO is able to continue being that kind of puzzle solver.
00:53:42.000 Did you see that tweet that he made about his plans?
00:53:45.000 Like, he put a tweet up in, I think it was 2006?
00:53:51.000 And then he's essentially done all those things.
00:53:53.000 He's done all those things.
00:53:54.000 Now, the thing is, most people, so a lot of people love Elon Musk, but there's quite a large community of people that don't love him so much.
00:54:05.000 Well, that's always the case.
00:54:06.000 I don't know.
00:54:07.000 I don't think so.
00:54:08.000 With anybody great.
00:54:09.000 I don't know if that's always the case.
00:54:10.000 When is it not the case?
00:54:12.000 I don't know.
00:54:13.000 Who accomplishes as many things as that guy does where everybody loves him?
00:54:18.000 It's a difficult...
00:54:19.000 I mean, I'm not a historian, but I could say Steve Jobs.
00:54:25.000 Terrible example.
00:54:26.000 So many people hated that guy.
00:54:28.000 So many people hated that guy.
00:54:29.000 I have personal friends that are involved in technology that wouldn't use Apple products because he's such a twat.
00:54:34.000 Sure.
00:54:35.000 They didn't want to have anything to do with him.
00:54:36.000 They knew people that were engineers under him.
00:54:37.000 They said it was horrible and mean and it just required so much.
00:54:41.000 He would scream at people and insult them.
00:54:44.000 He had these ideas in his head that he needed to get done.
00:54:47.000 And if you couldn't work the hours that you needed to do what he wanted to accomplish, he would treat you like shit.
00:54:56.000 Yeah, you're right.
00:54:57.000 I just wish the world was better.
00:54:59.000 Like with all people like that, like with Steve Jobs and with Elon Musk, when he dies, people will always, you'll remember the greatness, right?
00:55:09.000 Yeah.
00:55:09.000 So that's how it seems to work.
00:55:11.000 It's just sad that you can't celebrate that currently.
00:55:13.000 But I do think there's one particular aspect of his personality that I also share that pisses people off really bad, which is, like you said, he had a plan, but he's late on that plan.
00:55:25.000 He keeps promising things and he keeps being like a year or two or three late.
00:55:30.000 And that really, I don't know if it actually angers people or if people that already don't like you use that as a thing to say why they don't like you, but it's certainly a thing that people say a lot.
00:55:43.000 But I think that's an essential element of doing extremely difficult things is over-promising and trying to over-deliver.
00:55:50.000 That's the whole point.
00:55:51.000 Is to say, to make all the engineers around you believe that it's doable in a year.
00:55:57.000 That's essential to do it in two years.
00:56:00.000 And truly believing it seems to be essential.
00:56:05.000 Well, didn't he have people pay full price for that Roadster?
00:56:10.000 Like, you got on a list...
00:56:11.000 Ahead of time, yeah.
00:56:11.000 Yes.
00:56:12.000 So you paid a quarter of a million dollars for a car that's essentially vaporware.
00:56:17.000 Yeah.
00:56:17.000 But the thing...
00:56:18.000 So I don't know.
00:56:19.000 There's a whole bunch of financial people that get, like, mad at that kind of idea.
00:56:22.000 Oh, yeah.
00:56:22.000 They get furious.
00:56:23.000 Like, there's investors.
00:56:24.000 You know, it's like...
00:56:25.000 I think it's the most shorted stock in history.
00:56:28.000 Yeah.
00:56:29.000 So...
00:56:29.000 But it keeps kicking ass.
00:56:31.000 I don't...
00:56:32.000 It confuses the fuck out of people.
00:56:33.000 Both...
00:56:34.000 To me, the stock market is the most boring thing ever and people, it's gambling.
00:56:40.000 Yes.
00:56:40.000 And so you trying to say you're an expert in investing in the stock market, I blocked, I removed those people from my life because they don't say any interesting ideas.
00:56:51.000 I said it.
00:56:52.000 But, you know, when you're doing legitimate investment, yes, that's a really important service to society.
00:57:00.000 But if you're commenting on the fundamentals of engineering problems that real engineers are trying to solve, that's not interesting to me.
00:57:08.000 So that kind of stuff upsets, I think, financial folks.
00:57:13.000 But the beautiful thing is when you have people buy vaporware And you bring that vaporware to reality.
00:57:20.000 That's the amazing thing.
00:57:22.000 He will definitely bring that roadster to reality.
00:57:25.000 If he doesn't die, that roadster will happen.
00:57:28.000 If he dies, bail out now.
00:57:32.000 Same with that insane Cybertruck.
00:57:34.000 Yeah, Cybertruck is fucking awesome.
00:57:37.000 It's so ridiculous.
00:57:39.000 If he lives long enough, you better believe there's humans being put on Mars.
00:57:42.000 Whether it's him or he gets everybody else.
00:57:45.000 See, that one I'm skeptical of.
00:57:48.000 Just the type of people that are going to want to go.
00:57:52.000 See, you're not talking about the engineering problem.
00:57:55.000 No.
00:57:55.000 I think it's possible, ultimately, you know, I mean, it's, look, can we put people in space?
00:58:02.000 For sure, we've definitely done it.
00:58:05.000 Can we put things, well, some people think space is fake.
00:58:08.000 Space is fake.
00:58:09.000 That's, do you ever Google hashtag space is fake?
00:58:12.000 It's wonderful.
00:58:14.000 It's a testament to the education system in this country.
00:58:17.000 Well, on that tiny little tangent, I've gotten, I joked about Flat Earth and Space is Fake a little bit, almost like saying that's an interesting way to being open-minded.
00:58:27.000 And then I realized that's not something to joke about.
00:58:31.000 That there is a community of people that take it extremely seriously.
00:58:34.000 And then some of them thank me for acknowledging that the possibility of Oh.
00:58:39.000 And then I had said, okay.
00:58:40.000 Bless their little hearts.
00:58:42.000 Okay.
00:58:42.000 This is not...
00:58:43.000 And their little brains.
00:58:46.000 But I appreciated their open-mindedness, but they should take introduction to physics.
00:58:51.000 MIT OpenCourseWare provides courses on physics.
00:58:55.000 They should...
00:58:56.000 Can a regular person just sign up for that?
00:58:58.000 Yeah, yeah.
00:58:59.000 It's open free.
00:59:00.000 So how does that work?
00:59:01.000 What do you have to do in order to take those courses?
00:59:04.000 It's all made available online.
00:59:06.000 Just go to MIT.org or is it.edu?
00:59:09.000 MIT OpenCourseWare is the website.
00:59:11.000 I mean, most people.
00:59:12.000 Oh, and it's all on YouTube now.
00:59:14.000 Oh, that's beautiful.
00:59:14.000 It's all lectures.
00:59:16.000 There are like millions of views, introductory lectures to physics, mathematics, statistics.
00:59:20.000 I have courses on there.
00:59:21.000 Aha, physics.
00:59:23.000 But in order to understand that the work has been done to recognize the fact that the Earth is round, what would you recommend right away?
00:59:32.000 Classical mechanics with exponential focus, experimental focus?
00:59:36.000 See, none of those things are going to...
00:59:38.000 No, no, classical mechanics is good.
00:59:40.000 But if you're a dingbat, you're not going to be able to absorb all that?
00:59:44.000 Look up the Wikipedia page for gravity, I think.
00:59:47.000 That's not going to help either.
00:59:49.000 They say gravity's never been proven.
00:59:50.000 No one understands gravity.
00:59:52.000 There's no one who actually understands what gravity is.
00:59:54.000 We just know the effects of it.
00:59:56.000 It's actually magnetism.
00:59:57.000 Yes, for sure.
00:59:59.000 So you have to undertake the effort of proving the Wikipedia article for gravity wrong.
01:00:06.000 But Wikipedia, bro, what a terrible example.
01:00:09.000 Wikipedia is sketchy.
01:00:10.000 It says I'm Brian Callen's brother.
01:00:12.000 It says I got celiac disease.
01:00:14.000 It says a bunch of shit that's not real.
01:00:15.000 How do you know you're not related?
01:00:19.000 No.
01:00:19.000 I'm pretty sure.
01:00:22.000 I mean, he might as well be my brother.
01:00:24.000 I don't know if it says it anymore, but whatever.
01:00:26.000 Someone put it in there again.
01:00:27.000 Fuck it.
01:00:28.000 Wikipedia is actually another distributed system that's incredibly surprising to me that it works.
01:00:34.000 Yeah, it is, right?
01:00:35.000 Because even though there is a lot of misinformation in it and there's a lot of, you know, falsehoods, There's a lot of really good information as well, particularly about historical figures and interesting stuff.
01:00:46.000 If you want to find facts on things, it's great research.
01:00:50.000 And on science and technical topics.
01:00:51.000 So not like nutrition science or things where there's a lot of debates.
01:00:55.000 Unlike physics and math and so on, it's really good.
01:00:57.000 It's really, really good.
01:00:58.000 So it's community supported by other physicists.
01:01:02.000 But moving back from Flat Earth, can we go back to why you think we're not going to be colonizing Mars?
01:01:08.000 Oh, I'm not saying ever.
01:01:10.000 I'm just saying the problem to me is the type of people that would want to do it.
01:01:15.000 Because they can't return.
01:01:17.000 That's the real issue with going to Mars is that you can't return.
01:01:21.000 You don't think there's a huge number of non-crazy explorers in this world?
01:01:25.000 That want to die on Mars?
01:01:26.000 Yeah.
01:01:26.000 I had a whole bit about it.
01:01:28.000 I really believe that it's the fringe of the fringe that would be willing to die on Mars.
01:01:36.000 No.
01:01:37.000 I'd be willing to die on Mars.
01:01:38.000 Really?
01:01:39.000 Stay here.
01:01:39.000 Come on, I like you.
01:01:40.000 Don't go over there.
01:01:42.000 No, it's...
01:01:43.000 What do you like about me?
01:01:44.000 It's all temporary.
01:01:45.000 What's all temporary?
01:01:46.000 Life?
01:01:47.000 Life.
01:01:47.000 Life is temporary.
01:01:48.000 Right.
01:01:48.000 You're gonna die someday.
01:01:50.000 Sure, but if you decide to die on fucking Mars, I'm like, bro...
01:01:54.000 You'll be sending me emails from Mars.
01:01:55.000 Dude, I fucked up.
01:01:57.000 I won't be sending you emails.
01:01:59.000 This is the thing.
01:02:01.000 You're into the Native Americas.
01:02:04.000 You've been reading and following your work there.
01:02:07.000 I'm obsessed, man.
01:02:08.000 I've been obsessed about World War II and World War I, but you're converting me to both the warrior cultures and the suffering in that world.
01:02:17.000 The suffering is insane.
01:02:18.000 It's insane.
01:02:19.000 This book, Black Elk, Man, it details his life from...
01:02:24.000 He was a young boy during Custer's last stand.
01:02:27.000 He was there when Custer was killed.
01:02:29.000 Who was he?
01:02:29.000 Black Elk.
01:02:30.000 Yeah, Black Elk.
01:02:30.000 The guy.
01:02:32.000 What do you call that?
01:02:33.000 He's an Oglala Lakota medicine man.
01:02:35.000 Medicine man, yeah.
01:02:36.000 Yeah, and he just lived through the transition.
01:02:39.000 He lived through the transition of them battling with the U.S. soldiers to them being on the reservation.
01:02:46.000 Fucking insane poverty.
01:02:50.000 Insane.
01:02:51.000 Just the stories of people, the illnesses and the deaths, how many people's children died, malnourishment, starvation, abuse, and then just how much they hated where they were living and how they were living.
01:03:10.000 On the reservation.
01:03:10.000 Yeah, it's horrific, man.
01:03:12.000 It's horrific.
01:03:13.000 It's like...
01:03:15.000 It's hard to imagine.
01:03:16.000 It's hard to imagine when you're reading that this just happened.
01:03:21.000 He's talking about the really horrible parts at the end were in the early 1920s, 1930s.
01:03:29.000 It's hard to imagine.
01:03:30.000 It's hard to imagine that this tribe from 100 years prior, in the 1820s, We're living wild and free and we're, you know, we're living the same way they've lived for hundreds of years and had this incredible relationship with the land and these incredible religion that they practiced where they worship the earth and the animals and the sky and they had all these concepts
01:04:01.000 for the way you should live your life and how to Guarantee prosperity and how to guarantee success.
01:04:09.000 Man, they had a fascinating culture.
01:04:14.000 And it's gone.
01:04:15.000 It was wiped off the face of the map.
01:04:18.000 There was nothing like it anywhere else on Earth.
01:04:20.000 There was no culture anywhere on Earth that was like the Native American culture in the 1600s, 1700s, 1800s.
01:04:29.000 But in that period of time, they had this spectacular way of life.
01:04:34.000 And it was often very cruel and very ruthless, and they warred on each other.
01:04:40.000 This idea that Native Americans were living in peace and harmony with each other is nonsense.
01:04:46.000 Yeah, so I started, I was listening while doing Hills yesterday, kicked my ass.
01:04:50.000 I was listening to The Empire of the Summer Moon.
01:04:52.000 Fucking great book.
01:04:54.000 I commented on your Instagram, like saying something...
01:05:00.000 You know, basically admiring the purity of that way of life.
01:05:05.000 And I got so much shit by people saying, oh, you think rape and murder is pure and admirable?
01:05:11.000 So there is certainly an aspect to their way of life, which is sort of the warrior ethos, right?
01:05:17.000 The Comanches in particular.
01:05:18.000 They were the most ruthless, the most warlike.
01:05:22.000 That's all they did.
01:05:23.000 Basically like the Genghis Khan, the same kind of, the same horses, the innovators actually, war innovators.
01:05:29.000 Yeah.
01:05:29.000 And all they ate was meat as well.
01:05:31.000 I mean, all they ate was buffalo.
01:05:33.000 I mean, they essentially rode with the buffalo, killed buffalo, hunted buffalo, and then raided other tribes.
01:05:39.000 And then until the white man came, and then they started raiding the white man and killing the white man.
01:05:43.000 But they were, you know, at war with white people for hundreds of years.
01:05:49.000 I mean, they were the reason why the West was hard to settle.
01:05:53.000 I mean, the sneaky shit, I don't know if you've gotten to the point where they were giving people these big swaths of land in Oklahoma, and they essentially set them up to be killed by the Comanche.
01:06:05.000 They will say, hey, go out here.
01:06:07.000 We'll give you 1,600 acres.
01:06:09.000 It's all yours.
01:06:09.000 And they're like, oh, terrific.
01:06:11.000 Let's get our family and get in a wagon.
01:06:12.000 And no one let them know that the wildest motherfuckers that have ever lived on this continent were running that place.
01:06:19.000 And they would go there and just get slaughtered.
01:06:21.000 And one after another, families were wiped out that way and people were kidnapped.
01:06:25.000 And that lady that I have on the wall outside...
01:06:29.000 Yeah.
01:06:44.000 It's crazy, man.
01:06:46.000 It's the craziest story.
01:06:48.000 There's all these tribes that some are probably more warlike, some are more peaceful.
01:06:52.000 Yes.
01:06:52.000 That had a way of life here.
01:06:55.000 I don't want to romanticize too much.
01:06:57.000 I mean, most people don't believe me, but I really like that way of life, that closeness to nature.
01:07:02.000 You said texting from Mars or whatever.
01:07:06.000 I like, you know, I wouldn't choose it.
01:07:10.000 But I would be happier if I was forced into it.
01:07:12.000 It seems like a counterintuitive notion.
01:07:15.000 Because I'm so weak.
01:07:16.000 I'm so soft.
01:07:17.000 Even running hills yesterday, I realized how soft I am.
01:07:20.000 Well, you work too much.
01:07:22.000 Yeah.
01:07:22.000 No, behind a computer with my little fingers typing, right?
01:07:26.000 But you're also a black belt in jiu-jitsu.
01:07:28.000 You're also a martial artist.
01:07:30.000 Me against a Comanche warrior, good luck.
01:07:33.000 I think you'd fuck a Comanche up.
01:07:35.000 They don't know how to fight for real.
01:07:37.000 If they had a weapon, they'd kill you.
01:07:38.000 I think you're...
01:07:39.000 No, I... Listen, first of all, they were pretty small.
01:07:45.000 They weren't very big people.
01:07:47.000 Second of all, they didn't know jiu-jitsu.
01:07:49.000 The average person that doesn't know jiu-jitsu, you're going to choke the fuck out of them.
01:07:53.000 That'd be fun, actually, to sort of go into different warring cultures.
01:07:56.000 I'd go into Genghis Khan's times without weapons to see what kind of combat styles they had.
01:08:02.000 Just send Francis Ngannou.
01:08:03.000 He'd clean out the entire fucking crew.
01:08:07.000 I mean, just send Hoise Gracie.
01:08:08.000 Yeah, for sure.
01:08:10.000 Francis Ngannou in all generations will be screwed.
01:08:15.000 He's not as interesting.
01:08:17.000 Yeah, right.
01:08:17.000 It's just overwhelming.
01:08:19.000 But I think that if you had real jiu-jitsu skills...
01:08:24.000 You know, what you know now today, particularly because jiu-jitsu has evolved so much.
01:08:29.000 I mean, even the jiu-jitsu of 2020 is so radically different from the jiu-jitsu of, you know, 1990. It's radically different, like almost unrecognizable in a lot of ways.
01:08:40.000 Clearly, though, the basics are still the most important, and they're some of the greats of all time who just operate with the basics, whether it's Harder Gracie or Hicks and Gracie.
01:08:52.000 There's a lot of great, great jiu-jitsu players that just have those solid basics that are just honed to a razor-sharp edge.
01:09:07.000 Krohn, Krohn Gracie, he's got...
01:09:09.000 And when I say basic, it is a compliment.
01:09:12.000 I mean, arm bars, triangles, guillotines, renegade chokes, those types of things, but perfected to a level that is, they don't participate in a lot of the more modern, there's a lot of crafty, weird stuff that a lot of guys try today.
01:09:30.000 And some of the greats, even the greats that participate in jiu-jitsu matches today and are effective at it, don't really have that kind of style.
01:09:39.000 Yeah, I mean, but Krohn actually has some more creativity.
01:09:43.000 If you look at Roger Gracie, that's like...
01:09:45.000 Very basic.
01:09:46.000 I don't even know if he does footlocks.
01:09:48.000 Like, I think my favorite thing to do is on YouTube, just watch Roger Gracie matches.
01:09:53.000 Like, he looks like he's half asleep.
01:09:55.000 And he demolishes the greatest black belts in the world slowly by just, like in a half-asleep way, taking them down, passing their guard, going to mount and doing a choke.
01:10:07.000 Yeah.
01:10:07.000 It's like the, against, I don't know, Buchacher, against...
01:10:12.000 Just the best.
01:10:13.000 Well, my instructor, John Jock Machado, same thing, man.
01:10:16.000 Just his style is just solid basics of jiu-jitsu.
01:10:21.000 And he has a saying that the more you know, the less you use, which is really interesting.
01:10:26.000 Well, you mentioned Comanche warriors and the meat.
01:10:29.000 Yeah.
01:10:29.000 Congrats on the carnivore diet.
01:10:34.000 Yeah, man.
01:10:35.000 Here's something crazy.
01:10:36.000 I got off that diet for this weekend, because I did the month, and then once Saturday came around, I ate Italian food, I had Girl Scout cookies, pasta, and then yesterday I went to Disneyland.
01:10:49.000 So yesterday I went way, way off the diet and I had ice cream and I ate all kinds of shitty food and I was getting back pains and knee pains and all these kind of weird pains that went away when I was on the diet.
01:11:04.000 Now, this is not a testament against plant-based diets, because I was eating shit, shitty food, and pasta, which is a lot of bread.
01:11:16.000 White pasta.
01:11:17.000 Yeah, spaghetti.
01:11:18.000 That stuff causes inflammation.
01:11:20.000 It just does.
01:11:22.000 You know, it just does.
01:11:22.000 Sugar causes inflammation.
01:11:23.000 But it's interesting to have this great month where basically two weeks in after the diarrhea died off, I had two solid weeks of no aches and pains and feeling great.
01:11:35.000 I was like, this is wild.
01:11:36.000 This is really wild.
01:11:37.000 I feel amazing.
01:11:38.000 And then two days of eating shit and like my back hurts right now.
01:11:44.000 I'm sitting here.
01:11:44.000 My back is hurting.
01:11:45.000 My knee was hurting yesterday.
01:11:47.000 All those weird aches come right back.
01:11:50.000 Well, the nice thing about the Joe Rogan effect is that you're trying this diet and you're talking about keto a lot, that's become more socially acceptable to do.
01:11:59.000 Because I've been eating keto or low carb for many years and doing fasting, like 24, 48 hour fasts.
01:12:06.000 And I always kind of keep it more in the low down.
01:12:08.000 But even...
01:12:09.000 This time, I like traveling.
01:12:12.000 What I like to do when traveling is I'm trying to be, given my current situation, not spend much money.
01:12:22.000 One of the best ways to go, either carnivore or keto, is to go to McDonald's and just order beef patties.
01:12:31.000 They'll sell you just beef patties at McDonald's?
01:12:32.000 Just beef patties.
01:12:33.000 $1.50.
01:12:34.000 Really?
01:12:34.000 For a patty, for a quarter pound, yeah.
01:12:36.000 So you can, you know, like, it's like usually what I eat is about two pounds of meat a day.
01:12:41.000 And that's, what is it?
01:12:45.000 I don't know.
01:12:45.000 That's like 15 bucks.
01:12:47.000 So you've been doing this carnivore thing too?
01:12:49.000 Mm-hmm.
01:12:50.000 How long have you been doing it for?
01:12:51.000 Off and on.
01:12:53.000 For the carnivore, I've done since the first time either your podcast or Jordan Peterson or that kind of thing I dived into.
01:13:03.000 But before then, I've been doing keto.
01:13:05.000 My favorite meal is just meat and I know some people hate cauliflower, but cauliflower or green beans.
01:13:13.000 Why do you worry if people hate cauliflower?
01:13:15.000 Why'd you have to make that distinction?
01:13:18.000 I don't know.
01:13:19.000 Some people hate cauliflower.
01:13:21.000 Who's out there hating cauliflower?
01:13:23.000 Who the fuck are those people?
01:13:24.000 That's a weird thing to hate.
01:13:25.000 I just had a bunch of people say cauliflower sucks recently, so yeah, you're right.
01:13:31.000 If you cook it right, it doesn't suck.
01:13:32.000 You know what's good?
01:13:33.000 Buffalo cauliflower, like buffalo wings, buffalo sauce, cauliflower, fucking delicious.
01:13:38.000 What's that?
01:13:38.000 But that's a sauce.
01:13:39.000 Yeah.
01:13:40.000 No, sauce is like you're giving in to your weakness.
01:13:43.000 Yeah.
01:13:44.000 It's spices.
01:13:46.000 And you're giving away...
01:13:46.000 No, see, like...
01:13:47.000 Cauliflower?
01:13:48.000 A blander taste, to me, is better because you get to appreciate the fundamentals of the food.
01:13:53.000 Oh, okay.
01:13:54.000 So, I don't know.
01:13:55.000 I just enjoy it.
01:13:56.000 You do salt meat?
01:13:57.000 Salt, yeah.
01:13:58.000 Oh, how do you do that when you can just appreciate the fundamentals of the meat?
01:14:03.000 Yeah.
01:14:05.000 Good point, yeah.
01:14:06.000 You don't like hot sauce?
01:14:07.000 I'm playing checkers, you're playing chess.
01:14:09.000 Do you like hot sauce?
01:14:10.000 I put hot sauce on everything.
01:14:11.000 Yeah.
01:14:12.000 Yeah, I do.
01:14:13.000 But I stay away from it.
01:14:14.000 Like, I try to...
01:14:16.000 Listen, food to me right now in my life is a source of energy, not a source of pleasure.
01:14:22.000 But it can be both.
01:14:24.000 Unfortunately, I'm not addicted to drugs.
01:14:26.000 I'm not addicted to many things.
01:14:27.000 But with food, my mind, I don't know how to moderate.
01:14:31.000 Really?
01:14:31.000 So, like, anything pleasurable is a problem for me in terms of food.
01:14:35.000 Like cookies.
01:14:36.000 You put two cookies in front of me.
01:14:38.000 I don't know how to eat just one of them.
01:14:41.000 My brain is terrible at it.
01:14:44.000 This is Girl Scout cookie season, son.
01:14:47.000 They changed the name of Samoas.
01:14:49.000 Those are my favorite.
01:14:50.000 And now they have a new name.
01:14:53.000 I think they call them Tagalongs or something like that.
01:14:57.000 They've been that name for a while, I think.
01:14:59.000 Really?
01:15:00.000 When did they change?
01:15:01.000 Those are separate things, though, I think.
01:15:02.000 Am I talking about the wrong thing?
01:15:04.000 Maybe.
01:15:04.000 The ones that are like, they're chocolate on the bottom.
01:15:08.000 Dude, they're so good.
01:15:09.000 Samoas have that coconut in them.
01:15:10.000 Yeah, yeah, yeah.
01:15:11.000 Yeah.
01:15:12.000 That doesn't sound like the words of a man who's going to stick to the carnivore diet.
01:15:15.000 Oh, I'm going to stick.
01:15:16.000 All right.
01:15:17.000 Yeah, I mean, I'll have cheat days.
01:15:19.000 Or cheat meals, I should say.
01:15:20.000 Tagalong's the peanut butter.
01:15:21.000 Oh, that's right.
01:15:22.000 Those are good.
01:15:23.000 Those are fucking good.
01:15:24.000 What are the Samoas now?
01:15:25.000 What do they call them now?
01:15:27.000 But they just changed.
01:15:28.000 They just changed to something new.
01:15:32.000 I've never eaten a Girl Scout cookie.
01:15:34.000 Ever?
01:15:34.000 No.
01:15:35.000 What are you, a robot?
01:15:36.000 I'm a Russian, too.
01:15:38.000 Basically a robot.
01:15:39.000 Basically.
01:15:40.000 I eat six of those, and I was feeling like shit.
01:15:43.000 I'm like, oh.
01:15:46.000 What?
01:15:47.000 Maybe.
01:15:48.000 This is just a very quick Caramel Delight.
01:15:50.000 Is that what it's called now?
01:15:51.000 I don't think so.
01:15:51.000 Just someone who might have owned another company or whoever they were paying to make them.
01:15:55.000 I was wondering if it was a racial issue.
01:15:57.000 That's what the question was, was someone saying, because it was racist, this is an odd question.
01:16:01.000 The answer is no, the name of the cookies are owned by the two different companies who make them.
01:16:04.000 So they outsource it and they just, you know...
01:16:07.000 They change the name.
01:16:08.000 Racial issue.
01:16:09.000 Wow.
01:16:09.000 Because Samoa.
01:16:10.000 Like someone might be like sensitive to having a cookie named after an island.
01:16:15.000 And people are like, hey fuckface, that's our island, not your cookie.
01:16:18.000 You know?
01:16:20.000 See, those cookies don't even sound good to me anymore.
01:16:22.000 What about American cheese?
01:16:24.000 Is that okay?
01:16:25.000 American cheese?
01:16:26.000 No, not okay.
01:16:27.000 Wait, American?
01:16:28.000 Oh, yeah.
01:16:29.000 I stay with a Russian dvorog.
01:16:33.000 It's a cottage cheese.
01:16:34.000 Well, there's Swiss cheese, there's American cheese, and that's it, right?
01:16:39.000 Is there any other countries that are named specifically...
01:16:44.000 The cheese named after the country?
01:16:46.000 I bet you it's not even just like french fries.
01:16:48.000 I bet you American cheese is not even American.
01:16:50.000 Do you remember when there was freedom fries for a while?
01:16:52.000 People were trying to call fries freedom fries, like post 9-11 because they were mad that France didn't want us going over to Iraq.
01:16:58.000 Yeah, and then people who hate freedom banned it.
01:17:01.000 Yeah, freedom fries.
01:17:03.000 Oh, that's so dumb.
01:17:05.000 The thing I really like, actually, I think that's the thing that people don't often talk about, is the focus.
01:17:13.000 My life, I think a lot of people do this, is being able to focus for long periods of time.
01:17:18.000 And that's why I stuck with keto, or fasting especially.
01:17:23.000 Yes, the focus is pretty tremendous.
01:17:26.000 Well, that's what I really got with the carnivore diet.
01:17:29.000 The amount of my flatness of energy, the lack of dips and valleys, peaks and valleys, it's amazing.
01:17:35.000 Yeah, it's great.
01:17:36.000 And the fasting helps me too.
01:17:38.000 Like Jack Dorsey does only what's called OMAD one meal a day.
01:17:43.000 You could just say one meal a day.
01:17:45.000 This OMAD stuff.
01:17:46.000 Jesus Christ.
01:17:48.000 I'm a hip Reddit lingo guy.
01:17:52.000 I think on Reddit it's OMAD. Oh, okay.
01:17:56.000 No, I don't know.
01:17:56.000 One meal a day.
01:17:57.000 But, you know, a 24-hour fast.
01:18:01.000 That's a careful weapon you have to play with, at least for me.
01:18:06.000 It's weird.
01:18:07.000 It helps your mind really focus.
01:18:09.000 I can sit sometimes for five, six hours a day, like programming, really thinking, and lose track of time, and really focus.
01:18:15.000 But when you interact with other human beings, You're kind of a little bit of an asshole.
01:18:21.000 Like, I am, sorry.
01:18:23.000 I mean, when I am...
01:18:24.000 In a way where...
01:18:29.000 It's funny, but if there's something about a person that's full of crap, you are more likely to point that out.
01:18:38.000 When you're on keto or carnivore?
01:18:40.000 No, it's irrespective of diet.
01:18:43.000 Keto, carnivore, whatever, is the fasting.
01:18:45.000 Oh, the fasting.
01:18:46.000 The fasting.
01:18:47.000 Really?
01:18:48.000 So is it more irritable?
01:18:49.000 Is that what it is?
01:18:50.000 I think it's irritable, but you also see things more clearly.
01:18:54.000 I don't know.
01:18:54.000 I'll talk to my parents or something like that.
01:18:58.000 When I'm more well-fed, I'll be like, just enjoy having fun with them.
01:19:01.000 And if I'm fasted, I'll be like...
01:19:04.000 Why are you always judging me kind of thing, right?
01:19:06.000 Like, you realize the thing, the aspects of the interaction, which are problematic, and you want to sort of highlight them.
01:19:13.000 I'm just sort of noticing it, which is problematic when you're in a working environment, especially sort of deliberating, discussing with other engineers how to solve a problem.
01:19:24.000 I'm more likely, especially, you know, lead a team to say that somebody is a little bit full of shit.
01:19:31.000 When I'm fasting, as opposed to being a little bit more kind and eloquent about expressing why they're full of shit.
01:19:38.000 I found myself feeling more aggressive and more inclined to use recreational insults.
01:19:46.000 When fasting or carnivore?
01:19:47.000 Carnivore.
01:19:48.000 What's a recreational insult?
01:19:49.000 Like, come on, fuckface.
01:19:50.000 Fuckface.
01:19:51.000 You know, like saying something like that to someone or fill in the blank with whatever other words you would like to use.
01:19:56.000 That sounds like an academic paper.
01:19:57.000 The rate of fuckface goes up.
01:19:59.000 Well, just in casual conversation, I'd find myself using fun insults more often.
01:20:07.000 But fun with the intent of kindness behind?
01:20:10.000 No.
01:20:10.000 I mean, having fun.
01:20:11.000 Even talking about people who aren't there.
01:20:13.000 Just having fun.
01:20:14.000 But...
01:20:16.000 That's also a function of being a comedian.
01:20:19.000 We do that to each other really bad.
01:20:21.000 Like, I had a birthday.
01:20:24.000 My friends made me a cake that said, Happy Birthday, faggot.
01:20:28.000 That kind of shit is just so a part of the culture of comedians.
01:20:33.000 Like, everybody calls everybody bitch.
01:20:36.000 Everybody, you know, it's just fun.
01:20:38.000 Yeah, which is awesome because this comedian culture is now at full-on war with the cancel culture.
01:20:43.000 And it's like two armies of people who don't give a damn and people who give way too much of a damn.
01:20:50.000 Well, I have mixed feelings about all that stuff.
01:20:53.000 But I ultimately feel like the direction it's moving in, the reason why it's happening is for good.
01:20:59.000 I think there's a lot of people that are complaining about things and they're trying to cancel people and all that stuff.
01:21:04.000 And it's, you know, ultimately, some of it's misguided.
01:21:07.000 But I think the ideas behind it, like the primary push, like the gravity behind it, is people want less racism, less discrimination, less of a lot of things.
01:21:24.000 But then along the way you have Hypocritical human behavior that gets involved in this and you have people that are, you know, deeply flawed themselves but pointing out minor flaws in other people and then they get exposed and they feel horrible.
01:21:37.000 For every person who participates in this cancel culture, it's like… The wave is coming back at you.
01:21:45.000 I mean, it comes in and it comes out.
01:21:47.000 And if you go too far out on that fucking pier, it's gonna get you.
01:21:52.000 And this is part of it that we're learning.
01:21:55.000 And I think...
01:21:56.000 What people are today, like, if you look at humanity from, like, the 1930s, it was hard, man.
01:22:09.000 People lived in a hard way.
01:22:11.000 It was ruthless.
01:22:12.000 If you watch films from the 1900s, early 1900s, First of all, the domestic violence was so normal.
01:22:24.000 Like heroes in movies in the 50s and 60s just smacked women in the face.
01:22:29.000 Heroes.
01:22:30.000 Smacked their wives.
01:22:32.000 Hit their kids.
01:22:34.000 It was a different world.
01:22:36.000 And people will look probably at our time today and say, you know, people openly ate meat, meaning not, or like, I could see a few...
01:22:49.000 Not engineered meat.
01:22:50.000 Not engineered meat, right?
01:22:51.000 Sort of ate meat from factory farms.
01:22:54.000 As opposed to recreationally hunting it themselves and eating what they've hunted or engineered meat, lab meat.
01:23:03.000 Yeah, or you can get ethically raised food.
01:23:06.000 I mean, there are a lot of ranchers.
01:23:08.000 Like, it's one of the things that ButcherBox does very well, is they make sure that they have relationships with ranchers who have a commitment to ethically raised animals and ethically killed animals.
01:23:19.000 And what that means is, you know, they don't participate in anything that has anything to do with factory farming, no antibiotics, no added hormones ever.
01:23:27.000 And that is possible.
01:23:28.000 I mean, people have been eating animals from the beginning of time, literally.
01:23:33.000 97% of the world eats animals.
01:23:36.000 And this idea that the only way to do it is through factory farming, I don't think that's correct.
01:23:43.000 The idea is, if you eat meat, you participate in factory farming, and that's horrific.
01:23:52.000 I don't think that's true.
01:23:53.000 But I do think it is true when it comes to fast food for the most part.
01:23:56.000 And that's unfortunate.
01:23:58.000 And I think if they could, I mean, we need more transparency for sure when it comes to that stuff.
01:24:05.000 And that's one of the reasons why those ag gag laws, agricultural gag laws where people, there's laws that prevent people from working in these factory farming situations to expose.
01:24:15.000 There's laws that prohibit them from exposing the horrors of these environments.
01:24:21.000 That's a real problem.
01:24:22.000 That's a real issue that's clearly designed to protect that industry and allow them to commit these crimes.
01:24:28.000 Yeah, it's one of the things.
01:24:31.000 I'm conscious of my own hypocrisy in this.
01:24:34.000 I think deeply, unfortunately, love meat.
01:24:39.000 And I'm aware of how unethical factory farming is.
01:24:48.000 And so those two things I have to sit with and be conscious of.
01:24:52.000 That's a question.
01:24:53.000 When did that happen?
01:24:55.000 When did the factory farming thing happen?
01:24:56.000 You go back to the 1930s, there was no factory farming.
01:24:59.000 It was just farming.
01:25:01.000 Are you sure?
01:25:01.000 I think it was probably incremental.
01:25:03.000 Are you sure it wasn't in the 1930s, there wasn't already some mass...
01:25:07.000 So what is factory farming?
01:25:08.000 It's a scale, but also sort of the suffering.
01:25:12.000 There's a certain line you start to cross where it just feels...
01:25:15.000 I mean, it's unclear at which point it really becomes torture versus...
01:25:20.000 Agriculture.
01:25:21.000 Agriculture.
01:25:22.000 That's an interesting line.
01:25:23.000 And we kind of...
01:25:25.000 Yeah, there's probably a good answer for that.
01:25:28.000 The real problem is at scale.
01:25:29.000 It's probably fast food.
01:25:29.000 The birth of fast food is really probably where it exploded.
01:25:32.000 Where's McDonald's?
01:25:33.000 McDonald's probably started 100 years ago.
01:25:35.000 I don't know.
01:25:36.000 I don't know.
01:25:37.000 I'm not sure when it started.
01:25:39.000 At scale, you know, the feeding of massive amounts of people that aren't growing anything.
01:25:45.000 That's the real issue.
01:25:46.000 The real issue is whether you're in New York City or Shanghai or...
01:25:52.000 Los Angeles, large, gigantic metropolitan areas that aren't growing anything.
01:25:56.000 They got to get a lot of food to those people.
01:25:59.000 If you have 20 million people like in Los Angeles, 20 million people eat meat.
01:26:02.000 That's a lot of meat.
01:26:05.000 Yeah.
01:26:06.000 You gotta feed them.
01:26:07.000 You gotta feed them.
01:26:08.000 Oh, there's a sign that steps up.
01:26:09.000 I think lab engineer meat is kind of interesting.
01:26:11.000 Yeah, it is interesting.
01:26:13.000 How much have you paid attention to it?
01:26:14.000 Not much.
01:26:15.000 I'm waiting.
01:26:17.000 This is the horrible thing.
01:26:18.000 I'm very cognizant of it that I kind of don't allow my brain to think much about this whole space because I love meat and I'm trying to save money.
01:26:27.000 I get it.
01:26:28.000 Right.
01:26:29.000 So you eat those McDonald's quarter pounder pounders.
01:26:32.000 The life of a scientist, right?
01:26:36.000 The scientists, and especially now, have taken a leap.
01:26:39.000 That's a difficult leap.
01:26:40.000 So I'm still affiliated with MIT, but I decided to leave my full-time position.
01:26:47.000 Why?
01:26:47.000 Do a startup.
01:26:48.000 So I want to try to build...
01:26:52.000 Trying to build the kind of thing I dreamed about.
01:26:54.000 We talked about the movie Her.
01:26:56.000 I've been working.
01:26:57.000 That's been 80-90% of my day.
01:26:59.000 In fact, me doing the podcast is trying to, is not trying, is already successful at giving me enough money for food and shelter.
01:27:07.000 Tell people the name of the podcast so they can...
01:27:09.000 Artificial Intelligence Podcast.
01:27:12.000 Lex Friedman.
01:27:13.000 Lex Friedman.
01:27:14.000 Listen to, what is it, Elon Musk, Eric Weinstein's on there.
01:27:19.000 I talk with Garry Kasparov, Chomsky, Sean Carroll.
01:27:23.000 Sean Carroll is brilliant.
01:27:24.000 He is brilliant.
01:27:25.000 What is it like talking to Chomsky?
01:27:27.000 He talks slow.
01:27:30.000 Well, I talk...
01:27:32.000 Most people say my voice is very boring, and I talk slowly.
01:27:36.000 To those people, I say, go fuck yourself, Chomsky.
01:27:39.000 I love you.
01:27:40.000 I love you.
01:27:41.000 You're right.
01:27:42.000 I'm trying to actually...
01:27:44.000 It's very difficult to express thoughts, like Sam Harris struggles with this too, to express thoughts with the kind of humor and eloquence that they are in your brain, like to convert them.
01:27:58.000 As a comedian, you're essentially a storyteller.
01:28:02.000 So you probably don't even know how you did it.
01:28:06.000 You're like Haja Gracie.
01:28:08.000 You've probably developed this art of storytelling, of being able to laugh and make other people laugh, of bouncing back and forth.
01:28:15.000 To me, most of my life has been spent behind a book or computer, thinking interesting thoughts, but not connecting with other people and doing that dance of conversation.
01:28:25.000 So learning that dance...
01:28:27.000 While also thinking is really tough.
01:28:30.000 So Wachomsky was like a pleasure because we can both be robots.
01:28:34.000 But I think he's like 92 years old.
01:28:38.000 Is he really?
01:28:38.000 Yeah.
01:28:39.000 And the thing I loved about him.
01:28:41.000 So, you know, there's all that political stuff that I don't pay attention to.
01:28:43.000 I mean, he's a major sort of activist.
01:28:45.000 But he's also a linguist that thinks that language is at the core of everything, of cognition.
01:28:51.000 Right.
01:28:51.000 So it's at the bottom.
01:28:53.000 Everything starts with language.
01:28:54.000 Cognition, reasoning, perception, all of that is things built on top of language.
01:28:59.000 So it's a brilliant sort of seminal research.
01:29:02.000 But at 92 years old, he still looked in my eyes and really listened and really thought and really sharp ideas came out.
01:29:11.000 You do the same thing.
01:29:13.000 People ask me, like, meet Joe Rogan.
01:29:15.000 You don't take yourself too seriously.
01:29:18.000 Even with your celebrity, with the popularity of the podcast, that's a huge thing.
01:29:21.000 And with Chomsky, what was really surprising to me is while he's pretty stubborn on his ideas and so on, people criticize him, he's so stubborn in his ways, he didn't take himself too seriously.
01:29:31.000 I sat there, I'm just some kid talking to him.
01:29:34.000 He really listened.
01:29:36.000 The stupid questions, the interesting questions, he really listened.
01:29:39.000 At 92 years old, to have that kind of curiosity, I was like, I'm so happy when I see that kind of thing.
01:29:45.000 Yeah, that's a wonderful example of a career academic who's still just concentrating on ideas.
01:29:52.000 Ideas.
01:29:52.000 Yeah, and still thinking, always.
01:29:55.000 You know, because academics can be like really any other...
01:30:00.000 Endeavor, any other discipline, you can get lazy, right?
01:30:04.000 You see that in almost every walk of life.
01:30:07.000 There's certain people that rest on their laurels.
01:30:09.000 And especially when you become popular, you get really good at explaining.
01:30:14.000 So you get like, you do these talks, you do these lectures, you start saying the same thing over and over, and you forget to listen.
01:30:22.000 Because of this podcast, the Artificial Intelligence podcast, but also Joe Rogan, two different groups of fans whom I both love, people come up to me and start a conversation, and I love it, just listening to them.
01:30:37.000 And I hope I never lose that.
01:30:39.000 I'm younger than Chomsky, but I hope you stay that way.
01:30:43.000 It's nice if you have the time.
01:30:44.000 It's a problem if you're in the rush and someone wants to talk to you about something very deep.
01:30:49.000 Yes.
01:30:49.000 I've had those moments where someone says, hey man, I've got to ask you.
01:30:53.000 And then I'm like, dude, this is a long conversation.
01:30:56.000 I can't do this right now.
01:30:57.000 I'm in a rush.
01:30:58.000 That's the burden.
01:30:58.000 That's your burden, actually.
01:31:00.000 I'm in a beautiful place, which I don't think will last too long, which is I'm not sufficiently famous.
01:31:07.000 Those things don't happen often enough to where I can have that conversation.
01:31:11.000 Right.
01:31:11.000 You have the luxury.
01:31:12.000 Right.
01:31:12.000 Although let me say, I got to hang out with Brian Callen, who I've been a huge fan of on New Year's Eve.
01:31:18.000 I got to watch the old man dance.
01:31:22.000 And this funny thing happened.
01:31:25.000 He's a celebrity.
01:31:27.000 So we're hanging out and two times somebody came up to me and Brian.
01:31:33.000 And they said, wow, it's Lex Friedman.
01:31:35.000 It's so good to me.
01:31:39.000 And then they completely ignored Brian.
01:31:41.000 That must have killed him.
01:31:43.000 He must have been like, motherfucker.
01:31:45.000 It made me so proud.
01:31:46.000 It's because it's in Boston, and I think it's like nerds and whatever.
01:31:49.000 Sure, yeah.
01:31:51.000 That is funny, though.
01:31:52.000 That's hilarious.
01:31:54.000 He's one of my...
01:31:55.000 I mean, it was incredible.
01:31:56.000 I didn't know you guys were friends until we all came together in the podcast and so on.
01:31:59.000 I was a huge fan of his...
01:32:02.000 He's one of my oldest friends.
01:32:04.000 We met on MADtv.
01:32:07.000 I was a host one week.
01:32:09.000 And he was one of the stars of the show.
01:32:12.000 He's an awesome guy, man.
01:32:14.000 Like a really underappreciated person.
01:32:15.000 And he's a guy that...
01:32:19.000 Yeah.
01:32:38.000 I didn't know that.
01:32:39.000 He's had a bunch of brilliant people on his podcast.
01:32:42.000 He's had a bunch of really interesting intellectuals and scientists.
01:32:48.000 I think it's mixed mental arts or something like that.
01:32:51.000 Yeah, and he's doing it with Hunter, his friend, but he stopped doing it with him.
01:32:56.000 He's an unusual guy, Brian Kalanis, because he's silly, but he's also brilliant.
01:33:01.000 Yeah, you can see that.
01:33:03.000 Eric Weinstein has the same quality, obviously, from different worlds.
01:33:06.000 The silliness.
01:33:07.000 You can see through the silliness that there's an intelligent, first of all, a good human being there, but also an intelligent human being.
01:33:12.000 But at the same time, he's like the butt of every joke.
01:33:14.000 I appreciate that so much.
01:33:16.000 I love silly people.
01:33:18.000 Silly people are so much more fun.
01:33:19.000 The people that are easily offended and easily upset, like, ugh, he's so exhausting.
01:33:25.000 Silly people are the best.
01:33:27.000 Yeah.
01:33:27.000 I actually, so I played your theme song on guitar, and Brian Callen was researching it, like, how do you play it?
01:33:35.000 And it was a Jerry theme song, and there's a Brian Callen singing video of Joe Rogan's Shoulders for Days.
01:33:45.000 Yeah, some silly song he made out.
01:33:48.000 So I'm working on a deal I'm going to try to figure out, because I can play guitar and play the theme song I put up online.
01:33:54.000 You guys going to work together and make an album?
01:33:55.000 No, we're going to make an out, like a Joe Rogan theme is going to come up with some words on there.
01:33:59.000 What's the notes?
01:34:00.000 You got pages and pages of notes in front of you.
01:34:03.000 Is there stuff that you really wanted to discuss?
01:34:06.000 Yeah, yeah, yeah.
01:34:06.000 Well, we haven't talked about AI at all, but let me, at least Boston Dynamics would be interesting to talk about.
01:34:12.000 There was a fake video that I sent Jamie today.
01:34:15.000 These motherfuckers, they keep getting me.
01:34:17.000 There's a new fake video.
01:34:19.000 I think that was the same one.
01:34:20.000 I think someone just took another clip from it.
01:34:22.000 Oh, is it?
01:34:22.000 Those guys have been making VFX videos on YouTube for 10 plus years.
01:34:25.000 They're really good at it.
01:34:26.000 It's so good.
01:34:27.000 For people who don't know, there's a YouTube channel where people...
01:34:30.000 I think it's a single YouTube channel that does visual effects, like fake humanoid or robot dog robots that kind of resemble something like Boston Dynamics.
01:34:43.000 They do some crazy stuff with guns.
01:34:45.000 Yeah, this one they gave the robot a gun.
01:34:48.000 Pull it up, Jamie.
01:34:50.000 What is the gentleman?
01:34:52.000 Corridor Digital on YouTube is the guys that make it.
01:34:55.000 Corridor Crew is the YouTube channel, I think.
01:34:56.000 Fucking incredible that it's not real.
01:34:58.000 It looks so real.
01:35:00.000 And so the robot, they kick it, they hit it with a hockey helmet, or a hockey stick, rather.
01:35:05.000 It's their long video they made a while ago.
01:35:08.000 They might have made a new one, which was one out in the desert, but I think I had seen it before.
01:35:11.000 See, they trick you with the Boston Dynamics.
01:35:14.000 It's Boss Town.
01:35:16.000 Boss Town Dynamics.
01:35:18.000 It looks so realistic.
01:35:20.000 But here's the thing.
01:35:21.000 We're not that far off from this thing.
01:35:23.000 No, okay, okay, okay.
01:35:24.000 Let's walk it back.
01:35:26.000 Let's walk it back.
01:35:28.000 It's not realistic.
01:35:30.000 In what way?
01:35:31.000 It looks human realistic.
01:35:34.000 So you can tell it's a human.
01:35:35.000 Like a robotics person can tell it's a human.
01:35:37.000 Because it's really difficult to do that kind of motion, that kind of movement.
01:35:41.000 Oh, like when it's getting shot?
01:35:43.000 Well, not the getting shot.
01:35:44.000 So there's a lot of movement it does for the purpose of comedy.
01:35:47.000 Right.
01:35:48.000 Like, it actually is on purpose trying to look like a human for the comedic internet effect, like a human that's getting pissed off and so on.
01:35:56.000 Yeah.
01:35:56.000 Those qualities are like another order of magnitude.
01:35:58.000 Like this here, where it's like, give me that.
01:36:00.000 Yeah, come on, give me that, give me that.
01:36:01.000 Aw, come on.
01:36:03.000 So for a term...
01:36:04.000 Oh!
01:36:04.000 Yeah.
01:36:05.000 Yeah.
01:36:08.000 Doing Bruce Lee type of movements.
01:36:10.000 Some of those are just comedic.
01:36:12.000 You don't need Terminator type robot.
01:36:14.000 Right, but they do have legitimate robots that can do backflips now.
01:36:20.000 So it's really backflips, parkour.
01:36:23.000 This one's real.
01:36:24.000 This is all real.
01:36:25.000 It's manipulation.
01:36:26.000 So all of these robots, depending on what we're talking about here, but those are remote controlled.
01:36:32.000 And these are single demonstrations that they've perfected.
01:36:35.000 So it's really important to distinguish between the body of the robot and the brain of the robot.
01:36:43.000 So these bodies, unlike anything else, unlike a Roomba, unlike a drone, who can also be very threatening, These bodies somehow...
01:36:51.000 We anthropomorphize them and they terrify us.
01:36:55.000 I don't know what it is.
01:36:56.000 I met Spot Mini in person.
01:36:57.000 That was one of the most transformative moments in my life.
01:37:00.000 Really?
01:37:00.000 Because I know how dumb it is, but the experience of it like...
01:37:05.000 It's not even a head.
01:37:07.000 It's supposed to be a hand, but it looks like a head.
01:37:09.000 And it, like, looking up at me with that hand, I felt like I was like...
01:37:14.000 It was magic.
01:37:15.000 It was like a...
01:37:16.000 It was like Frankenstein coming to life.
01:37:19.000 It was this moment of creation.
01:37:22.000 And what I realized is my own brain sort of anthropomorphizing.
01:37:25.000 The same way you're, like, looking at these robots and you're thinking, these things are terrifying...
01:37:31.000 In 10, 20 years, where are we going to be?
01:37:35.000 That's our brain playing tricks on us.
01:37:37.000 Because the key thing that's a threat to humanity or an exciting possibility for humanity is the intelligence of the robots, the brains, the mind.
01:37:46.000 And these robots have very, very little intelligence.
01:37:49.000 So in terms of being able to perceive and understand the world, very importantly, very importantly, to learn about the world.
01:37:58.000 From scratch.
01:37:59.000 So the terrifying thing is, you talked often like with this philosophical kind of notion that Sam Harris talks about, sort of exponential improvement, be able to become human level intelligence, superhuman level intelligence, in a matter of days become more intelligent than that.
01:38:14.000 That's all learning process.
01:38:17.000 That's being able to learn.
01:38:18.000 That's the key aspect.
01:38:19.000 We're in the very early days of that.
01:38:22.000 There's an idea of, you know, Big Bang is a funny word for one of the most fundamental ideas in the nature of our universe.
01:38:31.000 Same way, self-play is a term for, I think, one of the most important powerful ideas in artificial intelligence that people are currently working on.
01:38:42.000 So self-play, I don't know if you're familiar with a company called DeepMind and OpenAI, so Google DeepMind, and a game.
01:38:49.000 I know you're a first-person shooter guy, but StarCraft and Dota 2. So last year, these are, what do you call them, real-time strategy, I guess, in people who won millions of dollars in e-sport competitions.
01:39:03.000 And so OpenAI separately had OpenAI 5, which took on Dota 2. Dota 2 is the computer game based on Warcraft 3. That's the most popular esport game.
01:39:16.000 And then DeepMind took on StarCraft with their AlphaStar system.
01:39:20.000 And the key amazing thing is they're similar to AlphaGo and AlphaZero that learn to play Go.
01:39:25.000 It's the mechanism of self-play.
01:39:27.000 That's the exciting mechanism that I think if you can figure out how to have an impact on more serious problems than games would be transformative.
01:39:37.000 Okay, what is it?
01:39:38.000 It's learning from scratch in a competitive environment.
01:39:41.000 Thinking of you have two white belts training against each other and trying to figure out how to beat each other without ever having black belt supervision and structures and slowly getting better that way, inventing new moves that way.
01:39:59.000 Eventually, they get better and better by that competitive process.
01:40:04.000 That's the machine playing itself without human supervision.
01:40:30.000 It'll continue to improve.
01:40:31.000 Let me pause you here because this is one of the things that I think probably translates to AI as it does to Jiu Jitsu.
01:40:39.000 You need more than one opponent.
01:40:42.000 Like you can't have one input.
01:40:43.000 One person training with one person specifically and singularly, you're not going to develop the type of game that you need to become a real black belt in Jiu Jitsu.
01:40:55.000 A hundred percent.
01:40:56.000 Exactly.
01:40:56.000 So that's part of the brilliance of this mechanism.
01:40:59.000 So imagine you didn't just have white belts, you had an opportunity to generate a new random white belt.
01:41:06.000 Like a fat, big one, a little one, and all kinds of different.
01:41:11.000 One that loves weed named Eddie Bravo.
01:41:14.000 A passive one.
01:41:15.000 A passive one.
01:41:15.000 And let them play.
01:41:18.000 What you find is Jiu-Jitsu might be simpler than the general problem of different kinds of StarCrafts and so on.
01:41:27.000 But there is sets of strategies in this giant space.
01:41:32.000 There are these complex hierarchical strategies, like high-level strategies and then specifics of different moves that emerge, some of which you didn't even realize existed.
01:41:42.000 And that requires that you start with the huge amounts of random initial states, like the fat person, the skinny person, the aggressive person, and so on.
01:41:50.000 And then you also keep injecting randomness in the system, so you discover new ideas.
01:41:56.000 So even when you reach purple belt, you don't continue with those same people.
01:42:01.000 You start your own school.
01:42:02.000 You start expanding to totally random new ideas and expanding in this way.
01:42:06.000 And what you find out is there's totally surprising to human beings, like in the game of chess or in the game of Go, in the game of Starcraft.
01:42:14.000 This self-play mechanism can do what sort of AI people have dreamed of, which is be creative.
01:42:21.000 Create totally new behaviors, totally new strategies that are surprising to human experts.
01:42:26.000 That's why Go was so astounding to them, right?
01:42:29.000 Because it's such a complex game.
01:42:32.000 Such a hard game.
01:42:33.000 And it's able to...
01:42:34.000 Well, the first astounding thing is it's able to beat the world champion.
01:42:38.000 Yeah.
01:42:39.000 The second astounding thing about both chess and Go is it's able to create...
01:42:43.000 Totally new ideas.
01:42:45.000 I'm not good enough at chess or go to understand the newness of them, but grandmasters talk about the way Alpha Zero plays chess, and they say there's a lot of brilliant, interesting ideas there.
01:42:58.000 Very counterintuitive ideas.
01:43:00.000 And that's such a...
01:43:01.000 And that's all.
01:43:02.000 The first breakthroughs didn't have as much self-play.
01:43:06.000 They were trained on human experts.
01:43:08.000 But AlphaZero and AlphaStar and OpenAI5, these systems are all fundamentally self-play, meaning no human supervision, starting from scratch.
01:43:19.000 So no black belt instructor.
01:43:22.000 And that means...
01:43:24.000 So learning from scratch, that's exceptionally powerful.
01:43:28.000 That's a process from zero, you can get to superhuman level intelligence in a particular task in a matter of days.
01:43:38.000 That's super powerful, super exciting, super terrifying if that's kind of what you think about.
01:43:44.000 The challenge is we don't know how to do that in the physical space, in the space of robots.
01:43:50.000 There's something fundamentally different about being able to perceive, to understand this environment, to do common sense reasoning.
01:43:58.000 The thing we really take for granted is our ability to reason about the physics of the world, about the fact that things weigh things, that you can stack things on top of each other, the fact that some things are hard, some things are soft, some things are painful when you touch them.
01:44:13.000 All that like there seems to be a giant Wikipedia inside our brain of like common sense dumb logic that's very tough to build up.
01:44:23.000 It seems to be an exceptionally difficult learning problem that Boston Dynamics will have to solve in order to achieve even the same kind of Yeah.
01:44:55.000 That's an exceptionally difficult thing to arrive at because ultimately these systems operate on a set of objectives and what a lot of people that think about artificial general intelligence say the objectives we need to inject in these systems that they're trained on need to have one uncertainty so they should always doubt themselves.
01:45:17.000 Just like if you want to be a good blackball you should always be sort of Always open-minded.
01:45:22.000 Sort of relax.
01:45:23.000 Always learn techniques.
01:45:24.000 It's okay to get submitted.
01:45:26.000 So always have a degree of uncertainty about your world view.
01:45:30.000 The kind of thing we criticized Twitter outrage mobs for not having.
01:45:35.000 So having uncertainty.
01:45:37.000 And the other thing is always have a place where there should be human supervision.
01:45:42.000 And I think we have good mechanisms for that in place.
01:45:51.000 I'm very optimistic about where these kinds of learning systems...
01:45:54.000 The exciting thing is, Boston Dynamics is not opening up their platform.
01:46:07.000 So they're working with a few people.
01:46:09.000 I'm trying to make time to make it happen.
01:46:14.000 To work with them to build stuff on top of the platform.
01:46:18.000 Sorry, I'm referring to Spot Mini as a platform.
01:46:21.000 So this robot is this dumb...
01:46:23.000 It's like a Roomba.
01:46:24.000 It's a dumb mechanistic thing that can move for you.
01:46:27.000 But you can add a brain on top of it.
01:46:30.000 So you can make it learn.
01:46:31.000 You can make it see the world and so on.
01:46:33.000 That's all extra.
01:46:34.000 That's not what Boston Dynamics offers.
01:46:36.000 So they want to work with people like me to add that kind of capability.
01:46:40.000 And that's exciting because...
01:46:41.000 Now you can have hundreds of people start to add interesting learning capabilities.
01:46:48.000 So I may have to retract my words about how far away we are with the capabilities of these robots once you now open up to the internet.
01:46:57.000 So I was speaking to Boston Dynamics.
01:46:59.000 I think they're solving the really hard robotics problem.
01:47:01.000 But once you open it up to the huge world of researchers that are doing machine learning and doing computer vision and doing AI research, the kind of capabilities they might add to these robots might surprise us.
01:47:13.000 That's where people are concerned, right?
01:47:15.000 The big leaps.
01:47:16.000 The big leaps, yeah.
01:47:34.000 Did you see Black Mirror?
01:47:37.000 Yeah, Black Mirror.
01:47:38.000 You know that episode, Heavy Metal?
01:47:40.000 Heavy Metal.
01:47:41.000 Very difficult to pull that off.
01:47:44.000 For now.
01:47:45.000 For now.
01:47:46.000 And you had a conversation with Nick Bostrom, who I'm also talking with on the podcast.
01:47:54.000 One of the things he mentioned is...
01:47:56.000 So I don't think he thinks about this stuff a lot.
01:47:58.000 I do about military applications.
01:48:00.000 I talk to folks.
01:48:01.000 That's one of the things people don't, just like with me, they kind of put to the side they don't want to think about military applications.
01:48:07.000 I would be more worried about drones than I would be about robot dogs.
01:48:12.000 Because the kind of stuff we saw in the Black Mirror episode is really difficult to pull off, to make a robot learn.
01:48:20.000 Well, drones are kind of more impressive, right?
01:48:22.000 Because they hover, they can move through 3D space, they have Hellfire missiles attached to them.
01:48:28.000 I mean, there's a lot of crazy shit that they can absolutely do right now with drones.
01:48:32.000 And you're talking about large-scale drones, but you can think of small-scale drones.
01:48:37.000 And I think there's also a Black Mirror episode with drones where they take over.
01:48:43.000 I haven't seen that one.
01:48:45.000 I think there's drones everywhere and they're kind of doing your basic friendly government surveillance, mass surveillance kind of thing.
01:48:54.000 I think they sell in the episode that it's for a good cause.
01:48:58.000 Spoiler alert, but I think they start killing everybody.
01:49:01.000 Yeah.
01:49:03.000 Of course.
01:49:04.000 Wasn't there, there has been research done on making artificial insects that have like little cameras inside of them that look like a dragonfly or some sort of bug and they fly around and they can film things.
01:49:17.000 And the thing that terrifies a lot of people is going more microscopic than that, more like robots inside the body that help you cure diseases and so on, certain things, even at the nanoscale.
01:49:30.000 So basically creating viruses.
01:49:32.000 Creating new viruses.
01:49:34.000 Little tiny ones.
01:49:35.000 Yeah.
01:49:36.000 And if they learn, they can be pretty dumb.
01:49:39.000 But on a mass scale, you don't have to be intelligent to destroy all of human civilization.
01:49:45.000 So...
01:49:46.000 So the real question about this artificial intelligence stuff that everybody seems to – the ultimate end of the line, what Sam Harris is terrified of, is it becoming sentient and it making its own decisions and deciding that we don't need people?
01:50:02.000 That's what everybody's really scared of, right?
01:50:06.000 I'm not sure if everybody's scared of it.
01:50:08.000 Yeah, they might be.
01:50:09.000 I think that's a story that's the most compelling, the sexiest story that the philosopher side of a Sam Harris is very attracted to.
01:50:19.000 I am also interested in that story, but I think achieving sentience, I think that requires also creating consciousness.
01:50:28.000 I think that requires creating the kind of intelligence and cognition and reasoning abilities That's really, really difficult.
01:50:36.000 I think we'll create dangerous software-based systems before then.
01:50:41.000 They'll be a huge threat.
01:50:42.000 I think we already have them.
01:50:44.000 The YouTube algorithm, the recommender systems of Twitter and Facebook and YouTube, from everything I know, having talked to those folks, having worked on it, The challenging aspect there is they have the power to control minds,
01:51:01.000 what the mass population thinks.
01:51:05.000 And YouTube itself and Twitter itself don't have direct ability to control the algorithm exactly.
01:51:14.000 One, they don't have a way to understand the algorithm.
01:51:17.000 And two, they don't have a way to control it.
01:51:19.000 Because...
01:51:20.000 But what I mean by control is, control it in a way that leads to, in aggregate, a better civilization.
01:51:29.000 Meaning like, sort of the Steven Pinker, the better angels of our nature, sort of encourage the better sides of ourselves.
01:51:35.000 It's very difficult to control a single algorithm that recommends the journey of millions of people through the space of the internet.
01:51:46.000 It's very difficult to control that.
01:51:48.000 And I think that intelligence instilled in those algorithms will have a much more potentially either positive or detrimental effect than sentient killer robots.
01:52:00.000 I hope we get to sentient killer robots.
01:52:02.000 Because that problem I think we can work with.
01:52:06.000 I'm very optimistic about the positive aspects of approaching sentience, of approaching general intelligence.
01:52:14.000 There's going to be a huge amount of benefit and I think there will be...
01:52:19.000 There's a lot of mechanisms that can protect against that going wrong.
01:52:24.000 Just from knowing the...
01:52:26.000 We know how to control intelligent systems.
01:52:30.000 When they are sort of in a box, when there are singular systems, when they're distributed across millions of people and there's not a single control point, that becomes really difficult.
01:52:40.000 And that's the worry for me is the distributed nature of dumb algorithms.
01:52:46.000 On every single phone, sort of controlling the behavior, adjusting the behavior, adjusting the learning journey of different individuals.
01:52:56.000 To me, the biggest worry and the most exciting thing is recommender systems, what they're called at Twitter, at Facebook, at YouTube, YouTube especially.
01:53:06.000 That one has just like I think you mentioned, there's something special about videos in terms of educating and sometimes indoctrinating people and YouTube.
01:53:18.000 Has the hardest time...
01:53:22.000 I mean, they have such a difficult problem on their hands in terms of that recommendation because they don't...
01:53:30.000 This is a machine learning problem, but knowing the contents of tweets is much easier than knowing the contents of videos.
01:53:39.000 Our algorithms are really dumb in terms of being able to watch a video and understand what's being talked about.
01:53:44.000 So all YouTube is looking at is the title and the description.
01:53:49.000 And that's it.
01:53:50.000 Mostly the title.
01:53:51.000 It's basically keyword searching.
01:53:54.000 And it's looking at the clicking, viewing behavior of the different people.
01:53:59.000 So it figures out that the Flat Earth Well, YouTube in particular,
01:54:19.000 they're trying to do something about the influx of conspiracy theory videos and the indoctrination aspect of them.
01:54:28.000 One of the things about videos is, say if someone makes a video And they make it on a very particular subject, and they speak eloquently and articulately, but they're wrong about everything they're saying.
01:54:41.000 They don't understand the science.
01:54:42.000 Say if they're talking about artificial intelligence.
01:54:44.000 They're saying something about things that you are an expert in.
01:54:48.000 They could, without being checked, without someone like you in the room that says that's not possible because of X, Y, and Z, without that, They can just keep talking.
01:54:59.000 So one of the things they do, whether it's about flat earth or whether it's about dinosaurs being fake or nuclear bombs being fake, they can just say these things and they do it with an excellent grasp of the English language, right?
01:55:12.000 So they say it They're very compelling in the way they speak.
01:55:16.000 They'll show you pictures and images, and if you are not very educated, and you don't understand that this is nonsense, especially if you're not skeptical, you can get roped in.
01:55:26.000 You can get roped in real easy, and that's a problem.
01:55:29.000 And it's a problem with some of the people that work in these platforms.
01:55:34.000 Their children get indoctrinated, and they get angry.
01:55:38.000 Their children get indoctrinated.
01:55:40.000 Now, what's interesting is They get indoctrinated also with right-wing ideology, and then people get mad that they're indoctrinated by Ben Shapiro videos.
01:55:51.000 So they'll get pissed off with that.
01:55:54.000 But you're okay with left-wing.
01:55:56.000 Right.
01:55:56.000 Why?
01:55:57.000 Because you're left wing.
01:55:59.000 So then it becomes like, okay, what is a problem?
01:56:02.000 What's really a problem?
01:56:03.000 And what is just something that's opposed to your personal ideology?
01:56:07.000 And who gets to make that distinction?
01:56:10.000 And that is where the arguments for the First Amendment come into play.
01:56:15.000 Like, should these...
01:56:18.000 Social media companies that have massive amounts of power and influence, should they be held to the same standards as the First Amendment?
01:56:26.000 And should these platforms be treated as essentially a town hall, like where anyone can speak, and there's a platform?
01:56:36.000 And there's a real problem that there's not that many of them.
01:56:39.000 This is a real problem.
01:56:40.000 The real problem is like...
01:56:41.000 Twitter is the place where people go to argue and talk about shit.
01:56:46.000 And Twitter maybe has a competitor on Facebook, but YouTube certainly doesn't have a competitor.
01:56:50.000 YouTube doesn't have any competitor.
01:56:51.000 I mean, there's Vimeo, there's a few other platforms, but realistically, it's YouTube.
01:56:57.000 You know, YouTube is a giant, giant platform.
01:57:02.000 What is this?
01:57:03.000 Alphabet reports YouTube ad revenue for the first time.
01:57:06.000 Video service generated $15.1 billion in 2019. Holy shit!
01:57:14.000 In comparison, I just looked up Twitch ad revenue was supposedly around $500 to $600 million.
01:57:22.000 Wow, that's a big difference.
01:57:23.000 And what about Facebook?
01:57:25.000 Facebook is stupendously valuable.
01:57:29.000 Probably way higher than that.
01:57:31.000 By the way, Facebook I don't think pays, like YouTube paid for my McDonald's burgers yesterday.
01:57:39.000 Yeah, Facebook's not right.
01:57:40.000 Facebook is not, and Twitter and Instagram, I don't think, are paying you directly.
01:57:44.000 There's a lot of calls to break up Facebook.
01:57:47.000 I mean, I'm on Facebook, but I'm not on it.
01:57:49.000 I don't use it.
01:57:50.000 It's just connected to my Instagram.
01:57:52.000 When I post something on Instagram, it goes to Facebook as well.
01:57:54.000 I never go to Facebook.
01:57:55.000 There's a Joe Rogan Facebook group that's a dumpster fire of brilliant folks.
01:58:00.000 Let me just put it that way.
01:58:02.000 Look at this.
01:58:03.000 Facebook's revenues amounted to $21.8 billion.
01:58:07.000 In just the fourth quarter.
01:58:08.000 Jesus Christ, just the fourth quarter, the majority of which were generated through advertising.
01:58:12.000 The company announced over 7 million active advertisers on Facebook during the third quarter of 2019. That probably, though, also adds an Instagram.
01:58:20.000 That thing with YouTube is just YouTube, not Google, not YouTube Premium, not anything else, just the address.
01:58:26.000 And to be fair, so the cash they have, they spend like Facebook AI research groups, some of the most brilliant.
01:58:34.000 It's a huge group that's doing general open-ended research.
01:58:38.000 Google Research, Google Brain, Google DeepMind are doing open-ended research.
01:58:42.000 They're not doing the ad stuff.
01:58:44.000 They're really trying to build...
01:58:46.000 That's the cool thing about these companies having a lot of cash is they can bring some of the smartest people and let them work on whatever in case it comes up with a cool idea.
01:58:54.000 Like autonomous vehicles with Waymo.
01:58:57.000 It's like, let's see if we can make this work.
01:58:59.000 Let's throw some money at it even if it doesn't make any money in the next 5, 10, 20 years.
01:59:04.000 Let's make it work.
01:59:05.000 That's the positive side of having that kind of money.
01:59:07.000 Yeah, that makes sense.
01:59:09.000 As long as they keep doing those kind of things.
01:59:12.000 The real concern, though, is that they're actually severely influencing the democratic process.
01:59:21.000 It's difficult, certainly in Jack Dorsey.
01:59:24.000 Jack Dorsey, in terms of the CEOs he interacted with, I think was one of the good guys.
01:59:28.000 Yes, I agree.
01:59:30.000 He wants a Wild West Twitter.
01:59:34.000 Well, he doesn't know it.
01:59:35.000 He wants a good Twitter.
01:59:36.000 He's kind of thinking about Wild West.
01:59:38.000 But his ideas have two.
01:59:41.000 Oh, two Twitter?
01:59:42.000 One that's filtered and one that's like...
01:59:44.000 Anything goes.
01:59:48.000 Yee-haw!
01:59:49.000 But I think the point is nobody knows what's the best kind of Twitter.
01:59:54.000 Even having two Twitters.
01:59:55.000 Do you really want the Wild West?
01:59:56.000 Do you want the First Amendment to say free speech for everyone?
02:00:00.000 It's a difficult...
02:00:02.000 The gray area there, you were just talking about YouTube with certain people saying that I'm an expert in AI or autonomous vehicles.
02:00:10.000 But I disagree with a lot of people.
02:00:12.000 And if those people make videos and maybe they don't have a PhD, God forbid, like are they not an expert either?
02:00:20.000 Am I right?
02:00:21.000 I'm actually personally sick of the academic sort of cathedral thinking that just because you have a PhD and just you can be an expert.
02:00:28.000 Like I'm not an expert.
02:00:30.000 I'm an idiot.
02:00:30.000 Do you feel like that line is getting more blurred with the access to all those MIT courses that are online and the extreme amount of data that's available to people, that there are going to be a lot of people that, even though they might not be classically trained, they have a massive amount of information?
02:00:48.000 And have an open mind.
02:00:49.000 Yeah.
02:00:53.000 I've recorded a podcast.
02:00:56.000 First of all, shout out to Jamie for being an incredible mastermind of audio production.
02:01:04.000 The reason I'm giving a shout out is because I suck so badly, I didn't have to do it.
02:01:09.000 I do it all myself.
02:01:11.000 But I do it pretty good.
02:01:15.000 When you learn it yourself from scratch, just like with Jiu Jitsu or with music and so on, I learn guitar from scratch.
02:01:20.000 You can learn with the online materials they have now.
02:01:23.000 You can become really good.
02:01:25.000 And the journey you take is not the traditional conformist journey through that education process.
02:01:33.000 You take your own journey.
02:01:34.000 And when you have millions of people taking their own journey through that process, There's going to be brilliant people without a PhD or without ever having gone to college.
02:01:42.000 Right.
02:01:42.000 And they...
02:01:44.000 I mean, it's difficult to know what to do with that, especially about political questions like economists.
02:01:50.000 There's these, you know, Paul Krugman, Nobel Prize winner...
02:01:56.000 Economists, Harvard economists, you know, they're supposed to be the holders of the truth and the fundamentals of our economy and when is there going to be a crash, what's good for the economy is the left, the right, what taxation system is good for the economy.
02:02:10.000 But nobody really knows.
02:02:11.000 Same with like nutrition science, psychology, economics, anything that involves humans is a giant mess that expertise can come from anywhere.
02:02:21.000 Right.
02:02:22.000 Like Rhonda Patrick, I think she's pretty criticized for – she's kind of young.
02:02:28.000 Yeah.
02:02:29.000 And she's – I would say she's incredibly knowledgeable as one of the world sort of experts.
02:02:34.000 But I think academia probably doesn't acknowledge her as an expert.
02:02:37.000 She's like young.
02:02:39.000 She recently got a PhD.
02:02:40.000 I don't – I'm not even sure – like there's that kind of hierarchy that people push down.
02:02:45.000 She's been – Unjustly criticized by people who don't even know her actual credentials.
02:02:51.000 There was one guy who was criticizing her and saying, well, she's not a clinical researcher.
02:02:55.000 That's one of the things he was backing his criticism on.
02:02:57.000 Like, no, that's exactly what she is.
02:02:59.000 And she's been doing that for years.
02:03:01.000 Like, you don't know what the fuck you're talking about.
02:03:03.000 People get very touchy with her because she's young and also because she's incredibly brilliant.
02:03:09.000 Like, that lady, she brings stacks and stacks of notes when she comes here.
02:03:12.000 She doesn't even look at them.
02:03:13.000 She just rattles off all those studies off the top of her head.
02:03:16.000 She has a massive amount of data available.
02:03:18.000 And she's very unbiased in her perceptions of things.
02:03:23.000 She's all about what do the results say?
02:03:27.000 What have the studies proven?
02:03:29.000 What can we learn from those studies?
02:03:30.000 And what do we have to take into consideration when we're assessing this data?
02:03:34.000 She's brilliant.
02:03:36.000 She's off the charts brilliant.
02:03:38.000 And people get fucking jealous, and I've seen it.
02:03:40.000 I've seen it with weaker, lazy minds in academia that criticize her that had, at least at one point in time, had a larger platform.
02:03:49.000 I think her platform is bigger now, and I'm happy that I've played a part in that.
02:03:53.000 And I don't want to be a social justice warrior, but I have seen women being criticized more harshly in a lot of domains of science.
02:04:00.000 I think you're right.
02:04:01.000 Yeah.
02:04:01.000 Well, you know, she's pretty, too.
02:04:03.000 There's a lot of things wrong there, you know?
02:04:05.000 Young, pretty.
02:04:05.000 Yeah, I get criticized for that, too.
02:04:07.000 Like, good-looking.
02:04:07.000 Beautiful guy.
02:04:09.000 Good-dressed well.
02:04:09.000 Good-looking, funny, yeah.
02:04:10.000 And handsome.
02:04:11.000 No.
02:04:11.000 Actually, I get criticized as this guy's an idiot.
02:04:14.000 He's boring.
02:04:15.000 Why can't he be more like Joe Rogan?
02:04:17.000 Okay.
02:04:18.000 What else you got there?
02:04:20.000 What else?
02:04:20.000 The notes wise?
02:04:21.000 Yeah.
02:04:22.000 I gotta talk to you about martial arts.
02:04:23.000 Okay.
02:04:24.000 I gotta talk to you about...
02:04:25.000 Well, so I'm a huge fan of wrestling.
02:04:30.000 I'm a huge fan of the Dagestan region.
02:04:34.000 Yes.
02:04:35.000 And I've gotten a lot of shit for it.
02:04:37.000 Posted that, you know, Conor's gonna be cowboy before that happened.
02:04:41.000 I'm also a huge fan of the different styles of fighters in MMA. And I'm surprised how much shit actually Conor gets.
02:04:51.000 Even though he brought...
02:04:52.000 Besides...
02:04:53.000 Besides sort of all the...
02:04:57.000 All the mess that came with him.
02:04:59.000 He also brought an interesting style, an interesting way of approaching fights, an interesting style of thinking and also philosophizing about fighting, which I think is amazing.
02:05:10.000 It clashes with the ideas of Khabib.
02:05:13.000 To me, Khabib Nurmagomedov, It'd be...
02:05:17.000 So I posted that Conor would be Cowboy, and...
02:05:21.000 Well, I didn't know Dana was going to say what he was going to say, but he would face Masvidal, and I thought he could beat Masvidal.
02:05:26.000 And then the biggest fight ever, 30,000 people in Moscow against Khabib.
02:05:32.000 For a rematch.
02:05:33.000 For a rematch would be the greatest fight ever.
02:05:35.000 Getting past Masvidal is not easy.
02:05:37.000 And Khabib getting past Tony Ferguson is not easy either.
02:05:40.000 Not easy, yeah.
02:05:41.000 Both of those fights...
02:05:42.000 And first of all, Masvidal is now going to fight Usman, which is very interesting.
02:05:47.000 Yeah, that's July.
02:05:49.000 Very, very interesting fight.
02:05:52.000 Usman is such a tank.
02:05:54.000 He's fucking terrifying.
02:05:56.000 They hate each other.
02:05:57.000 There's a lot of animosity and a lot of shit-talking.
02:06:02.000 But it's also, the more that happens, the better it is for both of them in terms of revenue generated.
02:06:08.000 It's a really interesting fight.
02:06:10.000 Now, let me tell you something.
02:06:11.000 When Masvidal was at the Conor cowboy fight, when they put the camera on him, biggest pop from the crowd.
02:06:18.000 The biggest.
02:06:19.000 Pop meaning like a blaze screaming and stuff.
02:06:21.000 People went nuts.
02:06:22.000 They saw him, they were like, yeah!
02:06:24.000 And he was like...
02:06:26.000 Wearing that robe.
02:06:27.000 He's hilarious.
02:06:29.000 Dude, I mean, he's a slow starter in terms of his career, being recognized for the kind of fighter that he is now, and also being recognized publicly as a superstar.
02:06:41.000 But his time has come.
02:06:42.000 He is here.
02:06:43.000 He is a fucking star.
02:06:45.000 When that camera went on him...
02:06:47.000 And the audience saw him, that crowd went bananas.
02:06:50.000 The entire T-Mobile arena, they went crazy.
02:06:54.000 Yeah, that would be an epic fight.
02:06:56.000 But in terms of the great, I just think, maybe it's me, the romanticized notion of Rocky IV, but Khan vs.
02:07:04.000 Khabib in Moscow, I can just see it with Putin and Fedor sitting there.
02:07:09.000 I'll sit next to him.
02:07:11.000 Do you think they would do it in Moscow?
02:07:12.000 Yeah.
02:07:13.000 30,000 people.
02:07:14.000 If Conor went to Moscow, man, good luck getting out of there if you win.
02:07:21.000 Good luck getting out of there if you lose.
02:07:24.000 He's so loved there.
02:07:26.000 Khabib is so loved in Russia.
02:07:28.000 But I think Russian people also love MMA generally.
02:07:34.000 The number of people that love fighting in Russia is huge.
02:07:40.000 And I know it seems like on the internet, Khabib is like, they love Khabib and Conor's hated.
02:07:48.000 But I think ultimately, they love a good, what does Conor call it?
02:07:53.000 A good...
02:07:54.000 Heel?
02:07:54.000 No, no.
02:07:56.000 A good scrap, I think he calls it.
02:07:57.000 A good fight.
02:07:58.000 Yeah.
02:07:59.000 Yeah, so I think that would be probably the biggest fight of all time.
02:08:02.000 And I think actually Conor has a shot.
02:08:05.000 I love...
02:08:07.000 Khabib is probably my favorite fighter.
02:08:08.000 I love that style of fighting.
02:08:09.000 I like the Sayetiev brothers that Frankie Edgar mentioned to you about.
02:08:16.000 They're probably the...
02:08:17.000 Bouvay Sar Sayetiev is the greatest freestyle wrestler of all time.
02:08:22.000 Just epic.
02:08:23.000 His brother Adam has a match against the Soldier of God.
02:08:29.000 What's his name?
02:08:29.000 Yoel Romero.
02:08:30.000 Yoel Romero at the 2000 Olympics in the finals.
02:08:34.000 Yoel Romero looks like, if you were to imagine the most terrifying opponent ever, he's just like shredded, ripped.
02:08:43.000 And then Adam Saitiev looks like, I don't know, a dad bod, very skinny nerd.
02:08:49.000 And he just effortlessly destroys him.
02:08:54.000 Really?
02:08:55.000 Yeah.
02:08:55.000 With a trip.
02:08:56.000 Let me see that video.
02:08:57.000 Is it online?
02:08:59.000 2000 Olympic Sydney Finals.
02:09:01.000 Adam Saitev versus...
02:09:03.000 Spell his name?
02:09:05.000 Adam Saitev.
02:09:07.000 S-A-I-T-I-E-V versus...
02:09:13.000 Yoel Romero.
02:09:14.000 Yoel Romero's fighting Israel Adesanya in March.
02:09:19.000 That is giant.
02:09:21.000 I can't...
02:09:22.000 Maybe not too much of a nerd, but...
02:09:24.000 Well, he definitely doesn't look as built as Yoel.
02:09:27.000 Yoel's a freak.
02:09:29.000 Yoel's probably the freakiest athlete I think I've ever seen personally in terms of his build, like his small waist.
02:09:35.000 He hugged that guy and picked him up.
02:09:37.000 I was at the end of it.
02:09:39.000 What moment was I looking for in here?
02:09:40.000 There's a couple moments where he scores points.
02:09:44.000 Yoel's got him down here.
02:09:48.000 Yeah, so he's up by a 4-1.
02:09:49.000 And I think, once again, he takes him down.
02:09:52.000 There's a 4-0 there.
02:09:52.000 I don't see.
02:09:55.000 Just start it off in the beginning so we could watch it.
02:09:58.000 There's a certain moment.
02:09:59.000 I mean...
02:09:59.000 There it goes.
02:09:59.000 Just start it right there.
02:10:01.000 They're basically technicians.
02:10:03.000 Yes, for sure.
02:10:03.000 When you look at the Dagestani people, I mean, there's such emphasis on technique.
02:10:08.000 Yeah.
02:10:08.000 Of everything else.
02:10:09.000 But also toughness.
02:10:10.000 It's like they have both things.
02:10:13.000 And this is one of the things that George St. Pierre told me about training in Russia.
02:10:17.000 Excuse me, training in Montreal.
02:10:19.000 What a takedown right there.
02:10:20.000 Fuck, that was spectacular.
02:10:22.000 Oh my God, it's amazing.
02:10:23.000 Look at that.
02:10:25.000 I think that was an inner trip, Uchigari type of trip.
02:10:30.000 It covers his mouth.
02:10:32.000 What George St. Pierre told me about training with Russian nationals in Montreal, he said they're so technical in that you get a lot of Americans that are definitely technical, but they emphasize that Being hard and tough and grueling training routines and grinding,
02:10:52.000 butting heads in practice.
02:10:53.000 And he said, whereas the Russian nationals are far more committed to drilling, far more committed to the technical aspects of exchanges and going through one technique after the other, chaining these techniques together,
02:11:10.000 understanding the paths.
02:11:11.000 Also, one of the, at least to me, one of the differences, it could be similar to Yael Romero's actually philosophy, but the philosophy of the Dagestani, the Russian people, the Soviet Union, is that recognition, fame, money, all that stuff doesn't matter.
02:11:27.000 Even winning doesn't matter.
02:11:28.000 The purity of the art is what matters.
02:11:31.000 At least with the Sayetiev brothers is what they stood for.
02:11:34.000 Well, that mirrors what Khabib says about Conor, that he doesn't want to have a rematch with him.
02:11:39.000 He's like, fuck that dude.
02:11:41.000 That's there.
02:11:41.000 I mean, Khabib is a little bit more of the modern age.
02:11:45.000 I mean, he has an Instagram and Twitter and so on, right?
02:11:47.000 And Khabib, despite what he says, also does a little bit of trash talking.
02:11:52.000 He still plays the game a little bit.
02:11:54.000 I'm going to change his face.
02:11:56.000 That's my favorite.
02:11:57.000 Send me location.
02:11:58.000 I want to change his face.
02:11:59.000 Most people say, especially when I'm here, is basically if Khabib did science.
02:12:09.000 I take that as a compliment.
02:12:10.000 That's one of my favorite quotes he's ever said, though.
02:12:12.000 I want to change his face.
02:12:13.000 I want to change his face.
02:12:14.000 It's terrifying.
02:12:15.000 It's terrifying because he can do it.
02:12:17.000 The cool thing is with Conor, that doesn't affect him.
02:12:20.000 The confidence he has.
02:12:22.000 The confidence that Conor has is just incredible.
02:12:25.000 Well, that he wants to do it again.
02:12:26.000 But I know for a fact that Conor was going through a whole lot of shit before that fight and did not have the best training camp.
02:12:32.000 And if he did an amazing training camp for this, like he really prepared.
02:12:35.000 Like he did for Conor.
02:12:36.000 Or excuse me, Cowboy.
02:12:37.000 The Cowboy fight, his coaches were saying he's never looked better.
02:12:40.000 That he just was on fire and so focused and so...
02:12:45.000 So accurate and and precise in training and that he was just on fire and that just seemed to be that all of the Bullshit and the distractions and all the things that sort of come with being the kind of global superstar that Connor is He managed to figure out a way to get away from those and to just really concentrate on his craft and and pull everything to a Championship level again and god damn it.
02:13:12.000 He looked like it against cowboy And to see the contrast of those two cultures, I mean, it is a Rocky IV type of situation.
02:13:19.000 Yeah.
02:13:20.000 Because you better believe Conor McGregor will resume trash talking.
02:13:23.000 Who knows?
02:13:23.000 He might not.
02:13:24.000 He might not.
02:13:25.000 He might not.
02:13:25.000 I mean, he didn't in this fight with Cowboy at all.
02:13:28.000 He didn't do any trash talking.
02:13:30.000 I wonder if maybe he has learned.
02:13:32.000 And I wonder if, you know, his desire to beat Khabib...
02:13:37.000 It eclipses his desire to get inside of his head and play all the games that he usually plays and the promotional games that ultimately probably won't be necessary.
02:13:47.000 But I think, you know, the UFC is trying to push for it right now.
02:13:50.000 They're pushing for it right now, a rematch with Khabib, but they're ignoring Tony Ferguson in a lot of ways, in my eyes.
02:13:56.000 And I'm like, that is the boogeyman.
02:13:58.000 That's going to be exciting.
02:13:59.000 It can go anywhere.
02:14:01.000 He's the boogeyman, dude.
02:14:02.000 He doesn't get tired.
02:14:03.000 He doesn't get tired.
02:14:04.000 He slices everybody up.
02:14:05.000 He's lost one fight in X amount of years, and that was because he had a broken arm.
02:14:12.000 Michael Johnson broke his arm.
02:14:13.000 So when you think about what Tony has been able to do to world-class fighters, what he did to Donald, I mean, he just smashed Donald's face.
02:14:23.000 He smashed Anthony Pettis.
02:14:25.000 He smashes everybody.
02:14:27.000 Tony Ferguson's the goddamn boogeyman.
02:14:28.000 He really is.
02:14:30.000 He doesn't get tired, man.
02:14:32.000 Do you think he'll get taken down?
02:14:33.000 For sure he'll get taken down.
02:14:37.000 He's not scared to be taken down.
02:14:40.000 That's the difference between Tony and everyone else.
02:14:42.000 If he gets taken down, he might let him take him down and just attack off of his back and elbow the shit out of him off of his back.
02:14:49.000 He's fucking dangerous off his back.
02:14:52.000 He's hard to control.
02:14:53.000 He scrambles very, very well.
02:14:56.000 He also has fantastic submissions, and he catches them from everywhere.
02:15:00.000 I mean, he catches triangle chokes, darts chokes.
02:15:03.000 His darts chokes are spectacular.
02:15:05.000 He's got one of the best darts chokes in the sport.
02:15:07.000 And he's not scared.
02:15:10.000 Ooh, if Khabib gets submitted?
02:15:12.000 My god, that would be crazy.
02:15:13.000 What if he puts Khabib asleep?
02:15:17.000 Do you remember when Dustin Poirier caught Khabib in a guillotine?
02:15:20.000 Yeah, he did.
02:15:22.000 He caught Khabib in a guillotine.
02:15:23.000 Listen, that is not where you want to be with Tony Ferguson.
02:15:26.000 You do not want to be in that position with Tony Ferguson.
02:15:28.000 That's a different kind of guillotine.
02:15:31.000 Dustin Poirier is primarily a striker.
02:15:34.000 Clearly he has submission skills.
02:15:35.000 He submitted guys before.
02:15:37.000 He submitted Max Holloway.
02:15:38.000 Dustin Poirier is a bad motherfucker, no doubt about it.
02:15:41.000 But when it comes to pure submission skills...
02:15:45.000 Tony Ferguson has an edge.
02:15:47.000 And, you know, he's a black belt, a 10th planet black belt.
02:15:50.000 He's a master of submissions and a great wrestler and a great scrambler.
02:15:55.000 And the thing about him that's so fucking terrifying is his cardio.
02:16:00.000 It's all the things, right?
02:16:01.000 It's the striking, it's the grappling, it's the submission abilities.
02:16:05.000 But he's not going to get tired.
02:16:06.000 He doesn't get tired.
02:16:07.000 And his mind is impenetrable.
02:16:09.000 His mind's impenetrable.
02:16:11.000 Yeah, people are looking past that fight.
02:16:12.000 Fuck no, not me, man.
02:16:14.000 I don't understand it.
02:16:15.000 When the UFC's talking about, you know, look at everybody he's fought.
02:16:20.000 Beat the fuck out of everybody.
02:16:22.000 Edson Barboza, Rafael dos Anjos.
02:16:25.000 Tony Ferguson, yeah.
02:16:25.000 Yeah, I mean, he smashes people.
02:16:28.000 He smashes people.
02:16:31.000 Yeah.
02:16:31.000 I mean, it's crazy.
02:16:32.000 I would say it probably could be his toughest fight to do.
02:16:37.000 I think it is his toughest fight.
02:16:39.000 I do.
02:16:39.000 I think that puts...
02:16:41.000 You know, a lot of people put Khabib close to the top 10 of all time.
02:16:45.000 Oh, he's in the top 10 of all time.
02:16:47.000 In my eyes.
02:16:48.000 He's 28-0.
02:16:49.000 As a lightweight.
02:16:50.000 Who cares about the record?
02:16:52.000 You look at the people you've beat.
02:16:53.000 Sometimes we idolize people for the perfection of the record too much.
02:16:56.000 Dude, the way he ragdolled Rafael Dos Anjos, the way he steamrolled, like, I mean, he's beaten top-flight competition and made them look like they have no business being in there with him.
02:17:09.000 But I think if you beat Tony Ferguson, I mean, that places him, that's immense.
02:17:12.000 And people put him above, like, I don't know, I think Hanna deserves to be in that story, in that top, like, 15, top 10. Perhaps.
02:17:24.000 Perhaps.
02:17:25.000 Jose Aldo.
02:17:26.000 Yes.
02:17:27.000 I don't know why people look past Jose Aldo or Eddie Alvarez.
02:17:32.000 Oh yeah.
02:17:33.000 The Eddie Alvarez fight was unbelievable.
02:17:36.000 At least, maybe I'm just biased in the sense that I thought there's no way that Conor beats Jose Aldo.
02:17:43.000 And then there's no way Conor beats Eddie Alvarez moving up a weight class.
02:17:48.000 I always thought he's going to lose.
02:17:50.000 And being surprised makes me up...
02:17:53.000 Connor's ability in my head.
02:17:54.000 Well, he's phenomenal.
02:17:56.000 With Connor, it seems to be a matter of how focused he is, and who is he fighting, and where is he at in his life.
02:18:05.000 His life is so chaotic.
02:18:07.000 He's always filled with so many distractions.
02:18:09.000 I mean, think about all the crazy shit that he's done, the throwing, the...
02:18:13.000 Throwing the dolly at the bus and just all the nutty shit he's done.
02:18:18.000 But it's nice that he seems to be still hungry to fight even though he probably has a lot of money in the bank.
02:18:23.000 Well, he certainly was hungry to fight Cowboy.
02:18:26.000 I mean, he looked fantastic in that fight.
02:18:29.000 And again, he's worth a couple hundred million dollars.
02:18:32.000 So it's just the pure love of the game.
02:18:35.000 Pure love of the game.
02:18:36.000 And that's kind of the warrior ethos that Khabib comes from, and it's cool to see that.
02:18:41.000 Mind if I... Nobody's ever said anything in Russian, actually, probably in the Joe Rogan podcast.
02:18:46.000 No, I don't think so.
02:18:47.000 If you ever need a translator...
02:18:48.000 Okay.
02:18:49.000 I'm your man.
02:18:49.000 No, can I read a...
02:18:51.000 So, Saitiev, just a few lines in Russian.
02:18:55.000 Okay, sure.
02:18:56.000 So, Bovaysar Saitiev would read Boris Parstanak, which is a famous Russian poet, won the Nobel Prize before every match, and he kind of captures that ethic.
02:19:06.000 So, this is the poem.
02:19:07.000 I'll say it in Russian.
02:19:09.000 Okay.
02:19:09.000 And then in English.
02:19:10.000 Please.
02:19:11.000 Okay.
02:19:11.000 Okay.
02:19:13.000 We're good to go.
02:19:15.000 We're good to go.
02:19:33.000 I know there's a bunch of Russian people that would appreciate that.
02:19:36.000 The translation is a bit crappy.
02:19:38.000 It's very difficult to translate the Russian language.
02:19:40.000 But it's...
02:19:41.000 The others, step by step, will follow the living imprint of your feet.
02:19:46.000 But you yourself must not distinguish your victory from your defeat.
02:19:50.000 And never for a single moment betray your credo or pretend.
02:19:54.000 But be alive.
02:19:56.000 Only this matters.
02:19:57.000 Alive and burning to the end.
02:20:00.000 So this is the end of a poem that represents the fact that most of the poem says that fame, recognition, money, none of that matters.
02:20:11.000 The winning and losing, none of that matters.
02:20:13.000 What matters is the purity of the art.
02:20:15.000 Just giving yourself completely over to the art.
02:20:18.000 So like others will write your story.
02:20:20.000 Others will tell whether you did good or bad.
02:20:23.000 Others will inspire using your story.
02:20:25.000 But as the artist, so in the case of Pasternak, he's a poet, writer, wrote Dr. Zhivago, is the art, you should only think about the art and the purity of it and the love of it.
02:20:39.000 And so when you look at Bouvasir Saiteyev and the brothers and the whole Dagestan region, They shunned fame.
02:20:48.000 So the thing that Khabib is thrust into this MMA world, which is fundamentally, I mean, it's a popular sport.
02:20:56.000 It's an interesting thing.
02:21:00.000 I mentioned, I think last time I was on here, the most terrifying human being.
02:21:04.000 You know investors when they like buy a penny stock seeing it's gonna blow up to me the most terrifying human being in the heavyweight division the the Russian tank I mentioned last time the Sadolayev who now just continues destroying everybody and it looks like he's already won the gold medal one bunch of world championships he's a heavyweight the heavyweights in the UFC should be scared.
02:21:26.000 Is he gonna fight MMA? So the hard thing...
02:21:30.000 Spell his name and let's get a video so we can look at it.
02:21:34.000 There he is.
02:21:34.000 Jamie's got it.
02:21:35.000 Bam!
02:21:37.000 I will never join MMA. No, hold on a second.
02:21:40.000 The MMA. The MMA. That's part of the quote.
02:21:46.000 And that's not...
02:21:51.000 Yeah, that's closer to where he was chasing.
02:21:54.000 He's still chasing the Olympics.
02:21:56.000 How do you say the name?
02:21:58.000 How did you say his first name?
02:22:00.000 Well, I just call him the Russian tank.
02:22:02.000 But it's Abdul Rashid Zedelaev.
02:22:07.000 Okay.
02:22:08.000 23, 24 years old.
02:22:11.000 And I think his tension is, he says he has a lot of close friends who are MMA fighters.
02:22:18.000 He loves watching it.
02:22:19.000 He feels a lot for them.
02:22:21.000 But it's not the very thing that this poem gets at.
02:22:27.000 He thinks that wrestling, the pure sport of wrestling, is all about courage, skill.
02:22:33.000 He describes it in this way.
02:22:34.000 He thinks MMA also has to have this component of trash talk and showmanship.
02:22:41.000 And he doesn't like it.
02:22:44.000 But I think that MMA needs that guy.
02:22:48.000 Like a heavyweight Khabib.
02:22:50.000 Heavyweight Khabib.
02:22:52.000 Every Conor needs a Khabib.
02:22:53.000 Every showman needs a person who says showmanship sucks.
02:22:59.000 Every Ali needs a Frazier.
02:23:00.000 Yeah, Frazier, right.
02:23:02.000 But this guy is terrifying.
02:23:04.000 I think he would do the same thing to the Khabib division.
02:23:09.000 Again, humble technique is everything, but just strength-wise is also a monster.
02:23:14.000 And is he really thinking about fighting or no?
02:23:17.000 It's hard to say.
02:23:18.000 It's hard to say because, again, one of the greatest wrestlers of all time really focused on 2020 Olympics.
02:23:24.000 He's throwing punches here.
02:23:25.000 I think what's going to happen is once likely wins gold at this Olympics, he's going to, you know, this titanic ship, a 23-, 24-year-old ship is going to start thinking and turning.
02:23:39.000 Maybe there is artistry.
02:23:41.000 Maybe there is skill and courage in mixed martial arts.
02:23:45.000 Well, there definitely is.
02:23:46.000 I mean, he doesn't have to do the trash talking thing.
02:23:49.000 There's a lot of people that are very stoic that fight and they don't participate in any of that stuff.
02:23:55.000 You know, and then there's people that thrive on that stuff.
02:23:58.000 I mean, it's really up to you.
02:24:01.000 The UFC doesn't tell you you have to talk trash.
02:24:04.000 You know?
02:24:04.000 I mean, results are what matters.
02:24:07.000 Right.
02:24:07.000 And it's not even trash that's interesting.
02:24:09.000 I think stories are interesting.
02:24:10.000 Yeah.
02:24:11.000 That's why people like team sports, like NFL. Did you watch the Super Bowl yesterday?
02:24:18.000 No, I didn't.
02:24:18.000 I went to Disneyland.
02:24:21.000 I wanted to talk to you about something that was at Disneyland.
02:24:23.000 What's that?
02:24:24.000 There's a new Star Wars ride.
02:24:26.000 Yeah.
02:24:26.000 This crazy Star Wars ride.
02:24:27.000 And it's a 20-minute ride.
02:24:31.000 I mean, it's a crazy long ride, and a lot of it, you're in like a vehicle.
02:24:36.000 Yes.
02:24:36.000 And the vehicle is all programmed by computers.
02:24:40.000 The direction of the vehicle, the way the vehicle moves, it's very complex.
02:24:44.000 There's no tracks.
02:24:45.000 So you're riding around in this vehicle, and the vehicle, like, they're shooting at you, the vehicle has to back up, you go into this new door, the vehicle knows how to go around a corner, and what's that guy's name?
02:24:55.000 Darth Maul is trying to cut through the wall.
02:24:57.000 Spoiler alert.
02:24:58.000 Fucking, this new ride is amazing.
02:25:01.000 It's crazy how intricate and complicated it is and how far off the deep end Disneyland went to create this thing.
02:25:09.000 I mean, it looks so crazy.
02:25:11.000 I mean, you're like, how much money did this cost?
02:25:14.000 This is it right here.
02:25:15.000 So you're riding around in these things and stormtroopers are shooting at you.
02:25:19.000 Are there rails or no?
02:25:20.000 No.
02:25:20.000 No, there's no rails, man.
02:25:22.000 Everything is done by computer.
02:25:25.000 The computer tracks out the environment and knows where each one of these go.
02:25:30.000 And by the way, there's several cars moving at the same time.
02:25:34.000 So there's people in front of you.
02:25:35.000 They're in cars.
02:25:36.000 They get shot at.
02:25:37.000 And look at the fucking scale of this place.
02:25:40.000 So that's one of them giant four...
02:25:42.000 Four-legged robot things that's in Star Wars.
02:25:44.000 So you're moving underneath them.
02:25:46.000 There's giant cannons that you have to move through.
02:25:49.000 It's in Rise of the Resistance.
02:25:51.000 It's trackless.
02:25:53.000 Yeah.
02:25:53.000 So this represents...
02:25:54.000 There's lines on the ground.
02:25:55.000 There's some...
02:25:56.000 I think those lines on the ground are just the wheels going the same way over and over and over again.
02:26:01.000 Yeah, no.
02:26:02.000 Sorry.
02:26:02.000 I just wanted to sort of commentate, but they're probably not used in the computer version.
02:26:06.000 So yeah, I think it's probably LiDAR-based.
02:26:08.000 It's...
02:26:09.000 I don't know what it's based on, but the computer is coordinating all of these different things at the same time.
02:26:16.000 You go through this room and you're seeing battles outside and you feel it.
02:26:20.000 You see the walls get hit like that.
02:26:22.000 It's fucking crazy, man.
02:26:25.000 It's amazing.
02:26:26.000 I mean, the line is bonkers.
02:26:29.000 So the robotics aspect of this, like the AI aspect of this, is probably minimal.
02:26:34.000 Look at that.
02:26:35.000 Look at that.
02:26:35.000 You're in this thing.
02:26:36.000 You move through this room.
02:26:37.000 And in the background, you're watching these starships shooting at each other.
02:26:42.000 It's all timed perfectly.
02:26:44.000 Yeah.
02:26:45.000 It's crazy, man.
02:26:46.000 Yeah, so to make this happen, I mean these are people that are willing to probably invest hundreds of millions in this.
02:26:54.000 Guaranteed.
02:26:54.000 So I think there's some, like there's very minimal AI in this because AI creates uncertainty and uncertainty is very undesirable in situations like this.
02:27:03.000 Yes.
02:27:04.000 Yeah, I don't think there's any AI in it, but for sure there's some sort of automation, some computer automation.
02:27:10.000 Yeah, but it's basic software.
02:27:13.000 It's software automation.
02:27:14.000 Don't call this basic.
02:27:15.000 Don't you dare.
02:27:16.000 Don't you dare, Lex.
02:27:18.000 Star Wars is not even real.
02:27:21.000 Oh!
02:27:21.000 Who are you?
02:27:22.000 There's reusable rockets being launched on a monthly basis, and we're going to colonize Mars for real soon.
02:27:28.000 That's real.
02:27:28.000 That's more interesting, for sure.
02:27:30.000 Definitely.
02:27:30.000 But this is dope.
02:27:31.000 So I'll be sitting on Mars while you...
02:27:34.000 No, I'll be here playing fucking Disneyland rides.
02:27:36.000 And then I'll go home and sleep in a bed and breathe air.
02:27:39.000 Fuck, you'll be out there on Mars.
02:27:40.000 And the history books will remember you.
02:27:43.000 The history books?
02:27:45.000 I don't know.
02:27:46.000 History books don't matter once you're dead.
02:27:48.000 Yeah.
02:27:49.000 I mean, it's nice that we have access to the history books, and I praise the historians for sure, but I'm not interested in making history.
02:27:57.000 Yeah, I don't know actually why I said that because I don't care about the history books.
02:28:00.000 Just the exciting – it's one of the only frontiers that we can actually be explorers in.
02:28:06.000 Like we've explored – well, the depths of the ocean we haven't exactly explored.
02:28:10.000 Right, yeah.
02:28:10.000 But the outer space, that's like – Man, that's like the most exciting possibility for engineering and science that we can explore.
02:28:19.000 And the mind, like I mentioned.
02:28:21.000 We don't know shit about the mind and exploring that with neuroscience, with AI. Just all of that, the cautiousness.
02:28:28.000 Oh, the other thing you talked about with Bostrom, the simulation.
02:28:31.000 Yes.
02:28:32.000 I wanted to talk to you about that, too.
02:28:34.000 Because you brought up Bostrom.
02:28:37.000 Bostrom relies on, I mean, he was relying on theories in terms of like mathematical theories of probability to say that he thinks it's more likely that we're in a simulation.
02:28:52.000 Yeah.
02:28:52.000 Yeah, the thing he's articulated, I don't think he's come up with the idea of the simulation.
02:28:57.000 He just kind of really thought about it deeply.
02:29:00.000 He came up with a simulation argument, which are these three categories that he described to you, possible outcomes.
02:29:06.000 I think the first one is we destroy ourselves before we ever create a simulation.
02:29:12.000 The second one is that we would lose interest in creating a simulation at some point.
02:29:18.000 And the third one is we're living in a simulation.
02:29:20.000 Yeah.
02:29:21.000 Where do you lean?
02:29:24.000 I think there's going to—the three paths that he highlighted, it makes it sound like it's so clear that it's just three, but I think there's going to be a huge amount of possibilities of the kinds of simulations.
02:29:38.000 Like to me, I keep asking, you know, to ask Elon Musk about the simulation where he said— What's on the other side?
02:29:47.000 What's outside the simulation.
02:29:49.000 Yeah, I think I asked, what would you ask an AGI system?
02:29:52.000 You said, what's outside the simulation is the question.
02:29:54.000 Yeah, he believes in it.
02:29:56.000 Or at least he entertains it as a troll.
02:29:59.000 Elon Musk embodies the best of the Twitter internet troll, a meme, and a brilliant engineer and designer in one.
02:30:10.000 It's like a quantum state that you can't quite figure out what's the coupling.
02:30:15.000 Because I don't know if he's trolling, but I'm the same way.
02:30:18.000 I love asking people about the simulation, even though I get a little bit of hate from the scientific community.
02:30:23.000 Why do you get hate from the scientific community about simulation?
02:30:27.000 Because it's a ridiculous notion if you think of it literally because it's not a testable thing.
02:30:34.000 We don't know how to test.
02:30:35.000 Why are you talking about this?
02:30:37.000 Why do you sit down with Elon Musk and talk about the simulation when you're sitting with a world expert in particular aspects of rockets or robotics?
02:30:46.000 I'm an expert.
02:30:47.000 I can't believe I just said that.
02:30:48.000 I'm not an expert in anything.
02:30:49.000 But I know a few things about autonomous vehicles.
02:30:52.000 I like to think of it as...
02:31:11.000 How would we build a simulation?
02:31:13.000 What would be a compelling enough virtual reality game that you want to stay there for your whole life?
02:31:19.000 That's a first step there.
02:31:20.000 That's useful to think about what is our reality?
02:31:24.000 What aspects are most interesting for us humans to be able to perceive with our limited perception and cognitive abilities, interpret and interact with?
02:31:34.000 And then the bigger question then is how do you build a larger scale simulation that would be able to create that virtual reality game, which I think is a possible future.
02:31:45.000 We're already creating virtual worlds for ourselves on Twitter and social networks and so on.
02:31:50.000 I really believe that virtual reality...
02:31:54.000 We'll spend more and more of our lives in the next 50 to 100 years in virtual worlds.
02:32:00.000 And the simulation hypothesis and simulation discussion is part of that.
02:32:06.000 I think there's...
02:32:08.000 The question of what's outside simulation is really interesting.
02:32:10.000 That's the other way of...
02:32:12.000 Because what created us?
02:32:14.000 What started the whole thing?
02:32:16.000 It's the modern version of asking...
02:32:19.000 What is God?
02:32:21.000 What does God look like?
02:32:23.000 It's asking, what does the programmer look like?
02:32:27.000 I think that's a fascinating question.
02:32:31.000 But arguing that we're already living in a simulation, I think you've got stuck on that little point.
02:32:36.000 I think it's not that...
02:32:38.000 There's a bit of a language barrier, too.
02:32:41.000 There's a technical...
02:32:43.000 I think Nick is legit.
02:32:45.000 It's funny.
02:32:46.000 Nick is a legit philosopher.
02:32:47.000 And so he's been fighting battles in the philosophy game.
02:32:50.000 Like, you ask them, does somebody disagree with him on these hypotheses?
02:32:54.000 And there's a bunch of philosophers that disagree with him, including Sean Carroll on the philosophical level.
02:32:59.000 And a lot of the arguments are in philosophy and they're sort of technical and they're about language and about terms and so on.
02:33:06.000 But I think, yeah, it's very possible that we live in a simulation.
02:33:13.000 I think...
02:33:15.000 One of the constructs of physics, theoretical physics, with many worlds interpretation of quantum mechanics, as Sean has talked to you about, reveals some interesting fundamental building blocks of our reality.
02:33:28.000 There's something I don't think people have talked to you about, which is the coolest thing to me, the most amazing thing that nobody can explain.
02:33:37.000 Yeah, things called cellular automata.
02:33:40.000 And there's a guy, a mathematician named John Conway who came up in the 70s with a thing called Game of Life.
02:33:47.000 And cellular automata are these two-dimensional, one-dimensional, but Game of Life is a two-dimensional grid Where every single little cell is really dumb and it behaves based on the cells next to it.
02:34:00.000 And it's born when there's like a certain number, like three cells alive next to it and it dies otherwise.
02:34:07.000 So it's like a simple rule for birth and death.
02:34:09.000 And all it knows is its nearby surrounding and its own life.
02:34:14.000 And if you take that system with a really dumb rule and expand it in size, arbitrary complexity emerges.
02:34:21.000 You can have Turing machines, so you can simulate perfect computers with that system.
02:34:28.000 And it can grow, and all these behaviors grow.
02:34:30.000 Like if you watch, if people Google Game of Life, and you can watch this extremely dumb, simple system just grow arbitrary complexities.
02:34:43.000 And what you start to realize that from such incredibly simple building blocks that don't know anything about the bigger world around them, you can build our entire universe.
02:34:56.000 You can build the kind of complexities we see in us.
02:34:59.000 So we think that God...
02:35:01.000 It's like designing every little aspect or whatever of our world or a simulation hypothesis.
02:35:07.000 The simulation is designed by hand, like I'm going to craft these things.
02:35:11.000 What you realize is all you need to do is just set some initial conditions, set some really basic rules, And allow the system to grow.
02:35:21.000 As long as it can grow arbitrarily, just crazy stuff, amazing stuff can happen.
02:35:26.000 From simplicity, complexity can emerge.
02:35:30.000 If you study this a little bit closer, just watch it.
02:35:34.000 People can watch Game of Life on YouTube and think about what it's showing for 10 minutes.
02:35:39.000 It'll blow your mind.
02:35:41.000 The fact that from...
02:35:44.000 Simplicity, arbitrary complexity, beauty can emerge.
02:35:49.000 It's incredible.
02:35:50.000 For the simulation, the creator of the simulation is probably some 13-year-old nerd living in his mom's basement.
02:35:59.000 It's probably just set some rules in this video game and press play.
02:36:02.000 And then arbitrary complexity can emerge.
02:36:05.000 It can have a Joe Rogan, it can have an Elon Musk, all the technologies that we've developed, and probably millions of other alien species that are living throughout our universe.
02:36:15.000 Jesus.
02:36:17.000 So the, yeah, that to me, the cellular automata reveals that the simulation is much easier to create than we might think.
02:36:28.000 But there's a lot of variability in the kinds of simulations we'll create.
02:36:32.000 I think the simulation hypothesis thinks there's one.
02:36:36.000 But I think there's going to be a lot of varieties.
02:36:40.000 There's a lot of possible different rule sets.
02:36:42.000 There's a lot of different...
02:36:47.000 We're good to go.
02:37:06.000 Suffering.
02:37:07.000 So consciousness brings with it this idea of basically, you know, subjective experience.
02:37:13.000 A subjective experience comes with the idea of pain and fear and so on.
02:37:18.000 The thing, again, my Russian romanticization of it, but I think fear of death is essential.
02:37:23.000 Scarcity is essential for beauty, for life.
02:37:26.000 And that's a nice feature of this little simulation we've got going on.
02:37:31.000 That there could be a lot of different alternatives, I think.
02:37:34.000 It could be less individualistic, less consciousness can be present in different kinds of forms.
02:37:40.000 So I see there's a lot more options than those three that he highlights.
02:37:45.000 And we can destroy ourselves in a lot of interesting ways.
02:37:49.000 The entire civilizations from AI to nuclear weapons to biological, to all kinds of weapons.
02:37:53.000 So it's almost like Whether it's a simulation or not is almost irrelevant.
02:37:59.000 The complexity of the existence and all of the various pushes and pulls that keep everything together, they're almost operating like some grand plan.
02:38:16.000 Whether they like it or not.
02:38:17.000 Whether or not a grand plan exists.
02:38:19.000 All these different things are happening and everything is moving in a very specific direction, right?
02:38:24.000 It's moving towards further complexity.
02:38:29.000 Like, I was having a conversation with a friend of mine last night where we were talking about phones.
02:38:34.000 And we were like, you know, like, when are they ever going to look at a phone and say, I think we're good.
02:38:40.000 We don't have to, the camera works great.
02:38:43.000 Signal's great.
02:38:44.000 You can call people.
02:38:45.000 You can text people.
02:38:46.000 Let's just stop innovating right here.
02:38:47.000 And we're both laughing.
02:38:48.000 Like, it's never going to happen.
02:38:49.000 But even though we admit, like, if you have an iPhone 11 or a Pixel 4, is that what you have?
02:38:56.000 Yeah, Pixel 4, yeah.
02:38:57.000 They work great.
02:38:58.000 You don't really need anything better.
02:39:01.000 Like, in terms of the way our culture works, you get so much done on these things.
02:39:06.000 You can bank on them.
02:39:07.000 Is it okay if I'm drinking all your waters, by the way?
02:39:09.000 Yeah, we have a lot of water, yeah.
02:39:10.000 Please.
02:39:11.000 Okay.
02:39:11.000 You're so Russian.
02:39:14.000 That's very polite of you.
02:39:17.000 The existence itself, whether or not there's a design to it, It seems to operate in a matter that would indicate there's a design.
02:39:29.000 The design doesn't have to be real.
02:39:31.000 It doesn't have to be a simulation.
02:39:33.000 It doesn't have to be a grand plan, but it moves in the same way as if it's a grand plan.
02:39:40.000 It's weird.
02:39:40.000 It's hard to put into words, but there's a different force and a momentum like the evolutionary process.
02:39:46.000 The fact that life was created, the fact that there's always a kind of progress.
02:39:50.000 And also, just like with the Native Americans, the fact that suffering seems to be a constant story that was weaving in.
02:39:58.000 We constantly progress, but it seems to be creating the other and torturing.
02:40:06.000 There seems to be constant suffering and war and so on through this growth process.
02:40:11.000 Death is a huge part of that.
02:40:14.000 And conflict.
02:40:16.000 Conflict.
02:40:16.000 Even social conflict.
02:40:17.000 Like we were talking about social justice warriors and that type of thing.
02:40:21.000 I think they almost have to exist.
02:40:24.000 It's almost like the world creates a space for them and people find a way to fill that space.
02:40:29.000 Yeah.
02:40:30.000 The conflict, by the way, also, I don't know if you're even aware, you're kind of, even though you were thrust into politics, you're not aware of politics, but there is the Iowa caucus going on today.
02:40:40.000 It's like the first vote for the Democrats.
02:40:42.000 Yeah.
02:40:42.000 Bernie's leading in the polls.
02:40:44.000 Which is interesting.
02:40:45.000 But that's the fun little...
02:40:48.000 Americans have their own little conflict going on here.
02:40:52.000 Oh, there's always going to be conflict with all groups of people, with everything.
02:40:57.000 I mean, there's conflict in the comedy community.
02:40:59.000 I'm sure there's conflict in the AI and autonomous vehicle community.
02:41:03.000 There's conflict.
02:41:05.000 I mean, and those things are critical.
02:41:08.000 You know, you learn from conflict.
02:41:10.000 If everything was just simple and easy, and there was no resistance whatsoever, nothing would get done.
02:41:15.000 And also, your own personal systems would never get tested.
02:41:19.000 I feel like every adversity that you experience is really a gift, because on the other end of that adversity, there's an opportunity for massive growth.
02:41:26.000 What was that Think and Grow Rich quote that...
02:41:31.000 Lovato said the other day every adversity carries the seed of an equivalent advantage.
02:41:40.000 I mean, just that.
02:41:46.000 Yeah.
02:41:46.000 That's a beautiful way to see it.
02:41:48.000 That's a beautiful quote.
02:41:49.000 I had to write it down.
02:41:50.000 I bought that book, too.
02:41:51.000 I'm going to get to that once I'm worn out on Native Americans.
02:41:54.000 I've got about seven other Native American books.
02:41:57.000 I've been, like I mentioned, doing the startup since August, and it's been a bit of a torture.
02:42:05.000 The self-doubt is pretty hardcore because I've been failing nonstop.
02:42:09.000 So I'm trying to build, spending most of my day programming.
02:42:12.000 You're trying to build a her.
02:42:14.000 Or a she.
02:42:15.000 Whatever it is.
02:42:15.000 Is it she or her?
02:42:16.000 What is it?
02:42:16.000 Her.
02:42:17.000 Her.
02:42:17.000 But no, on that path, there's a particular thing because you want to create a business.
02:42:22.000 You have to create tools that people would enjoy using on the path.
02:42:26.000 That's a long journey to create a companion that can form a deep friendship.
02:42:31.000 That seems so weird.
02:42:34.000 Everything seems weird until your life becomes better because of it.
02:42:40.000 Like, flying cars seem weird.
02:42:43.000 Oh yeah.
02:42:44.000 To me still.
02:42:45.000 In fact, Uber and Hyundai just partnered.
02:42:48.000 They're still pushing this idea of flying cars, electric VTOLs.
02:42:51.000 I just feel like people are going to slam into each other.
02:42:54.000 Unless they are autonomous and they have like magnets, so they repel.
02:42:59.000 You know, like, they can't hit each other, they get close, and they just go...
02:43:02.000 And what happens when they hit, like, to me, the...
02:43:05.000 Like, what does an accident look like?
02:43:07.000 They fall on your head.
02:43:08.000 Yeah.
02:43:08.000 You're hanging out in your house trying to watch, you know, Black Mirror.
02:43:11.000 Or also, like, currently most accidents people can walk away from.
02:43:15.000 Like, cars today are incredible.
02:43:17.000 Right.
02:43:17.000 And I don't know how you can walk away from an in-air crash.
02:43:21.000 Good question.
02:43:22.000 Very good question.
02:43:23.000 You probably won't.
02:43:24.000 Yeah.
02:43:25.000 Fuck, that's scary.
02:43:26.000 Yeah, but any technology kind of seems...
02:43:30.000 Awkward or weird, you can be terrified of it or you can think it's weird until it takes over.
02:43:36.000 I mean, none of us know what that would look like to have a closer connection with AI systems.
02:43:43.000 I don't know.
02:43:44.000 One of the things in this book that I'm in the middle of, I'm actually towards the end of this Black Elk book, is it details the invention of the automobile and the implementation of it and how the world changes.
02:43:57.000 That was the other surprising thing about this book is it's so recent.
02:44:00.000 It's crazy.
02:44:01.000 Really, really recent.
02:44:02.000 So during this time where Black Elk was a young boy, sees Custer get killed, takes his first scalp, remembers the sound of the man gritting his teeth as he's cutting his hair off, like cutting his scalp off.
02:44:18.000 And then later on in his life, as he's an older man, the world goes from very few automobiles to most people have an automobile during his lifetime.
02:44:32.000 Most travel is by automobile.
02:44:34.000 What does he say about this world, this new world?
02:44:38.000 I'll let you read the book.
02:44:40.000 He doesn't even know about this world.
02:44:42.000 He knows about the world in the 1930s.
02:44:44.000 I believe he died in the late 30s.
02:44:47.000 It's scary to be born.
02:44:49.000 Not scary, but I don't know what it would feel like to...
02:44:52.000 Be born in this natural world to see the kind of suffering in the U.S. military and then see the technology of the Industrial Revolution kind of propagate and be faced with that.
02:45:05.000 I don't know what that would feel like.
02:45:06.000 I don't know which world is better.
02:45:08.000 Which world represents progress?
02:45:11.000 Right, right.
02:45:11.000 What is progress?
02:45:13.000 Yeah.
02:45:13.000 What is progress?
02:45:14.000 I mean, progress seems to be inevitable complexity.
02:45:18.000 Inevitable, never-ending complexity.
02:45:21.000 And then there's this push towards that.
02:45:24.000 And I've always wondered...
02:45:26.000 If, I mean, Elon has this saying that human beings are the biological bootloader for AI. And I've always thought that if you paid attention to the human being's desire for materialism, like materialism seems to be like a constant.
02:45:42.000 Throughout cultures, people want things.
02:45:44.000 And when they have things, they want better things.
02:45:46.000 They want newer things.
02:45:47.000 Well, that generates a consistent level of innovation inside that civilization, that culture.
02:45:54.000 People are going to make better stuff because people are going to want better stuff.
02:45:57.000 So they're going to improve upon things.
02:45:59.000 Well, if you just scale that and you keep going, improving, improving, what do you get to?
02:46:04.000 Well, you get to something like artificial intelligence.
02:46:07.000 You get to something like some sort of...
02:46:13.000 Some sort of an event, some sort of a thing where the world changes.
02:46:16.000 And I think technology will help us ride that wave.
02:46:19.000 I'm an optimist in that sense.
02:46:21.000 We haven't talked about much, but I'm an optimist on Neuralink.
02:46:25.000 I think there'll be a few exciting developments there.
02:46:28.000 So Neuralink, Brain Computer Interfaces, I think it's a really exciting possibility there.
02:46:33.000 That Nick Bossom was too also skeptical about.
02:46:35.000 I'm more positive about it.
02:46:37.000 Increasing the bandwidth of our brain, being able to communicate with the internet, with the information.
02:46:44.000 It doesn't necessarily need to be through brain-computer interfaces, but increasing that bandwidth to expand our ability of our mind to reason.
02:46:52.000 Not to expand the ability to reason, sorry.
02:46:55.000 To take the mechanism of our mind's ability to reason and expand it with access to a lot of information.
02:47:01.000 And increase that bandwidth to be able to reason with facts.
02:47:04.000 Just like we can look up stuff on Wikipedia now, increasing the speed at which we can do that can, I think, fundamentally transform our ability to think.
02:47:13.000 Do you think that that's ever going to be a wireless option?
02:47:16.000 Because right now they have to drill holes in your head, right?
02:47:19.000 Yeah, I think there could be other interfaces, I think.
02:47:23.000 Yeah, I think so.
02:47:24.000 But also, like I said, weird technology.
02:47:27.000 Holes in your head sounds terrifying right now, but it could be normal.
02:47:33.000 Like ear piercing.
02:47:34.000 Well, yeah, ear piercings, yeah.
02:47:37.000 But there's something special about that.
02:47:39.000 Like, hey, did you get suited for Neuralink yet?
02:47:41.000 Billy's only 13. He's not ready for Neuralink.
02:47:44.000 We're going to wait until he's 16. He's like, Dad, all my friends have it.
02:47:48.000 Come on, Dad.
02:47:49.000 I want to get fitted.
02:47:51.000 And just like surgery, you take knee surgery, you take all surgery except brain surgery, and you take that for granted.
02:47:57.000 Yeah.
02:47:57.000 You're like okay with it, but on the brain it's...
02:48:00.000 Yeah, it's scary.
02:48:01.000 It's sketchy.
02:48:02.000 Can I... Because I know you probably got to go.
02:48:04.000 Yeah.
02:48:04.000 What do you got?
02:48:05.000 Last...
02:48:06.000 Can I close it out with a poem?
02:48:07.000 Let's do it.
02:48:08.000 Because I'm that guy.
02:48:09.000 Okay.
02:48:10.000 Because I've been doing the startup.
02:48:12.000 I've been suffering, so I'm reading a lot of Bukowski.
02:48:15.000 Oh, Bukowski poems.
02:48:16.000 Do you get drunk when you read them?
02:48:18.000 Of course.
02:48:18.000 Some whiskey.
02:48:20.000 Roll the dice.
02:48:21.000 Not vodka?
02:48:22.000 Vodka is for friends and family.
02:48:26.000 When you buy yourself, it's whiskey?
02:48:28.000 No, a man does not drink by himself.
02:48:31.000 Some men do.
02:48:32.000 Well, this man doesn't.
02:48:35.000 But it's more like relaxed thinking.
02:48:38.000 Drink is whiskey.
02:48:39.000 Vodka is we're going crazy.
02:48:41.000 Oh, okay.
02:48:41.000 We're going dark.
02:48:43.000 We're going to raid.
02:48:45.000 And pillage.
02:48:46.000 Roll the dice or go all the way by Charles Bukowski.
02:48:50.000 If you're going to try, go all the way.
02:48:53.000 Otherwise, don't even start.
02:48:55.000 If you're going to try, go all the way.
02:48:58.000 This could mean losing girlfriends, wives, relatives, jobs, and maybe your mind.
02:49:04.000 Go all the way.
02:49:06.000 It could mean not eating for three or four days.
02:49:08.000 It could mean freezing on a park bench.
02:49:11.000 It could mean jail.
02:49:12.000 It could mean derision, mockery, isolation.
02:49:16.000 Isolation is the gift.
02:49:18.000 All the others are a test of your endurance or how much you really want to do it.
02:49:22.000 And you'll do it, despite rejection and the worst odds, and it will be better than anything you can imagine.
02:49:29.000 If you're going to try, go all the way.
02:49:32.000 There's no other feeling like that.
02:49:35.000 You'll be alone with the gods, and the nights will flame with fire.
02:49:39.000 Do it.
02:49:40.000 Do it.
02:49:41.000 All the way.
02:49:43.000 All the way.
02:49:45.000 You will ride life straight to perfect laughter.
02:49:48.000 It's the only good fight there is.
02:49:53.000 Take a picture of you while you're reading that.
02:49:55.000 Pick up that piece of paper real quick.
02:49:58.000 We'll fake it.
02:49:58.000 Fake it for Instagram.
02:50:00.000 All right.
02:50:00.000 Fake it for Instagram.
02:50:01.000 People on Instagram that watch it will know.
02:50:04.000 Fake.
02:50:04.000 Fake.
02:50:05.000 Look up.
02:50:06.000 Fake.
02:50:08.000 That was awesome.
02:50:09.000 Appreciate you, brother.
02:50:10.000 Thank you very much for coming in here.
02:50:12.000 It's always a pleasure.
02:50:12.000 We've got to do it more often.
02:50:14.000 Yeah.
02:50:14.000 Ten more years.
02:50:15.000 Yes.
02:50:15.000 Ten more years.
02:50:16.000 Bye, everybody.
02:50:18.000 Nice.
02:50:19.000 Thanks, bro.
02:50:19.000 That was great, man.