The Joe Rogan Experience - September 25, 2024


Joe Rogan Experience #2206 - Chamath Palihapitiya


Episode Stats

Length

2 hours and 47 minutes

Words per Minute

162.90822

Word Count

27,336

Sentence Count

2,044

Misogynist Sentences

16

Hate Speech Sentences

13


Summary

In this episode, the boys talk about how the internet has changed the way we consume news and how it affects us and how we live in a world where we have no idea what we should be doing with our time. They also talk about what it's like to be a conservative news anchor on Fox News and how to deal with the constant stream of conservative talking points that's being pushed by the right and the left. Also, they talk about the fact that we're living in a bipolar world and how the outrage machine is ruining our ability to make sense of the world and why we shouldn't care about what's going on in the world because it's not really about us anymore. And, of course, there's a whole lot more. Enjoy the episode and tweet us what you thought of it! Timestamps: 4:00 - What's the worst thing about the internet? 6:30 - How do you deal with it? 7:00 8:15 - What are we supposed to do with all the new technology? 9:20 - How can we stop caring about what s going on? 11:15 12:30 What s going to happen to the world? 13:00- What s the point of the internet now? 14:40 - Is there any such thing as an outrage machine? 15:30- What do we need to do? 16:40- How do we know what we can we can control? 17:20- Why we should we be prepared for the internet in the 21st century? 18:20 19: What s our role in the algorithm? 21:15- Is there a better way to make the internet going to make us better? 22:00 | What s a good thing? 23:30 | Should we be worried about the future of the Internet? 25:00 + 22:40 26:10 27:10 - How to be aware of what s the real problem? 26 - What do you need to be doing in the future? 29:10 | What are you going to do in the next 5 years? 30:40 | How do I know what s our brains are going to be? 35:00 // 27:00 Is there something we should do in 2020? 32:00 Are we going to have a better version of our brain?


Transcript

00:00:15.000 It is the best clip because he's like a totally different person.
00:00:18.000 Well, it's what he really is.
00:00:20.000 But he really is.
00:00:20.000 Yeah, it's like the Ellen thing, you know, it's like...
00:00:25.000 I mean, he really did lose his shit there.
00:00:28.000 Oh, like, weirdly, you know?
00:00:30.000 Like, I got the Christian Bale one, because, like, he's in character, he's an intense scene.
00:00:36.000 Some guy's fucking around in the background, like, God damn it, stop fucking around!
00:00:39.000 I get that.
00:00:40.000 Like, he's in this frenzy of this intense scene.
00:00:44.000 But what is Bill doing?
00:00:50.000 Republican talking points on Fox News?
00:00:52.000 No, it was before.
00:00:53.000 It was when he was current affair.
00:00:55.000 It's when he's doing gossip and stuff.
00:00:57.000 Oh, that's right.
00:00:58.000 He was a gossip guy.
00:01:00.000 He was like an Entertainment Tonight type guy.
00:01:02.000 Inside Edition.
00:01:03.000 One of those deals.
00:01:04.000 Inside Edition.
00:01:04.000 Oh, is that what it was?
00:01:05.000 That's what it's called.
00:01:05.000 Ah, yeah.
00:01:07.000 And fucking those things.
00:01:09.000 They never go away.
00:01:10.000 It's such a weird environment, the left and right.
00:01:15.000 There's no, like, centrist news source on television.
00:01:19.000 There's no, like, this is probably what's going on, news source.
00:01:22.000 It's always one or the other, and it's like you're living in a bipolar person's brain.
00:01:29.000 I think, like...
00:01:31.000 Part of what's happened is we used to have news and you could make a good living in news and you know journalists were really sort of the top of the social hierarchy in some way shape or form because they were this check and balance and then somewhere along the way this business model focused people on clicks and nobody told the rest of the world that the underlying incentives were going to change and so that's where you find yourself where There's very little news
00:02:01.000 I think.
00:02:02.000 There's a lot of opinion.
00:02:04.000 And then the problem with opinion is that feeds the outrage machine.
00:02:08.000 And that's the, you know, the clickometer.
00:02:10.000 The clickometer doesn't go high when you're like, hey guys, I studied this equation in this show.
00:02:15.000 Right.
00:02:17.000 There's a 50% chance of this.
00:02:19.000 Nobody cares about that.
00:02:21.000 It's either like it's totally, totally bad or it's totally, totally good because it just, it amps people up.
00:02:27.000 And that's a real bummer because I think like you don't know what to think anymore.
00:02:32.000 Well, it's also a completely novel new thing that we weren't prepared for.
00:02:37.000 So before there was social media and online news, no one was prepared for the world of the algorithm.
00:02:46.000 No one was prepared for being like...
00:02:50.000 Literally everything that freaks you out is what you'll be shown because that shows that you're engaging and That's how it's set up for it, which is just so contrary to the rest of history I mean it was always it bleeds it leads in the news because they wanted people they wanted to win ratings news But they only had so much control and it was only on for an hour exactly and now it's just this 24 7 anxiety fest and It's omnipresent,
00:03:15.000 and it's basically made to convince you that you need more of that thing.
00:03:23.000 It's like a bad diet.
00:03:25.000 The first few shots of it, the first few bites of it, taste amazing.
00:03:30.000 But, you know, if you eat the full toffee cake every day for the next 365 days and then for the next year, I mean, you're going to get diabetes and heart disease.
00:03:41.000 So the version of that is like your brain just gets totally fried.
00:03:45.000 And then the bigger problem is when you're presented with, maybe it's not even the truth, but an opinion that you should consider.
00:03:53.000 You get totally shut out of it.
00:03:55.000 And that's like the real problem.
00:03:57.000 You build these antibodies in your body where this other version that says, take a step back and reconsider, it's not allowed.
00:04:05.000 Right.
00:04:06.000 And then if you said to yourself, well, how do you even start?
00:04:10.000 Where do you go?
00:04:11.000 A friend of mine works at Facebook and he showed me threads.
00:04:17.000 We were playing poker last week and he showed me threads.
00:04:20.000 And what's incredible is threads and X are like the exact polar opposites in some ways.
00:04:27.000 And he was showing me in the context of, like, how the outrage machine on threads works.
00:04:31.000 And the way it works is, like—and this woman wrote this article about it.
00:04:36.000 But what they'll do is they'll post something that says, 2 plus 2 equals 5. Just that.
00:04:42.000 That kind of thing?
00:04:43.000 And you'll get, like, a million views.
00:04:45.000 And then it'll first start with, like, you know, folks that are, like, trying to gently nudge this person.
00:04:50.000 Actually, you know, I want you to reconsider 2 plus 2 actually equals 4. You know?
00:04:54.000 And then it builds and it builds and it builds.
00:04:57.000 And so there's this weird version of how people react to information, and it tends to be kid gloves and then people just lose their mind at some point.
00:05:08.000 So it's kind of like...
00:05:08.000 And then over here, I think on X, what you find is there's a lot more structural data, but then it can easily get lost in the noise because there's just a few things that just constantly consume what the algorithms want to amplify and what is important in the moment.
00:05:23.000 And I think that finding a way to like probably blend the two is probably what is the best in the sense that there's probably some stuff over here that doesn't make it over here.
00:05:33.000 And there's probably a lot of stuff over here that doesn't make it over here.
00:05:37.000 A little bit of that diet for everybody probably goes a long way.
00:05:40.000 Is there heavy content moderation on threads?
00:05:44.000 I don't know.
00:05:45.000 I don't use it.
00:05:45.000 I don't either.
00:05:46.000 I don't have it.
00:05:46.000 I just saw it in that moment.
00:05:48.000 But when he described how the engagement farming works, it just sounded kind of like ludicrously, you know...
00:05:59.000 So to explain to people, 2 plus 2 equals 5 is a bizarre thing that was going around where they were talking about how math is racist.
00:06:08.000 Exactly.
00:06:08.000 Basically, that's where it gets to.
00:06:10.000 Math is racist.
00:06:10.000 Which is...
00:06:11.000 Oh, no.
00:06:15.000 If math is racist, we have a real problem.
00:06:19.000 Because that means everything's racist, because everything's math.
00:06:22.000 You know, we had this thing in, I don't know if you saw this, in the Bay Area in COVID, the city of San Francisco, you know, brought together their board of education or whatever, and they eliminated a bunch of AP classes, including like a bunch of AP math classes.
00:06:39.000 And part of it was because of this reason, because they felt it was exclusionary.
00:06:43.000 And I think what it misses is that There's all this other stuff you could do to kind of like even the starting line for folks.
00:06:51.000 But if you rip out things like AP Math, like take a step back, like in 8 billion people in the world, what are the odds that there's only literally one Steve Jobs or literally only one Elon Musk,
00:07:10.000 meaning capable of that kind of execution?
00:07:13.000 I suspect that there's maybe two.
00:07:17.000 And so part of our social responsibility as adults is how do you make it so that that second Steve Jobs can find a path to do stuff?
00:07:30.000 And I'm not saying AP math is the answer, but I'm saying there are people that I remember when I was growing up, I wasn't particularly good at anything in school.
00:07:41.000 But there were a few subjects which I was just like, wow.
00:07:44.000 The little time I spent in school, I felt a little safe.
00:07:48.000 I could connect.
00:07:49.000 It built up a little bit of my self-confidence, and I didn't feel so marginal.
00:07:54.000 I'm sure there's a lot of kids like that.
00:07:56.000 For some very small subset, maybe that's what that kind of class is.
00:07:59.000 It pushes them to a boundary that they didn't know was possible.
00:08:03.000 You're teaching them stuff.
00:08:04.000 That's really cool.
00:08:06.000 So I understand what the intent is, but then the byproduct is there's a small group of folks that get shut out, and then that person that could be that Steve Jobs-like person, that Elon Musk-like person, is held a little bit back.
00:08:22.000 And I think that that hurts all of us.
00:08:24.000 So you got to find a way where we're doing just a little bit better.
00:08:29.000 This episode is brought to you by The Farmer's Dog.
00:08:31.000 Dogs are amazing.
00:08:33.000 They're loyal.
00:08:33.000 They're lovable.
00:08:34.000 Just having Marshall around can make my day 10 times better.
00:08:38.000 I'm sure you love your dog just as much, and you want to do your best to help them live longer, healthier, happier lives.
00:08:45.000 And a healthy life for your dog starts with healthy food, just like it does for us.
00:08:50.000 There's a reason having a balanced diet is so important.
00:08:53.000 So how do you know if your dog's food is as healthy and as safe as it can be?
00:08:59.000 Farmer's Dog gives you that peace of mind by making fresh, real food developed by board-certified nutritionists to provide all the nutrients your dog needs.
00:09:08.000 And their food is human-grade, which means it's made to the same quality and safety standards as human food.
00:09:16.000 Very few pet foods are made to this strict standard.
00:09:19.000 And let's be clear, human-grade food doesn't mean the food is fancy.
00:09:23.000 It just means it's safe and healthy.
00:09:25.000 It's simple.
00:09:26.000 Real food from people who care about what goes into your dog's body.
00:09:31.000 The farmer's dog makes it easy to help your dog live a long, healthy life by sending you fresh food that's pre-portioned just for your dog's needs.
00:09:40.000 Because every dog is different.
00:09:41.000 And I'm not just talking about breeds.
00:09:43.000 From their size, to their personality, to their health, every dog is unique.
00:09:47.000 Plus, precise portions can help keep your dog at an ideal weight, which is one of the proven predictors of a long life.
00:09:55.000 Look, no one, dog or human, should be eating highly processed foods for every meal.
00:10:00.000 It doesn't matter how old your dog is, it's always a great time to start investing in their health and happiness.
00:10:06.000 So, try The Farmer's Dog today.
00:10:08.000 You can get 50% off your first box of fresh, healthy food at thefarmersdog.com slash rogan.
00:10:15.000 Plus, you get free shipping.
00:10:17.000 Just go to thefarmersdog.com slash rogan.
00:10:20.000 Tap the banner or visit this episode's page to learn more.
00:10:24.000 Offer applicable for new customers only.
00:10:26.000 Well, isn't that part of the problem with eliminating gifted classes?
00:10:31.000 I think they're doing that in New York.
00:10:34.000 Is that where they're doing that?
00:10:35.000 Find out if that's the case.
00:10:37.000 There's some where there's this hot controversy about eliminating the concept of gifted classes.
00:10:43.000 But the reality is there's some people that are going to find regular classes, particularly mathematics and some other things, they're going to find them a little too easy.
00:10:52.000 They're more advanced.
00:10:54.000 They're more advanced students and those students should have some sort of an option to excel and it should be inspiring Maybe intimidating, but also inspiring to everybody else.
00:11:03.000 I mean, that's part of the reason why kids go to school together.
00:11:06.000 Look how hard she works.
00:11:08.000 She works so much harder than me.
00:11:10.000 Look how much she's getting ahead.
00:11:11.000 Fuck, I gotta work harder.
00:11:12.000 And it really does work that way.
00:11:14.000 That's how human beings, in cooperation, that's how they grow together.
00:11:20.000 And I think that it used to be the case that if you went to in high school, you would be really cool with people that were going to like specific high schools to get really good at something.
00:11:32.000 Remember like that show on TV, Fame?
00:11:34.000 Yes.
00:11:35.000 Right?
00:11:36.000 And so, well, that was more about the performing arts, right?
00:11:38.000 Right.
00:11:38.000 But that was amazing.
00:11:39.000 It's like, you know, if you knew a kid that was going to one of those schools, what you'd say is, wow, you are incredibly talented in this one specific thing.
00:11:47.000 Go push the boundary of that and see what happens.
00:11:51.000 I think we owe it to ourselves to say that.
00:11:54.000 Yes.
00:11:54.000 Right?
00:11:55.000 There's 330 million Americans in the United States.
00:12:00.000 Don't you think that if we created a bunch of different ways for people to figure out what they're super good at, things are better, not worse?
00:12:09.000 What is the answer?
00:12:10.000 Do you think things take a huge step back?
00:12:15.000 You have more Joe Rogans, you have more Kevin Harts, you have more great actors, you have more great directors, but you also have more engineers.
00:12:23.000 You have more scientists, you have more doctors, and you created a way for them to just go deep in something where their curiosity took them.
00:12:31.000 That's okay.
00:12:32.000 What's wrong with that idea?
00:12:33.000 There's nothing wrong.
00:12:34.000 It sounds optimal.
00:12:36.000 It sounds pretty reasonable.
00:12:37.000 It sounds great.
00:12:38.000 It's just a matter of resources and then also completely revamping how you teach kids.
00:12:44.000 This is my gripe with this whole ADHD thing.
00:12:49.000 You know, I've talked to many people who have varying opinions on whether or not that's an actual condition or whether or not there's a lot of people that have a lot of energy and you're sitting in a class that's very boring and they don't want to pay attention to it.
00:13:02.000 So instead, you drug them and you give them medication that is essentially speed and lets them hyper-focus on things and now all of a sudden, little Timmy's locked on.
00:13:12.000 You know, it was really just the medication that he needed.
00:13:15.000 And I think for a lot of those kids, if they found something that was really interesting to them, you know, maybe they're really bored with this, but they're really excited by biology.
00:13:25.000 Maybe there's something that, like, resonates with their particular personality and what excites them, and, you know, they could find a pathway.
00:13:33.000 And instead, we have this very rigid system that wants to get children accustomed to the idea of sitting still For an hour at a time, over and over and over again throughout the day, being subjected to people who aren't necessarily that motivated or getting paid that well.
00:13:51.000 Well, we're going to probably talk about AI today, but let's just touch on this just in this one second.
00:13:58.000 We are going to create computers that are able to do a lot of the rote thinking for us.
00:14:09.000 What that means is, I think, The way that humans differentiate ourselves is that we're going to have to have judgment and taste.
00:14:17.000 Those are very defining psychological characteristics, in my opinion.
00:14:22.000 But what that means is if you go back to how school is taught, what you said Is very limiting for what the world is going to look like in 30 years.
00:14:32.000 You know, in 30 years where you have a PhD assistant that's in your pocket that can literally do all of the memorization, spell checking, grammar, all of the fact recall for you.
00:14:45.000 Teaching that today is probably not going to be as important as interpreting it.
00:14:51.000 Like, how do you teach kids to learn to think, not to memorize and regurgitate?
00:14:56.000 So we have to flip, I think, this education system.
00:14:59.000 We have to try to figure out a different way to solve this problem because, like, you can't set children in this generation up, our kids, To go and have to compete with a computer.
00:15:15.000 That's crazy.
00:15:16.000 It's crazy.
00:15:16.000 That thing can make a Drake song in three minutes.
00:15:19.000 The computer is going to win.
00:15:20.000 So what can't the computer do is, I think, maybe a reasonable question.
00:15:24.000 And I think the computer...
00:15:26.000 In a lot of cases, can't express judgment.
00:15:30.000 It'll learn, but today it's not going to be able to, the same way that humans can.
00:15:35.000 It's going to have different taste, right?
00:15:37.000 So the way that we interpret things, the same way that you motivate people, like all the psychology, all these things that are sort of like the softer skills that allowed humans to cooperate and like work together, that stuff becomes more important when you have a fleet of robots.
00:15:52.000 And so if you go all the way back to school, Today, the school system is unfortunately in a kind of a pretty tough loop.
00:16:05.000 Like, look, teachers, I think, are going to become the top three or four important people in society.
00:16:15.000 And the reason is because they are going to be tasked with teaching your kids and my kids how to think, not to memorize.
00:16:22.000 Don't tell me what happened in the War of 1812. You can just use a search engine or use a chat GPT and find out the answer.
00:16:32.000 But why did it happen?
00:16:34.000 What were the motivations?
00:16:35.000 If it happens again, what would you do differently or the same?
00:16:39.000 And those kinds of reasoning and judgment things, I think, were still far ahead of those computers.
00:16:44.000 So the teachers have to teach that, which means you have to pay them more, you have to put them in a position to do that job better.
00:16:50.000 And then back to what you said, I've lived this example of ADHD in my family.
00:16:57.000 I have five kids.
00:16:57.000 One of the kids was diagnosed with it.
00:17:01.000 And unfortunately, what happens is the system a little bit closes in on you.
00:17:04.000 So on the one side, they give you a lot of benefits, I guess.
00:17:08.000 I put it in quotes because you get these emails that say if they want extra time, if they want this, if they want, you know, they'll give you a computer, for example, to take notes so that you don't handwrite.
00:17:20.000 So those feel like aids to help you, right?
00:17:24.000 But then on the other side, you know, one person was very adamant, like, hey, you want to medicate?
00:17:31.000 And my ex-wife and I were just like, under no circumstances are we medicating our child.
00:17:37.000 That was a personal decision that we made with the information that we had knowing that specific kid.
00:17:42.000 All kids are different, so I don't want to generalize.
00:17:45.000 And then the crazy thing, Joe, what we did was we took the iPad out of the kid's hand.
00:17:50.000 And we said, you know, we had these very strict device rules.
00:17:55.000 And then COVID turned everything upside down.
00:17:58.000 And you're just surviving.
00:18:00.000 You're sheltering in place.
00:18:02.000 Five kids running around.
00:18:04.000 They're not really being, you know, taught by the schools.
00:18:08.000 The schools won't convene the kids.
00:18:11.000 And so what do you do?
00:18:12.000 You just hand them the device.
00:18:13.000 Everything was through the device.
00:18:16.000 The little class they got, through the device.
00:18:18.000 The way that they would talk to their friends, through the device.
00:18:20.000 So it reintroduced itself in a way that we couldn't control.
00:18:25.000 And then we saw this slippage.
00:18:28.000 And then what we did was we just drew a bright red line and we said, we're taking it out of your hands.
00:18:33.000 No more video games.
00:18:35.000 No more iPad.
00:18:37.000 We're going to dose it in very small doses.
00:18:39.000 And he had an entire turnaround.
00:18:41.000 But then here's what happened.
00:18:43.000 I took my eye off the ball a little bit this summer because it was like he had a great year.
00:18:49.000 He reset.
00:18:50.000 His self-confidence was coming back.
00:18:51.000 I was like, man, this is amazing.
00:18:53.000 And then I do the thing that, you know, a lot of people would do.
00:18:57.000 Here, you can have an hour.
00:18:58.000 Yeah, it's fine.
00:18:59.000 You know, talk to your friends, you know.
00:19:01.000 And then it started again, and then again, now we just have to reset.
00:19:04.000 So, at least in our example, what we have found, and it may not apply to everybody, but for us, him not being bathed in this thing had a huge effect.
00:19:19.000 Playing basketball outside, roughhousing with his brothers, having to talk to his friends, having to talk to us, watching movies.
00:19:29.000 You know, or we would just sit around because, by the way, what I noticed was like my kids had a hard time watching movies or listening to songs on Spotify for the full duration.
00:19:43.000 They'd get to the hook and they'd be like, Ford.
00:19:45.000 Next.
00:19:45.000 And they'd be like, you know, they'd watch like eight minutes next.
00:19:48.000 And I was like, what are you guys doing?
00:19:50.000 Like, this is like enjoying the fullness.
00:19:53.000 They couldn't even sit there for three and a half minutes.
00:19:56.000 Yeah.
00:19:56.000 So what at least my son was learning was to just chill a little bit.
00:20:04.000 Be there.
00:20:05.000 Be able to watch the show.
00:20:06.000 And these shows move at a glacial pace relative to what they're used to if they're playing a video game.
00:20:12.000 Or TikTok.
00:20:13.000 Or TikTok.
00:20:14.000 Yeah, because TikTok, they're like this.
00:20:17.000 Boom, boom, boom, boom.
00:20:19.000 And it's helped.
00:20:20.000 It's not a cure.
00:20:22.000 But it just goes back to what you're saying, which is like, If you give parents options, and I heard this crazy stat.
00:20:32.000 I don't know if this is true.
00:20:34.000 If you take your devices away from a kid, the kid will feel isolated from their other students.
00:20:41.000 The critical mass—I don't know if this is true or not, but it's what I was told, so I'll deal with it—was that if you get a third of the parents, so like in a class of 20, if you get a third of the parents to agree as well, no devices, the kid feels zero social isolation.
00:20:58.000 Because it becomes normative.
00:21:00.000 It's normal.
00:21:02.000 You've got a flip phone and you're texting like this to your parents or you're calling.
00:21:08.000 I don't know.
00:21:09.000 It may be worth trying.
00:21:10.000 There was a crazy thing.
00:21:12.000 I don't know, Jimmy, if you can find this, but there was a crazy thing.
00:21:14.000 Eaton College.
00:21:16.000 Which is like the most elite, if you will, private school in the UK. It's kind of where like all the prime ministers of the United Kingdom have matriculated through Eton College.
00:21:28.000 It's like high school, fancy high school.
00:21:31.000 They sent a memo to the parents for the incoming class.
00:21:36.000 And the headmaster said, when you get on campus with your child, we're going to give you like what is basically a Nokia flip phone.
00:21:45.000 You are going to take the SIM card out of this kid's iPhone or Android.
00:21:49.000 You're going to stick it in this thing.
00:21:51.000 And this is how they're going to communicate with you and communicate amongst each other while they're on campus.
00:21:56.000 Wow.
00:21:58.000 Mandatory.
00:21:59.000 Mandatory.
00:22:01.000 I thought this was incredible.
00:22:03.000 I don't know what the impact is, but that takes a lot of courage and I thought that's amazing.
00:22:11.000 Well, it's great because then if they're communicating, they're only communicating.
00:22:15.000 They're not sharing things or Snapchatting each other back and forth and the addictive qualities.
00:22:22.000 of these phones, which is, if you think about the course of human evolution and you think of how we adapted to agriculture and civilization and we essentially became softer and less muscular and less aggressive, that took a long time.
00:22:39.000 That was a long time.
00:22:40.000 This thing is hitting us so quickly.
00:22:44.000 And one of the bizarre things is it creates a disconnection.
00:22:50.000 Even though you're being connected to people consistently and constantly through social media, there's a disconnection between human beings and normal behavior and learning through interaction with each other, social cues,
00:23:05.000 all the different things that we rely on to learn how to be a friend and to learn how to be better at talking to each other.
00:23:13.000 I have a rule with my oldest who's 15. He'll call me.
00:23:19.000 He'll call me.
00:23:22.000 Or even when I call him.
00:23:29.000 It's like a grunt greeting.
00:23:33.000 Right.
00:23:33.000 They don't know how to talk anymore.
00:23:36.000 And I'm like, hello?
00:23:39.000 Hello?
00:23:42.000 And so I went through this thing where I would just hang up.
00:23:45.000 And I'm like, you know, beep, hang up.
00:23:47.000 And then he would call me back, huh?
00:23:54.000 And then finally I said, I just want you to have these building blocks.
00:24:00.000 They may sound really stupid to you right now, but looking people in the eye, being able to have a normal conversation, and be patient in that conversation.
00:24:12.000 It's going to be really valuable for you.
00:24:15.000 People will really be connected to you.
00:24:17.000 You may not feel that and you may think this is like lame and stupid what I'm telling you, but I was like, just try to just try to do it.
00:24:26.000 And then what's so funny is like, I would tell this story about like, you know, our kids go to like, you know, very well meaning private school, right?
00:24:36.000 And I almost think like sometimes like, again, we're not teaching necessarily kids to think for themselves.
00:24:43.000 We're asking them to memorize a bunch of things.
00:24:46.000 And one of the things that I worry that we've taught our kids to memorize are like the fake greetings and salutations.
00:24:53.000 So on the one end, you have what's really visceral, which is, oh.
00:24:58.000 Yeah.
00:25:21.000 Thank you.
00:25:22.000 And I'm like, what are you doing?
00:25:24.000 Who taught you that?
00:25:24.000 And I thought you were taught at school to say thank you like that?
00:25:28.000 You could just say thank you.
00:25:30.000 Right.
00:25:31.000 Thanks.
00:25:31.000 I appreciate that.
00:25:31.000 Just look somebody in the eye.
00:25:32.000 Thank you.
00:25:33.000 But what concerns me is as this tech gets more and more invasive in terms of how human beings, particularly children, interface with it, and as it gets – I mean, really, we would just be guessing as to what comes out of AI and to what kind of world we're even looking at in 20 years.
00:25:51.000 It seems like it's having a profound effect on the behavior of human beings, particularly young human beings and their developmental.
00:25:59.000 How old are you?
00:26:00.000 I'm 48. I'm 57. So when I grew up, there was zero of this.
00:26:05.000 And I got this slow trickle through adulthood from when I was a child, the VHS tapes and answering machines through the big tech.
00:26:14.000 Yeah, you had the rotary phone.
00:26:15.000 Yes.
00:26:15.000 Yeah, exactly.
00:26:16.000 So we went through the whole cycle of it, which is really interesting.
00:26:20.000 So you get to see this profound change in people and what it's doing to kids.
00:26:27.000 And you've got to wonder, what is that doing to the species?
00:26:29.000 And is that going to be normal?
00:26:33.000 Is it going to be normal to be emotionally disconnected and very bizarre in our person-to-person interface?
00:26:43.000 I think that when technologies get going, you have this little burst.
00:26:49.000 It's like these Cambrian moments.
00:26:51.000 You get these little bursts which are overwhelmingly positive.
00:26:54.000 I don't know what your reaction was, but my reaction when I first saw the iPhone, I was blown away.
00:27:02.000 And I think the first four or five years was entirely positive.
00:27:07.000 Because it was just so novel.
00:27:09.000 Like you took this big computer, and we effectively shrunk it to this little computer, made it half to a third of the cost, and lo and behold, supply-demand, just the number of computers, tripled and quadrupled and quintupled, and so many more people were able to be a part of that economic cycle.
00:27:28.000 All positive.
00:27:30.000 Then you get a little dip.
00:27:31.000 And the little dip is when I think we lose a little bit of the ambition of that first moment and we get caught up in the economics of the current moment.
00:27:41.000 And what I mean by that is, you know, the last five or ten years, I think why you feel this viscerally is...
00:27:48.000 We haven't had a big leap forward from the iPhone of really 2014-15.
00:27:54.000 And I'm not picking on the iPhone, I'm just like a mobile device.
00:27:57.000 So what have you had over the last 10 years?
00:27:59.000 You've had an enormous amount of money get created by an enormous number of apps.
00:28:07.000 And the problem is that they are in a cul-de-sac.
00:28:09.000 And so they'll just iterate in this one way that they understand, because the money is really good, quite honestly.
00:28:16.000 And the incentives of the capital markets will tell you to just keep doing that.
00:28:22.000 But then I think what happens is something shocks us out of it and then we get the second wave.
00:28:27.000 So if you go all the way back to look at the PC, the first moment of the PC in the 70s and the early 80s was incredible.
00:28:36.000 You had these people that were able to take it and do all kinds of really interesting things.
00:28:41.000 It was pure.
00:28:43.000 Then you had sort of like the 90s and the early 2000s, and what was it?
00:28:48.000 It was duopolistic at best, Microsoft and Intel.
00:28:53.000 And what they were able to do was extract a huge tax by putting all of these things on folks' desks, and it was still mostly positive.
00:29:01.000 But it was somewhat limited because most of the spoils went to these two companies and all the other companies basically got a little bit run over.
00:29:10.000 And then it took the DOJ to step in in 2000 and try to course correct that on behalf of everybody basically.
00:29:18.000 And then what happened was the internet just exploded.
00:29:21.000 And the internet blew the doors wide open.
00:29:24.000 And all of a sudden, if you had a PC, you didn't have these gatekeepers.
00:29:28.000 It actually didn't even matter whether you were running on Intel anymore.
00:29:31.000 You just needed a browser.
00:29:32.000 So you didn't need Microsoft Windows, right?
00:29:35.000 And you didn't need Intel.
00:29:38.000 And then just the internet just explodes.
00:29:41.000 So we have a positive moment followed by, you know, call it 10 or 15 years of basically economic extraction.
00:29:50.000 And then we have value.
00:29:52.000 I think today it's like we've invented something really powerful.
00:29:57.000 We've had 10 or 15 years that were largely economic.
00:30:02.000 And again, I think, you know, this is like the problem.
00:30:04.000 I'm going to sound like every other, you know, nerd from Central Casting from Silicon Valley telling you this, but I do think that there's a version of this AI thing which blows the doors wide open again.
00:30:17.000 And I think we owe it to ourselves to figure out how to make that more likely than not likely.
00:30:22.000 Well, it seems it's inevitable, right?
00:30:24.000 AI's emergence and where it goes from here on is inevitable.
00:30:28.000 It's going to happen.
00:30:28.000 And we should probably try to steer it at least in a way that benefits everybody.
00:30:34.000 And I agree with you.
00:30:35.000 There is a world I could see where AI changes everything.
00:30:40.000 And one of the things that makes me most hopeful is a much better form of translation so that we'll be able to understand each other better.
00:30:49.000 It's a giant part of the problem in the world.
00:30:52.000 It's, you know, the Tower of Babel.
00:30:54.000 So we really can't communicate with each other very well.
00:30:57.000 So we really don't know what the problems are in these particular areas or how people feel about us.
00:31:02.000 Can't empathize.
00:31:04.000 Yeah, we can't.
00:31:05.000 It's very easy to not empathize with someone where you don't even know what their letters are.
00:31:10.000 Have you been in a situation where you have a translator with a thing in your ear?
00:31:14.000 No.
00:31:15.000 Empathy is zero.
00:31:16.000 Because the problem is the person there is giving it to you in a certain tone because it's first person.
00:31:23.000 Oh, I've had that when I interview fighters.
00:31:25.000 I've had translators.
00:31:26.000 Yeah, but like when you're here, it's very hard to feel empathy for this person because it's this person that you're focused on because you're trying to catch it.
00:31:33.000 So you hear the words.
00:31:35.000 I think somewhat of the meaning is a little bit lost.
00:31:39.000 Then you go back to this person and you say something and they're in the same problem that you are.
00:31:43.000 So I agree that the translation thing is cool.
00:31:46.000 I think that there's going to be some negative areas.
00:31:53.000 And I think that there's going to be a lot of pressure on certain jobs, and we've got to figure that out.
00:31:58.000 So it's not all roses.
00:31:59.000 But some areas, if you imagine them, I'll give you a couple if you want, are just bananas, I think.
00:32:06.000 Okay.
00:32:07.000 Okay, so I'll go from the most likely to, like, the craziest.
00:32:12.000 Okay.
00:32:12.000 Okay, so most likely today, do you know if you know somebody that's had breast cancer?
00:32:18.000 If they go into a hospital, a random hospital in America, and the doctor says, we need to do a lumpectomy, meaning we need to take some mass out of your breast to take the cancer out.
00:32:31.000 What do you think the error rate today is across all hospitals in America?
00:32:36.000 It's about 30%.
00:32:37.000 Wow.
00:32:38.000 And in regional hospitals, so places that are poor, right, or places that are in far-flung parts of the United States, it can be upwards of 40%.
00:32:48.000 This is not the doctor's fault, okay?
00:32:51.000 The problem is that you're forcing him or her to look with their eyes into tissue and And try to figure out, well, where is the border where the cancer stops?
00:33:05.000 So for every 10 surgeries, what that means are a week later.
00:33:10.000 So imagine this.
00:33:11.000 You get a breast cancer surgery.
00:33:13.000 They take it out.
00:33:14.000 They send it to the pathologist.
00:33:15.000 The pathologist takes between 7 and 11 days.
00:33:19.000 So you're kind of waiting.
00:33:21.000 Seven of the calls come back.
00:33:23.000 You're clean margins.
00:33:25.000 You're great.
00:33:26.000 Now go to the next step.
00:33:28.000 Three of the calls, I'm sorry, there's still cancer inside your body.
00:33:32.000 Three.
00:33:33.000 So these women now go back for the next surgery.
00:33:37.000 But the problem is one of those women will get another call that says, I'm sorry, there's still cancer.
00:33:43.000 And so what is that?
00:33:45.000 That's a computer vision problem.
00:33:48.000 Right?
00:33:49.000 That's not necessarily a problem that can't not be solved literally today.
00:33:55.000 We have models, we have tissue samples of women of all ages, of all races, right?
00:34:04.000 So you have all of the different boundary conditions you'd need to basically get to a 0% error rate.
00:34:11.000 And what's amazing is that is now working its way through the FDA. So call it within the next two years, there'll be an AI assistant that sits inside of an operating room.
00:34:22.000 The surgeon will take out what they think is appropriate, they'll put it into this machine, and it'll literally, I'm going to simplify, but it'll flash red or green.
00:34:31.000 You got all the cancer out.
00:34:33.000 You need to take out a little bit more just right over here.
00:34:37.000 And now you get it out and now all of a sudden instead of a 30% error rate, you have a 0% error rate.
00:34:43.000 That's amazing.
00:34:44.000 That's today because you have this computer that's able to help you.
00:34:49.000 And all we need is the will and the data that says, okay, we want to do this, just show me that it works.
00:34:58.000 And show me what the alternative would be if we didn't do it.
00:35:01.000 And the alternative turns out to be pretty brutal.
00:35:04.000 14 surgeries for every 10 surgeries?
00:35:06.000 Like, I mean, that's not what the most advanced nation in the world should be doing.
00:35:10.000 Okay, so if you do it for breast cancer, the reason why breast cancer is where folks are focusing is because it gets so much attention, and it's like prime time.
00:35:21.000 But it's not just breast cancer.
00:35:23.000 Lung cancer, pancreatic cancer, stomach cancer, colon cancer.
00:35:30.000 If you look at any kind of tumor, So if you're at the stage where you're like, we need to get this thing, this foreign growing thing out of our body, we should all have the ability to just do that with zero percent error,
00:35:46.000 and it will be possible in the next couple of years because of AI. Okay, so that's kind of like a, that's cool and it's coming.
00:35:54.000 I think between years two and years five, You're going to see this crazy explosion in materials.
00:36:03.000 And this is going to sound maybe dumb, but I think it's one of the coolest things.
00:36:08.000 If you look at the periodic table of elements, what's amazing is we keep adding.
00:36:16.000 So there's like 118 elements.
00:36:18.000 We actually just theoretically forecasted there's going to be 119, so we created a little box.
00:36:24.000 It's theoretical, but it's going to show up.
00:36:27.000 And they forecasted that there's going to be 142. Okay?
00:36:32.000 So the periodic table of elements, quote-unquote, grows.
00:36:35.000 But when you look at the lived world today...
00:36:39.000 We live in this very narrow expression of all of those things.
00:36:43.000 We use the same few materials over and over and over again.
00:36:46.000 But if you had to solve a really complicated problem, don't you think the answer could theoretically be in this?
00:36:54.000 Meaning, if you took, I'm going to make it up, Selenium and then doped it with titanium, 1%, but if you doped it with boron, 14%, all of these things are possible.
00:37:06.000 It's like stronger than the strongest thing in the world, and wow, and it's lighter than anything.
00:37:11.000 So now you can make rockets with it and send it all the way up with less energy.
00:37:15.000 It's all possible.
00:37:16.000 So why haven't we figured it out?
00:37:19.000 Because the amount of energy and the amount of computers we need to solve those problems, which are super complicated, haven't been available to us.
00:37:30.000 I think that is this next phase of AI. So what you said, which is we're going to have these PhD level robots and agents.
00:37:37.000 In the next two to five years, we're going to come up with all kinds of materials.
00:37:41.000 You know, you'll have a frying pan that's like nonstick, but doesn't have to heat up like all, you know, whatever you want from like the most benign to the most incredible stuff.
00:37:49.000 We'll just re-engineer what's on earth.
00:37:54.000 That's going to be crazy.
00:37:56.000 It's going to be incredible.
00:37:57.000 We all benefit from that, the kinds of jobs that that creates.
00:38:02.000 We don't even know what job class that is to work with selenium and borer.
00:38:05.000 Again, I'm making up these elements, so please don't.
00:38:08.000 So the point is that that's like in the middle phase.
00:38:13.000 So our physical lived world is going to totally transform.
00:38:18.000 Imagine a building That's made of a material that bends.
00:38:22.000 You can just go like this and nothing changes to it.
00:38:25.000 Why would that be important?
00:38:27.000 Well, if you want to protect yourself from the crazy unpredictability of climate, in the areas where it's susceptible to that, maybe you can construct these things much cheaper.
00:38:38.000 Well, earthquakes.
00:38:39.000 Earthquakes.
00:38:40.000 You could construct more of them.
00:38:41.000 Imagine in San Francisco, you could build buildings that solved the housing crisis but do it in a way that was cheaper because the materials are totally different and you could just prove that these things are bulletproof.
00:38:52.000 So instead of spending a billion dollars to build a building because you got to go hundreds of feet into the earth, you just go 50 feet and it just figures it out.
00:39:04.000 So that's possible and I think there will be people that use these AI models to go and solve those things.
00:39:11.000 And then after that, I think you get into the world of it's not just robots that are in a computer, but it's like a physical robot.
00:39:20.000 And those physical robots are going to do things that today will make so much sense in hindsight.
00:39:27.000 So an example, I was thinking about this this weekend.
00:39:30.000 Imagine if you had a bunch of optimists, like Tesla's robot, and they were the beat cops.
00:39:38.000 They were the highway patrol.
00:39:40.000 Now, what happens there?
00:39:43.000 Well, first, you don't put humans in the way.
00:39:48.000 I suspect then the reaction of those robots could be markedly different.
00:39:52.000 Now, those robots would be controlled remotely, right?
00:39:55.000 So the people that are remote now can be a very different archetype, right?
00:40:00.000 Instead of the physical requirements of policing, you now add this other layer, which is the psychological elements and the judgment.
00:40:08.000 So my point is that if you had robots that were able to do the dangerous work for humans, I think it allows humans to do, again, judgment, you know, those areas of judgment which are very gray and fuzzy.
00:40:23.000 It'll take a long time for computers to be able to replace us to do that.
00:40:27.000 I really do think so.
00:40:29.000 I think the biggest thing that we have done as a disservice to what is coming is Mm-hmm.
00:40:54.000 It up-levels us.
00:40:55.000 You used to have to remember the details of some crazy theory, random detail fact.
00:41:01.000 Now you can just Google it.
00:41:02.000 You can leave your mind to focus on other things.
00:41:08.000 The creativity to write your next set, to think about the next interview, to think about your business, because you're occupying less time with the perfunctory stuff.
00:41:20.000 I think these models are doing that, and they're going to get complemented with physical models, meaning physical robots.
00:41:28.000 And they're going to do a lot of work for us that we have not done, or today that we do very precariously.
00:41:37.000 You know, like, should a robot go in and save you from a fire?
00:41:40.000 I think it can probably do a pretty good job.
00:41:42.000 We're good to go.
00:41:59.000 Again, it allows humans to focus on the things that we're really, really differentiated at.
00:42:08.000 I do think it creates complications, but we have to figure those out.
00:42:13.000 So that's kind of like a short, medium, long term.
00:42:16.000 Well, I see what you're saying in the final example as the rosy scenario.
00:42:20.000 That's the best case option, right?
00:42:23.000 That it gives people the freedom to be more creative and to pursue different things.
00:42:29.000 I think there's always going to be a market for handmade things.
00:42:34.000 People like things.
00:42:35.000 They like an acoustic performance.
00:42:38.000 They like stuff where it's very human and very real.
00:42:42.000 But there's a lot of people that just want a job.
00:42:46.000 And these people maybe just aren't inclined towards creativity and maybe they're very simple people who just want a job and they just want to work.
00:42:57.000 Those are the people that I worry about.
00:42:58.000 I worry about them as well.
00:43:00.000 And I think that, like, I didn't live in the agrarian economy nor in the Industrial Revolution, so I don't know how we solved this problem.
00:43:09.000 But we have seen that problem two times.
00:43:13.000 And each time, we found a way.
00:43:16.000 And this goes back to sort of like news and politics and like just working together.
00:43:21.000 But in each of those moments, we found a way to make things substantively better for all people.
00:43:28.000 Like I saw this crazy stat.
00:43:30.000 In 1800, do you know how many people lived in extreme poverty?
00:43:34.000 How many?
00:43:35.000 80%.
00:43:36.000 Whoa.
00:43:37.000 You know where we are today?
00:43:38.000 Sub 10%.
00:43:39.000 Single digits.
00:43:41.000 And it's a straight line that goes like this.
00:43:43.000 And that was through an agrarian revolution.
00:43:45.000 It was through the industrial revolution.
00:43:47.000 So it is possible for humans to cooperate to solve these problems.
00:43:52.000 I don't know what the answer is, but I do think you are right that it will put a lot of pressure on a lot of people.
00:43:59.000 But that's why we got to just figure this out.
00:44:02.000 What are your thoughts on universal basic income as a band-aid to sort of mitigate that transition?
00:44:09.000 I'm pretty sympathetic to that idea.
00:44:13.000 I grew up on welfare.
00:44:15.000 So what I can tell you is that there are a lot of people who are trying their best and for whatever set of boundary conditions can't figure it out.
00:44:26.000 I agree.
00:44:27.000 I grew up on welfare as well.
00:44:28.000 Yeah.
00:44:28.000 Yeah.
00:44:29.000 And so if I didn't have that safety net You know, my parents' struggles, I think, would have gotten even worse than what they were.
00:44:42.000 So I'm a believer in that social safety net.
00:44:45.000 I think it's really important.
00:44:46.000 It's best case scenario, right?
00:44:47.000 Because your parents worked their way out of it.
00:44:49.000 My parents worked their way out of it.
00:44:51.000 But some people are just content to just get a check.
00:44:56.000 And this is the issue I think that a lot of people have is that people will become entitled and just want to collect a check.
00:45:03.000 And if it's a substantial portion of our country, like if universal basic income, if AI eliminates, let's just say a crazy number, like 70% of the manual labor jobs, truck drivers, construction workers, all that stuff gets eliminated.
00:45:18.000 That's a lot of people without a purpose.
00:45:21.000 And one of the things that a good day's work and earning your pay It makes people feel self-sufficient.
00:45:27.000 It makes people feel valuable.
00:45:29.000 It gives them a sense of purpose.
00:45:31.000 They could look at the thing that they did, maybe build a building or something like that, and drive their kids by, hey, we built that building right there.
00:45:38.000 Oh, wow.
00:45:39.000 It's a part of their identity.
00:45:41.000 And if they just get a check, and then what do they do?
00:45:44.000 Just play video games all day?
00:45:46.000 That's the worst case scenario, is that people just get locked into this world of computers and online and And just receive checks and have the bare necessities to survive and are content with that and then don't contribute at all.
00:46:03.000 The jobs that, like, let's put it this way.
00:46:06.000 If we were sitting here in 1924, whatever, 100 years ago, you know, right in the midst of the turn of the Industrial Revolution, we would have seen a lot of folks that worked on farms And we would have wondered,
00:46:25.000 well, where are those jobs going to come from?
00:46:29.000 And I think that now when you look back, it was like, not obvious, but you could see where the new job classes came from.
00:46:40.000 It's like all of these industries that were possible because we built a factory.
00:46:44.000 And a factory turned out to be a substrate.
00:46:47.000 And then you built all these different kinds of businesses which created different kinds of jobs on top of it.
00:46:54.000 I would hope that if we do this right, this next leap is like that, where we are in a period where it's hard to know with certainty what this job class goes to over here.
00:47:08.000 But I think you have a responsibility to go and figure it out, and talk it out, and play it out.
00:47:15.000 Because the past would tell you that we have really good humans, when they're unimpeded, have a really good ability to invent these things.
00:47:26.000 So, I don't know, maybe what it is, is by 2035, there's a billion people that have traveled to Mars.
00:47:35.000 And you're building an entire planet from the ground up.
00:47:40.000 There'll be all sorts of work to do there.
00:47:43.000 Boy, what kind of people are going to go first there?
00:47:47.000 I think that there'll be a lot of people that are frustrated with what's happening here.
00:47:50.000 Yeah.
00:47:51.000 Sure.
00:47:52.000 Just like the people that got on the Pinta, the Santa Maria, and made their way across the ocean.
00:47:58.000 It all starts with a group of people that are just like, I'm fed up with this.
00:48:02.000 Yeah.
00:48:02.000 But to want to go to a place that doesn't even have an atmosphere, that's capable of sustaining human life, and you can only go back every couple of years, like...
00:48:15.000 Those people are going to be psychos.
00:48:16.000 You're going to have a completely psychotic—it's Australia on meth.
00:48:22.000 It's like the worst-case scenario of the cast-outs of society.
00:48:27.000 Sorry, just like what you say is—it's so true, but if you think about what that decision looked like— 400 years ago when that first group of prisoners were put on a boat and sent to Australia.
00:48:41.000 Right.
00:48:42.000 That's probably what it felt like.
00:48:44.000 Yeah.
00:48:44.000 Most people on the mainland, when they were like, cha-chao, were probably thinking, man, this is insane.
00:48:50.000 Right.
00:48:50.000 So it'll always look like that.
00:48:52.000 It'll be easier to rationalize it in hindsight.
00:48:55.000 But I do think that there will be a lot of people that want to go when it's possible to go.
00:49:00.000 And, like, look, you know, we're in this studio.
00:49:04.000 We could be anywhere.
00:49:06.000 We could be in Salt Lake City.
00:49:08.000 We could be in Rome.
00:49:09.000 We could be in Perth.
00:49:11.000 You don't know.
00:49:12.000 It's all the same.
00:49:13.000 Especially today.
00:49:14.000 So you could be on Mars.
00:49:16.000 Yeah, you could.
00:49:17.000 You wouldn't know.
00:49:17.000 Yeah.
00:49:19.000 That could be the future.
00:49:22.000 Instantaneous communication with people on other planets, just like you could talk to people in New Zealand today.
00:49:28.000 So that's an amazing example of an innovation in material science that we have been experimenting with for years.
00:49:38.000 So basically at the core of what you just said is a semiconductor problem.
00:49:44.000 It's a doping problem.
00:49:47.000 Is it silicon germanium?
00:49:48.000 Is it silicon germanium with something else?
00:49:52.000 And the problem, Joe, is to answer what you just said is a tractable problem that has been bounded by energy and computers.
00:50:02.000 And we're at a point where we're almost at infinite energy, and at a point we're almost at, like, what I say is very specific, which is we're at the point where right in the distance is the marginal cost of energy is basically zero.
00:50:19.000 The marginal cost, meaning to generate the next kilowatt, is going to cost, like, sub a penny.
00:50:25.000 Even with today's stuff, you don't need nuclear, you don't need any of that stuff.
00:50:28.000 We're just on this trend line right now.
00:50:31.000 Because of AI, we're at the point where to get an answer to a question, super complicated, is going to be basically zero, the cost of that.
00:50:41.000 When you put those two things together, What you just said, we will be in a position to answer.
00:51:07.000 Are not.
00:51:08.000 They're actually not that crazy.
00:51:10.000 These things are achievable technical milestones.
00:51:15.000 Everything will boil down to a technical question that I think we can answer.
00:51:19.000 You want a hoverboard?
00:51:20.000 We could probably figure it out.
00:51:22.000 Well, then also with quantum computing and one of the things about AI that's been talked about is this massive need for energy.
00:51:31.000 And so they're going – at least it's been proposed to develop nuclear sites specifically to power AI, which is wild.
00:51:39.000 Yeah.
00:51:41.000 I have to be – You got to dance around this?
00:51:48.000 No, I'll tell you what I think.
00:51:54.000 Okay, well, maybe before I give you my opinion, I'll tell you the facts.
00:52:01.000 Okay.
00:52:05.000 Today, it costs about 4 cents a kilowatt hour.
00:52:09.000 Just don't forget the units.
00:52:10.000 Just remember the 4 cents concept.
00:52:12.000 20 years ago, it cost like 6 or 7 cents.
00:52:15.000 If you go and get solar panels on your roof, it basically costs nothing.
00:52:20.000 In fact, you can probably make money.
00:52:22.000 So it costs you like negative 1 cent because you can sell the energy in many parts of America back to the grid.
00:52:29.000 But if you look inside the energy market, The cost has been compounding.
00:52:34.000 And you would say, well, how does this make sense?
00:52:36.000 If the generation cost keeps falling, why does my end-user cost keep going up?
00:52:43.000 This doesn't make any sense.
00:52:45.000 And when you look inside, we have a regulatory...
00:53:02.000 We're giving you a monopoly, effectively.
00:53:04.000 In this area of Austin, you can provide all the energy.
00:53:08.000 Now, Texas is different, but I'm just using it as an example.
00:53:11.000 But in return, I'm going to allow you to increase prices, but I'm going to demand that you improve the infrastructure.
00:53:21.000 Every few years, you've got to upgrade the grid.
00:53:23.000 You've got to put money into this, money into that.
00:53:26.000 Over the next 10 years, we've got to put a trillion dollars, America collectively, into improving the current grid, which I think will not be enough, because it is aging, and most importantly, it's insecure.
00:53:42.000 Meaning folks can penetrate that, folks can hack it, folks can do all kinds of stuff.
00:53:47.000 And then it fails in critical moments.
00:53:50.000 I think that in Austin you had a whole bunch of really crazy outages in the last couple of years.
00:53:55.000 People died.
00:53:57.000 In 2024, that's totally unacceptable.
00:54:01.000 So I think as people decide that they want resilience, You're going to see 110 million power plants, which is every homeowner in the United States.
00:54:17.000 Everybody's going to generate their own energy.
00:54:19.000 Everybody's going to store their energy in a power wall.
00:54:23.000 This stuff is going to become, I mean, absolutely dirt cheap.
00:54:28.000 And it'll just be the way that energy is generated.
00:54:32.000 So you have this, but this is not the whole solution, because you still need the big guys to show up.
00:54:39.000 When you look inside of like the big guys, so like now you're talking about these 2000 utilities that need to spend trillions of dollars.
00:54:47.000 They can do a lot of stuff right now to make enough energy to make things work.
00:54:53.000 But when you look at nuclear, I would just say that there are two different kinds of nuclear.
00:55:00.000 There's the old and the new.
00:55:01.000 The old stuff, I agree with you, it's just money and you can get it turned back on.
00:55:07.000 It's a specific isotope of uranium.
00:55:09.000 You can deal with it.
00:55:10.000 Everybody knows in that world how to manage that safely.
00:55:15.000 But then what you have are like these next generation things.
00:55:18.000 And this is where I get a little stuck.
00:55:20.000 And I'm not smart enough to know all of it.
00:55:22.000 But I'm close enough to be slightly ticked off by it.
00:55:27.000 There's a materials and a technical problem with these things.
00:55:31.000 And what I mean back to materials.
00:55:33.000 Some of these next-gen reactors need a material that will take you, like, 50 years in America and the world to, like, harvest an ounce.
00:55:44.000 The only place where you can really get it is the moon in sufficient quantity.
00:55:48.000 Are you really going to—I mean, that's how it's going to work?
00:55:52.000 Like, you're going to— Go to the moon to harvest energy.
00:55:53.000 You're going to go to the moon.
00:55:54.000 You're going to harness this material.
00:55:56.000 Then, you know, schlep it all the way back to some place in Illinois to make—I find that hard to believe.
00:56:04.000 What is the material?
00:56:06.000 I can find it.
00:56:07.000 It's in an email that one of my folks sent me.
00:56:10.000 But it's like—it's a certain— Form of reactor that uses a very rare material to create the plasmonic energy that can generate all of this stuff.
00:56:21.000 And it's just very hard to find on Earth.
00:56:23.000 So I kind of scratch my head.
00:56:24.000 What's the benefit of this particular type of reactor?
00:56:27.000 Enormous energy.
00:56:29.000 So like, you know, a solar cell gets this much energy, you know, a nuclear reactor does this, and like this other thing does that.
00:56:36.000 And it's super clean and...
00:56:37.000 So my point is, these next-gen reactors, I think, have some pretty profound technical problems that haven't been figured out.
00:56:44.000 I applaud the people that are going after it.
00:56:49.000 But I think it's important to not oversell that because it's super hard and there's still some profound technical challenges that haven't been solved yet.
00:57:00.000 You know, we just got past what's called like, you know, positive net energy, meaning, you know, let's just like, you put, I'm making up a number, you know, 100 units of energy in and at least you try to get out like 100.01.
00:57:19.000 And we're kind of there.
00:57:21.000 So that's where we are on these next-gen reactors.
00:57:24.000 The old generation of reactors, I'm a total believer in.
00:57:28.000 And we should be building these things as fast as possible so that we have an infinite amount of energy.
00:57:35.000 By the way, if you have infinite energy, the most important thing I think that happens is you have a massive peace dividend.
00:57:42.000 The odds of the United States going to war when we have infinite energy approach is zero.
00:57:48.000 But isn't the problem...
00:57:51.000 With introducing this to other countries, and I believe it was India where they introduced nuclear power plants, then they realized very quickly they could figure out how to make nuclear weapons from that.
00:58:02.000 Yes, when the uranium degrades, it can be used in weapons-grade uranium.
00:58:06.000 And the real problem would be if that is not a small handful of countries that have nuclear weapons, but the entire world, it could get very sketchy.
00:58:17.000 I think you're touching what I think objectively to me is the single biggest threat facing all of us today.
00:58:39.000 I escaped a civil war.
00:58:40.000 So I've had a lived experience of how destructive war can be.
00:58:46.000 The collateral damage of war is terrible.
00:58:49.000 Where were you?
00:58:50.000 In Sri Lanka.
00:58:51.000 And, you know, I was part of the ethnic majority, Sinhalese Buddhist.
00:58:57.000 But, you know, we were, they were fighting Hindu Tamil minority.
00:59:03.000 And it was a 20-year civil war.
00:59:06.000 It flipped the whole country upside down from an incredible place with 99% literacy to just a, you know, a struggling developing third world country.
00:59:16.000 And so we moved to Canada.
00:59:19.000 We stay in Canada.
00:59:22.000 My parents do whatever they could.
00:59:24.000 And they got run over by that war.
00:59:27.000 They went from a solidly middle-class life to my father had a ton of alcoholism and didn't really work.
00:59:36.000 And my mother went from being a nurse to being a housekeeper.
00:59:39.000 And it was dysfunctional.
00:59:43.000 It really crippled, I think, their dreams for themselves.
00:59:47.000 And so, you know, they breathe that into their kids.
00:59:49.000 Fine.
00:59:50.000 But that can't be the solution where hundreds of millions or billions of people have to deal with that risk.
00:59:56.000 And I am objectively afraid that we have lost the script a little bit.
01:00:02.000 I think that folks don't really understand how destructive war can be, but also that there are not enough people objectively afraid of this.
01:00:14.000 And that's what sends my spidey senses up and says, hold on a second.
01:00:18.000 When everybody is telling you that this is off the table and not possible, Shouldn't you just look at the world around and ask, are we sure that that's true?
01:00:30.000 And I come and I think to myself, wow, we are at the biggest risk of my lifetime.
01:00:37.000 And I think the only thing that is probably near this is maybe at some point in the Cold War, I don't know because I was so young, definitely Bay of Pigs.
01:00:50.000 But it required JFK to draw a hard line in the sand and say, absolutely not.
01:00:58.000 Will we be that fortunate this time around?
01:01:00.000 Are we going to find a way to eliminate that existential risk?
01:01:04.000 This is why my current vein of political philosophy is mostly that, which is the Democrats and the Republicans, there's just so much fighting over so many small stakes issues in the sense that Some of these issues matter more or less in different points,
01:01:26.000 but there is one issue above all which where if you get it wrong, nothing matters, and that is nuclear war.
01:01:33.000 And you have two and a half nuclear powers now that are out and about extending and projecting their power into the world—Russia, China, and Iran.
01:01:50.000 That wasn't what it was like 10 years ago.
01:01:52.000 That wasn't what it was like 25 years ago.
01:01:55.000 It wasn't even what it was like four years ago.
01:01:58.000 I just don't think enough people take a step back and say, hold on a second, if this thing escalates, all this stuff that you and I just talked about won't matter.
01:02:08.000 Whether our kids are on Adderall or not or the iPad, don't give them so much Fortnite or material science or Optimus.
01:02:18.000 It's all off the table because we will be destroying ourselves.
01:02:24.000 And I just think that that's tragic.
01:02:26.000 We have an enormous responsibility right now for the village elders of the world to tell people, guys, we are sleepwalking into something that you can't walk back from.
01:02:38.000 One of the strangest things about us is the kind of wisdom that's necessary to sort of see the future and prognosticate and see where this could go, especially based on the history of human beings and how many times things have You were talking about Sri Lanka,
01:02:56.000 but there's many examples all over the world of civilizations that were thriving, that were pounded into dust.
01:03:03.000 And because every day is similar for us, we have this inability to look forward and to make that leap and see the potential for disaster that all these things have.
01:03:17.000 And this is what freaks me out about when people talk openly about, you know, we have to win with Russia versus Ukraine.
01:03:25.000 I'm like, what are you talking about?
01:03:27.000 What is winning?
01:03:28.000 Yeah.
01:03:28.000 What does that mean?
01:03:29.000 This sounds insane.
01:03:30.000 And then applauding the long-range attacks into Russia now, like this escalation.
01:03:37.000 Oh, you know, they're attacking Russia now.
01:03:39.000 They'll show them.
01:03:40.000 Are you in a movie?
01:03:42.000 Do you think that this always ends up with the good guys winning?
01:03:46.000 Because that's not the case in human history at all.
01:03:50.000 And not only that, there is no good guy if people start launching nukes.
01:03:53.000 Everybody's a bad guy.
01:03:54.000 And everybody's fucked.
01:03:55.000 And that's on the table.
01:03:58.000 When you see long-range Israel bombing campaigns in the Lebanon and you see...
01:04:04.000 What's going on with Ukraine and Russia?
01:04:06.000 Who knows?
01:04:07.000 Who knows how this escalates?
01:04:09.000 Who knows what the retaliatory response is?
01:04:12.000 Who knows what the response to the response is?
01:04:15.000 Let me add to this by saying we know what the response will be not.
01:04:21.000 It will not be measured.
01:04:23.000 It will not be calm.
01:04:24.000 It will not be, hey, let's get on the phone and talk about it.
01:04:27.000 Right.
01:04:27.000 Like the thing is, there was a long period of time where, you know, America was the leading moral actor in the world, right?
01:04:37.000 And I think that we spoke from a place of wisdom, but also like earned respect.
01:04:43.000 But we forget that at the end of the Cold War, it's not that we vanquished the USSR as much as they imploded from within.
01:04:52.000 It was just an economic calamity.
01:04:54.000 They just couldn't afford to keep up with us.
01:04:57.000 And the reason was we had these two edges.
01:05:00.000 We had a technological edge and we had an economic edge.
01:05:08.000 And when you put those two things together, it created a lot of abundance.
01:05:12.000 Now, we can talk about how some of that is not equal, which I also agree with, but it allowed America to be sort of effectively, for a long period of time, the top dog.
01:05:26.000 The honest reality is that's not where we are today.
01:05:29.000 We are one of two or three.
01:05:32.000 And the problem with that is that you can't look back in history and try to live your life like what it was like in the good old days.
01:05:40.000 You know, we're not the high school football star anymore.
01:05:44.000 So we need to live in a more modest way, in a more reliable and consistent way, with neighbors that have also for themselves done well, and just realize that they have their own incentives.
01:05:58.000 And when you tell them to do something, they're not always going to listen.
01:06:02.000 So if we don't understand that and find a way to de-escalate these things, what you said is going to happen.
01:06:11.000 Something is going to be one step too far, a reaction, a reaction, a reaction, and then eventually somebody will overreact.
01:06:22.000 And that is all just so totally avoidable.
01:06:26.000 And it just frustrates me that we objectively don't understand that.
01:06:31.000 We sweep it under the carpet and we talk about all the other things.
01:06:37.000 And I understand that some of those things, all of those things, let's say, matter.
01:06:42.000 But at some point in time, nothing matters.
01:06:46.000 Because if you don't get this right, nothing matters.
01:06:50.000 And I think we have to find a way of finding people that draw a bright red line and say, this is the line I will never cross under any circumstance.
01:07:00.000 And I think America needs to do that first because it's what gives everybody else the ability to exit stage left and be okay with it.
01:07:07.000 The other problem that America clearly has is that there's an enormous portion of what controls the government, whether you want to call it the military-industrial complex, Military contractors.
01:07:22.000 There's so much money to be made in pushing that line.
01:07:25.000 Pushing it to the brink of destruction but not past.
01:07:29.000 Maintaining a constant state of war but not an apocalypse.
01:07:33.000 And that as long as there's financial incentives to keep escalating and you're still getting money and they're still signing off on Hundreds of billions of dollars to funnel this and it's all going through these military contractors and bringing over weapons and gear and the windfall is huge.
01:07:53.000 The amount of money is huge and they do not want to shut that off for the sake of humanity, especially if someone can rationalize.
01:08:00.000 You get this diffusion of responsibility when there's a whole bunch of people together and they're all talking about it.
01:08:04.000 Everyone's kind of on the same page and you have shareholders that you have to represent.
01:08:08.000 Like the whole thing is bananas.
01:08:10.000 So I think you just said the key thing.
01:08:12.000 This may be super naive.
01:08:17.000 But I think part of the most salvageable feature of the military-industrial complex is that these are for-profit, largely public companies that have shareholders.
01:08:30.000 And I think that if you nudge them to making things that are equally economically valuable or more, ideally more, they probably would do that.
01:08:46.000 What would be an example of that other than weapons manufacturing?
01:08:50.000 Like what would be equally economically viable?
01:08:52.000 So part of – when you look at the primes, the five big kind of like folks that get all of the economic activity from the Department of Defense – What they act is as an organizing principle for a bunch of subs underneath,
01:09:12.000 effectively.
01:09:12.000 They're like a general contractor, and then they have a bunch of subcontractors.
01:09:17.000 There's a bunch of stuff that's happening in these things that you can reorient if you had an economy that could support it.
01:09:26.000 So, for example, when you build a drone, okay?
01:09:31.000 What you also are building, a subcomponent, a critical and very valued subcomponent, all the navigation, all the communications, all of it has to be encrypted.
01:09:39.000 You can't hack it.
01:09:40.000 You can't do any of that stuff.
01:09:43.000 There is a broad set of commercial applications for that that are equal to and greater than just the profit margin of selling the drone, but they don't really explore those markets.
01:09:54.000 If, for example, we are multi-planetary, I'll just go back to that example, I will bet you those same organizations will make two or three times as much money by being able to redirect that same technology into those systems that you just described.
01:10:15.000 Hey, I need an entire communications infrastructure that goes from Earth to the Moon to Mars.
01:10:20.000 We need to be able to triangulate.
01:10:22.000 We need internet access across all these endpoints.
01:10:24.000 We need to be real-time from the get-go.
01:10:28.000 There's just an enormous amount of economic value to do that.
01:10:32.000 So, again, we have these very siloed parts of the economy that are limited by what we know.
01:10:39.000 And what we know is part of what you just said, which is we built these things, and then some people convince others to use the things that we built.
01:10:47.000 And I think instead of saying like there's like some crazy nefarious plot to always go to war, instead if we say if you make the thing and you could sell it to a different market and make more money, would these people do it or are they hell-bent on war?
01:11:01.000 I think it's more that they would just do it.
01:11:03.000 As long as there's not still a business for war.
01:11:07.000 So then it reduces war to a very different kind of business.
01:11:13.000 I think it's I think it's more kind of drone-oriented.
01:11:21.000 I'm not saying that war will go away.
01:11:25.000 There's no utopia where war goes away.
01:11:28.000 Which is a crazy thing to say, really.
01:11:30.000 Unfortunately, I think that we're always going to be fighting for some resource.
01:11:34.000 So the last 30 or 40 years, all these forever wars, we were fighting over energy, effectively.
01:11:39.000 This is my one bright spot that I think about with AI as well.
01:11:44.000 It's like everyone's terrified of the open border situation and criminals coming across the border.
01:11:51.000 Wouldn't the solution be not have a place like a desperate third world country where people are trying to escape on foot with their families?
01:12:01.000 If we were living next door to another United States, and you could just travel freely back and forth between the two of them, because it really didn't matter.
01:12:10.000 Both of them are equally safe.
01:12:11.000 Both of them have equal economic prosperity.
01:12:14.000 Both of them are, you know, equally democratically governed.
01:12:18.000 No problem.
01:12:19.000 Just go over there.
01:12:20.000 Go over here.
01:12:20.000 I mean, it kind of used to be like that with Canada.
01:12:22.000 You know, with Canada, you used to go over there with a driver's license.
01:12:25.000 You used to be able to go back and forth between the United States.
01:12:28.000 I mean, I think the first time I went to Canada, I did not have a passport.
01:12:30.000 I had a driver's license.
01:12:32.000 And it was kind of the same sort of deal.
01:12:35.000 It was just accepted.
01:12:36.000 Oh, that place is cool.
01:12:37.000 We're cool.
01:12:37.000 We're next to each other.
01:12:39.000 If the whole world was like that...
01:12:40.000 It'd be incredible.
01:12:41.000 Right.
01:12:41.000 It would be incredible.
01:12:42.000 And I think AI makes that possible.
01:12:45.000 I think so, too.
01:12:46.000 I think it makes that possible, and we're going to have to deal with a few very uncomfortable factors, one of them being the illegal drug trade, and another one being the consequences of prohibition, and forcing people into doing things- Prohibition of drugs?
01:13:03.000 Of drugs, yes.
01:13:04.000 I'm not comfortable with that.
01:13:07.000 But I feel like the only way to disempower illegal drug manufacturing is to have legal drug manufacturing that's regulated.
01:13:15.000 The only way to stop fentanyl overdoses is to have cocaine become legal.
01:13:20.000 But the problem with that is you're going to get a bunch of people that are addicted to cocaine.
01:13:23.000 Can I ask you a question?
01:13:24.000 Sure.
01:13:25.000 I don't do drugs, so I don't understand.
01:13:27.000 What's the step before it?
01:13:29.000 What causes you to want to do fentanyl?
01:13:32.000 They don't do fentanyl on purpose.
01:13:34.000 They started because of a prescription?
01:13:36.000 No.
01:13:37.000 Most fentanyl overdoses is fentanyl that's cut into other drugs, particularly party drugs like molly and ecstasy, cocaine, even heroin, things along those lines where people think that they're getting a pure thing, but they're getting it from the cartel.
01:13:53.000 It's laced with this?
01:13:54.000 Yes.
01:13:56.000 It's cheap and it's very small amounts of fentanyl do incredible damage.
01:14:01.000 Like the amount of fentanyl that can kill you is like the head of a nail.
01:14:04.000 It's very small.
01:14:05.000 Have you seen it in relationship to a penny?
01:14:07.000 I've seen a picture of it.
01:14:08.000 It's crazy.
01:14:09.000 So the problem is if you have a drug and you've cut it with a bunch of other things because you want to sell as much of it as possible, you add a bunch of things into it.
01:14:19.000 And to increase the potency, they add fentanyl.
01:14:22.000 And because of that, it's all done illegally, it's unregulated, so a lot of people die.
01:14:27.000 And the numbers in the United States, I think there are upwards of 100,000 people.
01:14:31.000 Is there something to do to motivate folks to not do the party drugs?
01:14:36.000 Well, I think you're going to have to have a massive education campaign, and people are going to have to understand it the same way they understand cigarettes.
01:14:43.000 Like, cigarette smoking in young people is down quite a bit from the 80s, right?
01:14:48.000 And I think that's because of people understanding the consequences of it.
01:14:52.000 But you're always going to have people that want to smoke cigarettes.
01:14:55.000 And my belief is that they should be able to smoke cigarettes.
01:14:58.000 I don't think you should do Adderall all day, but if you get a prescription, you can do it.
01:15:03.000 But at least in my mind, you're getting Adderall from a pharmacy and that pharmacy is going to give you actual Adderall and not some fentanyl-laced thing that's going to kill you and you have no idea.
01:15:15.000 You think you're taking the same thing you've always taken and then one day you're dead.
01:15:19.000 And that's the case with a lot of people today, especially with party drugs.
01:15:24.000 I don't think you should do heroin.
01:15:43.000 To relationships and families, to societies, how many alcoholics, how many drunk driving accidents, how many drunk people murdered other people.
01:15:51.000 There's just horrible consequences of alcohol, but I completely support alcohol being legal.
01:15:57.000 Yeah, same.
01:15:57.000 But we have learned how to consume alcohol as a culture.
01:16:01.000 What about what happened in Oregon?
01:16:03.000 You know, like Oregon, where they legalized it, and now they unlegalize it?
01:16:07.000 Well, first of all, they were already off the rails.
01:16:11.000 This is not like legalizing drugs in San Francisco in the year 2000. It's a completely different scenario.
01:16:23.000 So you have these people that are really...
01:16:27.000 They're accustomed to tents and subsidizing drug addicts and they're accustomed to this very bizarre breakdown of civil society where you're seeing open air drug markets and everyone's fine with it and it's somehow or another Kind and compassionate to allow this to take place everywhere.
01:16:48.000 And that's what you have in Portland, right?
01:16:50.000 Portland is probably one of the most liberal cities that we have, one of the most leftist cities that we have.
01:16:54.000 So for Oregon to do it that way, I think it's a Like an awesome libertarian notion to say, you know what?
01:17:02.000 We shouldn't make any of these things a crime.
01:17:06.000 These are personal choices and you can make good personal choices or bad ones and we'll have everything.
01:17:11.000 But the problem is the fabric of society, the encouragement of discipline and of hard work and of accomplishment had been eroded to the point where accomplishment meant that there was something wrong with you.
01:17:25.000 Right.
01:17:25.000 Like if you were a person that was eat the rich, tax the rich, if you're a person that had accomplished something great, it wasn't because of some extraordinary effort you put in, even if it was.
01:17:34.000 It was you did something to fuck over other people and that's the only way you get rich in this world.
01:17:39.000 And it's just It's bizarre.
01:17:42.000 And it permeated Portland.
01:17:44.000 So when you introduce free heroin to that, you're going to get more problems.
01:17:51.000 And you're subsidizing people for living on the streets, which they do.
01:17:54.000 I mean, I watched this interview where they were talking to these people in the Pacific Northwest, where they moved there specifically so they could be homeless.
01:18:01.000 Because they knew that they'd get money.
01:18:03.000 And there's no incentive to get out of those tents.
01:18:06.000 There's no incentive.
01:18:07.000 There's free food and free drugs, and they give you money.
01:18:10.000 And they're like, okay.
01:18:11.000 Right, so that didn't fail because of the actual drug policy, but more of the other social policies.
01:18:16.000 I think, have you ever heard of Dr. Carl Hart?
01:18:19.000 No.
01:18:20.000 He's a professor at, is he at Columbia?
01:18:24.000 I believe he's at Columbia.
01:18:26.000 Carl Hart was a straight chemist, like a guy studying chemistry and studying these substances in Colombia.
01:18:37.000 And he was a clinical researcher.
01:18:40.000 And along the way, he started realizing that our Our understanding of these drugs and the pros and cons of them had been flavored by propaganda heavily, particularly the sweeping act of 1970 that made almost everything illegal,
01:18:56.000 which was really to target civil rights groups and anti-war people.
01:19:00.000 And so this was during the Nixon administration.
01:19:02.000 They made a bunch of things that were psychedelics.
01:19:05.000 MKUltra, is that when that happened?
01:19:07.000 Well, MKUltra was actually before that.
01:19:09.000 It was when they were experimenting with people, particularly with LSD. So when they started doing this, they made everything illegal, and now the only way you can get any of these things is through illegal sources.
01:19:24.000 So you're getting them through the cartels.
01:19:25.000 So if we don't have the ability To legalize things.
01:19:31.000 And the problem with legalization, there's no good answer here.
01:19:36.000 So to keep things illegal, you're going to have fentanyl overdoses.
01:19:39.000 I know people have lost their children.
01:19:41.000 I know people have lost their brothers and sisters to this.
01:19:44.000 It's a horrible thing that happens.
01:19:48.000 You're also going to get heroin overdoses if it's legal.
01:19:50.000 So Jesus Christ, like, and you're going to get more people to try it because it's legal.
01:19:55.000 But it's just like prohibition during the 1920s or the 1930s, rather.
01:20:00.000 When they had that, what they did was they enabled the mob, and they enabled organized crime, and that was the rise of Al Capone, and that was the rise of the moonshiners, and you essentially, you always had a demand, and the people that were willing to supply that demand were criminals.
01:20:18.000 And they were criminals in this country.
01:20:19.000 We're empowering criminals that are essentially running Mexico, which is bizarre.
01:20:24.000 You know, I know during the latest election...
01:20:26.000 How many assassinations were there?
01:20:27.000 Was it 37 or 35?
01:20:29.000 During their latest elections...
01:20:31.000 In Mexico?
01:20:32.000 In Mexico.
01:20:33.000 There was at least 35 assassinations.
01:20:37.000 Wow.
01:20:38.000 Are there European countries that do this well?
01:20:40.000 Portugal.
01:20:42.000 Portugal has decriminalized everything.
01:20:44.000 And they saw a massive decrease in HIV, a massive decrease in drug addiction and all sorts of other things.
01:20:50.000 I don't know what the long-term study is on that.
01:20:53.000 37. 37!
01:20:56.000 37 assassinated candidates.
01:20:58.000 I mean, you have to play ball with the cartel over there.
01:21:02.000 You know, just like you had to play ball with the mob in the 1930s.
01:21:06.000 You have to play ball.
01:21:08.000 They have the guns and they're really mean and they'll do whatever the fuck they want to.
01:21:12.000 The 2021 midterms when 36 candidates were killed.
01:21:16.000 Oh my God.
01:21:16.000 Jeez.
01:21:17.000 Oh my God.
01:21:18.000 Jeez.
01:21:19.000 Yeah, it's wild down there.
01:21:21.000 That is a direct...
01:21:23.000 Result of having trillions of dollars being made by selling illegal drugs and made most of them to sell to America.
01:21:32.000 I'm sure they sell them to other places as well.
01:21:34.000 And it's all fentanyl or is it all...
01:21:36.000 Oh, it's all kinds of things.
01:21:37.000 No, I don't think there's a lot of money in that.
01:21:39.000 The real money is in like meth, meth, cocaine.
01:21:43.000 They have a problem with illegal marijuana that's grown in the United States on national forest land.
01:21:50.000 Because what happened is, especially in California, my friend John Norris, he wrote a book called A Hidden War, and he was a fish and game officer and, you know, basically wanted to be the guy that, like, checks your fishing license.
01:22:03.000 Like, great job, you're out in the outdoors.
01:22:05.000 And one day...
01:22:08.000 It's the beginning of a movie.
01:22:10.000 It is the beginning of a movie.
01:22:11.000 I'm sure they're probably doing a movie on it.
01:22:13.000 But in one day, they find this creek that has dried up, and they think that perhaps...
01:22:18.000 Oh, they diverted the water.
01:22:19.000 Someone's diverted the water.
01:22:20.000 They thought it was a farmer that had done something inappropriate or whatever.
01:22:23.000 So they follow the creek up, and they find this illegal grow-up that's run by the cartel.
01:22:28.000 And then they become a tactical unit, and they have Belgian Malinois and bulletproof vests and machine guns.
01:22:36.000 The whole thing's crazy, and they get in shootouts with the cartel in National Forest Land.
01:22:39.000 Oh, my God.
01:22:40.000 Because it's a misdemeanor to grow pot illegally in a state where pot is legal.
01:22:46.000 So California has legal marijuana.
01:22:48.000 You could go to any store, anywhere, use credit cards.
01:22:52.000 It's open, free market.
01:22:54.000 If you follow the rules, you can open up a store.
01:22:57.000 But if you don't follow the rules, you can sell it illegally, and it's just a misdemeanor.
01:23:01.000 I wanted to learn about marijuana, the market.
01:23:05.000 But you can't process the money, I think, right?
01:23:07.000 In some states.
01:23:09.000 I know in Colorado, it was a real problem.
01:23:11.000 And in Colorado, they had to do everything in cash.
01:23:14.000 Yeah, it's like Breaking Bad, like bricks of cash and all of these places.
01:23:16.000 Well, they were using mercenaries.
01:23:18.000 They're essentially using military contractors to run the money back and forth to the bank because you had to bring the bank money in bulk.
01:23:28.000 So you'd have a million dollars in an armored car, and a bunch of guys tailing the car in front of the car, and they're driving it to the bank, and everyone knows there's a million dollars in that car.
01:23:39.000 So you have to really be fortified.
01:23:42.000 And so it was very sketchy for a lot of people.
01:23:45.000 I don't know what the current condition in Colorado is now.
01:23:49.000 I don't know if they still have to do it that way.
01:23:52.000 A couple of companies, I just, I remember, the reason I know this is like, a guy came and pitched me on some business and he was like the software for all that.
01:24:02.000 And, you know, I think the company went public and I just realized it just went sideways because like, nobody wanted to touch it because they didn't want to build rails.
01:24:11.000 Yeah.
01:24:24.000 Yeah, it's great.
01:24:25.000 And they're trying to diminish that.
01:24:28.000 The latest steps during the Biden administration is to change it to a Schedule 3. And that's a proposal.
01:24:34.000 That would help.
01:24:36.000 But really, it should be just like alcohol.
01:24:38.000 It should be something that you have to be 21 years old to buy.
01:24:41.000 You should have to have an ID. And we should educate people how to use it responsibly.
01:24:45.000 And we should also pay attention to...
01:24:47.000 Whoever the fuck is growing it, make sure you're not going wacky.
01:24:51.000 There's people that are botanists that are out of their mind potheads that are just 24-7 hitting bongs, and they're making stuff that'll put you on Mars without Elon Musk.
01:25:01.000 I remember the...
01:25:03.000 The problem that somebody raised, I read this in an article, was you need to make it more legal than it is today so that you can get folks to put some version of a nutritional label on the thing and show intensity,
01:25:19.000 right?
01:25:19.000 Because the intensity is not regulated.
01:25:20.000 Right.
01:25:21.000 Right.
01:25:21.000 Well, they do regulate it in California.
01:25:23.000 If you go to good places in California, let's say this is 39% THC, which is very high.
01:25:28.000 This is 37. But then there's also the problem with one thing that marijuana seems to do to some people that alcohol doesn't necessarily...
01:25:39.000 Some people have a propensity for alcoholism, and it seems to be genetic.
01:25:43.000 But there's a thing that happens with marijuana where people who have a tendency towards schizophrenia, marijuana can push them over the edge.
01:25:51.000 And Alex Berenson wrote a great book about this called Tell Your Kids.
01:25:55.000 And I've personally witnessed people who've lost their marbles.
01:25:59.000 And I think it's people that have this propensity.
01:26:04.000 Because one of the things that I think is beneficial about marijuana in particular, and this is one of the things that freaks people out, is the paranoia.
01:26:13.000 Right?
01:26:14.000 Well, paranoia, I feel like what it is, is a hyper-awareness.
01:26:18.000 And I think it pushes down all these boundaries that you've set up, all these walls, and all these blinders, so that you see the world for what it really is.
01:26:28.000 And a lot of people, it freaks out.
01:26:29.000 But what I think it does is it...
01:26:32.000 Ultimately makes you more compassionate and kinder and nicer.
01:26:35.000 And you realize like...
01:26:36.000 In the moment or after?
01:26:38.000 Afterwards.
01:26:38.000 Afterwards.
01:26:39.000 I think it's a tool for recognizing things that you are conveniently ignoring.
01:26:47.000 And, you know, my friend Eddie told me about this once.
01:26:50.000 He was saying, if you're having a bad time and you smoke marijuana, you're going to have a worse time.
01:26:55.000 Because you're already freaking out.
01:26:56.000 You're already freaking out about something.
01:26:58.000 If you're going through a horrible breakup and you get high, like, oh, no one loves me!
01:27:02.000 But if you're having a great time with your friends, you'll probably just laugh and be silly, right?
01:27:07.000 Because you're not freaking out about something.
01:27:09.000 You're probably in a good place mentally, which we should all strive to be in a good place.
01:27:14.000 I have this weird psychosomatic...
01:27:19.000 My father was an alcoholic, and I didn't drink at all in my teens, in my 20s, and mostly in my 30s.
01:27:28.000 And then in my mid-30s, I started drinking wine, and I love wine, and I think I can handle it.
01:27:33.000 I really enjoy it.
01:27:34.000 I love it.
01:27:35.000 I do, too.
01:27:36.000 But I cannot drink hard alcohol.
01:27:39.000 The minute that it touches my lips, I get severe hiccups.
01:27:43.000 I mean, like, debilitatingly bad hiccups.
01:27:46.000 Really?
01:27:47.000 Any kind of alcohol.
01:27:48.000 So you think it's psychosomatic?
01:27:49.000 I think it's completely psychosomatic because it makes no logical sense.
01:27:52.000 Right.
01:27:53.000 If the tequila touches my lip, I just start hiccuping like crazy.
01:27:56.000 And it's like this weird protective thing that I think my brain has developed because my dad used to drink some stuff that would, like, make you blind.
01:28:05.000 Like moonshine.
01:28:07.000 It was like 150% proof the guy would just chug it.
01:28:10.000 I mean, he was...
01:28:10.000 Well, I think there are whiskey connoisseurs and there is like scotch, like old scotch does have a fantastic taste.
01:28:20.000 It's got an interesting sort of an acquired taste.
01:28:24.000 But there's real wine connoisseurs.
01:28:26.000 Wine is incredible.
01:28:27.000 Wine is a different animal.
01:28:29.000 The flavor of wine is spectacular.
01:28:31.000 It's the most delicious of all alcohols without being sweet.
01:28:34.000 I completely agree with you.
01:28:36.000 Yeah.
01:28:36.000 It's a different thing.
01:28:37.000 Like, the people that say they're tequila connoisseurs, like, shut up.
01:28:40.000 It all tastes like shit.
01:28:42.000 Some of it just tastes less like shit.
01:28:44.000 The great tequila tastes less shitty.
01:28:46.000 It's alcohol.
01:28:47.000 Yeah.
01:28:47.000 And it's like, then there's some flavor around the alcohol.
01:28:50.000 Yeah, I drink a glass of wine with a steak and it's like, oh, this wine's fucking great.
01:28:55.000 I totally agree with you.
01:28:56.000 And it puts you in a calm, it relaxes me.
01:29:01.000 You don't want to go drive fucking wild and get crazy and get in a fight when you're drinking wine.
01:29:05.000 Exactly.
01:29:07.000 Right.
01:29:10.000 Right.
01:29:29.000 And then when you discover and you learn, like, I remember, like, my wife's Italian, so we spent a lot of time in Italy in the summer with our kids.
01:29:36.000 And when we were there, you find these incredible Italian white Chardonnays, okay?
01:29:43.000 In the summer, I'm just going to be honest with you, there is nothing better to drink in the world.
01:29:48.000 It's better than water.
01:29:49.000 It's cold.
01:29:50.000 It's refreshing.
01:29:52.000 It's got a great bouquet.
01:29:54.000 But these bottles are 40, 50, 60, 80, maybe maximum 100 euros.
01:29:59.000 Max.
01:30:00.000 And then you'll spend $1,000 on some stupid white burgundy from France.
01:30:05.000 And it tastes like half of it.
01:30:07.000 You know, the other great example of this is like Chateau Petrus.
01:30:11.000 I don't want to get in trouble with the Chateau Petrus guys, but I'll just be honest.
01:30:15.000 Those bottles cost $2,000, $3,000, $4,000, $5,000, $6,000.
01:30:18.000 You see some in restaurants, $19,000.
01:30:20.000 I've never bought one.
01:30:21.000 I've tasted it once, because in Vegas I had a host, and once he gave me a thing, he says, Chamath, you can use this for your dinner.
01:30:32.000 And I'm like, well, is there a cap?
01:30:34.000 And he's like, this time, no cap.
01:30:37.000 And first I was like, God, I must have lost a lot of money.
01:30:41.000 When they say no cap, I was like, so then I said, fuck it, I'm just going to try this.
01:30:49.000 So I went to Caesars, and it was like $4,000 or $5,000.
01:30:53.000 I mean, I would never buy this in a norm.
01:30:55.000 But I got it because it was free.
01:30:58.000 Joe's okay.
01:30:59.000 All this build up in my mind.
01:31:01.000 This is, oh my god, this is going to be ethereal.
01:31:03.000 It's going to be ambrosia.
01:31:04.000 It was not ambrosia.
01:31:07.000 Whereas, you can find these other ones that are made by, you know, folks that just put their entire lives into it.
01:31:14.000 You taste the whole story.
01:31:17.000 I just think it's incredible.
01:31:18.000 But it's a weird status thing, the expensive wine.
01:31:21.000 It's just like Cuban cigars.
01:31:23.000 It's really dumb.
01:31:24.000 Yeah, it's a weird thing.
01:31:25.000 The real skill is being able to know price value.
01:31:29.000 And when you know it, it's so satisfying because it's like, oh, this is just delicious.
01:31:34.000 And then when your friends enjoy it, they're like, oh, my God, this is delicious.
01:31:38.000 And I'm like, yeah, that's 80 bucks.
01:31:40.000 Yeah.
01:31:40.000 How?
01:31:41.000 And I'm like, well, it's very hard to find.
01:31:43.000 So then the skill is like, it's funny, I'll tell you, this is how bad wine has gotten for me.
01:31:49.000 Meaning like, I love the story.
01:31:51.000 I love the people.
01:31:52.000 I want to support the industry.
01:31:54.000 So I went and I registered for an alcohol license at the ABC in California.
01:32:00.000 Really?
01:32:01.000 Because I was tired and frustrated of trying to buy retail.
01:32:06.000 Because you have to go through folks that have their own point of view.
01:32:10.000 And I was like, well, if we just become, you know, we as in like me and a friend of mine.
01:32:16.000 And so we set up a thing.
01:32:18.000 We set up a little LLC. I filed the paperwork.
01:32:22.000 And it's called like, you know, CJ Wine LLC, you know, my friend, me and Joshua.
01:32:29.000 And we're able to negotiate directly with the wineries.
01:32:33.000 And we're able to get it from wholesalers in Europe or in South Africa or in Australia.
01:32:40.000 And it just allows us to buy a bottle, try it if we really like it.
01:32:45.000 Thursday nights at my house is always poker.
01:32:47.000 We serve it to our friends.
01:32:49.000 If they like it, then we can buy a couple cases I can share with my friends and you get it at wholesale.
01:32:53.000 It's a great little hack.
01:32:54.000 Is there a limitation?
01:32:56.000 Is there a certain specific amount that you have to buy?
01:32:59.000 No, I look like a retail store.
01:33:01.000 It could be like Amazon.
01:33:03.000 And so a retail store could just buy a few bottles?
01:33:05.000 They could buy a case.
01:33:07.000 They could buy a few bottles.
01:33:09.000 That's a little bit harder.
01:33:10.000 So you have to have a more personal relationship.
01:33:13.000 But then the really good stuff, you can buy a few cases and then pass them on to your friends.
01:33:18.000 I think wine's incredible.
01:33:20.000 And with food, it's incredible.
01:33:21.000 But when I hear people that are going to open up their own wine label, I'm like, oh, good lord.
01:33:27.000 How much do you know about wine?
01:33:29.000 Like, oh, I'm going to start a wine business.
01:33:31.000 Like, what?
01:33:32.000 I went to a couple of these wineries, and I just asked them, just out of like, just explain to me how you got there.
01:33:40.000 And all I could think of was, man, this is way too complicated.
01:33:44.000 But these folks, it's like animal husbandry.
01:33:47.000 They're breeding this vine with this vine, but then they're going to cleave off this little bit.
01:33:52.000 So it's a breeding program over 10 and 20 and 30 years, and it's like, this is really complicated.
01:33:59.000 Oh yeah, they do weird stuff.
01:34:01.000 They'll splice avocado trees with, what is that nut?
01:34:08.000 Pistachios.
01:34:09.000 So they'll take avocado trees and they splice them with pistachios to make the tree more sturdy.
01:34:15.000 You can take two different species of tree and if you cut them sideways and splice them together they'll grow.
01:34:21.000 A friend of mine started a company that's making, like, potatoes.
01:34:24.000 And he makes, like, these ginormous potatoes like this.
01:34:27.000 It's an incredible thing.
01:34:28.000 Because, like, the yield is through the roof.
01:34:30.000 And, like, you know, his, I think, vision is, I'll be able to feed the world in a cheaper and more abundant way.
01:34:39.000 But it's all this engineering.
01:34:40.000 He hacked the chromosome of the potatoes.
01:34:43.000 Oh my god.
01:34:43.000 That's incredible.
01:34:44.000 And like it generates a huge potato and it generates a seed.
01:34:49.000 I didn't know this, but potatoes don't have seeds.
01:34:51.000 In order to plant more potatoes, you chop up the potato into like quarters or eights and you stick the potato into the soil.
01:35:00.000 Oh, he's playing God.
01:35:01.000 And so he's like, no, this is dumb.
01:35:04.000 I'm going to make a huge potato and I'm going to have the potato have the seed.
01:35:07.000 Then I'll just take the seed and I'll plant it in the ground.
01:35:09.000 So when you usually do it, do you have to have that potato and they have to soak it so a sprout comes out of it and then plant it?
01:35:15.000 Is that what they do?
01:35:16.000 I don't know, but I mean...
01:35:18.000 Because I've seen that before.
01:35:19.000 Yeah.
01:35:20.000 I've always wondered, like, how are they doing that?
01:35:23.000 I don't know.
01:35:23.000 But he's, like, going after potatoes and then he wants to go after fruit.
01:35:27.000 Like, I'm a...
01:35:28.000 I love fruit.
01:35:29.000 But fruit is, like, tilts me because, like, every time I go to a store...
01:35:34.000 Sometimes like you'll get like their seasons were like, you know, like white nectarines and white peaches, maybe like one of the most incredible fruits ever created.
01:35:42.000 But if you get it in a bad season, it's just like unedible.
01:35:44.000 Yeah, they're dull.
01:35:45.000 They're dull.
01:35:46.000 They're terrible.
01:35:47.000 Mangoes are the same.
01:35:48.000 I love mangoes.
01:35:49.000 And when you see like these mangoes in the summertime, like, I don't know where Europe gets their mangoes, but this is probably one of the best features of Europe.
01:35:57.000 Like they have these mangoes that are just like this.
01:35:59.000 Well, they're organic, too.
01:36:01.000 They're just they're just incredible.
01:36:02.000 Well, they still have real tomatoes.
01:36:04.000 Like, you have to search for real tomatoes here.
01:36:07.000 Like, if you want an heirloom tomato, that's an actual tomato.
01:36:11.000 The tomatoes that we have are just these freaks.
01:36:14.000 You know, I... Have you ever worn...
01:36:17.000 Glucose monitor?
01:36:18.000 Glucose monitor?
01:36:18.000 No, haven't.
01:36:19.000 I wore one for 90 days.
01:36:22.000 And my wife, when she was younger...
01:36:25.000 Well, she runs a pharma company, so she has a...
01:36:29.000 Proclivity for science, obviously, and she thinks about a lot of this stuff scientifically, but she also broke her back when she was 11. So she's very sensitive to inflammation.
01:36:40.000 So she's hacked a lot of food so that she can minimize inflammation.
01:36:45.000 I wore this thing, and I was...
01:36:49.000 Totally blown away.
01:36:50.000 The things that I thought were healthy for me, my body was like, this is radioactive.
01:36:55.000 Like what stuff?
01:36:56.000 So like the way that I ate rice or quinoa, I would have like a small amount of rice or I would have brown rice or I'd have black rice or I'd have quinoa because I was like, oh, it's more protein.
01:37:07.000 It didn't really matter.
01:37:09.000 My body reacted with this massive sugar spike.
01:37:13.000 The minute I cooked it off, put it in the fridge, waited 24 hours and ate it the next day, No glycemic load whatsoever.
01:37:22.000 Same with potatoes.
01:37:24.000 Potatoes.
01:37:24.000 I found that pasta, if I made it more al dente than what I was used to, no glycemic load.
01:37:30.000 And like the problem that's frustrating...
01:37:32.000 More al dente meaning less cooked?
01:37:34.000 Less cooked.
01:37:35.000 Has a totally different reaction in my body than if I make it soft and smushy.
01:37:42.000 Huh.
01:37:43.000 So I've trained my body to have like really, you know, al dente pasta.
01:37:47.000 And I see that...
01:37:49.000 The glycemic load is much lower.
01:37:52.000 Just the point is, like, the food supply in the United States, I think it's the most precarious it's ever been.
01:38:00.000 It's brutally hard to figure this out.
01:38:02.000 I mean, is everybody supposed to get a glucose monitor and then figure out what little things, you know, trigger insulin spikes?
01:38:11.000 Well, that's not possible.
01:38:12.000 And then even if you do find out, how do you get it in a cheap, affordable way?
01:38:17.000 I mean, no wonder, like, everybody's, you know, really struggling with this.
01:38:21.000 Have you ever worn a glucose monitor when you were in Italy?
01:38:25.000 No.
01:38:26.000 I wore it here.
01:38:26.000 I'd like to know, because there's a difference that the way my body responds.
01:38:31.000 I can tell you how my body feels.
01:38:34.000 If I take a picture, like, I mean, I try to work out, and I take pretty detailed, like, what's my BMI, what's my muscle mass, what's my fat percentage?
01:38:42.000 And I always take those readings right before I go.
01:38:47.000 And when I look afterwards, and I don't do anything when I'm there.
01:38:52.000 You know, I swim in the sea when I can, like when I'm on vacation or whatever.
01:38:56.000 I walk a lot, but nothing else.
01:38:59.000 No weights, no nothing.
01:39:03.000 My muscle mass stays the same.
01:39:06.000 My fat percentage goes down.
01:39:11.000 I look healthier.
01:39:14.000 And I feel really great.
01:39:19.000 And all I do is I just eat what's in front of me.
01:39:21.000 I don't think about quantities, whatever.
01:39:23.000 But when I'm back in the United States, so I get to be there, call it six weeks a year, right?
01:39:28.000 But when I'm back in the United States, I have to go back on lockdown.
01:39:32.000 Because like a lot of people, you know, I had this thing, like if you look at a picture of me in Sri Lanka, I look like old Dave Chappelle.
01:39:43.000 I was like this.
01:39:46.000 I was just a total stick figure.
01:39:49.000 Within one year of being in North America, in this case in Canada, when you look at the school pictures, I was fat.
01:39:58.000 Couldn't explain it to you.
01:39:58.000 It's just the difference in the food system.
01:39:59.000 And my parents were making the same things because they wanted to have that comfort of what they were used to.
01:40:04.000 Mmm.
01:40:07.000 I don't know if it was the food supply or not, but...
01:40:10.000 It has to be.
01:40:10.000 It has to be.
01:40:11.000 It has to be.
01:40:12.000 Everybody says the same thing.
01:40:13.000 And then my whole family has struggled with it, you know?
01:40:16.000 So I think that there's something...
01:40:18.000 And then when I go now to Italy as a different reference example, it's like, it's the best shape of my life.
01:40:25.000 Yeah, you feel completely different.
01:40:27.000 Even when you eat things like pizza over there, you don't feel like you ate a brick.
01:40:31.000 I've eaten pizza here and I love it, but when I'm over, I'm like, oh, what did you do?
01:40:36.000 What did you do?
01:40:36.000 Like you ate a brick.
01:40:37.000 But over there, it's just food.
01:40:39.000 It tastes great.
01:40:40.000 The pasta doesn't bother you.
01:40:42.000 Nothing bothers you.
01:40:43.000 It's just whatever they're doing.
01:40:45.000 And there's many things.
01:40:46.000 There's one of them, they're not using enriched flour.
01:40:48.000 And another thing is they have heirloom flour, so it hasn't been maximized for the most amount of glutens.
01:40:54.000 I'm curious to see what Bobby does if Trump wins in this whole...
01:40:58.000 Make America healthy again.
01:41:00.000 I don't exactly know what his plans are.
01:41:02.000 Yeah.
01:41:03.000 What's possible?
01:41:05.000 Like, how much can you really affect with regulation?
01:41:07.000 How much can you really bring to light?
01:41:09.000 And what are we going to learn about our food system?
01:41:12.000 I mean, even Canada, one of the things about the hearings that they just had was they were comparing Lucky Charms that they sell in the United States that are very brightly colored versus Lucky Charms they sell in Canada.
01:41:23.000 It's a completely different looking product because in Canada it's illegal to use those dyes that we use ubiquitously.
01:41:31.000 And those dyes are terrible for you.
01:41:32.000 We know they're terrible for you.
01:41:34.000 Canada knows they're terrible for you, which is why they're illegal up there.
01:41:36.000 The food tastes the same.
01:41:38.000 It still sucks.
01:41:39.000 It's still bad for you.
01:41:40.000 It's still covered in sugar.
01:41:41.000 But at least it doesn't have that fucking poison that just makes it blue or red.
01:41:46.000 And it's impossible to teach my kids healthy eating habits as a result of this.
01:41:51.000 The food in the United States, it's everywhere and it's beating into you that this is a cheap way of getting caloric intake and it is full of just all these stuff you can't pronounce.
01:42:05.000 It's garbage, yeah.
01:42:06.000 It's all garbage.
01:42:07.000 And it's so common.
01:42:10.000 And then if you're in what they would call a food desert, if you're in a place that only has fast food, like my god, like your odds of being metabolically healthy if you're poor and you're living in a place that's a food desert.
01:42:21.000 It's impossible.
01:42:22.000 It's fucked.
01:42:23.000 It's impossible.
01:42:23.000 You're fucked.
01:42:24.000 It's too hard.
01:42:24.000 And it's also very expensive, which is even crazier.
01:42:27.000 It's so expensive to eat well and to eat like clean and make sure that you don't have any additives and garbage in your food.
01:42:34.000 Do you remember in like the 90s and 2000s where what we were told was fat was bad?
01:42:41.000 Yeah.
01:42:42.000 And like you would see sugar-free and I would just buy it.
01:42:46.000 Oh, yeah.
01:42:47.000 Sugar-free was great.
01:42:48.000 I was like, sugar-free, I'm doing the healthy thing here.
01:42:49.000 This is great.
01:42:50.000 Margarine.
01:42:51.000 Margarine.
01:42:51.000 Or then I would see fat-free and I'd be like, oh, I'll do that.
01:42:54.000 Yeah.
01:42:55.000 And it turned out all this stuff was just...
01:42:57.000 Well, it's such a small amount of people that affected that.
01:43:02.000 That's what's so terrifying.
01:43:03.000 There's a small amount of people who bribed these scientists to falsify data so that they could blame all these coronary artery diseases and heart diseases on saturated fat when it was really sugar that was causing all these problems.
01:43:21.000 And we had a very dysfunctional understanding of health for the longest time.
01:43:26.000 The food pyramid was all fucked up.
01:43:28.000 The bottom of the food pyramid was all bread and carbs.
01:43:31.000 You know, it's so nuts.
01:43:33.000 And it just made a bunch of, like, really sloppy humans.
01:43:36.000 And you could see it in the beaches, the photos from the 1960s versus looking at people in the 2000s.
01:43:42.000 Have you had Casey and Callie Means on?
01:43:43.000 They're coming on.
01:43:44.000 They're coming on.
01:43:46.000 They have an incredible story.
01:43:48.000 Should I say it?
01:43:49.000 Sure.
01:43:50.000 They have this incredible story that they tell about what happened.
01:43:55.000 And what they say is in the 80s, when you had these mergers, one of the mega mergers that happened was tobacco company with food company.
01:44:06.000 There was two of them.
01:44:07.000 And a lot of the scientists started to focus some of their energy on taking that skill—I'll just put that in quotes—of making something very addictive and transposing it to food, right?
01:44:19.000 It's like, okay, if I'm at RJR and I'm used to making cigarettes, how do I think about structurally building up something that wants you to eat more, but now instead of smoking, instead of a cigarette, it's a Twinkie or whatever?
01:44:32.000 And a lot of the food science that we have in America is built on the back of a bunch of these mega-mergers where you had these scientists go and build super, quote-unquote, addictive food.
01:44:44.000 And that was a failure of the—I mean, obviously it was a failure of the capital markets, but it was a failure of public health.
01:44:49.000 Well, it's a failure of our regulatory process.
01:44:54.000 It's a failure of our exposing the public to this and making sure that whatever this is, is labeled the same way cigarettes are.
01:45:05.000 Because if you want to buy cigarettes, you can buy them today, but it's going to have a big warning that tells you this can kill you.
01:45:11.000 Totally.
01:45:13.000 Arguably, sugar is probably as difficult to kick as nicotine is.
01:45:19.000 And there's a lot of other problems.
01:45:20.000 I smoked when I was younger.
01:45:23.000 I found it much easier to stop smoking than it was for me to cut out sugar.
01:45:30.000 Wow.
01:45:30.000 Cutting out sugar is basically impossible.
01:45:32.000 It's very hard.
01:45:33.000 You encounter it everywhere.
01:45:35.000 I have a friend who has diabetes, and he got type 2 diabetes, and he's thin, my friend Duncan.
01:45:39.000 And one of the things he found out is when he stopped, it was all just eating too much sugar.
01:45:44.000 When he stopped eating sugar, he's like, oh my god, I have so much energy.
01:45:48.000 Like, this is what I'm supposed to feel like?
01:45:50.000 He had thought that it was just life.
01:45:54.000 Like he's in his 40s now.
01:45:56.000 Lethargic.
01:45:57.000 Yeah.
01:45:58.000 I need a nap.
01:45:59.000 And he didn't realize he was poisoning himself.
01:46:01.000 Yeah.
01:46:02.000 And that's what most people are doing.
01:46:05.000 Most people out there that are drinking regular soda and they're eating candy and you're eating burgers with sugar in the bun and bullshit and bread and french fries cooked in seed oils.
01:46:16.000 You're just poisoning yourself.
01:46:18.000 You should fact check this because I may get the number wrong, but there was something that came out very recently about the percentage of the U.S. food stamp system that goes to soda.
01:46:30.000 And it was like 10% of the total budget?
01:46:34.000 Something nutty like that.
01:46:35.000 It's like some ginormous amount of money is just basically giving folks sugar water.
01:46:41.000 And you wonder why now the solution is just to give everybody on the back end of it Ozempic.
01:46:47.000 It's also like, let's be real.
01:46:49.000 That's not food, okay?
01:46:52.000 It's something you put in your mouth, but you can't buy cigarettes with food stamps, right?
01:46:56.000 So if you can't buy cigarettes with food stamps, why should you be able to buy something that's really bad for you?
01:47:01.000 I mean, what would change if we said food stamps, we're going to actually increase the amount that you get, but we're going to regulate what you can buy, and you have to buy all the things from the outside of the store?
01:47:12.000 I don't even think you have to regulate it.
01:47:15.000 Think of what has happened because of companies like Uber Eats and DoorDash, as an example.
01:47:20.000 What have they done?
01:47:22.000 And I'll tell you why I think this is important.
01:47:24.000 Those guys have gone out, and Cloud Kitchens, there's three companies.
01:47:29.000 They have bought up every single kind of warehouse in every part of every city and suburb in America.
01:47:37.000 And what they put in there are these things that they call ghost kitchens.
01:47:42.000 So that when you launch the app and a lot of the times when you get a drink from Starbucks, it's not coming from the actual Starbucks down the street.
01:47:50.000 It's coming from a ghost kitchen.
01:47:51.000 Why?
01:47:52.000 Because they centralize all the orders and it creates an economy of scale.
01:47:56.000 Why am I telling you this?
01:47:57.000 I think that there is a way for food stamps to sit on top of that infrastructure and just deliver food.
01:48:13.000 You know, like, the choices are...
01:48:16.000 I understand.
01:48:16.000 Yeah, you're going to have people that choose that Big Mac, because it is delicious.
01:48:21.000 Yeah, and I think that they are delicious.
01:48:22.000 And once a year, I have a Big Mac.
01:48:26.000 But I think if you're going to tie it to something like a government subsidy, at least the government should have a conversation with themselves that says, well, we can ship them all of this healthy food and, you know, change a bunch of boundary conditions in these folks' lives.
01:48:41.000 Like, I've seen it.
01:48:45.000 I know what it's like to be overweight.
01:48:47.000 It sucks.
01:48:49.000 Your self-confidence is negative one.
01:48:53.000 The way I dressed, the way I felt, it just took an enormous amount of work to overcome it, and you're still left with it.
01:49:00.000 Then there's the physical pains.
01:49:01.000 Then there's the internal issues that you create for yourself.
01:49:05.000 This is not a win.
01:49:06.000 And so I get it that the hamburger tastes good, but this is where the government, I think, has a responsibility to say, look, we're going to spend hundreds of billions of dollars a year, so let's spend it in a smart way.
01:49:19.000 We're not going to give you pop and soda anymore.
01:49:22.000 And we're going to start introducing some fruit, some vegetables, some fiber.
01:49:27.000 Like, why can't you do that?
01:49:29.000 That's like a very reasonable thing to do, especially if on the same hand, the other hand of the government is going to go and negotiate insulin prices and metformin prices and GLP-1s.
01:49:40.000 Right.
01:49:41.000 That is the issue.
01:49:42.000 See what Bobby Kennedy was talking about with these GLP-1s?
01:49:46.000 He was comparing the amount of money spent on GLP-1s and what you could give every obese American, that you could give them free healthy food and a gym membership?
01:49:57.000 For 10%, I think.
01:49:59.000 I think what Bobby said is the cost of GLP-1's current course and speed would be $3 trillion a year.
01:50:06.000 Isn't that amazing?
01:50:07.000 Just something that controls your appetite.
01:50:09.000 And for $300 billion, you can give everybody food.
01:50:14.000 By the way, you can use this ghost kitchen infrastructure to give the food in a prepped way, right?
01:50:19.000 So you can even make life super simple for them.
01:50:22.000 The idea that you would spend – we don't have $3 trillion to spend.
01:50:26.000 But we have a responsibility to make sure that people don't kill themselves with food.
01:50:30.000 But now there's an industry that's making $3 trillion by giving people these GLP-1s.
01:50:36.000 And the problem is just like every other industry, once it starts making money, it does not want to stop.
01:50:43.000 By the way, I think that they should be allowed to make money.
01:50:47.000 But what I'm saying is, in a free market, every actor is allowed to act rationally.
01:50:53.000 And actually, what you want is everybody to have their own incentives and to act naturally.
01:50:59.000 That's when you get the best outcome.
01:51:01.000 Because if you're acting with some shadow agenda, you're not going to necessarily do the right thing.
01:51:05.000 So my point is, in this example, The government's job, in this narrow example, is to get the best healthcare outcome.
01:51:15.000 Because if they're doing any form of long-term planning, it's pretty obvious.
01:51:20.000 Like, we are hurtling to a brick wall on this healthcare issue with respect to people's health.
01:51:27.000 You don't have a solution.
01:51:28.000 The only solution cannot be to Medicaid and then triple and quadruple the budget deficit that we already don't have a path to pay down.
01:51:38.000 Right.
01:51:38.000 Well, the only other thing that I could think is if there was some sort of a way that would be effective at establishing discipline other than just promoting it.
01:51:52.000 I could conceive of, especially when you're dealing with something like Neuralink or some sort of a new way of programming the mind where it just changes whatever the behavior pattern is that accepts these foods as choices.
01:52:11.000 Like, lobotomize your appetite.
01:52:13.000 That would be a very dystopian place.
01:52:17.000 Sketchy to fucking be an early adopter.
01:52:19.000 If you want this subsidy, you need to get this brain implant.
01:52:21.000 It would not be a good place.
01:52:22.000 That would be bad.
01:52:24.000 That's worst case scenario.
01:52:25.000 Best case scenario is you just have...
01:52:29.000 Like a national scale promotion of health and wellness and abandonment of this body positivity nonsense and fat doctors and people that are telling you that every weight is a healthy weight and all food is food and to think otherwise is discriminatory,
01:52:45.000 which you're hearing from people.
01:52:47.000 And by the way, that stuff is funded.
01:52:50.000 And that's what people need to know.
01:52:51.000 That nonsense is actually funded.
01:52:53.000 They pay people to be influencers, and they're getting paid by these food companies to say these nonsense things that are scientifically, factually incorrect.
01:53:03.000 They're not true.
01:53:04.000 It is not healthy in any way, shape, or form to be obese.
01:53:09.000 And when they tell you that you can be metabolically healthy and still have fat, it's okay.
01:53:15.000 It's not okay.
01:53:16.000 It's not okay.
01:53:17.000 That's just not true.
01:53:17.000 And is that fat shaming?
01:53:19.000 You can call it whatever the fuck you want, but it doesn't change what it does to the human body.
01:53:23.000 And it doesn't make someone better if you don't make them feel bad about being robustly unhealthy.
01:53:31.000 Well, it's an enormous disservice to folks if We don't expose an alternative path.
01:53:40.000 Okay, we're spending this much money.
01:53:42.000 We spend so much money in all kinds of random stuff, like just a simple example that we saw this past week.
01:53:47.000 $50 billion spent between rural broadband and chargers.
01:53:54.000 We have no rural broadband and we have three chargers.
01:53:58.000 No, this is the data.
01:54:00.000 That's $50 billion.
01:54:04.000 Okay, that's not the 300 billion that Bobby- Explain that, what you mean by that.
01:54:07.000 So, there was a- So, in Congress, when they come together to pass these bills, sometimes what happens is there's a lot of horse trading, right?
01:54:18.000 And you get what's called a Christmas tree bill, which is like everybody gets to hang something off the Christmas tree.
01:54:24.000 And the crazy part of the United States is these little baubles now here are 10 billion here, 50 billion there, whatever.
01:54:32.000 So, We passed a few years ago something that was meant to get rural broadband into thousands of people's homes.
01:54:44.000 And initially it was given to SpaceX and Starlink specifically.
01:54:49.000 And then I think it was pulled back.
01:54:52.000 And they said, you know what?
01:54:53.000 There's a better, quote-unquote, better way to do it.
01:54:57.000 And there's other folks that decided that the better option would just be to, like, lay fiber.
01:55:05.000 Now, I'm just going to—this is not a judgment.
01:55:08.000 I'm just going to show you something.
01:55:11.000 The thing is, like, if you were, like, in Kansas from here, okay?
01:55:16.000 Between here and Kansas is like a big distance, right?
01:55:19.000 It's like this.
01:55:21.000 When you're in orbit, it's just here.
01:55:23.000 Right.
01:55:24.000 It's just infinitely shorter distance.
01:55:26.000 So just at a practical level, solving that problem technically is way better through satellites.
01:55:33.000 Fine.
01:55:34.000 They make, I think, what was the right decision.
01:55:37.000 They unmake it.
01:55:38.000 And so they say, we're going to go and figure out a way to lay fiber, whatever.
01:55:42.000 They've laid zero fiber.
01:55:44.000 So there's these thousands of people that were promised broadband that you can get for basically $100 a month or less now, and instead they're spending upwards of thousands and thousands per home and haven't delivered any.
01:55:59.000 Isn't it like $42 billion?
01:56:01.000 That's $42.
01:56:03.000 And then the other example was a $7 billion program to install EV chargers, $7 billion, and they've managed to install three.
01:56:12.000 So all I'm saying is, if you add the two together, that 50 billion, that's one-sixth of what Bobby Kennedy was talking about in terms of giving people organic food.
01:56:21.000 So maybe we can't give everybody organic food, but you can get tens of millions of people, right?
01:56:27.000 Start with the 40 million people that's on Snap.
01:56:29.000 You can get maybe 10 million of those people.
01:56:32.000 Get them organic food right away.
01:56:35.000 So there's all of this waste.
01:56:36.000 All we got to do is just like focus a little bit, take some of these dollars that are just like falling into the seat cushions, it seems like, and just reallocate it somewhat intelligently into the things that really matter.
01:56:50.000 It seems worse than falling into the seat cushions if it's 42 billion people and none of them have received internet access.
01:56:56.000 That's really insane.
01:56:57.000 $42 billion, nobody's got that.
01:56:59.000 I should say, I used the Starlink Mini this past week in Utah in the mountains.
01:57:05.000 It's amazing.
01:57:06.000 It's amazing.
01:57:07.000 It's the size of this book.
01:57:08.000 It's amazing.
01:57:09.000 It's crazy.
01:57:10.000 It literally fits in a small laptop case.
01:57:13.000 My friend couldn't believe that was it.
01:57:15.000 I was like, this is it.
01:57:16.000 All you do is plug it in.
01:57:17.000 I got a very early one, and I used to carry it around in a laundry basket.
01:57:23.000 It was incredible.
01:57:25.000 And then when you see what they've done in a couple of years...
01:57:30.000 Yeah.
01:57:52.000 And then I think, man, this is like, it's $100?
01:57:55.000 It's less than $100?
01:57:56.000 It's crazy.
01:57:57.000 And then, you know, in a year or two, you'll be able to run your cell phone over this thing?
01:58:01.000 Yeah.
01:58:02.000 This is crazy.
01:58:03.000 Yeah, in a year or two, it'll probably be straight to your cell phone, and you won't need that dish anymore.
01:58:08.000 But right now, the dish is a small iPad, and you just sit it down in a field.
01:58:12.000 And we plugged a cord to it, and you don't even have to have a cord.
01:58:15.000 It's a battery that comes with it.
01:58:17.000 So how much do you think it really takes?
01:58:19.000 To give every American just a Starlink dish.
01:58:24.000 Pretty fucking cheap.
01:58:25.000 It's very cheap.
01:58:26.000 Well, it wouldn't be $42 billion.
01:58:28.000 I can guarantee you that.
01:58:29.000 And by the way, the cost of that would basically fall through the floor if you put in an order for 50 million of these units.
01:58:35.000 Right.
01:58:35.000 You know?
01:58:36.000 Right.
01:58:36.000 SpaceX would make them for like $8.
01:58:38.000 Right.
01:58:38.000 You know what I mean?
01:58:39.000 And it's fast internet, too, which is even crazier.
01:58:42.000 Yeah.
01:58:42.000 Yeah.
01:58:43.000 There's all of this stuff that...
01:58:47.000 We should do.
01:58:49.000 We just need a few folks, I don't know, that can either course correct or just can shine a light on it.
01:58:56.000 I mean, it's like this thing where I'm like so optimistic, married to enormous fear and like, you know, just like a little...
01:59:05.000 I kind of go back and forth between these things.
01:59:08.000 Well, let me paint the ultimate dystopian solution.
01:59:11.000 Yeah.
01:59:12.000 The ultimate – part of our problem is we have corruption.
01:59:16.000 We have what you were talking about with deals, sort of like the border wall deal had money in it for Ukraine.
01:59:25.000 There's all these weird deals.
01:59:26.000 There's bills that don't make any sense.
01:59:28.000 How did you add all this stuff?
01:59:30.000 Why is this 2,000 pages?
01:59:31.000 How many people signed it actually read it?
01:59:34.000 AI government.
01:59:35.000 AI government solves all those problems.
01:59:38.000 AI government is not corrupt.
01:59:40.000 AI government just works literally for the people.
01:59:43.000 And instead of having all these state representatives and all these bullshit artists that pretend to be working on their truck and they don't know what the fuck they're doing, they're just doing it for an ad, you don't have any of that anymore.
01:59:52.000 Now everything's governed with AI. The problem is who's controlling the AI? And is there some sort of an ultimate regulatory body that makes sure that the AI isn't biased or tainted?
02:00:03.000 I think there's a step before that which is a lot more palatable.
02:00:08.000 I thought about your version, and the problem that you state is the key problem, which is how is this model trained?
02:00:16.000 Who got their hands on that core stuff, the weights and the values of that?
02:00:21.000 Who decides?
02:00:23.000 And at some point, there is already today in AI models a level of human override.
02:00:29.000 It's just a natural facet of how these things are.
02:00:32.000 There is a way to reinforce the learning based on what you say and what I say.
02:00:37.000 It's a key part of how an AI model becomes smart.
02:00:40.000 It starts off as primordial and then Joe and Chamath and all these other people are clicking and saying, yes, that's a good answer, bad answer, ask this question, all this stuff.
02:00:52.000 Who are those people?
02:00:53.000 Right.
02:00:53.000 And it could be gamed as well, right?
02:00:55.000 Like you could organize.
02:00:56.000 I think at scale we haven't figured out – we haven't seen it yet, but it will be when the incentives are that high.
02:01:02.000 And we've seen distortions, right?
02:01:04.000 Like the Gemini AI that was – they were asked to make Nazi soldiers.
02:01:08.000 They made these multiracial Nazi soldiers.
02:01:31.000 And that kind of stuff.
02:01:32.000 Right.
02:01:32.000 So that is a problem in that AI is not clean, right?
02:01:38.000 It's got the greasy fingerprints of modern civilization on it and all of our bizarre ideologies.
02:01:44.000 But there's a step before it that I think can create a much better government.
02:01:48.000 So it's possible today, for example, to understand—have you ever done a renovation on your house?
02:01:55.000 Yes.
02:01:55.000 Okay, so you make plans.
02:01:58.000 You go and, you know, your architect probably pays an expediter to stand in line in City Hall.
02:02:06.000 There's a person that goes and reviews that plan.
02:02:09.000 They give you a bunch of handwritten markups based on their understanding of the building code.
02:02:14.000 You can't use this lead pipe.
02:02:15.000 You need to use aluminum.
02:02:16.000 This window's too small.
02:02:18.000 All this stuff.
02:02:19.000 You come back.
02:02:19.000 You revise.
02:02:20.000 You go do this two or three times on average to do a renovation.
02:02:23.000 And then they issue your permits.
02:02:27.000 Now, an AI can actually just ingest all the rules, knows exactly what's allowed and what's not allowed, and they can take your picture and instantly tell you, Joe, fix these 19 things.
02:02:39.000 You fix those things.
02:02:40.000 You go to the city.
02:02:41.000 You can show that it maps to all the rules.
02:02:44.000 So you can streamline government.
02:02:45.000 You can also point out where they're making decisions that don't map to what the rules say.
02:02:50.000 That, I think, is going to be a really important first step because it allows us to see where maybe this administrative state has grown unwieldy, where you got to knock some of this stuff back and clean up some of the cruft because there's rules on top of rules and one conflicts with the other.
02:03:07.000 I bet you there are things on the books today that are like that.
02:03:11.000 100%.
02:03:12.000 We have no way of knowing.
02:03:13.000 Right.
02:03:14.000 But I do think an AI can tell you these things and say, just pick which one.
02:03:19.000 It's A or it's B. And I think that that starts to cut back a lot of the difficulty in just making progress.
02:03:28.000 Right.
02:03:29.000 You know?
02:03:29.000 You know, one of the things that I thought was extraordinary that Elon was getting pushed back on was his idea of making the government more efficient.
02:03:41.000 And that auditing the various programs and finding out how to make them more efficient.
02:03:46.000 And a lot of people really freaked out about that.
02:03:49.000 And their main freak out, the main argument from intelligent people that I saw was, what are you going to do?
02:03:55.000 Are you going to fire all these people that are in charge of government?
02:03:59.000 I don't think that's the answer for ineffective government is to let the same people do the same thing because otherwise you have to fire them.
02:04:06.000 That sounds insane.
02:04:08.000 And to say that the government is as efficient as is humanly possible or even close to it, no one believes that.
02:04:16.000 No rational person believes that.
02:04:18.000 Everyone believes in bureaucracy.
02:04:19.000 Everyone believes there's a lot of nonsense going on.
02:04:22.000 Everyone believes that, like, look at the difference between what Elon has been able to accomplish with SpaceX versus what NASA has been doing recently.
02:04:32.000 Look at the difference between what they're able to accomplish with Starlink versus this $42 billion program that yielded zero results.
02:04:39.000 Look at the difference between all these different things that are done in the private sector when there's competitive marketplace strategies.
02:04:47.000 Like you have to figure out a way to get better and more efficient and you can't afford to have a bunch of people in your company that are doing nothing.
02:04:56.000 And that are creating red tape and making things harder to progress.
02:05:00.000 That's bad for the business.
02:05:03.000 That's the argument for letting private companies take over things.
02:05:07.000 By the way, I think that what...
02:05:09.000 And just to build on what you're saying, people jump to this conclusion that like government shouldn't exist.
02:05:15.000 It's not some anarchic thing where like government's actually very important.
02:05:21.000 They create incentives.
02:05:23.000 And then those of us in private industry go out and try to meet those incentives or take advantage of them.
02:05:28.000 That's very normal, okay?
02:05:30.000 And a well-functioning government creates very good incentives.
02:05:35.000 An incredible example of this is in the 1950s.
02:05:40.000 Do you know what the GDP of Singapore was?
02:05:44.000 No.
02:05:44.000 It was the same as the GDP of Jamaica.
02:05:49.000 And then you fast forward 70 years and you understand what good governance looks like.
02:05:54.000 We actually were talking about Singapore yesterday, how extraordinarily efficient their recycling program is.
02:06:00.000 It's unbelievable.
02:06:01.000 I mean, it's really amazing what they do.
02:06:05.000 They really recycle.
02:06:07.000 They recycle how we think we're recycling.
02:06:09.000 They really do.
02:06:10.000 They really separate the plastic.
02:06:11.000 They break it up.
02:06:12.000 They use it to make power.
02:06:13.000 They use it to make road materials.
02:06:15.000 They make building materials out of it.
02:06:17.000 They reuse everything.
02:06:22.000 I think?
02:06:42.000 And it works incredibly.
02:06:44.000 You can do that in the United States.
02:06:46.000 The thing that we would benefit a lot from is, if we could just point out all the ways in which there's either too many laws or laws are conflicting, You can at least have a conversation about batting those back.
02:07:00.000 And the second is, if you look inside of private equity, there is one thing that they do which I think the government would hugely benefit from, and it's called zero-based budgeting.
02:07:14.000 And this is an incredibly powerful but boring idea.
02:07:18.000 What private equity does when they buy a company, some of them, the best ones, they'll look at next year's budget, and if they say, what should the budget be?
02:07:28.000 Well, guess what's going to happen, Joe, in your company?
02:07:30.000 Everybody runs and says, I need X for this, Y for that, Z for this, and you have this budget that's just ginormous.
02:07:38.000 Instead, what some of the best private equity folks do is say, we're starting with zero.
02:07:44.000 Next year's budget is zero.
02:07:45.000 We're spending nothing now.
02:07:47.000 Let's build it back up meticulously block by block.
02:07:52.000 So somebody comes in.
02:07:53.000 Okay, what is it exactly that you want to do?
02:07:56.000 I want to build an interface that allows...
02:07:59.000 They start saying something.
02:08:01.000 No.
02:08:03.000 Okay, what do you want to do?
02:08:04.000 I want to upgrade the factory so that we can make a more high-yield...
02:08:08.000 Okay, done.
02:08:09.000 You're in.
02:08:09.000 How much do you need?
02:08:10.000 Okay.
02:08:11.000 One by one by one.
02:08:13.000 And if you go and you do that inside the government, what you probably would find is that same group of people would probably enjoy their job a lot more.
02:08:21.000 They'd actually be, their hands would be on the controls in a much more directed way.
02:08:27.000 We'd spend a lot less because a lot of this stuff probably just goes by the wayside and we don't even know, you know.
02:08:34.000 And people would just be more able to go and work.
02:08:37.000 You could do what you wanted to do.
02:08:39.000 I could do what I wanted to do.
02:08:41.000 Elon could do what he wants to do.
02:08:43.000 There was a thing, I tweeted it out today.
02:08:49.000 He cannot get the FAA to give him a flight permit for Starship 5 and 6. So they're waiting on dry docks, right?
02:09:02.000 They're slow rolling the approval, right?
02:09:05.000 It takes him less time to build these starships now than it does to get government approval.
02:09:11.000 That's what he said.
02:09:13.000 Meanwhile, the FCC, which is a sister organization to the FAA, fast-tracked the sale of 220 radio stations in part to some folks that were foreign entities, right before an election that touched like 160 million Americans.
02:09:31.000 When you look at that, you would say, how can some folks Cut through all the red tape and get an answer quickly.
02:09:39.000 How can other folks be waiting around for something that just seems so obvious and so exceptional for America?
02:09:48.000 And there's no good answer.
02:09:52.000 I don't know what the answer is.
02:09:55.000 I don't think any of us know.
02:09:57.000 No, and then there's just folks that are stuck in space.
02:10:00.000 Meanwhile, there's these two people stuck in space.
02:10:03.000 Yeah.
02:10:05.000 And...
02:10:05.000 And Jamie said they were supposed to be there for how long?
02:10:07.000 Eight hours?
02:10:08.000 Uh, yeah.
02:10:10.000 They were supposed to be there for eight hours.
02:10:11.000 They were supposed to be quick.
02:10:12.000 They were supposed to be quick.
02:10:13.000 And they've been there for months.
02:10:15.000 They're gonna be there till February.
02:10:16.000 That's so insane.
02:10:19.000 They're gonna be there till February.
02:10:20.000 How terrifying must that be?
02:10:22.000 I mean, I... For maybe you and me?
02:10:24.000 Eight days.
02:10:25.000 Eight days.
02:10:26.000 They were supposed to be there for eight days.
02:10:30.000 I think I would freak out.
02:10:32.000 100%.
02:10:33.000 I think they would too.
02:10:34.000 How do you not?
02:10:35.000 Well, I read this article where they interviewed them.
02:10:38.000 Now, this could be the party line.
02:10:39.000 I don't know.
02:10:39.000 But they're like, this is great.
02:10:41.000 It's my natural place.
02:10:42.000 I'm happy here.
02:10:42.000 Oh, good lord.
02:10:44.000 I can't believe that's real.
02:10:46.000 I had a friend of mine who went to— That's what they say to themselves.
02:10:48.000 Keep from going crazy.
02:10:49.000 Well, yeah.
02:10:49.000 A friend of mine went to space.
02:10:51.000 The founder of Cirque du Soleil, Guy Laliberte.
02:10:54.000 And he brought a super high...
02:10:57.000 Google talks about like it's already over.
02:10:59.000 What?
02:11:00.000 But it's still going on.
02:11:01.000 No, it's still going on.
02:11:01.000 It says it until February 21st.
02:11:03.000 Yeah, February 2025. It hasn't happened yet.
02:11:05.000 Yeah.
02:11:06.000 We're stuck in space.
02:11:07.000 It says like if it ended up spending eight months...
02:11:09.000 That's great.
02:11:09.000 That's just AI. Wow.
02:11:12.000 Yeah, it could be way more.
02:11:15.000 It could be way, way more.
02:11:16.000 That's weird.
02:11:17.000 That's another flaw with AI, right?
02:11:21.000 Read it like that?
02:11:22.000 I wonder what the incentive is for AI to lie to you about that.
02:11:25.000 How does AI not know it's not 2025 yet?
02:11:29.000 We're stuck in space until February of 2025?
02:11:32.000 Well, that's just a straight-up error.
02:11:34.000 That's a weird error, though.
02:11:36.000 It is a weird error.
02:11:37.000 But these poor people, you know?
02:11:41.000 So my friend that was up there said it was incredible.
02:11:45.000 He has this funny story where he was a smoker.
02:11:48.000 He still is a smoker, but...
02:11:50.000 This was like 20 years ago, so he was going up on like a Soyuz rocket.
02:11:55.000 And he shows up, I guess in Siberia is where they do the launches.
02:12:00.000 And he was really stressed out because he had to stop smoking and he had to stop drinking and all this stuff.
02:12:06.000 And he shows up and the cosmonauts are smoking.
02:12:09.000 Oh, no.
02:12:10.000 They're like, oh, it's totally fine.
02:12:11.000 Don't worry.
02:12:11.000 So they smoke in space?
02:12:12.000 No, no, no.
02:12:13.000 I'm saying on the ground while they were trading.
02:12:14.000 Oh, boy.
02:12:15.000 So they go up.
02:12:16.000 He does eight days.
02:12:16.000 He comes back down.
02:12:18.000 He took these incredible high-res pictures of, like, all the parts of the Earth.
02:12:22.000 He said it was the most incredible thing.
02:12:24.000 But, you know, when you get back, he's like, I was ready to get back.
02:12:27.000 Did you see this latest report?
02:12:29.000 There's, like, real controversy about some finding that the James Webb Telescope has discovered?
02:12:35.000 And there's some talk of some large object moving towards us that's course-correcting.
02:12:42.000 Yeah.
02:12:43.000 This is the weird part about it.
02:12:45.000 And there's all these meetings, and so all the kooky UAP people are all over it saying, disclosure is eminent.
02:12:52.000 There's a mothership headed towards us.
02:12:57.000 So it gets fun.
02:12:58.000 I don't know what they mean by course correcting.
02:13:00.000 What does that mean?
02:13:01.000 And how do they know it wasn't impacted with something else that diverted it?
02:13:05.000 It could have been that.
02:13:06.000 It could have just been the gravitational fields.
02:13:08.000 It could have been an orbital path.
02:13:09.000 But they're not telling anybody.
02:13:11.000 There's something going on.
02:13:13.000 Do you think they would tell people?
02:13:16.000 Like, imagine if there was a giant chunk of steel, of iron rather, that's headed towards us.
02:13:22.000 That's a great question.
02:13:23.000 I think the question is, what would we do if we knew?
02:13:26.000 Do we have the capability of moving that thing?
02:13:31.000 Would the FCC wait five months to give Elon the— I think you'd probably send as many—but see, I mean, it's all a physics problem at that point.
02:13:40.000 Right.
02:13:40.000 It's also a problem of breaking it up.
02:13:42.000 Exactly.
02:13:42.000 If it breaks up, then you have smaller pieces that are hitting everywhere instead of one large chunk.
02:13:48.000 Isn't this like the perfect reason why being multi-planetary just makes a lot of sense?
02:13:54.000 Sure.
02:13:55.000 Like, for example, would you get on an airplane if they said, hey, Joe, this is the best airplane in the world.
02:14:00.000 It's the most incredible.
02:14:01.000 It's the most luxurious.
02:14:03.000 It has the best weather.
02:14:04.000 You can surf.
02:14:05.000 But there's only one navigation system.
02:14:07.000 And if it goes out, boop!
02:14:10.000 You'd never do that.
02:14:12.000 Right.
02:14:12.000 Would you ever get on that airplane?
02:14:13.000 No.
02:14:14.000 So, you know, I think we owe it to ourselves to have some redundancy.
02:14:19.000 Right.
02:14:19.000 Yeah.
02:14:20.000 But ultimately, I always wonder, you know, like the universe sort of has these patterns that force innovation and constantly move towards further and further complexity.
02:14:31.000 And if you were going to have intelligent life that existed on a planet, what better incentive to get this intelligent life to spread to other parts of the planet than to make that planet volatile, make super volcanoes, earthquakes, Solar flares,
02:14:48.000 all sorts of different possibilities, asteroid impacts, all sorts of different possibilities that motivate this thing to spread.
02:14:54.000 But to say, like, this is fragile, and it's not forever, so create some redundancy.
02:15:01.000 I mean, I was raised Buddhist.
02:15:04.000 I'm not that religious in that way, but I'm kind of weirdly spiritual in this other way, which is, I do think the universe is basically, it's littered with answers.
02:15:17.000 You just got to go and find out what the right questions are.
02:15:20.000 So to your point, like, are all these natural phenomena on Earth, you know, the question is, okay, if that's the answer, well, the question is like, do we want to be a single planet species or do we want to have some built-in redundancy?
02:15:37.000 And, you know, maybe 100 years from now that builds on top of what happens in the next five, we'll have discovered all kinds of different planets.
02:15:46.000 That's an amazing thing.
02:15:48.000 Unquestionably.
02:15:49.000 Unquestionably.
02:15:50.000 And we also know that there's planets in our immediate vicinity that used to be able to harbor life like Mars.
02:15:57.000 We know that Mars was covered in water and Mars had a sustainable atmosphere.
02:16:01.000 So we know that this is not just a dream, that this is possible, that what we're experiencing here on Earth is temporary.
02:16:08.000 And if we get hit by something big—well, we know Earth was hit by a planet in its formation.
02:16:14.000 There was Earth 1 and Earth 2. The formation of the moon, the primary theory is that we were hit by another planet, and that's why we have such a large moon.
02:16:22.000 Is that right?
02:16:23.000 I didn't know that.
02:16:24.000 Quarter the size of Earth.
02:16:25.000 It's like keeping our atmosphere stable and keeping our—it's— A wild shooting gallery out there.
02:16:32.000 It really is.
02:16:33.000 And especially our particular solar system has a massive asteroid belt.
02:16:39.000 There's like 900,000 near-Earth objects.
02:16:42.000 But isn't that so inspiring?
02:16:44.000 This idea of discovering all these other questions that we don't know yet to even ask.
02:16:51.000 That is a life well lived.
02:16:54.000 Yes.
02:16:55.000 That's the most promising aspect to a hyper-intelligent AI, in my opinion, that it'll be able to solve problems that are inescapable to us and also offer us like real hard data about how big of a problem this is and when this needs to be solved by and then come up with actionable solutions.
02:17:18.000 Yeah.
02:17:19.000 And that seems to be something that might escape us as biological entities with limited minds, especially when we're not working together.
02:17:27.000 And you could get AI to have the accumulated power, mind power of everyone, you know, 10x.
02:17:35.000 The mental model is, if an alien showed up today, would humans by and large drop all of their internal issues and cooperate together?
02:17:50.000 Perhaps.
02:17:50.000 Perhaps.
02:17:51.000 I would hope that the answer would be yes.
02:17:54.000 It would have to be something that showed such overwhelming superiority that it shut down all of our military systems and did so openly to the point where we're like we're really helpless against this thing.
02:18:06.000 Well, so I think that one way to think about AI is that it is a supernatural system in some ways.
02:18:15.000 So if we can just find a way to cooperate and harness this and see the bigger picture, I think we'll all be better off.
02:18:24.000 Like, again, killing each other.
02:18:30.000 It's just so barbarically unnecessary.
02:18:33.000 It doesn't solve anything.
02:18:36.000 All it does is just makes more anger.
02:18:38.000 It creates more hatred because what's left over is not positive.
02:18:45.000 And I think that we need to be reminded of that somehow without actually living the experience.
02:18:53.000 Yes.
02:18:54.000 My hope is that one of the things that comes out of AI and the advancement of society through this is the allocation of resources much more evenly.
02:19:03.000 And that we use AI, as I was saying before, the best way to keep people from entering into this country is to make all the other places as good as this country.
02:19:14.000 As good as this country.
02:19:14.000 Then you solve all the problems for everybody.
02:19:18.000 And you don't have this one place where you can go to get a job or you go over there and you get murdered.
02:19:24.000 Well, so I think that, you know, why are a lot of people coming to America?
02:19:29.000 A lot of the reasons, some are clearly political persecution, but a lot of the other reasons are economic, to your point.
02:19:36.000 And so if you can create economic abundance generally in the world, That's, I think, what people want.
02:19:44.000 Most people want, as you said before, a good job.
02:19:48.000 They want to come in and feel like they can point to something and say, I made that.
02:19:51.000 I want to feel proud of that.
02:19:53.000 They want to hopefully get married, have some kids, have fun with them, teach them what they were all about, and then our swan song and we all kind of, I don't know, get reborn or not.
02:20:05.000 Isn't it interesting that the idea of people not getting together in groups and killing people they don't know, that's utopia?
02:20:12.000 That is some sort of ridiculous pie-in-the-sky vision of the possibility of the future of humanity, where that's...
02:20:24.000 Common in small groups, like even in cities.
02:20:27.000 I mean, there's individual murders and there's crimes in cities, but cities aren't attacking other cities and killing everybody.
02:20:35.000 So there's something bizarre about nations, and there's something bizarre about the uneven application of resources and possibilities and, you know, your...
02:20:49.000 Your economic hopes, your dreams, your aspirations being achievable pretty much everywhere.
02:20:57.000 If we did that, I think that might be the way that we solve most violence or the most horrific nonsensical violence.
02:21:06.000 And you have this data point.
02:21:08.000 I said this before, but the most important thing that has happened Is that in the last four or five years, we have severely curtailed the likelihood of war in the nominal sense.
02:21:22.000 I think Trump was able to basically draw a hard red line in the sand on that stuff.
02:21:28.000 And the underlying reason was because we had enough economic abundance where the incentives to go to war fell.
02:21:36.000 We had just a complete rebirth of domestic hydrocarbons in America.
02:21:41.000 Whether you agree with it or not, my point is it is quite clearly correlated in the data.
02:21:46.000 As we were able to produce more stuff, so economic abundance, we had less need to go and fight with external parties.
02:21:55.000 So I do think you're right, like this reduces it down to we need to find ways Of allocating this abundance more broadly to more countries.
02:22:07.000 Meanwhile, that one crazy thing that you can't unwind and go back from, you can just never go there.
02:22:14.000 And you just have to make sure nobody believes that that is justified.
02:22:19.000 Because in a nuclear event, I think that that's not what happens.
02:22:24.000 Clearly.
02:22:25.000 I saw this brilliant discussion that you had where you were explaining that Trump is the wrong messenger, but many of the things that he did actually were very positive.
02:22:38.000 And I think that is a...
02:22:41.000 It's a very difficult thing to describe.
02:22:45.000 It's a very difficult thing to express to people because we're so polarized, particularly with a character like Trump that's so polarizing, it's very difficult to attribute anything to him that is positive and Especially if you're a progressive,
02:23:03.000 or if you're on the left, or if you've been a lifelong Democrat, or if you're involved in tech.
02:23:08.000 I mean, it's this bizarre denial of basic reality.
02:23:15.000 The reality of what can you see based on what was put in place, what actions were taken, what were the net benefits?
02:23:26.000 I've always been a liberal.
02:23:29.000 And I think I should define what liberalism used to mean.
02:23:33.000 It used to mean absolutely no war.
02:23:36.000 And it used to mean free speech.
02:23:39.000 And it used to mean a government that was supportive of private industry.
02:23:46.000 Try your best.
02:23:47.000 Go out there.
02:23:47.000 We'll look out for you.
02:23:49.000 Come back to us if things go haywire.
02:23:51.000 That's an incredible view of the world.
02:23:54.000 And I think what happened was when I was given a choice, I would vote Democrat or I would support Democrats because I thought that that's what they stood for.
02:24:08.000 And I didn't really understand Trump.
02:24:11.000 And so what happened was I got too caught up in the messenger and I didn't focus enough on the message.
02:24:21.000 And I didn't even realize that.
02:24:23.000 I didn't realize it in 2016, but I don't think many people did.
02:24:28.000 And then in 2020, I got lost in it.
02:24:31.000 But probably by 21 or 22, I started to see all this data, and I said, hold on, I am not being a responsible adult the way that I define responsibility.
02:24:44.000 I am not looking at this data from first principles, and I need to do it.
02:24:50.000 And when I did, what I saw was a bunch of decisions that turned out to be pretty smart.
02:24:59.000 The problem is that because he's the vessel, he turns off so many people with his delivery.
02:25:07.000 And I think this is a moment where the stakes are so high, you have to try to figure out what the message is versus what the messenger is saying.
02:25:16.000 Or look to somebody else that can tell you the message in a way that maybe will allow you to actually listen to it.
02:25:24.000 That could be J.D. Vance, it could be Elon Musk, it could be RFK. There's all kinds of surrogates now, because I think that they have realized that there's a lot of value in these messages.
02:25:38.000 Yes.
02:26:00.000 It's like very unique times creates strange bedfellows.
02:26:04.000 It's sort of like one thing that kind of like always like pops out of me like, why are they working together?
02:26:09.000 Why are they cooperating?
02:26:10.000 I always think like, what's going on here?
02:26:12.000 And when I saw him and Bobby align, you know, Bobby is a very balanced view of Donald Trump.
02:26:20.000 Here's the good, here's the bad.
02:26:22.000 Even now, even with everything that's on the line for Bobby and Bobby's agenda, He's quite honest about Donald Trump's positives and negatives.
02:26:34.000 But they both get along.
02:26:37.000 And one of the things, and probably the most important thing, where they were sounding the drum from day one is, under no circumstance will the United States go to war.
02:26:49.000 I just think we should observe that.
02:26:51.000 People should have an opinion on that.
02:26:54.000 He's so polarizing that there's been two attempted assassinations on him and no one cares.
02:27:01.000 He's like Neo in The Matrix.
02:27:02.000 He's like dodging these bullets.
02:27:04.000 For now.
02:27:05.000 You know?
02:27:05.000 Yeah.
02:27:06.000 But listen, no one can dodge forever.
02:27:07.000 But the thing is, it's like no one seems to care that the rhetoric has ramped up so hard and has been so distorted.
02:27:16.000 The other thing that people need to, I think, think about is...
02:27:19.000 The domestic policy agenda of both the Democrats and the Republicans are within error bars.
02:27:26.000 And what I mean by that is when push comes to shove, they both, whether it's Kamala Harris or Donald Trump, they have to work through a very sclerotic Congress, which means that very little will ultimately get done if you just look at the track record of all these past presidents.
02:27:46.000 You typically get one piece of landmark legislation passed in your first two years, and it all just gets unwound.
02:27:55.000 It's happened from Clinton onwards.
02:27:57.000 You know, Bush had one bite at the apple.
02:28:00.000 Obama had one bite at the apple.
02:28:01.000 Trump had one bite at the apple.
02:28:03.000 Biden had one bite at the apple.
02:28:05.000 So the American political system has a really incredible way of, like, insulating itself.
02:28:14.000 So, if people would just take a step back and look at that, a lot of the policy agendas that both of them espouse are going to be very hard to get done.
02:28:25.000 There'll be one thing, you know, maybe they both do something on, you know, domestic taxation.
02:28:31.000 Maybe they both do something on the border.
02:28:34.000 But the likelihood, based on the past, is that they'll get one of these things done and then not much will be done.
02:28:41.000 This is why I think folks then need to think about, okay, what are the super presidential powers then where they can act alone?
02:28:53.000 One area where they can act alone is they can issue executive orders.
02:28:58.000 And that can direct the behavior of governmental agencies.
02:29:04.000 Okay, so people should decide what they think about that.
02:29:07.000 Do you want a muscular American bureaucracy?
02:29:11.000 Do you want a more slimmed down one?
02:29:13.000 Do you want one that has, you know, bigger ambitions, more teeth?
02:29:19.000 Do you want one that is zero base budgeted?
02:29:22.000 They're pretty stark on those things.
02:29:25.000 And then foreign policy I think one camp is much more in the view that we are the world's policemen and there's a responsibility that comes with that.
02:29:38.000 And one says, we got a lot of problems at home.
02:29:41.000 We're not getting pulled into something abroad.
02:29:45.000 And I think people need to decide about that.
02:29:46.000 But other than those two threshold issues, my honest opinion is that we're in error bars between the two of them.
02:29:56.000 One will cut taxes by this much, one will increase taxes by that much.
02:30:02.000 But there is real decisions that have been made during the Biden administration about the border that are affecting people.
02:30:09.000 Or lack thereof.
02:30:10.000 I think it's a decision.
02:30:12.000 I don't think it's a lack thereof, especially the flying people in and the utilization of an app.
02:30:18.000 To fly people in?
02:30:20.000 That seems insane.
02:30:22.000 The whole thing seems insane and I don't know what the motivation is.
02:30:26.000 I've talked to people that know a lot about the construction business and they believe the motivation is cheap labor.
02:30:30.000 I think that's part of it and that a lot of the problem is in many industries.
02:30:36.000 The lack of cheap labor and people that are willing to do jobs.
02:30:38.000 It's one of the things that I've heard, you know, there's a lot of criticism about all the Haitians that have moved to Springfield, Ohio, but one of the positive things that I've heard from people that live there is that these people are hard workers and they're willing to do jobs that the other people weren't willing to take on.
02:30:53.000 So you have pros and cons, but you have this incentivized effort to move people into this country illegally, which will undoubtedly bring in people that you don't want here.
02:31:05.000 Gang members, cartel members, terrorists.
02:31:08.000 That's real.
02:31:09.000 And we've documented that.
02:31:10.000 And there's people that have been arrested that were trying to come in that were terrorists.
02:31:13.000 And there's people that have gotten through for sure.
02:31:15.000 I think that if I give both of them the benefit of the doubt, I think both of them will have to act on the border.
02:31:21.000 I think that Donald Trump has had a clearer view of this issue for much far longer.
02:31:27.000 I think that Kamala has had to shift her position to make herself more palatable to centrists.
02:31:34.000 But I do think that both of them will probably have to act, because I don't think what's happening today is sustainable.
02:31:42.000 I don't think it is either, but the fear, and Elon's talked about this, the real fear is that they're bringing these people in, give them a clear path to citizenship, which will allow them to vote, and then you've essentially bought their vote.
02:31:54.000 So if the Democrats bring them in, incentivize them to become Democrats and vote, and give them money, which they clearly are doing, they're giving them EBD cards, and they're giving them housing, and they're giving things that they're not giving to veterans and poor people in this country.
02:32:09.000 That seems to be an incentive.
02:32:36.000 Mm-hmm.
02:32:39.000 I mean, I—yeah, I have—I don't know whether it's a conspiracy, per se, meaning—but I do agree with the outcome, meaning I remember very vividly— We,
02:32:57.000 my parents, took up the whole family, three of us, myself and my two sisters, to Niagara Falls, and then we crossed the border to Buffalo.
02:33:06.000 And we applied for refugee status in America as well.
02:33:12.000 We didn't get it.
02:33:13.000 We were rejected.
02:33:15.000 And when we went back, we got a tribunal hearing in Ottawa, where I grew up.
02:33:23.000 And I remember that it was in front of this magistrate judge, so the person comes in with the robes and the hair and everything, and you sit there, and they hear- They have the wigs up there?
02:33:31.000 All of it, yeah.
02:33:34.000 The wigs.
02:33:35.000 And then they sit there and they hear your case out.
02:33:40.000 And my father had to defend our whole family.
02:33:45.000 Here's, you know, what our life was like.
02:33:46.000 Here's what we did.
02:33:48.000 And I remember just crying from the minute it started.
02:33:52.000 That's all I did the whole time.
02:33:54.000 It seared in my mind because, like, you know, your life is right there.
02:33:58.000 It's like a crucible moment for your whole family.
02:34:00.000 If they're like, I don't buy it, off you go.
02:34:04.000 We go back and I don't know what would have happened.
02:34:08.000 Fortunately, obviously, it worked out.
02:34:11.000 And then, you know, you go through the process, I became a Canadian citizen.
02:34:14.000 Then I moved to the United States to get on a visa, then I become an American citizen.
02:34:18.000 I have an enormous loyalty to this country.
02:34:23.000 And so when I think about, like, Americans not getting what they deserve before other folks, it really does touch me in a place—like, I get very agitated about that idea.
02:34:34.000 It's not that those folks shouldn't be taken care of in some way, shape, or form, because I was one of those people that needed a little bit of a safety net, right?
02:34:43.000 We needed welfare.
02:34:44.000 We needed the places to go in to get the free clothes and all that stuff.
02:34:51.000 But you have to sort of take care of all of the people that are putting in the effort and the time to be here and followed the rules and stood in line.
02:35:02.000 Like when I came to the United States, man, I came on a TN visa.
02:35:08.000 Every year, you had to get it renewed.
02:35:10.000 You had to show up.
02:35:11.000 And if the person that was looking at you said, Chamath, out, you're gone, Joe.
02:35:17.000 Then I had to transfer to an H-1B visa.
02:35:19.000 My company had to show that there wasn't an American that could do this job.
02:35:26.000 And then we were able to show that.
02:35:29.000 So I've lived this experience of an immigrant following the rules And just methodically and patiently waiting and hoping and the anxiety that comes with that.
02:35:40.000 Because it comes with tremendous anxiety.
02:35:42.000 If you ask people that were on H-1Bs in America, there was a website, I don't even know if it exists anymore, but we would check what, you know, because when you apply for a green card, you get an application date.
02:35:57.000 And man, I would sweat that website every other week.
02:36:00.000 Hey, did they update that?
02:36:01.000 And it would be like four years in the past and I'm like, I'm never going to get my green card.
02:36:07.000 My visa is going to expire.
02:36:08.000 I'm going to have to move back to Canada.
02:36:10.000 But I still play by the rules.
02:36:13.000 So I just think it's important to recognize that there are a lot of folks that play by the rules that are immigrants to this country.
02:36:19.000 There are a lot of people that were born here that have been playing by the rules.
02:36:22.000 And I think we owe it to them to do the right thing for them as well.
02:36:27.000 And then try to do the right thing for some folks that are coming across the border because there probably are some of them legitimately are escaping some really bad stuff.
02:36:37.000 Quite a lot of them.
02:36:37.000 Quite a lot of them.
02:36:38.000 And I'm sure most of those people are people that just want a better opportunity.
02:36:41.000 And that's a great thing.
02:36:43.000 And that's a great thing.
02:36:44.000 But you have to take care of all the people here, especially the veterans and especially these people that have been struggling in these inner cities that have dealt with the redlining and all the Jim Crow laws that have set them back for decades and decades and has never been corrected.
02:37:00.000 There's never been any effort to take these places that have been economically fucked since the beginning of the 20th century and correct it.
02:37:10.000 And instead you're dumping all this money into people that have illegally come here.
02:37:14.000 That to me is where it starts looking like a conspiracy.
02:37:17.000 I think that as long as people can explain what they're doing for these other folks that you just mentioned, I think for a lot of people, for 50% of the population that leans red on this topic, you could at least explain to them.
02:37:34.000 The problem is that there is no explanation.
02:37:36.000 There is a $150,000 home credit that Gavin Newsom was about to give.
02:37:43.000 I think he vetoed it.
02:37:44.000 I could be wrong.
02:37:45.000 He did.
02:37:45.000 It was wildly unpopular.
02:37:47.000 But that bill somehow gets to his desk.
02:37:50.000 And is there a bill that says we should have better food for the food deserts?
02:37:56.000 Did that bill get passed?
02:37:58.000 So there's clearly a way for state legislatures to do what's right on behalf of the folks in their state.
02:38:06.000 So if we just had a little bit more balance, and then if we were able to shine a light on those things...
02:38:14.000 A lot of the people that live here that contribute would feel better about where things were going and wouldn't feel like the whole thing is just rigged.
02:38:25.000 Right.
02:38:25.000 Well, that's one of the things that people are so excited about with this Trump union with Tulsi Gabbard and Robert Kennedy is that you're having these movements that seem to be almost impossible to achieve outside of an outsider,
02:38:41.000 like the Make America Healthy Again concept.
02:38:43.000 What are you talking about?
02:38:45.000 You're going to go up against these companies that have been donating to these political parties forever and have allowed them to have these regulations that are allowing them to have these dyes in food that's illegal in our neighboring Canada?
02:38:58.000 What?
02:38:59.000 No one's done that before, right?
02:39:01.000 So that's very exciting.
02:39:03.000 But again, messenger message.
02:39:05.000 Messenger message.
02:39:06.000 Just take a step back, though, and if you were just the average Joe citizen, I think an important thing to just notice is why are all these totally different people acting as a team?
02:39:27.000 I just think it's an interesting thought question for...
02:39:31.000 I don't have an answer.
02:39:32.000 And I'm not going to plant an answer.
02:39:35.000 But just ask yourself, why are all of these people cooperating?
02:39:40.000 And I think...
02:39:42.000 The 2024 election is a lot about the traditional approach to governance and a very radical reimagining of government.
02:39:56.000 And I think that's what effectively will get decided.
02:39:58.000 The traditional approach says, we're going to create robust policies, we're going to work sort of top down, you know, this muscular foreign policy, muscular domestic policy, the government's going to play a large part of the economy, and we're going to try to right some wrongs.
02:40:15.000 The radical reimagining says, We're going to go back to a more founding notion of this country.
02:40:25.000 We're going to have a very light governmental infrastructure.
02:40:29.000 We are going to cut back a bunch of the rules.
02:40:34.000 And we're going to take a little bit of a step back on foreign policy so that we don't end up in a situation we can't pull back from.
02:40:44.000 In that lens, it's very different.
02:40:47.000 In the lens of actual policy, I honestly think that it's pretty much six of one, half a dozen of the other.
02:40:53.000 But in that first lens, they're really markedly different choices.
02:40:57.000 And we'll see.
02:40:59.000 But the lens that you're describing, the thing that distorts everyone's vision is Donald Trump as a human being.
02:41:05.000 That's the thing.
02:41:06.000 And it's also the media's depiction of him, which has been grossly distorted.
02:41:10.000 And I think that, you know, I met him and spent time with him.
02:41:16.000 I've also had lunch with Kamala.
02:41:19.000 She was very kind, very nice person.
02:41:21.000 Donald Trump, very funny, very kind, very polite, like he talks to you.
02:41:33.000 And I just was like, wow, this is like totally—and exactly what you said.
02:41:37.000 I was like—I was expecting something totally different.
02:41:43.000 And I think, though, that the part—at the core, part of what—where the media goes crazy, I think, I'm guessing, is that there's a part of him as well that's like an entertainer.
02:41:54.000 I mean, he's better than—he's as good as any comedian.
02:41:59.000 Yeah.
02:41:59.000 He's on point.
02:42:00.000 He's got rhythm.
02:42:03.000 He knows how to land.
02:42:04.000 So there's a thing that he's doing when he's on stage, which for the audience, I think, is no different than going to a show or a revival or something.
02:42:13.000 You're seeing a star.
02:42:15.000 But then if you're looking at him as Donald Trump, the person, I think the media really gets tilted.
02:42:24.000 Yeah.
02:42:24.000 Well, not only that, they've distorted who he is.
02:42:28.000 Whatever flaws Donald Trump has are nothing in comparison to the media's depictions of him.
02:42:34.000 Everybody's got flaws.
02:42:36.000 His, I think, are—his exist and are well described.
02:42:40.000 But I do think that they—I think there's like a couple of good examples.
02:42:46.000 You know, one example that bothered me was the Charlottesville press conference.
02:42:55.000 When I first heard the media depiction of it, I was really upset because of what I thought he said.
02:43:02.000 It turned out he didn't say it.
02:43:04.000 In fact, not only did he not say it, he said the exact opposite.
02:43:09.000 And then I was really frustrated and a little bit angry because I thought, he was never lying to me.
02:43:15.000 The filter was lying to me.
02:43:17.000 Right.
02:43:18.000 And I'm not paying for those people to lie to me.
02:43:23.000 I'm paying for them to actually give me the transcript of it so that I can decide for myself.
02:43:28.000 I think that's part of a responsibility of being a cogent adult.
02:43:32.000 And the only repercussions of them lying is a lack of trust that people have for them now.
02:43:41.000 I think the trust in the mainstream media is the lowest it's ever been.
02:43:46.000 I think way more people trust you.
02:43:48.000 Way more people trust us to tell a version of what we think is happening.
02:43:56.000 You're not going to lie.
02:43:57.000 And you're like interested in just showing the clips and then just debating.
02:44:01.000 What did he mean?
02:44:02.000 What did he say?
02:44:03.000 Why did he say this?
02:44:04.000 Why did he say that?
02:44:06.000 By the way, the same goes for Kamala because now, you know, the domestic political machinery is going to try to characterize her as well.
02:44:14.000 Cherry-picked comments, she says.
02:44:16.000 So my point is, I think we have to suspend this.
02:44:19.000 Right, but it's not balanced.
02:44:20.000 Right.
02:44:20.000 Particularly, look at a debate where they fact-checked Trump multiple times, but they didn't fact-check her.
02:44:27.000 By the way, I read this.
02:44:29.000 Is this true or not, but the co-host was her sorority sister?
02:44:33.000 Yes.
02:44:34.000 That is true?
02:44:35.000 Yes.
02:44:38.000 Not just that, but there's the affidavit by the person from ABC that said she was aware of the questions and that she was told that there were certain things that were going to be off-topic or off-limits, like her record as a DA. And also some other person that she's attached to that's involved in something shady.
02:44:58.000 And then on top of it, there was things that she said that were absolutely untrue.
02:45:02.000 One of them, one of the grossest ones, was she saying that we don't have any troops deployed in combat zones.
02:45:08.000 Have you ever seen that one where the troops in the combat zones are going, what the fuck are we doing here?
02:45:14.000 And then Dan Crenshaw.
02:45:16.000 See if you can find Dan Crenshaw's post on Instagram.
02:45:19.000 He...
02:45:20.000 He's a Navy SEAL with one eye.
02:45:22.000 Yeah, yeah.
02:45:22.000 I mean, that's a guy who understands the consequences of war, right?
02:45:26.000 Clearly.
02:45:26.000 Paid the price personally for serving.
02:45:28.000 But on his Instagram, he laid out how many troops are in active combat zones.
02:45:34.000 It's tens of thousands in multiple combat zones of American citizens.
02:45:39.000 I mean, look, I was trying to be charitable when I said that.
02:45:42.000 Like, I think that there's...
02:45:44.000 Like, I'm going to assume that both people...
02:45:51.000 Are smart.
02:45:52.000 They believe what they believe.
02:45:54.000 They're both trying to do what they think is right.
02:45:55.000 They both want to win.
02:45:56.000 Okay, let's take that off the table for a second.
02:46:00.000 The filters that then try to give them that message will try to pervert that truth for their own best interests.
02:46:07.000 And I think what they have decided, the mainstream media, is their best interests are better served through one, i.e.
02:46:12.000 Kamala, and less well served through another, i.e.
02:46:15.000 Donald Trump.
02:46:16.000 Yes.
02:46:17.000 So if there's ever been a moment where it's time for all of us to show up and be a grown-up, try to get the source material, try to think about things from first principles, I'm telling you it is the most consequential election of our lifetime.
02:46:36.000 And the simplest reason why is that the president decides to hit the button on the nuclear football.
02:46:46.000 So just imagine for one second, irrespective of what your politics is, who do you want to hold that thing?
02:46:57.000 What do you want them to do under even the most excruciating pressure in the world?
02:47:05.000 And I think what we want to find is someone, at least in that very narrow moment, will be a JFK-like decision.
02:47:16.000 I will block everybody else out.
02:47:19.000 It is entirely about my desire and wish for how I want America to be known.
02:47:24.000 And I'm going to protect my children and my grandchildren.
02:47:27.000 You cannot touch the button.
02:47:29.000 You can't get close to the button.
02:47:32.000 I think we're a lot closer than people think.
02:47:35.000 I think we are, too.
02:47:37.000 Thank you very much for being here, man.
02:47:38.000 That was really fun.
02:47:39.000 I really enjoyed it.
02:47:40.000 Thanks.
02:47:41.000 I know you're very busy, so I really appreciate your time.
02:47:43.000 It was a real honor.
02:47:44.000 It was an honor for me as well.
02:47:46.000 Thank you.
02:47:46.000 Thank you.
02:47:47.000 All right.
02:47:47.000 Bye, everybody.