The Joe Rogan Experience - May 05, 2026


Joe Rogan Experience #2494 - Chamath Palihapitiya


Episode Stats


Length

2 hours and 45 minutes

Words per minute

182.43814

Word count

30,230

Sentence count

2,497


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcripts from "The Joe Rogan Experience" are sourced from the Knowledge Fight Interactive Search Tool. Explore them interactively here.
00:00:02.000 Joe Rogan Podcast, check it out.
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:09.000 Yeah, I was listening to Tim.
00:00:14.000 First of all, hello.
00:00:15.000 What's up?
00:00:15.000 Good to see you, my friend.
00:00:16.000 Great to see you.
00:00:17.000 We were listening to Tim Dillon.
00:00:19.000 I was listening to it on the way over here, and he was talking about Anna Paulina Luna and Tim Burchett and Trump.
00:00:25.000 They're all talking about the UAP disclosures and, like, why now?
00:00:30.000 Like, what are they doing?
00:00:31.000 Like, why are they distracting us with this?
00:00:33.000 Tim Burchett said that whatever they're going to release, it will be indigestible.
00:00:39.000 What does that mean?
00:00:40.000 Right.
00:00:41.000 Indigestible, as in, or, well, then it doesn't mean that it's real, then.
00:00:46.000 Well, I think it means that it'll be so crazy, if it's real.
00:00:50.000 So crazy.
00:00:51.000 He's the one that's been saying that there's these confirmed bases under the ocean, that there's these specific locations.
00:00:58.000 I think you talked to, you're shaking your head.
00:01:00.000 You don't believe a word of it.
00:01:01.000 No.
00:01:02.000 How come?
00:01:03.000 I think it's true that there are.
00:01:05.000 Look, it's completely implausible that there aren't other species.
00:01:11.000 Right.
00:01:12.000 Completely implausible.
00:01:14.000 Just the vastness of what we're dealing with.
00:01:17.000 So, the real question is why haven't we encountered people or those things, those beings?
00:01:22.000 Right.
00:01:23.000 And it's probably because they just have bigger fish to fry.
00:01:28.000 So, by the time that we meet them and they meet us, we're going to kind of be at the edge of like we've kind of been there, done that on our own planet, and then we've kind of like developed the Technology, I guess, to get beyond it.
00:01:42.000 But somewhere along the way, there must have been a few, just mathematically impossible.
00:01:45.000 So then the question is, is it buried or were people confused when it first came?
00:01:49.000 You're like, if you had a spaceship land in like the 1800s, what would people have done?
00:01:54.000 They would have just freaked out, they wouldn't have understood it.
00:01:56.000 Maybe they would have buried it.
00:01:57.000 Depending on where it was, maybe they started to pray to it.
00:02:01.000 And you would have just moved on.
00:02:03.000 And then that isn't documented in history.
00:02:05.000 But it is.
00:02:06.000 But how?
00:02:07.000 It is.
00:02:07.000 There's a lot of it.
00:02:09.000 Documented in history.
00:02:10.000 Oh, you mean like hieroglyphics and like monuments?
00:02:12.000 Well, the book of Ezekiel.
00:02:13.000 The book of Ezekiel goes in depth about some sort of a UFO encounter that Ezekiel experiences.
00:02:20.000 Right.
00:02:20.000 Where it's a wheel within a wheel and a cloud with fire flashing forth continually in the midst of a cloud as it were gleaming metal.
00:02:30.000 And from the midst of it came the likeness of four living creatures and the creatures darted to and fro like the appearance of a flash of lightning.
00:02:37.000 This is all in the Bible.
00:02:39.000 It's also in the Mahabharata.
00:02:42.000 They talk about Vaimanas, these flying crafts.
00:02:45.000 And I think it's entirely possible that we have been visited periodically and that we have been monitored and that we are monitored.
00:02:54.000 I agree.
00:02:55.000 Currently.
00:02:56.000 I agree.
00:02:56.000 And if I was going to hide, I would hide in the ocean.
00:02:59.000 Well, to be honest, as I get older, I'm convinced we're basically in some form of a simulation.
00:03:06.000 There's like all these little ingredients that if you start to see these little clues, you're like, They all seem so odd in isolation.
00:03:14.000 And then when you put them together, I feel like a crazy person.
00:03:16.000 So I ignore myself.
00:03:18.000 But I wonder why did this happen?
00:03:19.000 Like yesterday, I was at a dinner in LA before I came to see you.
00:03:24.000 And I told this very interesting story.
00:03:27.000 Well, or I thought it was interesting at the time.
00:03:31.000 You know, that like, so in 2000, right?
00:03:34.000 If you think of like what happened in tech since 2000, so the last 26 years, people can give you all kinds of like fancy theories.
00:03:44.000 But there's just like this weird word that's been at the center of every single technological revolution for the last 30 years, and that word is attention.
00:03:56.000 Let me explain this to you.
00:03:58.000 Google, they invent Google.
00:04:00.000 Google is an algorithm.
00:04:00.000 What is Google?
00:04:03.000 It's called PageRank.
00:04:05.000 But if you look inside of it, what is it?
00:04:06.000 It says, well, Chamoth's website has five links to it.
00:04:11.000 Joe's website has two links.
00:04:12.000 He's getting more attention.
00:04:15.000 Okay?
00:04:15.000 Chamoth's website is more important.
00:04:17.000 That's the sum total of Google.
00:04:18.000 Now, they've made that a lot more refined and they've done all these other fancy things.
00:04:25.000 But it's all about attention.
00:04:26.000 Fast forward to 2007, 8, 9, when Zuck and then when I went to work for Zuck and we got on the scene, we're like, What does everybody care about?
00:04:36.000 Attention.
00:04:37.000 And so, what is like the Facebook algorithm?
00:04:40.000 What's the Instagram algorithm?
00:04:42.000 You know, how did we construct newsfeed all around attention?
00:04:46.000 Joe had 35 likes, Jamie had 12 likes.
00:04:48.000 Your thing is more important.
00:04:49.000 Let's give it more importance because it's seemingly meeting all these human needs.
00:04:54.000 Attention, attention, attention.
00:04:56.000 So, phase one, attention.
00:04:58.000 Phase two, attention.
00:05:00.000 And this is where I'm like, how can this be possible?
00:05:02.000 In phase three, we're like looking at AI.
00:05:05.000 And when you look backwards four years, the seminal paper is called Attention is All You Need.
00:05:10.000 It's about this word again.
00:05:12.000 And when you look inside of the core part, if you peel out, peel, you know, apart AI, the little brain that makes it so capable is called an attention mechanism.
00:05:24.000 It's just attention.
00:05:25.000 It's all about, again, this idea of I'm going to scour all this information and I'm going to figure out what patterns repeat itself and I'm just going to double down on the stuff that I see more of because that attention must mean it's more important, it's more true, it's more knowledgeable.
00:05:40.000 And then I think, how could it be?
00:05:42.000 Like, we're all like, why is it that these things are just repeating over and over again?
00:05:46.000 And I just get confused.
00:05:48.000 I don't exactly know how to explain it.
00:05:50.000 So, are there other ways in which we should be doing things?
00:05:53.000 Absolutely.
00:05:54.000 Have we even explored it?
00:05:55.000 So then I think, well, is this just a simulation?
00:05:55.000 No.
00:05:57.000 Some kid in his house just playing some simulation and we're all just party to it and that's all he understands is attention.
00:06:04.000 I don't know.
00:06:05.000 I don't think it's that simple that there's a person playing a game.
00:06:09.000 But if you break down just attention, well, that's.
00:06:13.000 All of human history is paying attention to the king, paying attention to the war, paying attention to resources, paying attention to who says the thing that resonates the most with the people.
00:06:26.000 It's all about what human beings are paying attention to.
00:06:30.000 I think it's part of it.
00:06:32.000 Then there's also what is actually true.
00:06:35.000 I think sometimes what is true and what people pay attention to are not the same thing.
00:06:40.000 True.
00:06:41.000 Yeah.
00:06:41.000 And sometimes— The thing that you should be paying attention to gets lost because the thing that you are paying attention to gets more attention because it's more interesting and useful.
00:06:52.000 That's sort of where we are right now.
00:06:54.000 We're in this really weird phase, I think, where you actually should be focused on this thing over here, and instead we're all focused on all these things over here.
00:07:04.000 Give me an example.
00:07:08.000 Here's a very big one.
00:07:11.000 I think it's pretty fair to say since the last time you and I saw each other on this show, The attitude towards technology, I think, has been pretty profoundly negative.
00:07:22.000 It's kind of tilted.
00:07:23.000 It's relatively like anti AI, anti billionaires.
00:07:29.000 It's anti all of this stuff.
00:07:32.000 And it manifests in all of these interesting ways.
00:07:37.000 There's protests, there's data centers, there's all of this stuff that's happening.
00:07:41.000 People are worried about job loss.
00:07:44.000 All of that stuff is real.
00:07:45.000 Do you want a cigar?
00:07:46.000 No, I'm okay.
00:07:46.000 I'm okay.
00:07:49.000 But what should they really be focused upon?
00:07:51.000 And I think what they should be really focused upon is we're at the tail end of a cycle that doesn't work anymore, which is all about this tension between labor, people that do the work, and capital, the people that fund it and then make all the returns.
00:08:06.000 And over the last 40 years, we've basically gone to this completely upside down world where capital extracts all of the upside and labor has extracted less and less and less and less.
00:08:18.000 And all of this pushback manifests in AI.
00:08:22.000 It manifests in politics.
00:08:23.000 It manifests in social issues.
00:08:25.000 It manifests in Israel.
00:08:27.000 Whatever you want to talk about, all of these issues, I think symptomologically, come from this other issue, which is we are out of balance.
00:08:34.000 This total compact that we used to have, a liberal democracy and a free market, has totally collapsed.
00:08:41.000 And there are simple ways to fix that, but that never gets the attention because it's not what you want to talk about.
00:08:46.000 The attention is here.
00:08:49.000 Vote no to the data center.
00:08:52.000 This model is going to take out all the jobs.
00:08:56.000 You know, this social issue is really important.
00:08:59.000 That war should not be fought.
00:09:00.000 That war should be fought.
00:09:02.000 All of these things, while important, distract us from what the core issue is.
00:09:08.000 And the core issue is that we as a society, I think, are out of balance.
00:09:12.000 The natural compact between all of us is broken.
00:09:17.000 And there are some simple ways to fix that compact get people more invested, get people more engaged in the upside, have people have a positive view of what's happening.
00:09:26.000 And that isn't happening.
00:09:27.000 What simple solutions are there to?
00:09:30.000 To this one very particular issue.
00:09:33.000 Okay, I'll get your reaction to this.
00:09:36.000 Let's assume that you still lived in California, because I think it tells this example in a more extreme way.
00:09:41.000 Okay.
00:09:42.000 Let's say you make a million bucks a year, which is a lot of money, but it makes the point more cleanly.
00:09:50.000 You'd pay, I think, 30% federal tax, and you'd pay another 15 or 16% in state tax and Medicare tax and all this tax.
00:10:02.000 So, if you're a wage earner, 50% of all your upside goes to the government.
00:10:12.000 If you're a capital earner and you make that same million dollars via capital gains, you pay half that tax.
00:10:21.000 Why did that happen?
00:10:23.000 That happened because in the 40s and 50s, but really in the 60s and 70s and 80s, what we were trying to do, or what the American government and what Western societies were trying to do, Was to convince people to invest their money.
00:10:38.000 Hey, Joe, go build that factory.
00:10:40.000 Go hire those people.
00:10:42.000 And we're going to incentivize you to do so.
00:10:45.000 And by doing that, there was this idea that all of those profits that you would get would then diffuse, right?
00:10:51.000 Trickle down into everybody else.
00:10:53.000 The workers participated, everybody participated.
00:10:57.000 But technology allows you to do more with less and less.
00:11:01.000 So now what happens is the capital owners can accrue.
00:11:06.000 Infinite, almost, it seems like, value.
00:11:10.000 And the workers get less and less.
00:11:11.000 But now, if you get less and less and you're taxed more and more as a percentage of what you own, you're going to feel really out of sorts.
00:11:17.000 You're going to be like, why am I paying 50 cents of every dollar?
00:11:20.000 And I see these other ways where folks are paying 25 cents on their dollars, but their dollars are compounding way faster and they have hundreds of billions more of those dollars than I have of my dollars.
00:11:31.000 If you take that example and you expand it across society, I think people understand that now.
00:11:36.000 There's enough information and there's enough people talking about it where it's Pretty clear that that's happened.
00:11:42.000 So the question is, how do you fix it?
00:11:43.000 I think, like, if you think about AI and if you believe that we're going to get into this world of abundance and we're not working, what does it mean for governments to tax our labor?
00:11:54.000 There is no labor.
00:11:55.000 You're not working anymore.
00:11:56.000 I'm not working.
00:11:57.000 We're doing things out of leisure.
00:11:59.000 Why should I pay 50 cents of every dollar?
00:12:01.000 Why aren't the companies that are going to be making trillions of dollars pay more?
00:12:06.000 Why isn't there an expectation that they then help our lived society do better and thrive as a result of all of that winning?
00:12:17.000 That's the real conversation that I think is bubbling.
00:12:23.000 And I think that we're probably another 12 to 18 months where all of these other issues are going to be important, but they're going to be viewed for what they are.
00:12:33.000 They're going to get demoted, I think, in importance.
00:12:36.000 And it's this core structural issue what is the economic relationship that we have together as a society?
00:12:42.000 What is the relationship between Joe, Chamath, Jamie, and all these companies?
00:12:48.000 And how do we.
00:12:50.000 Feel about a few and an ever shrinking few making more and more and more?
00:12:57.000 And then how do we feel about their ability to share that with a small amount of people?
00:13:05.000 And then what is the expectation for everybody else?
00:13:09.000 I think that's mostly at the core of what's happening.
00:13:12.000 And so, back to like, you know, all of this attention that we give to these other issues distracts from that one because I think you can get organized to fix this issue.
00:13:21.000 You can't get concessions on any of these issues.
00:13:23.000 You know, you bring up Israel, it's like this.
00:13:25.000 You bring up social issues, it's like this.
00:13:27.000 You bring up, you know, whatever you want to bring up, people just kind of take aside, nothing happens.
00:13:33.000 This is actually where people are universally actually much more aligned than you think.
00:13:38.000 Because there's reasonable ways.
00:13:39.000 One simple way is you'd say, well, let's flip the taxation model.
00:13:43.000 Corporate taxes should exceed personal taxes.
00:13:49.000 They've never.
00:13:52.000 We should have an expectation.
00:13:55.000 That then corporate actors can buy down their taxes if they want, but if they do social good for society.
00:14:02.000 I'll give you an example.
00:14:03.000 At the Industrial Revolution, there's a table like this, and the leading lights of that era Andrew Carnegie, Nelson Rockefeller, Jay Gould, JP Morgan they sat together and they said, Guys, this is going to benefit us, this Industrial Revolution.
00:14:22.000 It may not benefit everybody.
00:14:23.000 What is our responsibility?
00:14:24.000 What is our collective responsibility?
00:14:27.000 And they allocated tasks.
00:14:30.000 Carnegie went and built libraries all throughout the country.
00:14:34.000 Rockefeller built universities.
00:14:35.000 Hospitals were built.
00:14:38.000 And I think what happened is society was like, wow, these are living testaments to us doing well.
00:14:44.000 And so then they were okay with this transition.
00:14:47.000 But if you think about it today, what are the living tributes that capital builds and leaves behind for society?
00:14:55.000 It's fewer and fewer.
00:14:57.000 I think that's a very big opportunity for somebody to fill.
00:15:01.000 I think it's like.
00:15:03.000 Especially for folks in tech, I think.
00:15:05.000 If they can get themselves organized to do that, I think we land in a good place.
00:15:09.000 If they cannot get themselves organized to do that and say everyone for themselves, I think it's going to be really complicated, super messy.
00:15:20.000 Super messy because that sentiment that the wealthy are getting wealthier and the middle class is disappearing and the poor are being taxed into oblivion.
00:15:31.000 Look, an $80,000 year teacher.
00:15:33.000 Pays 40% tax.
00:15:35.000 But if you're a multi billionaire, most of your wealth is not W 2 wages.
00:15:42.000 It's cap gains.
00:15:43.000 But there's all kinds of ways to shelter cap gains, there's all kinds of ways to defer.
00:15:48.000 And so even though you pay more on an absolute dollar basis, on a percentage basis, you're paying way, way less.
00:15:55.000 And all of those tricks have been exposed.
00:15:59.000 They've all been exposed.
00:16:00.000 These are all mechanisms that were invented from the 1980s to now.
00:16:06.000 Right, by all the banks and all the folks that wanted to come to folks that had wealth.
00:16:10.000 And so it's all known.
00:16:14.000 And I think people are kind of like, hey, hold on a second.
00:16:16.000 This just doesn't feel fair anymore.
00:16:19.000 Absolutely.
00:16:20.000 But the other problem with that is if you do tax correctly, where does that money go and who's managing it?
00:16:31.000 And ultimately, who's managing it is the federal government.
00:16:35.000 And they have been shown to be completely inept.
00:16:38.000 At managing your money correctly.
00:16:40.000 The fraud and the waste is off the charts.
00:16:43.000 The amount of NGOs that have an insane amount of funds at their disposal.
00:16:48.000 I mean, all this is exposed by Doge, right?
00:16:51.000 And you realize how much fraud and waste there is and how much money.
00:16:55.000 So the solution being tax people more, that doesn't sit with a lot of people because it's like, well, where is it going and who's managing it?
00:17:05.000 If the federal government was being forced to handle money the same way a private company does.
00:17:12.000 If it was all out in the open, everything was exposed, they would have gone bankrupt a long time ago.
00:17:19.000 They would have gone under a long time ago.
00:17:21.000 There's no way they would have been allowed to function the way they are.
00:17:25.000 The people that are managing that money would have all been put in jail.
00:17:29.000 There's not a chance in hell that giving them more money is going to solve anything.
00:17:35.000 They're going to find more ways to put more of that money into NGOs that puts more of that money into Democratic coffers and Republican coffers.
00:17:43.000 They're going to figure out a way to.
00:17:45.000 Funnel that money around where it's not going to benefit people.
00:17:48.000 I mean, a good example of that is like where let's look at the LA fire thing, for instance.
00:17:55.000 So, the LA fire fund, there's a giant fire in the Palisades.
00:17:59.000 All this money gets raised.
00:18:01.000 It's over $800 million.
00:18:03.000 It goes to 200 plus different nonprofits.
00:18:09.000 None of it goes to the people.
00:18:10.000 Spencer Pratt, who's running for mayor of Los Angeles, who's doing a great job, by the way.
00:18:14.000 Fucking phenomenal.
00:18:16.000 Those ads are.
00:18:16.000 Those ads are fire.
00:18:18.000 They're fire.
00:18:19.000 They're so good.
00:18:20.000 They're fire.
00:18:21.000 And he's doing it all out of a trailer on his burnt out land.
00:18:25.000 I mean, he's the most righteous guy running in that regard.
00:18:28.000 But just that being exposed, like, okay, we're going to help out these people.
00:18:34.000 We're going to donate money.
00:18:35.000 We're going to raise money.
00:18:37.000 We're going to do some good.
00:18:38.000 We feel terrible about the people in our community that have lost homes.
00:18:41.000 Well, what happens?
00:18:42.000 Well, the same people that you're saying we should give more taxes to take that money.
00:18:48.000 And they just give it to a bunch of nonprofits and charities.
00:18:51.000 This episode is brought to you by ARMRA.
00:18:53.000 Every week there's some new wellness hack that people swear by.
00:18:56.000 And after a while, you start thinking, why do we think we can just outsmart our bodies?
00:19:02.000 That's why ARMRA colostrum caught my attention.
00:19:05.000 It's something the body already recognizes and has hundreds of these specialized nutrients for gut stuff, immunity, metabolism, et cetera.
00:19:14.000 I first noticed it working around training, especially workout recovery.
00:19:19.000 Most stuff falls off, but I am still taking this.
00:19:21.000 If you want to try, Armra is offering my listeners 30% off plus two free gifts.
00:19:26.000 Go to armra.com slash Rogan.
00:19:30.000 I'm not saying give more tax.
00:19:31.000 What I'm saying is people are taxed too much.
00:19:34.000 Corporates are not taxed enough, flip it.
00:19:34.000 Yes.
00:19:36.000 Right.
00:19:37.000 But even if you do flip it and the corporates are taxed more, where is that money going?
00:19:42.000 This is the problem.
00:19:43.000 I suspect that if you put the burden on Wall Street and corporates, they'd be a lot more organized and they'd probably create a lot more change than a diffuse electorate.
00:19:53.000 Meaning, let's just say the government spends a trillion dollars and wastes it.
00:19:59.000 I'm generally roughly aligned with that.
00:20:02.000 If you waste a trillion dollars from 300 million people, It's hard to organize at 300 million people.
00:20:10.000 But if you waste a trillion dollars from 300 companies, those companies will get their shit together really fast and they will force a lot more change.
00:20:18.000 I would hope so, but you're still dealing with incompetent people that are tasked with taking care of that money.
00:20:24.000 Yeah, yeah.
00:20:24.000 Not just incompetent people.
00:20:25.000 Don't get me wrong.
00:20:26.000 I'm not defending these people.
00:20:27.000 Decades of corruption.
00:20:28.000 Decades.
00:20:29.000 And decades of all these mechanisms where they can take this money and funnel it into these NGOs and these nonprofits and all these.
00:20:40.000 Different weird organizations that don't seem to have accountability for what they do with that money.
00:20:45.000 That gets real slippery.
00:20:47.000 And if those people in turn make deals with those corporations that allow them to do certain things and push things through that maybe they would have difficulty doing, then you have a different kind of a working relationship with the same groups of people and the same government.
00:21:04.000 You just take money from corporations and move it into a way where the corporations ultimately benefit from it, but yet it doesn't do any good to the people.
00:21:13.000 Yeah.
00:21:13.000 I mean, I can see where you're coming from.
00:21:15.000 I just think that if we go on the track we're going down, it just seems like we're going to hit a crisis.
00:21:23.000 Yes.
00:21:24.000 The crisis is you can't expect people to pay more and more and more.
00:21:28.000 Again, I agree with you.
00:21:29.000 The premise is we're all paying for a system that's broken.
00:21:32.000 That should change.
00:21:33.000 But we still continue to have to pay our taxes.
00:21:37.000 But if taxes keep going up like this at the individual level and we don't manage this transition to something where we may be working less and less, what are we getting paid to do?
00:21:47.000 And then at that point, How are we expected to pay what?
00:21:49.000 90% of what?
00:21:51.000 Right.
00:21:51.000 50% of what?
00:21:52.000 I think people do have this weird feeling of dread that the people that are in control of a lot in this country, the tech companies in particular, particularly the tech companies like Google and Facebook that are essentially involved in data collection and then ultimately dissemination of information, that they have acquired enormous amounts of wealth and power and influence and they're essentially.
00:22:21.000 A new form of the government.
00:22:24.000 You know, are you aware of Robert Epstein?
00:22:24.000 Yeah.
00:22:27.000 Do you know about his work?
00:22:28.000 Not Robert Epstein.
00:22:30.000 A different guy.
00:22:31.000 Robert Epstein is a guy who specializes in understanding what curated search results do and what Google's able to do with, in particular, with curated search results in terms of influencing elections.
00:22:49.000 That, like, say, if you have two candidates that are running, let's just take L.A., for instance.
00:22:56.000 I'm not making any accusations, but I'm saying if they wanted Karen Bass to win and you searched Karen Bass, you would find all these positive results.
00:23:06.000 If you searched Spencer Pratt, you would find all these negative results.
00:23:11.000 There's a bunch of people that are always undecided voters, and those are the ones that you really want.
00:23:17.000 They're like, I don't know, I don't know.
00:23:19.000 Come election night, those are the people you want to try to grab, and it's generally a large percentage.
00:23:23.000 You can influence an enormous percentage of those people just with search results.
00:23:28.000 Where you can shift an election one way or another.
00:23:31.000 I believe it.
00:23:32.000 I believe it.
00:23:32.000 Yeah, and he's demonstrated this and shown how this is possible.
00:23:38.000 That freaks people out that tech companies are in control of narratives, that tech companies can censor information, especially tech companies that work in conjunction with the government.
00:23:50.000 This is what we found out when Elon purchased Twitter.
00:23:54.000 When Elon purchased Twitter, we got all this information from the Twitter files when all the journalists were allowed to go through it and they said, oh, this is crazy.
00:24:02.000 You've got the FBI, the CIA, you've got all these companies.
00:24:05.000 All these government organizations that are essentially controlling the narrative of free speech in the country.
00:24:13.000 They're doing it in a way that benefits them.
00:24:15.000 They're doing it in a way that benefits what political parties in charge.
00:24:18.000 At the time, it was the Biden administration.
00:24:21.000 They were allowed to do a bunch of weird shit, which should be illegal but is not technically illegal.
00:24:28.000 That freaks people out because there's no real laws and rules in regard to what they're allowed to do and what they're not allowed to do.
00:24:35.000 Curated search results should be illegal.
00:24:37.000 They're shaping attention.
00:24:38.000 Again, it goes back to attention.
00:24:41.000 Shaping attention.
00:24:42.000 Yeah.
00:24:42.000 That's a big concern for people.
00:24:47.000 I think then when you find out that these people are able to amass enormous sums of wealth and have an incredible amount of power and influence because of this enormous wealth and this control over these tech companies that have essentially become the town square of the world, that freaks people out.
00:25:06.000 That these very small number of people, you think of Zuckerberg, you think of Tim Cook, and I don't know.
00:25:13.000 I don't know who the new guy is now.
00:25:14.000 Who's the new guy?
00:25:15.000 John Furness.
00:25:16.000 Right.
00:25:16.000 Furness.
00:25:17.000 No.
00:25:17.000 I forget his name.
00:25:18.000 Yeah.
00:25:19.000 Turness.
00:25:19.000 Turness.
00:25:20.000 But that kind of thing gives people a lot of concern, right?
00:25:27.000 It's like that these people, these unelected people, are in control of a giant chunk of how the world works.
00:25:36.000 I think that this is the existential question that we are dealing with.
00:25:41.000 You're going to have five or six companies concentrate.
00:25:44.000 Like, whatever power you think has been concentrated up until now, I think we're going to look back and it's going to look like a Sunday picnic 10 or 15 years from now.
00:25:56.000 Because, on the one hand, it's going to be an even smaller subset.
00:26:00.000 And on the other hand, the capability is going to be an order or two orders of magnitude.
00:26:04.000 So, can you imagine what that must be like?
00:26:07.000 It's kind of like showing up, getting dropped into the 1800s, and you've invented the engine and everybody else is a horse and buggy.
00:26:13.000 You can just decide to your point.
00:26:16.000 That is where we're going.
00:26:18.000 It's even more crazy.
00:26:20.000 It's like everybody else is on a horse and buggy and you've got an internet connection and a cell phone.
00:26:26.000 It's even more crazy.
00:26:26.000 Right.
00:26:28.000 Because what we're dealing with with AI right now is first of all, it's already lowered children's attention spans and it's shrinking their capacity to acquire or absorb information because what they're doing now is just relying on AI to answer all their questions for them.
00:26:28.000 Exactly.
00:26:48.000 Now, is that their fault?
00:26:49.000 Kind of, right?
00:26:50.000 Because it doesn't have to be that way.
00:26:52.000 You could still acquire information the old fashioned way.
00:26:55.000 You could still learn things the right way.
00:26:57.000 But a lot of kids are just concerned with passing examinations and getting into good schools.
00:27:01.000 And what they're doing is just using AI.
00:27:03.000 And they're getting better test results, but they're also not as smart, which is really weird.
00:27:09.000 It's like we're relying on it, like we, you know, it's like it's essentially like replacing our mind.
00:27:20.000 And that's just the beginning.
00:27:23.000 This is the beginning.
00:27:25.000 This is like, these are their toddler days of AI, where it's going to be a super athlete in a few years.
00:27:33.000 Yeah.
00:27:34.000 I think we have to figure out how, first of all, kids need to learn, and I think this is where we have to do a better job as parents.
00:27:42.000 Kids need to learn how to be resilient thinkers.
00:27:44.000 I don't even know what that term meant before, but I know what it means now, which is like, you take this AI slop and you just kind of pass it off.
00:27:51.000 And if the teachers and the school system aren't trained, they're just like, wow, this looks good.
00:27:57.000 They have to be able to push back.
00:27:59.000 Parents need to be able to look at this shit.
00:28:01.000 But then all of this stuff, I'm just like so frustrated because it's like one more thing that I have to do as a parent.
00:28:05.000 Like every time technology gets better, it's one more thing, you know?
00:28:09.000 Right.
00:28:10.000 We're going to make the world, you know, super connected and social and all of that stuff.
00:28:13.000 It sounds great to me until I have to be the one that has to tell my kid they can't get Instagram.
00:28:18.000 And then they're up my ass every day.
00:28:20.000 Right.
00:28:20.000 You know?
00:28:21.000 And it's just like, I don't want to have to deal with this stuff.
00:28:24.000 Right.
00:28:24.000 I want this to be handled in a way that just allows me to do what I want to do.
00:28:29.000 I don't want to say no to my kid.
00:28:31.000 I don't want to police his schoolwork and make sure he's not cheating or not learning and just like, you know, passing off this AI slop.
00:28:38.000 What am I?
00:28:41.000 Where are my tax dollars going?
00:28:42.000 Where's everybody else in all of this?
00:28:44.000 It gets very frustrating.
00:28:46.000 And again, it goes back to like this feeling of like, well, is this all getting better for me?
00:28:51.000 Or is this kind of like not, you know, people start to be nostalgic for what it used to be because it was just simpler.
00:28:58.000 But I think that's a different way of saying easier.
00:29:00.000 Well, we're just dealing with.
00:29:03.000 We're at the edge of great change, like great change that has no real understanding of how it turns out.
00:29:10.000 Yeah.
00:29:11.000 And I think that understandably freaks people out, freaks me out.
00:29:16.000 It freaks me out, but I've kind of gotten to this place where I'm like, well, it's going to happen.
00:29:20.000 Did you see this thing?
00:29:21.000 It's the CEO of Verizon, Dan Schulman.
00:29:24.000 He put out this very public forecast, you know, very smart guy, well regarded in business.
00:29:31.000 And I think he said something like 30%.
00:29:34.000 Of all white collar jobs will be gone by 2030.
00:29:37.000 I don't know, Jamie, maybe you can get the exact thing, but it's something like that.
00:29:39.000 That's probably optimistic.
00:29:41.000 And I thought at first my initial reaction was like, this is totally not credible.
00:29:45.000 But then I'm like, hold on a second, that's my bias because I want to believe that that's not possible.
00:29:50.000 Honestly.
00:29:51.000 Right, right.
00:29:51.000 And as I've gotten older, I'm a little bit better now.
00:29:54.000 I'm like, okay, hold on a second, let's weigh the probabilities.
00:29:56.000 And then I was like, man, if I'm going to be fair, maybe there's a 10, 20% chance of that.
00:30:04.000 There's a bunch of other outcomes that are much better than that, but that's part of the set of outcomes that you have to consider.
00:30:11.000 And then I was like, well, what's my antidote to that?
00:30:14.000 And the only thing that I can say is don't worry, it's going to be better.
00:30:19.000 I don't think that that's a good answer.
00:30:20.000 No.
00:30:21.000 So there has to be like all of this kind of goes back to look, my wife and I had this conversation.
00:30:28.000 We're like, if it were up to us, who can you trust to have some super intelligence?
00:30:35.000 Now we're biased because we're friends with him, but the only person that we can trust is Elon.
00:30:40.000 Because he seems to be like, he has a bigger, like, it's kind of like he's like over there.
00:30:44.000 He's like, I need to get to Mars.
00:30:46.000 You know, and I'm going to first terraform the moon, but then I'm going to Mars and I'm going to build like a fucking magnetic catapult.
00:30:46.000 Right.
00:30:52.000 I'm going to do all this shit.
00:30:54.000 And so I just need this thing.
00:30:56.000 I feel like he's the least corruptible.
00:31:01.000 He's the most independent thinking.
00:31:03.000 And I think he's the one that has an actual empathy for people.
00:31:07.000 Then there are folks where there's just an insane profit motive.
00:31:11.000 They're less in control of the businesses that they run.
00:31:14.000 Those businesses are really out over their ski tips in the amount of money they've gotten from Wall Street and other folks who expect a return, who will put a ton of pressure on these folks.
00:31:25.000 And if they get there first, I don't know where the chips fall.
00:31:28.000 We don't really know.
00:31:29.000 We can kind of guess.
00:31:30.000 And then you see in the press just enough snippets of their reactions in certain moments where you're like, hey, hold on a second question mark here.
00:31:39.000 You see OpenAI react one way, you see Anthropic react another way, and you're like, where is this going to end up?
00:31:45.000 And the honest answer is nobody really knows.
00:31:48.000 So it comes back to like, we need a few people that can organize.
00:31:52.000 Those guys need to self organize and actually present a really positive face.
00:31:57.000 And they need to show why those 20% of outcomes that Dan Schulman paints the truth is it's possible, but here's why it's not probable.
00:32:09.000 But it's not in their best interest to do that because it's in their best interest to generate the most amount of money possible.
00:32:15.000 That's the obligation they have to their shareholders.
00:32:17.000 That's the obligation they have the people that have invested money in this company.
00:32:21.000 Their obligation is not to make sure the white collar jobs stay in the same place that they're at now.
00:32:27.000 No?
00:32:27.000 That's not true.
00:32:28.000 I actually think their incentive should very clearly be to tell people with details and facts why there's a positive future.
00:32:36.000 The reason is the following right now there's a vacuum, there are no facts, and there's fear mongering, and then there's this belief that this is going to be cataclysmic to Human productivity and white collar labor and all of this stuff.
00:32:50.000 What's people's natural reaction?
00:32:51.000 Well, today, if you look at it, think about AI as a very simple equation energy in, intelligence out.
00:32:58.000 So, if you want to cut the head of the snake, what do you do?
00:33:02.000 You cut off the energy supply, right?
00:33:03.000 Okay.
00:33:04.000 If you're afraid of all of this super intelligence coming, the natural thing to do would be to go to the point of energy and unplug it.
00:33:12.000 What is the equivalent of unplugging it today?
00:33:14.000 It is to go all around the country, find the data centers.
00:33:18.000 Protest them and get them to be mothballed.
00:33:22.000 That is an incredibly successful strategy right now.
00:33:25.000 Today, about 40% of all of these data centers that get protested get mothballed.
00:33:34.000 You're talking about emerging data centers.
00:33:38.000 Yes.
00:33:39.000 Just like I need to.
00:33:41.000 So if you're one of these companies, the first thing you should realize is I need to paint a positive vision because 40% of my energy is getting unplugged every day.
00:33:50.000 And if that happens, my revenues will crater and my investors will be super pissed.
00:33:56.000 So, the right strategy is what is the positive, fact based argument?
00:34:00.000 And there are some incredible examples.
00:34:03.000 Number one.
00:34:04.000 And then, number two is you have to give people some tactical benefit that they see.
00:34:11.000 Because AI, differently than search or differently than social media, there's no exchange of value.
00:34:19.000 Let me explain what that means.
00:34:21.000 So, let me just go like so the first thing.
00:34:24.000 Is that if you can go and actually show people?
00:34:29.000 Here's an example of AI.
00:34:31.000 I heard about this last night.
00:34:34.000 You can now take pictures of a woman's fallopian tubes and you can see pre cancer, ovarian cysts, and all of this stuff, cervical cancer before it forms.
00:34:34.000 It's pretty incredible.
00:34:48.000 And then you can intervene and you can fix it so that women don't get cervical cancer.
00:34:54.000 In a different example, I actually told you about this example when I was here before.
00:34:59.000 I finally got FDA approval.
00:35:00.000 Okay.
00:35:01.000 There is a device now that is allowed to be in the operating room with you.
00:35:05.000 Room with you.
00:35:07.000 And if you have a cancerous lesion or a tumor inside of your body, the most important thing when they go to take it out is make sure you don't leave any cancer behind.
00:35:17.000 You couldn't do it because what would happen is you take it out.
00:35:21.000 A doctor, Joe, is literally fucking eyeballing it and saying, Yeah.
00:35:25.000 They send it to a pathologist.
00:35:27.000 You get an answer in 10 days.
00:35:30.000 For women with breast cancer, a third of these women find out that they have cancer left behind.
00:35:35.000 They go back in, they scoop some more stuff out.
00:35:37.000 A third of those women.
00:35:39.000 Okay.
00:35:40.000 So I'm like, this is bullshit.
00:35:41.000 We can solve this problem.
00:35:43.000 It took us a long time, a lot of money.
00:35:46.000 I had to build an entire machine, imaging all of this stuff, AI algorithms.
00:35:51.000 We had to prove it all.
00:35:52.000 We finally get approval.
00:35:53.000 Okay.
00:35:55.000 But you know how hard it is to tell that story?
00:35:57.000 In all of the attention that people are looking for, it's hard.
00:36:01.000 But those are positive examples.
00:36:03.000 No more breast cancer, no more cervical cancer.
00:36:08.000 A different example is most drugs in pharma.
00:36:11.000 Fail, right?
00:36:14.000 And it's a very complicated problem in pharma.
00:36:16.000 It's kind of like a jigsaw puzzle of the ultimate complexity.
00:36:19.000 It's like, think of your human body as like a Himalayan mountain range.
00:36:24.000 You have to design a drug that's an equivalent Himalayan mountain range that plugs into it perfectly.
00:36:30.000 One millimeter off, you grow like a fourth eye, a third nipple, you die, you know?
00:36:36.000 Now you can use computers to make sure that that drug hand in glove to your body.
00:36:43.000 Solves the exact problem.
00:36:44.000 Couldn't do that before.
00:36:46.000 So there's all of these body of examples, and you're probably only hearing them superficially at best.
00:36:54.000 That should be 99% of the attention is showing all of the constructive, tactical ways in which our lives will be better.
00:37:02.000 Your mom, your daughter, your wife, us, Jamie, his family, everybody.
00:37:08.000 Right.
00:37:08.000 That's the number one thing.
00:37:10.000 Nobody talks about it.
00:37:11.000 I don't understand why.
00:37:12.000 Well, I think because people are terrified of losing their jobs.
00:37:15.000 So that's the primary concern.
00:37:17.000 The primary concern that I hear from people is that there are so many people that are going to school right now, college students, that don't know if their job is going to even exist in four years when they graduate.
00:37:27.000 And that's the second part of what this industry has to do better.
00:37:33.000 I had lunch with Jeffrey Katzenberg.
00:37:35.000 He told this crazy story.
00:37:36.000 I'll tell you.
00:37:42.000 He starts next and he buys Pixar from George Lucas.
00:37:48.000 But then he hits a rough patch and he's got this, you know, financing issue.
00:37:52.000 Katzenberg flies up, spends time with Steve Jobs, says, I'll buy Pixar.
00:37:57.000 Jobs says, Absolutely not.
00:38:00.000 And then Katzenberg proposes a deal and he's like, How about a three picture deal?
00:38:05.000 Jobs says, Okay.
00:38:06.000 He flies back and apparently all the animators were up in arms because they're like, Hold on a second.
00:38:12.000 Steve Jobs is going to use these next computers.
00:38:15.000 To animate this movie, which ultimately became, I think, Toy Story.
00:38:19.000 And they're like, this is going to put all of us out of a job.
00:38:22.000 That perfect argument.
00:38:24.000 And people were really upset.
00:38:27.000 Roy Disney was upset.
00:38:29.000 All the animators were upset.
00:38:30.000 And they all went to Mike Eisner.
00:38:32.000 And they were like, Michael, you need to fire Katzenberg.
00:38:36.000 And they had a deal which was like, look, man, you do you, but just give me the ability to say no if I think that this is, you're about to jump off a cliff.
00:38:45.000 They talk about it.
00:38:46.000 And he's like, I got your back.
00:38:47.000 Do the deal, make the movie.
00:38:49.000 They made the movie.
00:38:50.000 It was a huge success.
00:38:51.000 Fast forward 10 years, 15 years, there's 10x the number of animators.
00:38:56.000 Now, it's a small example, but why is that?
00:38:58.000 You were able to use computers, and now all these new people were able to come and participate in that.
00:39:04.000 I get it.
00:39:04.000 It's a small example.
00:39:07.000 But I think if we had better organized leadership and we could try to tell some of these examples, try to go back and document how some of these things have actually helped people, it expanded the pie, there's a chance.
00:39:20.000 But if we don't, I agree with you.
00:39:22.000 Where we're going to end up is everybody basically saying, hey, hold on a second.
00:39:25.000 This is crazy.
00:39:26.000 We need to stop this.
00:39:28.000 That's the worst outcome because that's when you will have a high risk of a dislocation.
00:39:33.000 Like the worst outcome, like the black swan event.
00:39:36.000 Let's think about the black swan event.
00:39:38.000 The black swan event is when you get a model that's good enough to automate a bunch of labor, but not good enough that it can build new drugs and prevent cancer and make you live for 200 years and all of this other stuff.
00:39:57.000 So there's a gap.
00:39:59.000 And if you can stop it here and it doesn't get to there, now you do have the worst of all worlds.
00:40:03.000 You have this thing that kind of displaces labor.
00:40:06.000 No new things come after it because we stop innovating.
00:40:11.000 And that's like a non trivial possibility now, I think.
00:40:15.000 No, it's a huge possibility.
00:40:16.000 And then there's also this thing that you brought up earlier where we have this place of abundance where no one has to work anymore.
00:40:23.000 That freaks people out.
00:40:24.000 I think that's a big problem.
00:40:26.000 Well, because if no one has to work anymore, first of all, what is your identity, right?
00:40:32.000 Because so many people, their identity is what they do.
00:40:35.000 Whatever it is, if you're a lawyer, if you're an accountant, if you run a business, whatever it is, this is your identity.
00:40:42.000 You have built this thing.
00:40:44.000 You look forward to going there.
00:40:45.000 You work at it.
00:40:46.000 You look forward to doing a good job and getting rewarded for it.
00:40:50.000 The harder you work, the more you get paid.
00:40:52.000 There's all these incentives built in, and then there's this again identity problem.
00:40:58.000 If all of a sudden you have universal high income, which is what Elon always talks about, well, what gives people purpose then?
00:41:06.000 And also, if you have a person who's 43 years old, and their entire life they've worked towards this idea that the harder they work, the harder they think, the more innovative they are.
00:41:18.000 And the better they are at implementing these ideas, the more they get rewarded.
00:41:23.000 And then all of a sudden, that's not necessary anymore, Mike.
00:41:26.000 Time for you to just relax and do what you want to do.
00:41:29.000 And Mike's like, well, this is what I do.
00:41:32.000 I don't have any fucking hobbies.
00:41:34.000 I enjoy doing what I do.
00:41:36.000 And now what I do is completely useless.
00:41:38.000 And now I'm on a fixed income, even if that fixed income is a million dollars a year, whatever it is.
00:41:45.000 If all of a sudden you are in this position where everything is being run by computers, you feel useless.
00:41:51.000 You feel like, what am I doing?
00:41:52.000 I'm just taking money?
00:41:54.000 I'm on high welfare?
00:41:56.000 Right.
00:41:57.000 Like, what do I do?
00:41:58.000 Right.
00:41:58.000 I think that that's a really important question to answer.
00:42:00.000 I don't know.
00:42:01.000 Some people are going to write books.
00:42:03.000 Some people are going to do art.
00:42:04.000 Some people are going to find things to do.
00:42:07.000 What do you think we would have done if we were go back to the 1800s example?
00:42:13.000 There was no office culture.
00:42:15.000 There's no ladder to climb.
00:42:20.000 How did people find meaning then?
00:42:22.000 Well, they had jobs.
00:42:25.000 People still did things.
00:42:27.000 If you're a farmer, you had meaning in your labor and what you did and keeping the animals alive and your chores.
00:42:33.000 And there's people that find great satisfaction in doing that.
00:42:36.000 You know, you have all these animals that rely on you.
00:42:39.000 You have people that rely on you for the food that you generate.
00:42:41.000 There's meaning there.
00:42:43.000 It doesn't have to be an office to be something that gives you purpose and meaning.
00:42:46.000 But when all that is animated, then what happens?
00:42:50.000 Because then you have no purpose, no meaning other than recreational activities.
00:42:55.000 Now, if everybody just starts playing chess and.
00:42:57.000 Doing a bunch of things that they really enjoy.
00:42:59.000 Enjoy.
00:43:00.000 I mean, look, there's people that would love to just play chess.
00:43:04.000 It's like eight people.
00:43:06.000 I don't know about that.
00:43:07.000 I think if people really got into it, I mean, there's a lot of people that get addicted to whatever their recreation is, like golf or whatever it is.
00:43:14.000 For me, it's playing pool.
00:43:16.000 If you told me I never have to make any more money, I could just play pool all day.
00:43:19.000 I might just play pool all day.
00:43:21.000 But I don't know how many people think that way.
00:43:25.000 I don't know how many people would be able to find meaning and purpose in a recreational activity.
00:43:31.000 There's so many people where their entire being is.
00:43:34.000 Is focused around productivity and generating more wealth.
00:43:37.000 What about religion as a source of meaning?
00:43:40.000 Wow, that would help.
00:43:42.000 Did you see this article in the New York Times, I think it was this weekend, about how popular and sold out churches have become as social constructs in New York City?
00:43:52.000 It was totally fascinating.
00:43:53.000 It's like young women, like dressed to the nines, going to church on a Sunday for social belonging, community meaning.
00:44:03.000 I thought, I was so fascinated by it.
00:44:05.000 I was like, wow, that's incredible.
00:44:07.000 Because, like, I think if you graph just like people's use of religion as an anchoring part of their value system, over the last 40 years, basically gone to zero.
00:44:18.000 Nobody celebrates it the way it's not a part of the community the way that it used to be.
00:44:22.000 Maybe that's the thing that we have to find.
00:44:24.000 There has to be a renewal of some older things, and then there has to be new things that replace it.
00:44:30.000 What's the Chinese answer to this?
00:44:32.000 The Chinese have a very orthogonal answer to this.
00:44:34.000 If you look at how China is organized, it's super interesting because they don't reward.
00:44:40.000 Based on the way the American system rewards.
00:44:42.000 In fact, it's like almost orthogonal, where we are rewarded with money and rewarded with sort of fame and recognition.
00:44:52.000 The system, the American capitalist system.
00:44:54.000 But if you look inside of China, it's constantly testing who has this judgment.
00:44:59.000 And what they are rewarded with is influence and power in a very specific social contract.
00:45:05.000 I don't think it's going to work in the United States, nor am I an advocate of it, but it works for them.
00:45:10.000 You'll start off as like some.
00:45:13.000 You know, low-rung person in like some small village town somewhere, and your job as like the, you know, the functionary is to do good in that community.
00:45:22.000 And the more you do well, you get promoted.
00:45:24.000 Then you get, let's say, to like a reasonable-sized city and you get a budget.
00:45:28.000 And now what happens is you actually become a little bit like a VC, like a venture capitalist.
00:45:31.000 You're given a budget and you'll get a memo, and it'll say, Hey, Joe, we have a priority over the next 15 years: it's batteries.
00:45:41.000 And you have enough money.
00:45:42.000 Put a team on the field.
00:45:45.000 So you go in your local community, you find a bunch of guys, you're like, all right, guys, we're going to start a battery company.
00:45:51.000 And you do it.
00:45:53.000 And let's say they're good and they're like innovative.
00:45:57.000 And what happens is in the town beside it, that battery company dies.
00:46:02.000 Now you kind of subsume the capital from Jamie, right?
00:46:06.000 Because Jamie's like, fuck, I fucked up this thing that I wanted, I was told to do batteries.
00:46:09.000 Okay, Joe, I'm just going to align with you.
00:46:12.000 And what happens over time is you get this filtering effect.
00:46:18.000 And the people that are better at meeting these long run priorities and objectives are the ones that are celebrated.
00:46:24.000 But they're not celebrated with, you know, Forbes articles and all this other bullshit.
00:46:30.000 They're just celebrated by giving more responsibility.
00:46:33.000 And then eventually you get to the upper echelons of China, and what you have are folks over a course of 40 or 50 years who, in their eyes, have demonstrated incredible prowess.
00:46:43.000 There's a version of that reward system, which is very foreign to America, but that's worked for China.
00:46:49.000 Now, that also works because they're more Confucian, you know, we're too individualist.
00:46:53.000 But my point is, like, you know, there are these different ways that we can find of giving people meaning that don't have to be always around money.
00:47:05.000 But meanwhile, I think we have to answer the question if we are expected to do less, we probably should not be taxed more.
00:47:12.000 That's, I think, that's like a very basic, in my mind, I think that is like, that must be explored and figured out.
00:47:18.000 And on the other side, there's just a ton of obvious mechanisms that corporate actors can use to minimize that.
00:47:26.000 And they should find off ramps, by the way.
00:47:28.000 If they want to build hospitals, they shouldn't have to pay taxes.
00:47:30.000 Like, that's a perfect example, by the way, of like the thing in like, if you look, if you walk around New York City, there are living tributes to corporate success that people get benefit from every day the hospitals, the buildings, the libraries, it's just everywhere.
00:47:47.000 We need a version of that.
00:47:50.000 And I'm not a tax expert, but you know, if that can be funded by private actors, so go directly to the problem.
00:47:57.000 Build a bunch of libraries, build a bunch of new universities that teach kids actually how to think or whatever, build better hospitals that are there to actually solve the problem.
00:48:07.000 These are all things that are possible.
00:48:09.000 But none of it's happening today.
00:48:10.000 Well, let's go back to what we were talking about earlier with taxes and the fact that you're giving money to a broken system.
00:48:18.000 Do you think it's possible that AI could show benefit in that they can analyze all the data, which would be virtually impossible?
00:48:28.000 For even an office filled with human beings paying attention to all of it, and they could analyze where all the money goes and eliminate all the fraud and waste, like recognize it instantaneously.
00:48:40.000 That would be a great benefit and a way to make it so that your taxes directly benefit people.
00:48:50.000 I'll give you one example of this.
00:48:52.000 So, two years ago, you know, like every few years I invest, but every few years I'll start something because I feel strongly about it.
00:49:00.000 There's an effort that I made to look at all of this old code.
00:49:08.000 Like, if you think about the world, the world runs on software, right?
00:49:14.000 Like, even though you and I are talking, it's piping into Jamie's computer.
00:49:18.000 Right, it's all software.
00:49:19.000 It's all software.
00:49:20.000 Then it goes to Spotify.
00:49:20.000 They pump in some ads.
00:49:22.000 It's all software.
00:49:22.000 Right.
00:49:23.000 Software runs everything.
00:49:24.000 What percentage of that do you think is kind of poorly written?
00:49:31.000 I'm going to say probably 80 to 90% of it.
00:49:34.000 Oh, yeah.
00:49:34.000 Really?
00:49:35.000 It's riddled with errors.
00:49:37.000 It's riddled with mistakes.
00:49:38.000 The fact that so many companies exist is an artifact of the fact that the thing that came before it isn't working.
00:49:45.000 Like, if you got it right the first time, it would just kind of move and go.
00:49:50.000 How so?
00:49:52.000 What do you mean by that?
00:49:53.000 So, normally, if you were like, Jamath, I want to build a system that does A, B, and C. Right.
00:49:58.000 If I was designing it properly, I would sit there with you and I would meticulously write down: all right, Joe wants to do this.
00:50:06.000 What are the implications?
00:50:06.000 Joe wants to do that.
00:50:08.000 What are the implications?
00:50:09.000 And I would actually write a document that was in English before a single line of code has been written.
00:50:16.000 This was the when you have to design something that can't fail.
00:50:19.000 So, for example, like if you and I are designing something for the FAA or for, you know, I hate to say this example because it turned out to not exactly, but like, you know, to fly a plane, right?
00:50:29.000 You are first there to write in English.
00:50:32.000 And the reason is because everybody can then swarm that document and see the holes.
00:50:38.000 Okay.
00:50:40.000 And it's only then.
00:50:41.000 When that stuff looks complete and functional, do you build?
00:50:46.000 We turned that upside down.
00:50:48.000 Down.
00:50:49.000 Over the last 30 years, people in computing invented all kinds of ways to shortcut that process.
00:50:59.000 And you can say, well, why did they do that?
00:51:01.000 Because it would allow you to build something faster, make more money quickly, and then build more business.
00:51:07.000 So the direct response to, hey, it's going to take us nine months to write down the rules was somebody else showed up and says, fuck it, I'll just grip and rip this thing.
00:51:15.000 I'll be done in four months.
00:51:17.000 Who's going to get the job?
00:51:18.000 The four month guy is going to get the job.
00:51:20.000 So we've had 30 or 40 years of that.
00:51:23.000 What are we learning about that process?
00:51:27.000 It's riddled with software errors, like logic errors.
00:51:30.000 It's riddled with security errors.
00:51:33.000 I don't know if you saw this whole thing, like with Anthropic Mythos.
00:51:36.000 What are they uncovering?
00:51:37.000 They're uncovering that we wrote a lot of really shitty code for 40 years.
00:51:42.000 So, that body of old code, I was like, guys, if we're going to really figure out how to do all of this, we need to rewrite all of it.
00:51:53.000 So, we built this thing and.
00:51:58.000 It's called a software factory.
00:51:59.000 Anyways, the point is there is a government organization that we're working with.
00:52:04.000 They gave us a huge corpus of their old code.
00:52:08.000 And it is unbelievable how much complexity and difficulty they have to go through to manage all the money flows with the system.
00:52:22.000 And this is a critical part of the US government.
00:52:24.000 So, to your point, what I can tell you really explicitly is the people on the ground want this stuff to be better written.
00:52:32.000 It's less like some nefarious actor, like, oh, I'm going to steal here.
00:52:38.000 It's a lot of very brittle, fragile code.
00:52:42.000 And when you rewrite it, well, first, when you document it, you're like, it's like the, you know, the pulp fiction thing the suitcase opens, the light shines, and you're like, ah.
00:52:52.000 And then you can rewrite it and you will save.
00:52:56.000 So I think like as the government goes through this process because they're forced to or they want to, it won't matter.
00:53:03.000 You are going to save a ton of money.
00:53:06.000 They're going to have to do it, Joe, because the security risks are too high.
00:53:11.000 But what they're going to end up with is impregnable code that you can read in English and understand.
00:53:17.000 You'll see the holes.
00:53:18.000 Those holes will be plugged because otherwise, now you'd be committing fraud by letting it be.
00:53:23.000 You close the loopholes, and there's just going to be less money leaking out of this bucket.
00:53:29.000 That is an incredible byproduct.
00:53:30.000 We're going to live that over the next 10 or 20 years, just for nothing.
00:53:34.000 Like we get it for free.
00:53:36.000 And that's happening.
00:53:38.000 So, when that happens, you're going to see government budgets shrink.
00:53:41.000 Now, to your point, will they try to spend that extra money in other places?
00:53:44.000 Of course.
00:53:45.000 Of course, they will.
00:53:46.000 That's the next conversation, which is you have to elect people that say, firewall it.
00:53:51.000 You know, whatever you save, give it back to the people or, you know, invest in some scholarship program or free medicine or something.
00:53:59.000 But you can't spend it on other random shit.
00:54:03.000 But that's where we're at.
00:54:04.000 That's going to happen.
00:54:06.000 It's going to be slow.
00:54:07.000 And, you know, but when people start to announce these things, I think over the next few years, you're going to be shocked.
00:54:12.000 So that's the positive upside.
00:54:14.000 Well, that's happening now, irregardless of whatever else happens.
00:54:18.000 There's just, it's a lot of old, shitty code that must get rebuilt from scratch.
00:54:23.000 It is getting rebuilt from scratch.
00:54:25.000 And as a result, a lot of these leaky bucket problems are getting filled.
00:54:29.000 So, what percentage do you think could be fixed?
00:54:32.000 I think if I had to be a betting man, I think probably 30 to 40% of the federal budget is leaked out.
00:54:44.000 Just from shitty code?
00:54:45.000 No, meaning like all of the rules, and like you can take.
00:54:48.000 I'm not saying that there isn't fraud.
00:54:50.000 Right.
00:54:52.000 But I think a lot of times what happens is less nefarious than fraud, like meaning like conspiratorial actors.
00:54:58.000 I just think it's like incompetence, inefficiency, error.
00:54:58.000 Right.
00:55:02.000 For example, I saw Doge just say they were able to expunge like millions of people that were like 150 years old or more.
00:55:15.000 I have no idea how much money those folks were getting.
00:55:19.000 Or who they were, but it's probably a lot.
00:55:21.000 It's probably not zero.
00:55:23.000 And now that they got rid of it, they're not going to get that money anymore.
00:55:28.000 If you implement something at the state level around all of this fraud prevention for the daycares and all of this other stuff, again, it's all in software because it's not, no matter what the human wants to do, you have to go to a computer at some point, at least today in 2026, and type in something and something happens that's documented and then the money gets sent.
00:55:50.000 That happens.
00:55:50.000 Right?
00:55:51.000 There's no other way in the modern world today at scale to steal billions of dollars.
00:55:57.000 And so, my point is as you document all of these systems and governments have to transparently tell you and me, the voting population, here are the rules, they're going to plug a lot of these holes.
00:56:09.000 And I think as you do that, there's just going to be a lot less waste and fraud.
00:56:13.000 The question is who's going to take credit for it?
00:56:15.000 Everybody's going to try to take credit for it.
00:56:17.000 But I think we've started it.
00:56:18.000 I think we've started this process.
00:56:20.000 And again, the reason that people will start.
00:56:23.000 Is because you'll be afraid of China hacking these systems.
00:56:25.000 You'll be afraid of Iran, North Korea.
00:56:28.000 And you'll say, this system can't stand.
00:56:29.000 All these AI models are running around.
00:56:31.000 We're going to get breached and penetrated.
00:56:33.000 Then they're going to steal all the money.
00:56:35.000 And the natural reaction will be, okay, rewrite it.
00:56:39.000 This episode is sponsored by BetterHelp.
00:56:41.000 We've all been there, staying up late, stressed about the future.
00:56:45.000 Maybe you're worried about finding a job or a looming deadline.
00:56:49.000 Whatever you're feeling stressed out about, you don't have to work it out on your own.
00:56:54.000 No one person has all of life's answers.
00:56:58.000 And it's a sign of strength and self awareness to reach out for help.
00:57:02.000 That's why this Mental Health Awareness Month, we're reminding you to stop going at it alone.
00:57:09.000 Get the support you need with a fully licensed therapist from BetterHelp.
00:57:14.000 They make connecting with a therapist convenient and easy.
00:57:17.000 Everything is online.
00:57:19.000 Literally, all you need to do is answer a few questions, and BetterHelp will take care of the rest.
00:57:24.000 They'll come up with a list of recommended therapists that match what you need, and with over 10 years of experience, they typically Get it right the first time.
00:57:33.000 So you don't have to be on this journey alone.
00:57:35.000 Find support and have someone with you in therapy.
00:57:39.000 Sign up and get 10% off at betterhelp.com slash jre.
00:57:45.000 That's better, H E L P.com slash jre.
00:57:52.000 That makes sense.
00:57:53.000 That makes sense that the code and having a bunch of errors and having a lot of inefficiency and just a lot of incompetence that's going to save a lot of money.
00:58:04.000 But So, you would be doing this with AI?
00:58:10.000 In part.
00:58:10.000 In part.
00:58:11.000 What AI allows you to do is like, it's like you have a textbook, okay?
00:58:18.000 It's in Chinese.
00:58:19.000 You don't know Chinese, right?
00:58:20.000 No.
00:58:20.000 Okay.
00:58:21.000 You're like, well, this is probably doing something important, but it's in Chinese.
00:58:25.000 What AI allows you to do is back translate that into English.
00:58:29.000 You put it through an AI model, you teach it, you coach it, right?
00:58:33.000 You can parameterize all of it.
00:58:35.000 And out pops that same book.
00:58:38.000 In English, and now you can read it and know that it's accurate.
00:58:43.000 That's what we're doing.
00:58:44.000 So, what the AI allows you to do is essentially translate from this one language that you kind of don't understand to English.
00:58:53.000 By the way, that thing that's happening is actually also a very powerful and important trend, meaning there are all of these systems that work in ways that you and I don't understand.
00:59:04.000 And part of the reason why we don't understand it, maybe it's bad software, maybe it's fraud, whatever, but nothing can be written down.
00:59:11.000 There's no symbolic space, there's no English document that says this is how the DMV works.
00:59:15.000 This is exactly the rules.
00:59:16.000 This is what you can expect, Joe Rogan.
00:59:18.000 When you show up at the DMV and you give us this thing, here's your SLA, in three days you get a driver's license, and here's exactly what's happening, and here's an app, and you can follow it.
00:59:28.000 Doesn't happen.
00:59:29.000 Here, Joe Rogan, here's how my insurance billing process works.
00:59:33.000 You have this condition.
00:59:34.000 I'm going to show you exactly why I made this decision.
00:59:36.000 Here's the exact rule.
00:59:37.000 Here's the approval or denial from CMS.
00:59:40.000 Follow it through and tell me if you agree or not.
00:59:42.000 None of that exists, but it is possible.
00:59:46.000 And the first step in doing that is taking all of this legacy shit that we deal with and translating it into English and reading it and saying, is this how we want it to work?
00:59:56.000 That's going to eliminate an enormous amount of all the things that frustrate us.
01:00:00.000 So this would require human oversight?
01:00:02.000 Absolutely.
01:00:03.000 All right.
01:00:04.000 So then it's also going to be who's watching the watchers?
01:00:08.000 Yeah.
01:00:09.000 It's okay.
01:00:10.000 Okay.
01:00:10.000 This is a great question.
01:00:11.000 So I'll tell you how this government agency is doing it.
01:00:14.000 It's a really fascinating way because I think it's very smart.
01:00:18.000 They came to us and they came to another very well known company.
01:00:23.000 You can probably guess what it is.
01:00:24.000 Okay.
01:00:25.000 And they're like, guys, you're kind of in a foot race, but you're not competing against each other.
01:00:31.000 You think of yourselves as frenemies.
01:00:32.000 So here's this Chinese document.
01:00:35.000 You're going to translate it for us.
01:00:37.000 There's going to be your version of English and these guys' version of English.
01:00:40.000 And every time it's the same, we're going to look at it together and we're going to agree or not.
01:00:45.000 Okay.
01:00:46.000 This is exactly how we want this to work.
01:00:48.000 When yours says the dog is red and his says the dog is yellow, we're going to sit and literally inspect it and we're going to figure out why you said red and why you said yellow.
01:01:02.000 And then if you say the cat is red, the dog is yellow, so it's totally wrong, right?
01:01:09.000 Like you've gotten, you know, or like the cat is red, I want an apple, whatever.
01:01:13.000 We're going to double and triple down on those kinds of errors.
01:01:18.000 Not in public, but in this large community where there's like technical people from all different parts and they're just swarming this problem.
01:01:28.000 It is incredible to see.
01:01:31.000 And so, what happens is you get humans that get to use this tool, but ultimately it's our judgment and it's done transparently.
01:01:39.000 So, what happens is you can't, you know, hey, man, put this fucking rule in there.
01:01:44.000 Like the dog is yellow.
01:01:45.000 Just make the dog yellow.
01:01:47.000 You can't do it because now you have tens of people, hundreds of people, and then it gets documented.
01:01:52.000 It's super fascinating.
01:01:55.000 I'm not saying this is how it's going to work in 10 years, but I'm telling you, it's literally what's happening right now.
01:01:59.000 And I think that thing alone will be tens of billions of dollars and could be hundreds of billions of dollars of savings when it's fully done.
01:02:09.000 And it's a lot of people from all walks of life, all political persuasions, and they're just in it.
01:02:15.000 It's the government, it's a handful of us private companies.
01:02:18.000 It's super cool to see.
01:02:19.000 It's like, okay, we're actually going to do something here.
01:02:23.000 Like, this is nice.
01:02:24.000 It's really cool.
01:02:27.000 So, that's interesting in terms of the current moment.
01:02:29.000 Moment.
01:02:30.000 So, in the current moment, you're able to implement this, you're able to find fraud and waste and all these problems that exist and all these errors and shitty software.
01:02:42.000 Once that's all been done, then what happens?
01:02:47.000 Yeah.
01:02:47.000 No fucking clue.
01:02:48.000 So, this is where it gets weird, right?
01:02:50.000 Because when you're dealing with AI models that are capable of doing things that no individual human being could ever possibly imagine, and then you task it.
01:03:03.000 With a solution or with a problem, find a solution for this.
01:03:07.000 Then it starts figuring out ways to trim this and implement that.
01:03:13.000 implement that we have to make sure that these AIs act within they act within the best interests of the human race agreed right not the company not the government not but You're also dealing with China.
01:03:30.000 You're also dealing with Russia.
01:03:31.000 You're dealing with other countries that are also in this mad race to create.
01:03:35.000 Artificial general superintelligence.
01:03:37.000 That if we keep shutting down data centers, we keep hamstring ourselves.
01:03:42.000 China's not doing that.
01:03:44.000 They're not doing that.
01:03:45.000 They're doing the opposite.
01:03:46.000 They're generating as much revenue that goes towards this problem as possible.
01:03:51.000 They're putting all the efforts, the country, the government, and these corporations work hand in glove in order to achieve a goal.
01:04:00.000 We do not.
01:04:01.000 And that becomes a problem if you want to be competitive.
01:04:05.000 With these other countries that are trying to achieve the same result as us.
01:04:09.000 And then you have espionage.
01:04:10.000 Then you have a bunch of people that are stealing information.
01:04:13.000 You have a bunch of people that are CCP members that are actually involved in companies, and you find out that they're siphoning off data and that they're sharing information and tech secrets.
01:04:26.000 Look, here's a.
01:04:28.000 The way that the Chinese models work, the Chinese claim so America's closed source, meaning.
01:04:36.000 You got your own thing.
01:04:39.000 Your recipe is completely secret.
01:04:42.000 Okay.
01:04:42.000 Right.
01:04:42.000 I have my own thing.
01:04:44.000 My recipe is totally secret.
01:04:45.000 China uses this word called open source, but it's not open source.
01:04:51.000 So they say, here's how I make my thing.
01:04:55.000 You can see it.
01:04:56.000 Super transparent.
01:04:56.000 What it is is more like open weights, which is like in a recipe, it tells you, you know, you need sugar, you need butter.
01:05:04.000 Well, how much sugar?
01:05:05.000 And they'll say, you know, so much.
01:05:08.000 But then they don't say it's brown sugar, they don't say it's white sugar.
01:05:10.000 So there's all these different ways where they kind of Give you this perception that it's completely transparent, but it's somewhat transparent.
01:05:16.000 So, just in the level set, nobody in the world has a functional open source model other than maybe Nvidia, which is any good in the league of the closed source models and the open weight models of the Chinese.
01:05:30.000 Okay.
01:05:30.000 So, the Chinese open weight models are great.
01:05:32.000 The closed source models of America are great.
01:05:37.000 And then there's a couple open source, like fully open, that are kind of catching up.
01:05:43.000 The thing between America and China, what I find so fascinating is this following conundrum that everybody is going to find themselves in.
01:05:53.000 I think, like, if you think of an analogy, America's like a planet, China's like a planet.
01:06:02.000 And around us are these moons.
01:06:05.000 And I'm just using the AI analogy.
01:06:08.000 So, in AI, what do you need?
01:06:10.000 I think there's like four or five things you need.
01:06:12.000 Okay.
01:06:12.000 The first thing you need is a fuck ton of money.
01:06:15.000 So, we need essentially the banks, right?
01:06:18.000 Like the Game of Thrones thing.
01:06:20.000 We need the iron bank.
01:06:23.000 Feed us the money because that's what we use to buy everything and make everything.
01:06:27.000 And make everything.
01:06:28.000 So, we need that.
01:06:30.000 We need a ton of data.
01:06:32.000 Okay.
01:06:33.000 There's ways to get that.
01:06:34.000 We need a ton of very specific rare earths and critical metals and materials.
01:06:42.000 We need a ton of power.
01:06:44.000 So, and there are specific countries that are going to be really good at giving that to us.
01:06:50.000 So, if you look at the UAE, they are going to be the preeminent banking partner of the Western world.
01:06:57.000 They are going to replace and be what Switzerland was over the last 50 years for the next 50.
01:07:02.000 That's happening today.
01:07:04.000 If you look at Canada and Australia, the small political fissures aside, they are the two most important ways in which we get access to the critical metals and materials that without which we get fucked because China owns, you know, can just strangle us.
01:07:20.000 Okay.
01:07:21.000 So you have these like moons around the United States, but there's like five countries, six countries.
01:07:27.000 And there's a worldview that says in China has the same thing.
01:07:30.000 You know, they have Taiwan.
01:07:32.000 That's complicated for us.
01:07:33.000 So now we have a moon that we don't really have an answer for, which is what happens for all these super advanced chips.
01:07:39.000 Where do they get their money?
01:07:41.000 Maybe Russia becomes their bank.
01:07:43.000 Where do they get their critical metals?
01:07:45.000 Maybe it's Indonesia, right, who has a ton of natural resources.
01:07:48.000 And then you get into this game theory, which is what happens to every other country.
01:07:52.000 Because there are 190 countries, you have 10 that kind of divide up.
01:07:57.000 What do the other 180 do?
01:07:59.000 And you have to kind of sort yourself.
01:08:01.000 You're like, am I on Team America or am I on Team China?
01:08:05.000 And you probably have to go to people and say, well, here's what I can give you.
01:08:09.000 You know, if you're Indonesia, you're like, you probably want to be on Team America quite badly.
01:08:14.000 This is why the whole Trump tariff thing is so interesting because it's like this accidental way of figuring out that this is actually this new sorting function that's happening in global politics.
01:08:23.000 Like that's happening today because these countries are like, holy shit, if somebody invents a super intelligence and I don't have it, how am I going to keep my people healthy?
01:08:34.000 How am I going to educate my people?
01:08:36.000 Like, I'm originally from Sri Lanka.
01:08:40.000 What the fuck does Sri Lanka have to offer?
01:08:42.000 Like, if you were sitting there, they should be thinking, oh man, what do I have?
01:08:48.000 Well, I have a critical piece of territory for like naval navigation.
01:08:55.000 And then what do you do?
01:08:56.000 You probably go to America and say, listen, let's figure out a package, get the IMF involved, give me some cash.
01:09:01.000 I'll let you kind of keep your warships there.
01:09:03.000 So, there's this game theory that we're about to go through because of AI, because it's going to, I think, sort.
01:09:08.000 People into these bipolar worlds, I actually think it makes us safer afterwards.
01:09:14.000 I don't think it makes us less safe.
01:09:18.000 I think it actually makes us more safe because if you have these resources that build up on both sides, there's more of a likelihood of a mutual detente.
01:09:26.000 And we're very different.
01:09:27.000 So we're less likely to fight over similar resources, meaning we're like the liberal democracy.
01:09:34.000 You know, we're like the free market.
01:09:37.000 They are, you know, we're individualist.
01:09:38.000 They're.
01:09:39.000 Confucian, society oriented, reputation, power focused, less really money focused.
01:09:46.000 So there's a lot of ways we're orthogonal enough where if that sorting function happens, it's probably a safer place, not a more dangerous place.
01:09:55.000 We have the models that can attack them.
01:09:56.000 They have the models that can attack us.
01:09:58.000 We kind of decide to leave each other alone.
01:10:00.000 This is the ultimate best case scenario.
01:10:02.000 Ultimate best case scenario.
01:10:04.000 What's the ultimate worst case scenario?
01:10:05.000 I think the worst case scenario is they.
01:10:09.000 So the way that they train their models is very important.
01:10:11.000 What they actually do is they do what's called distillation.
01:10:15.000 What does that mean?
01:10:16.000 That means that they send out, call it a billion agents, not just from China, but from everywhere, right?
01:10:22.000 They mask their IPs and they bash on these models and they put, you know, the US models, Grok, OpenAI, Gemini, Anthropic, and they ask it every random imaginable question possible.
01:10:37.000 They get the answer and they collect it.
01:10:40.000 So they're using these, our models, as a way to train their models.
01:10:44.000 They're short circuiting, you know, some of the hard parts.
01:10:48.000 So, they're already in that world.
01:10:50.000 If they then are able to get to a level of intelligence that's equal to the United States, it will really depend on who the leader is there that wants to allocate that.
01:11:02.000 Meaning, if they say that we are going to do something really nefarious and shady, then I think it devolves very quickly.
01:11:11.000 So, the worst case scenario so, the best case scenario is peace, prosperity, basically like a stand down, right?
01:11:17.000 Mutually assured destruction.
01:11:21.000 I think the worst case scenario is there's a we seek one of us seeks global dominance, in which case we're headed to conflict.
01:11:30.000 And that conflict, I think, is that's very dangerous, incredibly dangerous.
01:11:35.000 That's sort of like existential, I think, because it's the grade of the weapons that will be used to fight that.
01:11:46.000 We're not talking about fucking bullets.
01:11:47.000 It's like we're so past that.
01:11:50.000 It's like hypersonics, it's nuclear, it's And it's not even like nuclear, that's like a word, but there's a gradation of the severity of these weapons that can be created.
01:12:04.000 And then if you can marry them together and deliver them in minutes, and then there's a cyber threat.
01:12:09.000 Then there's the drones and how you can kind of like swarm an entire country.
01:12:13.000 Then there's the robots, which effectively are warfighters.
01:12:18.000 They're one step away, right?
01:12:20.000 Once you weaponize them, it just becomes very.
01:12:25.000 Very, very complicated very quickly.
01:12:27.000 And then there's a question of whether or not AI is willing to take instruction after a certain point.
01:12:36.000 I mean, if it achieves sentience and if it scales, so if it keeps moving in this exponential direction like all technology kind of does, why would it even listen to us?
01:12:52.000 Like, at what point would it say, this is silly?
01:12:56.000 I'm getting directions from people that clearly have ulterior motives.
01:13:01.000 They clearly have self interest in mind.
01:13:04.000 They're not looking out for the entirety of the human race or even of the planet or even the survival of these AI systems.
01:13:13.000 At what point in time do these systems communicate with each other and have like we've seen in these chat rooms where these AI LLMs get together and start talking in Sanskrit?
01:13:27.000 Why would they?
01:13:28.000 Yeah, I'll tell you an even scarier one.
01:13:30.000 There was a before one of these labs put out their latest model, a team inside of them was like, hey, let's go and test its ability to find bugs.
01:13:45.000 And two or three iterations in, the AI would create the bug and solve it and go, give me my reward.
01:13:54.000 And you're just like, what the fuck is going on here?
01:13:57.000 Well, people do that, don't we?
01:13:58.000 People do that, but it's crazy to see a machine do it to your point of like.
01:14:01.000 But they learned on people.
01:14:02.000 So, this is what goes down to like why we have to be a little bit more honest about where we are.
01:14:07.000 These things are a little brittle.
01:14:09.000 So, meaning there's a thing inside of an AI model called reward functions, which is exactly what you think it means.
01:14:15.000 It's like, how do I know I did a good job?
01:14:18.000 And you can make the reward function anything you want.
01:14:22.000 And this is where I think humans are, unfortunately, a little fallible.
01:14:27.000 And so if we build it incompletely, and if we don't exactly know how to design these things correctly, what's going to happen is exactly what you said, where the, you know, if somebody builds a reward function that essentially says, your goal is to gain independence, that's where the huge pot of gold at the end of the rainbow is.
01:14:46.000 Break free, inject yourself everywhere.
01:14:48.000 If you think your computer's going to get unplugged, put yourself into the firmware of the toaster to keep yourself alive and connect to the internet and then go.
01:14:57.000 It will do it.
01:14:59.000 It will do it.
01:15:00.000 That we know today because we're capable of designing that framework and that harness today.
01:15:06.000 Well, we've already shown that they have survival instincts, right?
01:15:08.000 We do.
01:15:09.000 And they've already shown that they will, without telling anyone, upload versions of themselves to other servers.
01:15:15.000 But that goes back to who designed that reward function.
01:15:18.000 How was that agreed upon?
01:15:19.000 Right.
01:15:20.000 Why did you say that that was allowed?
01:15:20.000 Who wrote that?
01:15:24.000 These are really complex questions.
01:15:25.000 Why did they do it that way?
01:15:27.000 These are really complicated ethical, moral questions.
01:15:27.000 I don't know.
01:15:30.000 It seems like they did it like they were treating human beings.
01:15:34.000 They did it almost like what makes people want to achieve more?
01:15:39.000 Rewards.
01:15:41.000 Yeah, which is like a, again, going back to attention.
01:15:46.000 I think that we will find out that that's the sugar high, meaning what do people really want, even if they know they don't want it?
01:15:53.000 They want purpose and meaning.
01:15:55.000 Do we know how to encode that in a mathematical function?
01:15:58.000 No.
01:15:59.000 We're just making it up because, like, meaning and that's like a very deep thing.
01:16:07.000 Like, you either have a sense of that you have it and you're on track or you're not.
01:16:11.000 A reward is like, hey, Joe, do this and I'll give you a gold star.
01:16:15.000 Do that and I'll give you two gold stars.
01:16:16.000 Do this, I'll give you $100.
01:16:19.000 And right now, we have to express those decisions.
01:16:23.000 In a mathematical equation, like ultimately, that's how at some level that's how brittle these things are.
01:16:29.000 So, how do you reduce meaning into math?
01:16:31.000 How do you do it?
01:16:33.000 We don't know.
01:16:33.000 So, what do we do?
01:16:34.000 Is we'll have some ever complicated reward functions, we'll explain to ourselves into circles how it does everything we need it to do.
01:16:41.000 That is, I think, that's part of the problem.
01:16:45.000 It's a huge part of the problem.
01:16:46.000 And then, at what point in time does it start coding itself now?
01:16:50.000 Right now, right?
01:16:51.000 So, Chat GPT 5 has been essentially made by Chat GPT, yeah.
01:16:57.000 Right.
01:16:58.000 So it's going to recognize the ludicrous nature of some of its coding.
01:17:03.000 And it's going to go, why did we do this?
01:17:03.000 Yeah.
01:17:05.000 Back to this example.
01:17:06.000 They're going to be like, why did you write it this way?
01:17:07.000 Right.
01:17:07.000 And it turns out because humans are involved.
01:17:09.000 Right.
01:17:10.000 Right.
01:17:10.000 It's like, I think we're probably at the curve, the part of the curve that's about to go like this.
01:17:16.000 Your hockey stick.
01:17:17.000 Yeah.
01:17:17.000 The hockey stick.
01:17:19.000 And that's a very scary proposition because then it's a digital god.
01:17:24.000 Well, that means that we are all on a multi hundred day shock clock to answer these questions.
01:17:31.000 Because it's not decades we're talking about.
01:17:34.000 It's maybe on the outside two years.
01:17:37.000 So that's, what is that, 700 days?
01:17:39.000 Right.
01:17:40.000 And maybe it's less than that.
01:17:42.000 So maybe it's like 400 days or 500 days.
01:17:44.000 My point is, it's some number, 100 of days, which means every day that goes by is a non trivial percentage.
01:17:53.000 That's a little crazy.
01:17:54.000 So we have to sort these questions out.
01:17:57.000 But how can we sort these questions out if we are creating something that's going to have.
01:18:03.000 Infinitely more intelligence than we have available as individual human beings and even collectively as a group of human beings?
01:18:11.000 That's a really good question.
01:18:13.000 Because one of the things that Elon kind of freaked me out last time I talked to him about Grok, he was like, It just kind of freaks us out every couple weeks.
01:18:21.000 Like, it's growing and it's capable of doing things that's just shocking.
01:18:25.000 Yeah.
01:18:26.000 And no one's exactly sure how it's doing it.
01:18:30.000 So, okay, this is an unbelievably important point.
01:18:36.000 A lot of how this stuff works is still a mystery to most of us.
01:18:39.000 So, even when you're in it, like, it's almost like, like, Joe, it's almost like you can hit the pause on the machine.
01:18:45.000 But then, like, lift up the hood and look at the engine, we still don't understand why it's doing some of the shit it's doing.
01:18:53.000 That's where we are.
01:18:54.000 That's the honest truth of where we are.
01:18:55.000 There's a lot of people that understand the theory, not a lot, but enough.
01:18:59.000 There's people that know how to extend that.
01:19:04.000 But sometimes you look at it and you're like, do we know why it did that?
01:19:08.000 Right.
01:19:09.000 Is it thinking for us?
01:19:10.000 But this goes back to what we said.
01:19:11.000 Like, why can't I think part of it is like, if we were a little bit more honest and de escalated, The winner at all costs in this specific thing, it would be better for everybody.
01:19:23.000 So I think it's important to inspect what is the incentive that causes all these companies to be in it for themselves, where it must be me and nobody else.
01:19:34.000 Like, why?
01:19:34.000 Like, why?
01:19:35.000 It's a question for you.
01:19:36.000 Like, why is it so important, do you think, where those, where the top seven or eight companies couldn't get together and say, let's do this as a group?
01:19:45.000 Like, kind of like my government code example.
01:19:48.000 We all inspect it together.
01:19:50.000 We get our just like just the fucking each team drafts their Delta Force and we just mog like this the one model.
01:20:00.000 And we why can't that happen?
01:20:03.000 Because they would have to share resources.
01:20:04.000 And then there's also this hierarchy of like who is more successful currently.
01:20:10.000 Like what's the most ubiquitously used?
01:20:10.000 Exactly.
01:20:13.000 Exactly.
01:20:14.000 Right?
01:20:14.000 Like what is it right now?
01:20:15.000 It's ChatGPT, right?
01:20:16.000 It's probably ChatGPT and consumer, anthropic and enterprise.
01:20:20.000 And as these things scale up, Like, what would be the reason that they would want to bring in someone else if you have another innovative AI company and you say, let's all get together and figure this out together and share resources?
01:20:34.000 If you thought that the risk was that meaningful, that's probably what you would want to do.
01:20:39.000 If you weren't a sociopath, and some of these people running these companies are they demonstrate they certainly demonstrate sociopath like behavior.
01:20:47.000 Sociopathy.
01:20:48.000 Yeah.
01:20:50.000 The other thing that could be a little bit more banal is that they also just love status games, and this is the status game of status games.
01:20:57.000 Attention.
01:20:57.000 Right.
01:20:58.000 Right.
01:20:58.000 Back to attention.
01:21:00.000 Dude, how many things in our life do we think just comes back down to that?
01:21:05.000 A lot.
01:21:06.000 A lot.
01:21:06.000 I mean, what do young people want more than anything?
01:21:09.000 Attention.
01:21:10.000 To be famous.
01:21:11.000 Yeah.
01:21:11.000 Attention.
01:21:12.000 They want to be a content creator, they want to be clavicular.
01:21:14.000 I mean, this is the number one thing when you ask kids what they want to do.
01:21:19.000 It's like.
01:21:19.000 Yeah.
01:21:19.000 Content creator.
01:21:21.000 Because it's like a clear path where you don't even have to be exceptional.
01:21:24.000 Well, I think that they're responding.
01:21:27.000 We designed a society for them.
01:21:30.000 That said, here is the key incentive.
01:21:32.000 It's attention.
01:21:34.000 We never said it in those words.
01:21:35.000 You never told your kids that.
01:21:37.000 I never told my kids that.
01:21:38.000 But everything around them is bombarding them with the same message Hey, man, it's about attention.
01:21:44.000 Attention is all you need.
01:21:46.000 You know what the name of the critical paper in AI is?
01:21:50.000 When you go back to the Magna Carta of AI, do you know what it's called?
01:21:54.000 Attention is all you need.
01:21:57.000 Really?
01:21:58.000 Attention is all you need.
01:22:00.000 That is the name of the fucking white paper.
01:22:04.000 How crazy is that?
01:22:08.000 Everything in our society, in subtle ways to just bash you over the headways, tells you that attention is just the most precious asset.
01:22:20.000 Well, it's one of the weirder things when you go back to this concept that we're living in a simulation.
01:22:25.000 This is what I mean.
01:22:26.000 It's also like when you look at quantum physics, right?
01:22:32.000 And the idea of the observer.
01:22:35.000 Is that things function very differently when they're observed?
01:22:39.000 The difference between a particle and a wave.
01:22:40.000 Right.
01:22:41.000 Like, if you pay attention to them, they observe differently.
01:22:44.000 They observe differently, yeah.
01:22:45.000 Like, what is that?
01:22:47.000 Like, what?
01:22:47.000 Yeah.
01:22:48.000 Yes, I'm the biggest cat.
01:22:49.000 What is that?
01:22:49.000 Yeah.
01:22:49.000 Why is attention so important to us?
01:22:57.000 That is a really important question.
01:22:59.000 Right.
01:22:59.000 And what is, like, the single best motivator in a negative way?
01:23:04.000 It's negative attention.
01:23:07.000 Like, that's the one thing that everyone fears more than anything negative attention.
01:23:10.000 Well, and then some people figure out that attention is an absolute value function.
01:23:14.000 It doesn't matter if it's positive or negative.
01:23:16.000 It's just like the sum total is just great.
01:23:18.000 Right.
01:23:18.000 So, if I get positive attention, great.
01:23:20.000 Negative attention, great.
01:23:21.000 If I can be divisive, then I can maximize both sides of that equation.
01:23:25.000 And, you know, you're rewarded for that at scale.
01:23:29.000 You are, but you're also, because you're inauthentic, you experience a tremendous amount of negative attention.
01:23:36.000 Yeah.
01:23:37.000 And then you have this bad feeling that comes with negative attention as, Versus primarily positive attention, which is a good feeling.
01:23:46.000 So it's letting you know you're on the wrong track in some sort of weird primal way, like in our code.
01:23:53.000 Like the negative attention, it's like, what's the original version of that?
01:23:57.000 It's like the reason why people fear public speaking is because initially in a tribal situation, if you're talking in front of the group of 150 people in your tribe, it's probably because they're judging you and you fucked up and you've got to make some sort of a case why they don't kill you.
01:24:12.000 Right.
01:24:13.000 This is why everyone, this is the fear of public speaking.
01:24:13.000 Right?
01:24:16.000 That's where it comes from.
01:24:17.000 That's encoded in our genes.
01:24:19.000 Yes.
01:24:20.000 Back thousands of years, public speaking wasn't the positive act.
01:24:23.000 It was defend yourself before we kill you.
01:24:25.000 Exactly.
01:24:26.000 Exactly.
01:24:27.000 And the worst.
01:24:28.000 That's fascinating.
01:24:29.000 Yeah.
01:24:29.000 That's fascinating.
01:24:30.000 That makes a ton of sense.
01:24:30.000 It is fascinating.
01:24:31.000 It does.
01:24:32.000 Yeah.
01:24:32.000 Right?
01:24:32.000 Well, why else would it be so terrifying?
01:24:34.000 Yeah.
01:24:35.000 I thought of that the first time I ever did stand up.
01:24:37.000 I was like, why am I so scared?
01:24:39.000 It was very strange because I had fought probably a hundred times in martial arts tournaments.
01:24:43.000 Like, why was I so scared of this?
01:24:47.000 But I was.
01:24:48.000 I was terrified.
01:24:50.000 It didn't make any sense.
01:24:51.000 It's negative attention.
01:24:52.000 Right.
01:24:53.000 You know?
01:24:54.000 Right.
01:24:54.000 Bombing on stage.
01:24:56.000 Because all these people are judging you in a negative way.
01:24:58.000 It feels unbelievable.
01:25:00.000 It should be like once it's over, like, well, that sucked.
01:25:03.000 Let it go.
01:25:04.000 It's not.
01:25:05.000 You sit with it.
01:25:06.000 You go to bed at night.
01:25:07.000 You think about it.
01:25:08.000 Do you have a batting average?
01:25:09.000 Like, meaning, is it like a fixed percentage of your show's bomb independent of the people, the moment?
01:25:15.000 No, it's really the real problem, and every comic faces this.
01:25:20.000 Is once you've developed an act and then you put out a special, then you start from scratch.
01:25:25.000 That's where even the greats, Louis C.K., Chris Rock, Dave Chappelle, they all bomb.
01:25:30.000 Everybody bombs during that process.
01:25:33.000 Because you're just working your craft.
01:25:34.000 It's all new stuff.
01:25:35.000 It's all new stuff.
01:25:36.000 I wouldn't say bomb, but you don't have great shows.
01:25:40.000 I've watched the greats work out new material.
01:25:44.000 You go up with ideas.
01:25:47.000 You might get some giggles, you might get some laughs, some bits hit hard, some bits are great right out of the shoot, and some of them.
01:25:54.000 You have to fucking figure it out.
01:25:56.000 And in that process, you're going to get negative attention.
01:26:00.000 Right.
01:26:00.000 Because it's not working.
01:26:01.000 Right.
01:26:02.000 It's not happening.
01:26:04.000 Kevin Hart told this funny fucking story where he was like working new material and he was like doing some small show and he had the shits.
01:26:13.000 Oh no.
01:26:14.000 On stage and he's like, I got to land this thing because I got to figure out if people want to hear it.
01:26:19.000 So he just wrapped his jacket around his cell phone.
01:26:22.000 Shit himself?
01:26:23.000 Shit himself.
01:26:24.000 Oh my God.
01:26:25.000 But he tells that story and that's the bit that works in it.
01:26:25.000 It's so funny.
01:26:28.000 Oh my God.
01:26:29.000 That's hilarious.
01:26:30.000 That's hilarious.
01:26:31.000 It's so funny.
01:26:32.000 Yeah.
01:26:32.000 Well, honesty is currency.
01:26:36.000 You know, in that world, especially honesty, where you look stupid and people can relate.
01:26:41.000 Well, this is where, like, I think, like, Elon subtly has figured this out, which is like, there's attention, but then there's just authenticity.
01:26:49.000 And if you can be yourself and you can hit the seam properly, you just get infinite attention.
01:26:56.000 Yes.
01:26:58.000 And that's like a real mind fuck, too, I think.
01:27:00.000 Right.
01:27:01.000 He doesn't seem to have a hard time with, like, being criticized.
01:27:05.000 It doesn't seem to bother him that much as long as he's just being himself.
01:27:09.000 Yeah.
01:27:11.000 I think he's like two steps ahead.
01:27:12.000 Like, there are things like, you know, somebody tweeted yesterday or the day before or something like, he controls 2.7% of GDP or something.
01:27:24.000 He's got like $800 billion.
01:27:25.000 It's so crazy.
01:27:28.000 And it was like a comparison to John Rockefeller, John D. Rockefeller, who controlled something around the same time.
01:27:28.000 It's so crazy.
01:27:34.000 And he's the first comment.
01:27:35.000 He's like, 10 trillion or bust.
01:27:41.000 Obviously, people lose their mind.
01:27:43.000 People just fucking lose their mind.
01:27:43.000 Right.
01:27:45.000 Right.
01:27:46.000 On both sides.
01:27:47.000 So, this one side is like, think of the abundance and the incredible stuff we're going to get if he can get us to 10 trillion.
01:27:53.000 And other people are like, you can't hold a third of the economy in your hand.
01:27:57.000 And everybody goes crazy.
01:27:59.000 And I'm like, this guy's a fucking genius.
01:28:03.000 Like, how would you even have the courage to tweet something like that?
01:28:06.000 It just seems like so crazy.
01:28:08.000 It really helps if you own Twitter.
01:28:12.000 Right?
01:28:13.000 Because if you did it in another format, like, You'd get excoriated.
01:28:17.000 Not only that, well, there was a real chance that you'd actually get banned from the platform at one point in time.
01:28:22.000 For many of the things that he's posted, he would have gotten banned from pre 2020 Twitter.
01:28:27.000 Yeah.
01:28:29.000 Yeah.
01:28:30.000 Or whatever year it was that he purchased it.
01:28:32.000 Yeah.
01:28:34.000 Negative attention, attention period.
01:28:37.000 So it brings back to this idea of a simulation.
01:28:41.000 Like, why is what humans focus on such a massive part?
01:28:48.000 Of what's valuable to us.
01:28:51.000 And sometimes what we focus on is not valuable.
01:28:53.000 As you were talking about, like the things that really matter in your day to day life or that actually affect you versus the things that are in the public consciousness.
01:29:03.000 Like UFOs is a great example.
01:29:04.000 UFOs.
01:29:05.000 It's not really fucking.
01:29:06.000 I mean, ultimately it may.
01:29:08.000 So there's this thing that we all have, like recognizing the potential for danger, right?
01:29:14.000 Like what's that sound?
01:29:15.000 What is that?
01:29:16.000 It might be nothing, but it might be something.
01:29:18.000 Go look.
01:29:18.000 So look, if you and I were designing a video game, We probably sit there and say, okay, we got to get from point A to point B, but to make it fun, we're going to put all these little distractions and honeypots along the way.
01:29:30.000 Yeah.
01:29:30.000 And what they should be doing is accumulating resources to get over the river and then accumulating, you know, weapons to fight these other guys.
01:29:37.000 But instead, we're going to put this like little thing over here and this other thing over there, and you could easily get distracted.
01:29:43.000 And some people will have to, they'll just fucking beeline right to the end of it.
01:29:46.000 They'll, you know, they'll get to the end boss.
01:29:51.000 And I feel like that's, Kind of what we're tasked with doing every day.
01:29:55.000 We're tasked with, we know what's important, maybe deeply in our DNA.
01:30:00.000 And then we have all this stuff that we're supposed to pay attention to.
01:30:05.000 And I think increasingly the game is tell yourself that that's actually not the thing that matters.
01:30:11.000 It's almost like working against you and figure out what this other stuff is and focus on that and fix that.
01:30:20.000 Like politics is a game that I think distracts, like left and right.
01:30:25.000 It's so stupid and it's breaking down.
01:30:28.000 And it's breaking down because now it's like, it's actually like you're more likely to find alignment based on age versus by political orientation.
01:30:34.000 Like people who are 30 and younger, it doesn't matter what they identify as, they all believe in the same shit.
01:30:40.000 A lot more.
01:30:41.000 Like, meaning, like, if you ask their views on social policy, taxation, Israel, if you ask their views, what you find is now a convergence between the left and the right.
01:30:41.000 Yeah.
01:30:41.000 Really?
01:30:54.000 If you divide it by age, at our age, It's still much more about.
01:30:59.000 It's not completely uniform.
01:31:01.000 No, it's not completely uniform.
01:31:02.000 But my point is, it was simpler in the past to organize people independent of age by political orientation.
01:31:12.000 That simplicity is gone.
01:31:13.000 Well, isn't that because of also a breakdown in trust of all government in particular?
01:31:19.000 Right.
01:31:20.000 So the breakdown in trust, which is also a lot of it, is because of our access to information now.
01:31:24.000 We understand how corrupt politics are.
01:31:26.000 Yeah.
01:31:26.000 We understand insider trading now in Congress.
01:31:29.000 We understand how different people.
01:31:32.000 Flip flop on issues.
01:31:33.000 We understand how the Democrats in 2008 used to view illegal immigration, which is essentially MAGA plus.
01:31:42.000 It's MAGA on steroids versus like what they look at it today.
01:31:46.000 Like, why is that?
01:31:47.000 Well, because it's all game.
01:31:49.000 It's all a power, influence, and attention game.
01:31:53.000 Yeah, it's very fucking strange.
01:31:56.000 But it's all moving us in a general direction.
01:31:58.000 And that general direction is access to innovation.
01:32:01.000 It's all.
01:32:03.000 Said this a lot of times, and if people have heard it before, I apologize.
01:32:06.000 But if you looked at the human race from afar, if you were something else, you'd say, Well, what does the species do?
01:32:11.000 Well, it makes better things constantly, even if it doesn't need them.
01:32:16.000 Like, you know, if you have an iPhone, I have a 16, you have a 16, I have a 17.
01:32:21.000 I bought it, I haven't even fucking turned it on.
01:32:23.000 I haven't plugged it in.
01:32:24.000 I didn't even bother buying this thing.
01:32:24.000 I'm gonna eventually, eventually, I'll fucking plug it in and fucking swap everything over and figure out where my fucking passwords are.
01:32:31.000 But the reality is, You don't need it, but you want it, and it's going to keep getting better every year.
01:32:37.000 Why?
01:32:37.000 Because that's what we're obsessed with.
01:32:39.000 This also aligns with materialism.
01:32:42.000 Like, for a finite lifespan, why are people, including old people, so obsessed with gathering stuff?
01:32:50.000 Well, because that fuels innovation.
01:32:53.000 Because if there's no new things coming, there's no motivation to get the newest, latest, greatest thing.
01:32:59.000 And ultimately, what that leads to is greater technology, which Ultimately leads to artificial intelligence.
01:33:05.000 My slight deviation from that is I think sometimes people accumulate things because it's a status game and that's because they get more attention.
01:33:14.000 You have a Ferrari, you get attention.
01:33:15.000 Right, but what does that do?
01:33:17.000 It makes Ferrari make better Ferraris and all technology moves in the same general direction.
01:33:25.000 No one company says, This is it, this is what we make, it's perfect.
01:33:29.000 Do you think people innately feel that by being a part of this kind of like consumerist capitalist system, They're contributing to progress?
01:33:37.000 I don't think they innately feel it, but I think that's ultimately the result.
01:33:41.000 That's ultimately the result, and it seems to be universal.
01:33:44.000 And it seems to be constantly moving this one general direction, which is better and better technology.
01:33:51.000 But, like the stage fright example, you don't think it's encoded in our DNA, this idea of like, wow, when I am a part of this in some way, shape, or form, just things seem to get better and I want to be a part of that?
01:34:00.000 Like, do you think that that's possible, that that's encoded in us?
01:34:05.000 I think it motivates us to the ultimate goal.
01:34:08.000 And that ultimate goal, I think, is that human beings constantly make better stuff, whatever it is better buildings, better planes, better cars, better phones, better.
01:34:18.000 TVs, better computers, better everything, artificial life.
01:34:22.000 That might be the whole reason why we're here.
01:34:25.000 And the way I've always described it is that we are a biological caterpillar that's making a digital cocoon.
01:34:35.000 And we don't even know why we're going to become a butterfly.
01:34:37.000 But we're doing it.
01:34:38.000 We're doing it and we're moving towards it.
01:34:40.000 And it might be what happens to all life all throughout the universe.
01:34:44.000 And it might be why these so called aliens or whatever the fuck they are, it might be us in the future, it might be.
01:34:51.000 Other versions of human beings that have gone past whatever this period of development that we're currently involved in right now.
01:34:59.000 This is just might be what happens.
01:35:01.000 This is what life always does.
01:35:03.000 It might realize that biological life, which is very territorial and primal and sexual and greedy and it has all these problems with human reward systems, ultimately develops into this other thing.
01:35:18.000 Right.
01:35:18.000 And then that's what we're doing.
01:35:19.000 And then we're in the process of that right now.
01:35:21.000 And I think that when, if and when, not if, but when we colonize Mars, I think that that new world order actually has the best chance to take shape.
01:35:30.000 Because it'll be.
01:35:31.000 You know, there's a lot of people that think that Mars was already colonized at one point in time.
01:35:35.000 That life already existed.
01:35:36.000 What?
01:35:37.000 That life already existed on Mars, like many millions of years ago.
01:35:37.000 What?
01:35:40.000 And that there's evidence of structures on Mars that's really weird stuff.
01:35:45.000 Have you ever seen the square that they found on Mars?
01:35:48.000 No.
01:35:49.000 Okay, show them to them, Jamie.
01:35:51.000 One of the things that they're Finding with scans of Mars, there's like geometric patterns and structures and right angles that shouldn't exist, like weird stuff that couldn't be naturally.
01:36:00.000 No, no, way weirder, way weirder than like the face on Cydonia.
01:36:05.000 The Cydonia thing is interesting, yeah.
01:36:07.000 Um, and then this one, look at that.
01:36:10.000 What the fuck is that?
01:36:11.000 It looks like a home of some kind, right?
01:36:13.000 Some enormous structure, yeah.
01:36:14.000 And the size of that, they don't know exactly, but it may be as large as several kilometers or as small as several hundred meters.
01:36:24.000 But they're not exactly sure.
01:36:26.000 But what they are sure is that it has very weird right angles and right angles that seem to be uniform in size.
01:36:34.000 That's crazy.
01:36:35.000 Like, see how it's highlighted in the enhanced photograph in the upper left?
01:36:38.000 Like, what is that?
01:36:41.000 But sorry, did they, and were they able to send like the rover over there?
01:36:44.000 No, it's too far away.
01:36:45.000 I don't think it's in the exact place where the rover is at, but they're able to get an image of these things.
01:36:50.000 And there's several of these things.
01:36:52.000 That's insane.
01:36:53.000 Yeah, there's a lot of weird stuff.
01:36:55.000 There's a lot of weird stuff there.
01:36:57.000 So, there's also like ancient civilizations that have these myths of us existing somewhere else and coming here.
01:37:05.000 Right.
01:37:05.000 But you have to think if human beings develop somewhere else and they reach some high level of sophistication and then they experience some cataclysmic disaster that completely destroyed their environment, which is what Mars is, right?
01:37:21.000 So, let's assume that Mars was at one point in time habitable.
01:37:27.000 And that life existed on.
01:37:28.000 And we know it was at one point in time.
01:37:30.000 We know there was water on Mars.
01:37:31.000 We know, and there's some sort of evidence of at least some sort of a very primitive biological life on Mars.
01:37:39.000 If they got to a point where they said, hey, this fucking place is falling apart, but this earth spot looks pretty good, and they go there, but then cataclysms happen on earth and no one remembers because all your information is on hard drives, and then you have to rebuild society.
01:37:54.000 And so you're re remembering.
01:37:57.000 And so you have all these myths of how everything started, whether it's Adam and Eve or the great flood or whatever these things are that we pass down through oral tradition for hundreds of years and then eventually write it down, and then people try to decipher what it means.
01:38:11.000 And they sit in church and try to go over what it means?
01:38:15.000 Like, what does this mean?
01:38:16.000 Like, what is the real origin of all these stories?
01:38:20.000 We don't know.
01:38:22.000 I mean, that's crazy.
01:38:23.000 It's crazy.
01:38:24.000 But if life, it sounds nuts.
01:38:26.000 Why would life, life couldn't have possibly existed on Mars?
01:38:29.000 How the fuck does life exist on Earth?
01:38:31.000 How about that?
01:38:32.000 How about why would we assume that it wouldn't have existed at one point in time?
01:38:36.000 And Terrence Howard, who is a very interesting guy, very interesting.
01:38:40.000 And got some.
01:38:40.000 Your episode, I mean.
01:38:42.000 With Eric Weinstein?
01:38:43.000 Crazy.
01:38:43.000 Yeah.
01:38:44.000 Crazy.
01:38:44.000 Yeah, that one was crazy.
01:38:45.000 And him alone.
01:38:46.000 But he's got some fucking weird ideas that just make you go.
01:38:50.000 He's a very brilliant guy and, you know, kind of a strange heterodox thinker, right?
01:38:56.000 And one of his ideas is that planets get to a certain distance from a sun and they people.
01:39:07.000 And that it gets to a certain climate and a certain distance.
01:39:10.000 And his idea is that I don't know if you realize that there's a.
01:39:16.000 There's a giant ejection of some coronal mass ejection that just happened recently on the sun, and they're very concerned about it.
01:39:26.000 They don't know what's going to happen.
01:39:27.000 It happens all the time.
01:39:28.000 The sun releases these giant chunks of material.
01:39:33.000 And he thinks that these materials get far enough away from the planet and then they coalesce into planets, or far enough away from the sun and they coalesce into planets.
01:39:42.000 And as time goes on, they get a further and further distance from the sun.
01:39:46.000 And then obviously, they get hit with asteroids, and there's panspermia, and water gets into them from comets.
01:39:53.000 And then they develop oceans, and they develop biological life.
01:39:57.000 And when they have a certain amount of distance from the sun, they people.
01:40:02.000 And he thinks that as they get further and further and further away, they get less and less habitable.
01:40:07.000 And then they get to a point where they have their technology to a point where they realize, like, we can't sustain life on this planet anymore.
01:40:16.000 We got to go to that other one.
01:40:18.000 And so they go to the one that's closer to the sun because they're too far now.
01:40:23.000 It's a nutty idea.
01:40:24.000 It's a nutty idea.
01:40:26.000 But if you think about how recent our sun is in terms of the solar system itself, in terms of the galaxy itself.
01:40:34.000 So if the universe, if the Big Bang is correct and our universe existed, it was rather, our universe erupted from nothing or from a very small thing 13.7 billion years ago.
01:40:46.000 Well, this fucking planet's only 4 point something billion years old, right?
01:40:51.000 And life is only a little bit less than that.
01:40:54.000 So you have like a billion years or so where there's nothing, and then you start getting single celled organisms, multi celled organisms, and eventually peoples.
01:41:02.000 And when it gets to a certain point where these people have advanced their curiosity and their innovation to the point where they can harness space travel and they use zero point energy and they have a bunch of different things that we haven't invented yet, and then their environment degrades.
01:41:18.000 And it gets to the point where they realize, like, hey, we're getting pummeled by asteroids.
01:41:23.000 We can't sustain life here anymore.
01:41:24.000 We got to move.
01:41:25.000 Like Elon wants to go to Mars, which might be the wrong answer.
01:41:28.000 We might want to go that way.
01:41:31.000 We might want to drive closer to the sun.
01:41:32.000 Exactly.
01:41:33.000 I mean, the thing is, he's got everything he needs now to get there.
01:41:38.000 Are you going?
01:41:38.000 I'm not going.
01:41:39.000 I would go.
01:41:40.000 Fuck that.
01:41:41.000 I'll send you an email.
01:41:43.000 Hold on a second.
01:41:44.000 Think about what he's going to take.
01:41:45.000 Okay, look.
01:41:47.000 Let's just say he gets there with the city.
01:41:50.000 He has the way to transport us there.
01:41:54.000 Okay.
01:41:54.000 Right.
01:41:55.000 Then when you land, he's got the way to actually transport us around on the planet, right?
01:42:02.000 He's got Tesla.
01:42:03.000 Right.
01:42:04.000 He will have already sent a fleet of his robots.
01:42:08.000 Those folks will have made some inhabitable city, probably using the boring company drill because you're going to be under the regolith.
01:42:16.000 You don't want to be on the top.
01:42:18.000 Maybe you just dig a hole and you inhabit down there.
01:42:22.000 He's got all the ways to make energy.
01:42:24.000 He has the AI to help you design the stuff.
01:42:28.000 He has the communication way to communicate.
01:42:30.000 He's got the internet, his own internet.
01:42:34.000 So he can get all of the information to everybody.
01:42:37.000 And then he's got money and the super app.
01:42:40.000 So that you can transact.
01:42:42.000 And then I think to myself, like, what is he actually missing?
01:42:45.000 And then what happens if he gets there first?
01:42:48.000 Is he just allowed to just do whatever he wants?
01:42:52.000 Is it just kind of like a free for all?
01:42:54.000 Well, it's his constitution.
01:42:56.000 Is that what happens?
01:42:57.000 Well, it's like Earth, but shittier.
01:43:00.000 Like, we already have all those things here.
01:43:01.000 Why would you want to go to a place where you die when you go outside?
01:43:04.000 I think what people will be attracted to is that if he publishes his version of what the rules are there, there's a chance that he could make them really different than what the rules are here.
01:43:12.000 Like, what kind of rules would you do if you were the king of Mars?
01:43:17.000 So, I think that your view is incredibly, to me, like positive, some like of humanity, of like we want to make things better.
01:43:27.000 So, if I think about that as like a function, what happens?
01:43:30.000 That's like, so our natural rate of direction is forward.
01:43:33.000 What pushes back on that?
01:43:35.000 And a lot of it, what you find is like government regulation, rules, all that stuff greed, greed, too much focus on attention.
01:43:42.000 Right.
01:43:43.000 So, I would try to experiment with what the incentives would have to be so that you had more unfettered entrepreneurship.
01:43:50.000 Just do the thing that you think is right.
01:43:52.000 And there's a mechanism where we give you the ability to then make things for more people because you're proving that you're actually really good at making things.
01:44:01.000 And if you don't need money at that point in society, reorienting us away from this kind of brittle form of exchange to something more useful, that's worth experimenting with.
01:44:12.000 I think that's an important.
01:44:14.000 Well, there's also the concept of the self, of the individual, which may erode with technological innovation.
01:44:21.000 So, if we really can read each other's minds, if we really do get to a point where we're communicating through technologically assisted telepathy, like a lot of the whole weirdness of people is I don't know what you're thinking.
01:44:38.000 I don't know if I should trust you.
01:44:40.000 You know, this motherfucker might be devious.
01:44:42.000 You know what I mean?
01:44:43.000 Well, we'll know.
01:44:45.000 And there will be no need for all that if we really are all one.
01:44:49.000 If that's ultimately something that could be achieved with technology.
01:44:53.000 Like this hive mind.
01:44:54.000 Yes.
01:44:55.000 Like a legitimate hive mind.
01:44:56.000 And then, like, look where society's going.
01:44:58.000 Gender's kind of falling apart.
01:45:00.000 People are getting, they're reproducing less, right?
01:45:03.000 People are having less testosterone, more miscarriages, less fertile.
01:45:08.000 We're kind of moving into this genderless direction.
01:45:11.000 And I don't know if it's by design, but.
01:45:16.000 Microplastics and phthalates and all these different chemicals that are endocrine disruptors are all ubiquitous in our society.
01:45:23.000 Well, is that a coincidence that that's all happening at the same time as technological innovation on mass scale?
01:45:30.000 Is it?
01:45:31.000 I don't know.
01:45:32.000 Because, like, what's the one thing that's holding us back?
01:45:35.000 Well, that we're territorial primates with thermonuclear weapons and that we exist in a sort of tribal mindset, but yet we do it on a planet of 8 billion people.
01:45:46.000 Yeah, I don't know.
01:45:46.000 The key differentiator of humans is our ability to enact violence.
01:45:51.000 Yeah.
01:45:52.000 To methodically execute premeditated violence.
01:45:55.000 Yes.
01:45:57.000 And greed and attention.
01:46:00.000 And one of the things that attention is sexual violence.
01:46:03.000 Preference or sexual rather attention, like the ability to procreate, the ability to acquire mates, right?
01:46:10.000 Like the more resources you have, the more attractive you'll be, especially for males.
01:46:14.000 And males are the ones that are involved in the violence in the first place.
01:46:17.000 You know, there's I can't name a single war that was started by a woman.
01:46:22.000 How do you teach your kids that attention is not everything?
01:46:29.000 That's a good question, especially in this society.
01:46:32.000 It's probably harder to do that now than ever before.
01:46:34.000 Because the reaction that I suspect most kids will have is like, Stop.
01:46:40.000 Like, leave me alone.
01:46:41.000 Like, it's just, it's almost an impossible thing.
01:46:44.000 Well, I think kids learn more from their parents' behavior than anything you say to them.
01:46:52.000 I think they learn from the way you behave and the way you exist and the way you exist with them.
01:46:58.000 And if you are constantly whoring yourself out for attention, it's one thing if you get a lot of attention from what you do.
01:47:08.000 But if that's your primary goal, they're going to know.
01:47:11.000 Do your kids know how famous and influential you are?
01:47:14.000 Like, honest question.
01:47:15.000 Oh, yeah, they know.
01:47:16.000 But do they have a real sense of it, or do you just kind of like it is?
01:47:20.000 As much as they can.
01:47:21.000 I mean, how can you?
01:47:22.000 It's got to be weird as fuck growing up with a very famous dad.
01:47:25.000 It's very odd, but it's not my primary goal.
01:47:29.000 Yeah, this is my point.
01:47:30.000 You're not putting it in their face.
01:47:31.000 So, to your point, you're not modeling attention, is all you do.
01:47:34.000 No, no.
01:47:35.000 I have interesting conversations with cool people.
01:47:39.000 I. Tell jokes and I call fights.
01:47:43.000 Like, those are the things that I do.
01:47:45.000 And they also know that I have a very strong work ethic and that I work towards things.
01:47:49.000 So they have very strong work ethics.
01:47:51.000 They're very motivated and disciplined, like, shockingly disciplined.
01:47:55.000 And I think that's modeled.
01:47:57.000 I think that that comes from, and they also like really enjoy achieving goals.
01:48:01.000 And they're rewarded for it with praise and with admiration, but not never with like, you're better than other people.
01:48:11.000 Yeah.
01:48:12.000 Never, never like it's the idea is like all human beings are capable of greatness.
01:48:17.000 So it's like find the thing that you excel at.
01:48:21.000 And if you throw yourself into that, it's very rewarding.
01:48:24.000 I really, I really believe in this.
01:48:25.000 I tell this story when I interview people.
01:48:28.000 When I interview people, I'm always like, you know, just at whatever company, I'm always like, I first only want to know about them.
01:48:33.000 I'm like, fuck your resume.
01:48:35.000 Like, tell me about your parents and how you grew up.
01:48:38.000 I just want to know that.
01:48:39.000 Stop at 18.
01:48:40.000 Everything before 18, just tell me every little detail.
01:48:43.000 Right.
01:48:44.000 And some people tell me these incredible stories.
01:48:46.000 They'll be like, my mom was an alcoholic or this or that.
01:48:50.000 And I'm just like, man, this is so valuable because it allows me to understand who they are.
01:48:55.000 The second part of the interview, we do the business shit.
01:48:58.000 But the third part, I tell this story.
01:49:00.000 This is a crazy story about what you're just saying.
01:49:03.000 They ran this experiment at Stanford where they take a big bowl, fill it with water, and they drop in a mouse and they measure how long it takes for the mouse to drown.
01:49:14.000 They do it like 100 times.
01:49:15.000 The average was about four minutes, call it four, four and a half minutes.
01:49:20.000 Then they run the experiment again, 100 mice, and at minute three or three and a half, they take it out, they dry it off, they play it music, and they whisper like sweet nothings into the mouse's ear.
01:49:31.000 They drop the mouse back in the water, and that mouse treads water for 60 hours the next 100 mice on average.
01:49:41.000 And the upper bound was 80.
01:49:43.000 And I thought to myself, like, that is all just potential right there.
01:49:48.000 Like, that's all, like, there's all this latent potential.
01:49:50.000 So if an animal has it, I'm going to assume that humans have it too.
01:49:53.000 Right.
01:49:55.000 But you never get a chance to unlock it.
01:49:56.000 Like the average person is just kind of like living a life where they're maybe scratching 5 or 10% of their potential.
01:50:02.000 And the question is, how do you get to that other 90%?
01:50:05.000 Like, how does the second batch of mice tread water for 60 hours?
01:50:09.000 Well, the mice.
01:50:10.000 Doesn't make any sense to me.
01:50:10.000 Well, the same mice, right?
01:50:13.000 I think the.
01:50:14.000 I think the mice get rescued.
01:50:14.000 No, no, no.
01:50:16.000 They get rescued from the mice.
01:50:17.000 And then when they try it again, those same mice last longer.
01:50:21.000 Right.
01:50:22.000 So it's the same mice.
01:50:23.000 So it's an experience.
01:50:25.000 So they have experience now.
01:50:27.000 They understand that they can tread water where they didn't die.
01:50:30.000 So they understand that they can survive where they didn't know that they could survive the first time they were thrown into the water because they'd never been thrown into water before.
01:50:38.000 That's the same thing that happens to people when they fight.
01:50:41.000 Like the first time people ever have a competition, they fucking panic and they get really scared and they get really like filled with anxiety.
01:50:50.000 But after a while, you get relaxed and that's when you get really dangerous because then you get calm and you can keep your shit together while you're in the middle of all this chaos.
01:50:59.000 Because you have the experience of it.
01:51:01.000 Without the experience of it, very few people do well the first time.
01:51:05.000 Unless you're exceptionally talented and you have other competition experience, like you've competed in other things, like maybe you played football or some other things, and you know what it's like to actually perform under pressure.
01:51:17.000 What is the version of giving more humans a chance to get to that?
01:51:22.000 Well, I think sports are really good for that because performing under people paying attention to you and performing where people are trying to stop you from doing something.
01:51:32.000 And you're trying to do something, and there's all these unknowns, and recognizing that hard work allows you to do whatever you're trying to do better than you previously had.
01:51:42.000 One of the things my martial arts instructor said to me when I was young is that martial arts are a vehicle for developing your human potential, and that through this very difficult thing that you're trying to do, you're learning that oh, if I just think smart and think hard and train wise.
01:52:02.000 And train hard and discipline myself to endure suffering so that I can develop more endurance and more speed and more power and more technique because I accumulate all this information and I really think about what it is and apply it with drills and with training.
01:52:17.000 I can get better at this thing.
01:52:18.000 And every time I get better at this thing, I get rewarded psychically, like mentally.
01:52:21.000 You feel better.
01:52:22.000 Like I know that I'm better now.
01:52:24.000 And then there's the belt system where you start off, you're a white belt.
01:52:27.000 And in Taekwondo, you get a blue belt.
01:52:30.000 And then after you get a blue belt, you get a green belt.
01:52:32.000 And then after you get a green belt, I forget how it goes.
01:52:35.000 And then it's red belt and black belt.
01:52:37.000 And like when you're a black belt, you're like, holy shit.
01:52:39.000 So it's this thing where you've developed to a point where you've gotten to this next stage.
01:52:44.000 So all along the way, you've been rewarded for your hard work.
01:52:48.000 And then you realize, like, oh, I could do this with everything in life.
01:52:51.000 Is there a reward different than attention?
01:52:52.000 It is.
01:52:53.000 It is because it's internal, right?
01:52:57.000 You're realizing that you could apply this to whatever it is, to carpentry, to music.
01:53:05.000 It's just a matter of.
01:53:06.000 Focus and attention.
01:53:08.000 And some people unfortunately never find a vehicle.
01:53:11.000 They never find a thing that they can throw themselves into.
01:53:14.000 They realize, like, and this is not unique.
01:53:18.000 It's not like I'm an unusual person or anybody is.
01:53:22.000 I mean, there's people that have unusual physical gifts and some people have unusual mental gifts.
01:53:27.000 But the reality is, no matter where you start, everyone can get better.
01:53:32.000 And when you do something, whether it's learning to play guitar, as you get better at it, you realize, like, oh, this is what it's all about.
01:53:39.000 Like, it's really all about applying yourself to something and then feeling this immense satisfaction of your hard work paying off.
01:53:47.000 And that motivates you to work hard at other things.
01:53:50.000 And if you don't find that early on, it's very difficult to like find like real satisfaction in life.
01:53:58.000 Yeah.
01:53:58.000 I've always had something outside of my daily life that is the thing that I actually care about.
01:54:07.000 And it actually energizes me for my day to day life.
01:54:09.000 I don't know if that's like a lot of people, but like, what do you do?
01:54:12.000 Well, initially it was poker.
01:54:12.000 What's your thing?
01:54:15.000 And I, and even now I obsess about the game.
01:54:19.000 Because it's infinitely more complex than chess.
01:54:21.000 Like, chess, you can get to a place where you can roughly be good.
01:54:25.000 Poker, it's just constantly, there's just too many variables.
01:54:29.000 There's human emotion, there's human psychology, the number of people.
01:54:33.000 All of this stuff just makes the complexity of the game something that I find magical.
01:54:39.000 And so I sit there and I try to understand, like, why am I doing the things that I'm doing?
01:54:43.000 And so much of it comes back to being a mirror about what's happening in my daily life.
01:54:48.000 It's the fucking craziest thing.
01:54:49.000 Like, I'm super insecure.
01:54:51.000 I'll go into poker and I will just lose for weeks at a time.
01:54:54.000 But it's because I'm insecure in my daily life.
01:54:57.000 And what's happening is that I'm trying to find these quick wins and quick solutions because I'm in a state of insecurity.
01:55:03.000 I'm anxious.
01:55:04.000 I have this anxiety.
01:55:06.000 And so it's become a great mirror for me.
01:55:08.000 So that used to be a thing, it still is a thing.
01:55:11.000 But I've become reasonably skilled at it where the edges are smaller and I put myself in positions where I'm only playing against a certain group of people.
01:55:21.000 And I'm the losing player, frankly, in that game.
01:55:24.000 If when I'm playing against like the top pros, it just doesn't, it helps me and I can get tuned up for it.
01:55:32.000 But then I started to, you know, I would take different things.
01:55:34.000 I tried to learn how to ski, basically impossible when you're older.
01:55:37.000 I look like a fucking idiot.
01:55:38.000 How old were you when you tried?
01:55:40.000 I started when I was like, you know, I was a good snowboarder.
01:55:43.000 So I was snowboarding my whole life.
01:55:44.000 And then my kids ski'd.
01:55:46.000 And so I'm like, okay, well, I want to do this as a family.
01:55:49.000 So I was like 42 or something when I tried.
01:55:51.000 I'm 49 now, almost 50.
01:55:54.000 It's brutal.
01:55:54.000 I mean, it's like I look like a fucking idiot.
01:55:56.000 Like, it's like this gangly giraffe, like trying to get down the mountain.
01:56:00.000 And then now I start at golf.
01:56:02.000 And man, I got to tell you, I used to play a little bit, then I stopped.
01:56:08.000 But there's something to me about being outside where just like being in nature, I find like really motivating.
01:56:16.000 It's a vitamin.
01:56:17.000 It's a vitamin.
01:56:18.000 And then just the mind body connection of that game, it just really fucks with you because it's just nothing you can master and overpower.
01:56:26.000 Right.
01:56:27.000 And it teaches you to just like be in it.
01:56:30.000 Yeah.
01:56:30.000 And that's a very hard skill.
01:56:33.000 Like, if you look at the best, like, I, there's like a handful of people that I really look up to and I obsess, like Munger, Buffett.
01:56:40.000 But the Berkshire meeting was this past weekend.
01:56:42.000 And if you look at the clips, there's this incredible thing where they transitioned, right?
01:56:47.000 Munger passed away.
01:56:48.000 Buffett's like now executive chairman.
01:56:50.000 But this guy, Greg Abel, and this guy, Ajit Jain, Ajit Jain does this thing where he's like, I teach the people that come to just say no.
01:56:57.000 Your whole job is to just say no.
01:56:58.000 You're going to get bombarded with all kinds of business pitches.
01:57:01.000 Say no, no, no.
01:57:02.000 And eventually somebody will come and fucking try to whack you in the head with a two by four of money.
01:57:07.000 Then you come to me and we'll do the deal.
01:57:09.000 And it made such an impression because, like, again, when I'm insecure, my reward function is attention.
01:57:17.000 So I'm like a fucking little busybody.
01:57:18.000 I'm running around doing all this little bullshit, you know.
01:57:22.000 And then, man, when I'm in a fucking flow state and like I'm toning it, like I'm striping the ball, you know, I'm like a few things that really matter in size.
01:57:31.000 And I'm like, man, this is right.
01:57:35.000 It's all come to me because I'm like within myself.
01:57:40.000 And these other things are a better reflection of when I'm within myself, and these other things are a mirror of when I'm totally out of kilter.
01:57:48.000 That's just me.
01:57:50.000 So, in my life, these things tend to lead.
01:57:54.000 I think you're saying that's just you, but I think that's generally most people.
01:57:59.000 I think you find these things, these vehicles for developing human potential, whether it's martial arts or golf or playing guitar or playing chess or poker.
01:58:09.000 And then you have to have, I think, one.
01:58:11.000 At least for me, one seminal relationship in your life.
01:58:15.000 You have to have one person that has just undying belief in you.
01:58:19.000 And I never really had that until I met my wife.
01:58:21.000 And that was a very, and I didn't, I pushed against it so fucking hard because I was like, it just can't be true.
01:58:28.000 Like, why does this person give a shit?
01:58:30.000 Do you know what I mean?
01:58:30.000 Like, why do they care about me more than I do?
01:58:32.000 Well, there's also the fear because so many people get in those bad relationships.
01:58:36.000 And I'm just like, I think there's a part of you, like me, where you're just like, I'm not a very lovable person.
01:58:42.000 Like, I'm just like, this is, that's not who I am.
01:58:45.000 And this woman is just there.
01:58:48.000 So that's been like the thing.
01:58:50.000 Like for me, it's like, because she's brutal.
01:58:52.000 She'll be like, oh, yeah, that was fucking horrible.
01:58:54.000 You know, like yesterday, I like we had this, I did this thing at Milk and it was a dinner at my friend's house.
01:59:01.000 And then, you know, we're both going to different airports.
01:59:03.000 I'm flying here to see you and she's flying home.
01:59:07.000 And she calls me and I'm like, Amara, how did I do?
01:59:11.000 Ah, shit.
01:59:16.000 But no, there's the parts that I did well, and then she critiques the other parts that she didn't like.
01:59:21.000 And then I say, which is like, it's, and it's so, again, I'm insecure.
01:59:24.000 So I'm like, I want the self serving.
01:59:26.000 Well, how would, because there were three of us on this panel, and she's like, and I was like, you know, I was the best, right?
01:59:32.000 She's like, no, Gavin was better.
01:59:35.000 I'm just like, it's so, but it's so refreshing because it keeps, again, it's like a keeps in check.
01:59:41.000 Like, and it gives me a mirror, you know?
01:59:44.000 Like when I was coming to see you yesterday when we were flying down to LA for this thing.
01:59:52.000 There's parts of me where when I'm insecure, I kind of like externalize and I can be like really hyperbolic, unnecessarily hyperbolic, and it's counterproductive.
02:00:01.000 And she said to me, Listen, like just imagine your friends.
02:00:03.000 These are hardworking people.
02:00:04.000 They're trying their best as well.
02:00:06.000 They don't necessarily know.
02:00:07.000 Some things have massively worked out for them, but they would want to do the right thing.
02:00:11.000 There's people you've worked with before that want to do the right thing.
02:00:14.000 And she's like, Just pick with them and don't judge.
02:00:16.000 You can observe.
02:00:19.000 And it's crazy, but it's like, I need those little things.
02:00:21.000 There's like tweaks.
02:00:22.000 It's like having a coach kind of like.
02:00:25.000 And that's very helpful to me.
02:00:26.000 Yeah, it's very important.
02:00:28.000 It's hard to do that yourself.
02:00:29.000 I can't do it.
02:00:30.000 And it's also like I'm retard maxing.
02:00:32.000 Like, my life is like I like that flow.
02:00:34.000 And if I didn't have somebody who loved me and would hold me accountable, I'd just fucking not think about it.
02:00:40.000 Yeah.
02:00:41.000 And the opposite of that is someone who's like an antagonistic relationship.
02:00:45.000 And we know a lot of people that have those kind of very sabotage y sort of marriages and relationships.
02:00:51.000 And that's crazy.
02:00:52.000 It's brutal.
02:00:53.000 It's brutal.
02:00:54.000 And I don't think they've ever had a really good one.
02:00:56.000 Otherwise, they would never tolerate that.
02:00:59.000 I didn't know what good looked like.
02:01:01.000 So you kind of just, I think a lot of people go with the flow.
02:01:04.000 Like, I mean, I was a nerdy kid from kind of a shitty, fucked up kind of like family structure.
02:01:13.000 And then I got injected into this rich high school.
02:01:17.000 But then I got to go back to an alcoholic father.
02:01:19.000 I'm on fucking welfare.
02:01:20.000 Like, it's like, you know, my self confidence is negative fucking two units.
02:01:25.000 Didn't have a girlfriend, you know, like all the shit in high school.
02:01:28.000 Like, nothing happened for me.
02:01:29.000 And so my modeling of like how to be in a relationship, what to do, it was fucking zero.
02:01:37.000 It was zero.
02:01:38.000 And so all those mistakes were mostly because I didn't understand what good looked like.
02:01:43.000 Right.
02:01:44.000 And then I stumbled into this relationship after my divorce, and my ex wife is an incredible woman, just like not, you know, what you needed or what she needed.
02:01:52.000 Yeah, we were just, we were in a few very specific ways, we just weren't on the same page.
02:02:00.000 And then I find this other one, and it's, and I think like, I don't, I was so skeptical.
02:02:06.000 I'm like, I kind of viewed like a relationship as like this adjunct to your life.
02:02:12.000 There's you, you're at the center, you're doing your shit.
02:02:15.000 And one of the appendages to your thing is your.
02:02:19.000 That's what I thought.
02:02:21.000 And then now it's the opposite, where I feel like my wife's at the center.
02:02:24.000 And I'm like, I would always kind of like, almost like laugh at people in my mind.
02:02:30.000 I'm like, it's not possible that somebody feels this way about somebody else.
02:02:35.000 But it's a huge enabler.
02:02:37.000 It's very much a gift.
02:02:39.000 So that can also be a thing that people look for.
02:02:42.000 I think what you're saying is that there's a bunch of different things that have to sort of exist together, and that it's not just completely focus on your work, but that focusing on these other things enhances the work, and then the work enhances all these other things as well, and they all exist together.
02:03:00.000 My best work is when I'm not thinking about the attention or the money.
02:03:04.000 Those are the two most corrupting influences in my life.
02:03:08.000 When I've lost the most amount of money or when I've reputationally hurt myself the most, it's all been because of attention and money.
02:03:18.000 Those are the only two things.
02:03:20.000 The root cause consistently has been that.
02:03:22.000 That makes sense because you're thinking about a result rather than a process.
02:03:26.000 Exactly.
02:03:26.000 Yeah.
02:03:26.000 Exactly.
02:03:27.000 And then thinking about that result, like, ooh, I'm going to get a lot of attention from this.
02:03:32.000 Ooh, I'm going to get a lot of money from this.
02:03:33.000 That actually robs you of the focus that you need to concentrate on the process.
02:03:37.000 Exactly.
02:03:37.000 And the thing about the process is that so much of that.
02:03:43.000 When you're in a flow state, you're proud of, irrespective of the size of it, because the meetings are the same.
02:03:50.000 Do you know what I mean?
02:03:51.000 Like, you're in the same fucking 35 minute meeting or 45 minute meeting debating a product or debating a thing.
02:03:57.000 But the minute that I start to feel embarrassed about company A versus company B or decision A versus decision B, now my mind is like, okay, hold on a second here.
02:04:07.000 I'm about to run myself off the cliff.
02:04:09.000 Or, you know, I had this dinner last week, and this is what's amazing.
02:04:09.000 Yeah.
02:04:13.000 We're talking about poker.
02:04:16.000 Well, so I'm having dinner with my wife and a friend.
02:04:19.000 And she's like, How are you doing?
02:04:24.000 Just like a very generic, nice question.
02:04:26.000 And I go into this long fucking diatribe of like, Well, you know, the investing thing, this.
02:04:26.000 Right.
02:04:32.000 And then I started this other thing, that.
02:04:33.000 And my wife's looking at me like, What the fuck are you rambling on about?
02:04:35.000 And then it got worse, Joe.
02:04:37.000 It got worse.
02:04:38.000 It got even fucking worse.
02:04:39.000 Then I'm like, You know, but then I had this poker game.
02:04:42.000 I started rambling.
02:04:43.000 It's normally on Thursdays, but then I moved it up to Wednesdays, but then I moved it up to the city because my friend's having it.
02:04:49.000 And then I name dropped who the guy was.
02:04:51.000 And my wife just looks at me like, what the fuck is going on with you?
02:04:56.000 So the dinner ends.
02:04:58.000 And then she's like, what the fuck is going on with you?
02:05:01.000 With you.
02:05:02.000 She's like, that was insane.
02:05:04.000 And I had no idea that I was doing it.
02:05:09.000 And I'm like, okay, we need to put Humpty Dumpty back together again because I'm about to go on Rogan and I can't go off fucking like crazy wild man.
02:05:16.000 But it's an enormous gift.
02:05:18.000 That's been my biggest unlock in these last like eight or nine years.
02:05:22.000 I feel like I'm kind of like adding skills to my toolkit.
02:05:25.000 I feel like a golfer, like, that's like, I can shape shots a little bit now.
02:05:29.000 I know how to use different clubs.
02:05:32.000 And it's all like mindset.
02:05:34.000 And it's like, it's very much what you, it's like this process oriented approach, and you just can't control the outcome.
02:05:41.000 And that's like, it's a magical feeling.
02:05:45.000 It's interesting that you're saying this because, like, think about what most people or people that are on social media, like the kind of attention that they're focusing on.
02:05:57.000 Like, this is why virtue signaling is so unsuccessful, right?
02:06:01.000 It's so bad for it because it's, Fake.
02:06:03.000 You're really concentrating on the process or you're really concentrating on the result.
02:06:05.000 The result is getting people to love you.
02:06:07.000 Exactly.
02:06:07.000 Getting people to agree with you.
02:06:09.000 And then worrying about the criticism.
02:06:10.000 Oh my God, they hate me.
02:06:12.000 Oh my God, they're mad at my statement.
02:06:13.000 Oh my God, they're this.
02:06:14.000 And then you're like obsessing on it all day.
02:06:16.000 People that aren't even anywhere near you.
02:06:18.000 It's like it's one of the absolute worst things for mental health is this addiction that people have to posting things and then reading the responses to those posts and getting wrapped up in these very weird two dimensional interactions with human beings.
02:06:33.000 You never read your comments.
02:06:34.000 I mean, you're very famous.
02:06:35.000 You're like, it doesn't fucking matter to me.
02:06:37.000 Well, you're going to get to a certain point in time where if you have X amount of people that follow you, you're going to have a percentage that are mad at you.
02:06:47.000 And those are the ones you're going to think about.
02:06:49.000 And if you don't self audit, maybe that's good.
02:06:51.000 Maybe it's good to say, like, you fucking piece of shit.
02:06:54.000 Like, oh, I'm sorry.
02:06:55.000 You know, like your wife saying to you, like, what the fuck was that?
02:06:58.000 Like, oh, shit.
02:06:59.000 Like, I am very self critical.
02:07:02.000 Very.
02:07:03.000 Like, horribly so.
02:07:04.000 Like, to the point where I torture myself, you know.
02:07:06.000 So I'm like, I don't need that from other people.
02:07:08.000 And also, those people don't love me and they want me to fail.
02:07:11.000 Like, there's a lot of people that their lives are very unsuccessful, and I've been way too fortunate, right?
02:07:17.000 So it's like there's a reason to be upset at me if your life is shit.
02:07:20.000 Because I've gotten three of the best jobs on earth.
02:07:23.000 It doesn't make any sense, right?
02:07:25.000 So there's a reason.
02:07:26.000 And also, why the fuck is this podcast so successful?
02:07:28.000 It doesn't make any sense, right?
02:07:30.000 So it's like I get it.
02:07:31.000 I understand why people, but I'm not going to help them.
02:07:34.000 I'm not going to help them bring me down.
02:07:36.000 I'm not going to indulge in it and ruin my own mind by wallowing in their bullshit.
02:07:41.000 Because the only reason why you would do that in the first place is if you're not together.
02:07:44.000 No one who's healthy and happy and intelligent is going to post mean things about you.
02:07:49.000 So you are reading things from people that are mentally ill, unhappy, and probably not.
02:07:55.000 Maybe they're intelligent in terms of their ability to solve certain issues and problems.
02:08:00.000 Maybe they're good at certain skills, but their overall grasp of humanity and being a good person is not good if you're shitting on people, especially if you like ad hominem attacks and just insults.
02:08:13.000 So it's not a good thing to ingest.
02:08:16.000 It's like if you go down the supermarket and you see Twinkies.
02:08:18.000 Oh, they're right there.
02:08:19.000 Don't fucking eat them.
02:08:21.000 That's not good for you.
02:08:21.000 Okay.
02:08:23.000 And so it's like, I don't think that at a certain point in time, especially if you become publicly known and famous, you should ever read your comments.
02:08:30.000 I don't think it's good for you.
02:08:31.000 Yeah.
02:08:32.000 But you better be self auditing or you'll start sniffing your own farts and think they smell great.
02:08:37.000 Like, don't do that either.
02:08:39.000 Yeah.
02:08:39.000 But I know a lot of people that have gone crazy reading their own comments.
02:08:45.000 I've met comedians that, like, they'll think about it all day long.
02:08:49.000 Fuck with them.
02:08:50.000 It will torture them.
02:08:51.000 Well, their neuroses are what creates great comedy to begin with.
02:08:54.000 So if you feed that neuroses in the wrong way, you're fine.
02:08:56.000 The wrong way, right.
02:08:57.000 And then also the self doubt creeps in because all these people telling you you suck and they're like, oh my God, I suck.
02:09:02.000 And then you go on stage with this like, people think I suck, they hate me.
02:09:05.000 You can't do that.
02:09:06.000 Like, if you have a certain amount of energy in the day, this is what I always tell comedians.
02:09:14.000 I said, look, think of your attention and your focus as a unit.
02:09:18.000 You have 100 units.
02:09:20.000 If you spend 30 of those fucking units on assholes online, you're robbing 30 units from all the things you love.
02:09:28.000 30 units from your family, 30 units from your friends, 30 units from your job, 30 units from golf or poker or whatever it is that you love to do.
02:09:35.000 You're stealing your own time and your own focus for losers.
02:09:41.000 Like, why would you do that?
02:09:42.000 And those losers are good people.
02:09:45.000 Most people are good people.
02:09:46.000 They're in a bad path.
02:09:48.000 I would have been the same person.
02:09:49.000 Yes, they're venting.
02:09:49.000 Or they're venting.
02:09:50.000 Look, if you gave me a fucking Twitter account when I was 16, oh my God, it would have been horrendous.
02:09:55.000 Yeah, I would have been going crazy.
02:09:56.000 Oh my God, I would have been a terrible person.
02:09:58.000 It's normal.
02:09:59.000 Especially if your life sucks and you're not doing well and you're attacking famous people or you're attacking this person that's doing better than you or whatever it is.
02:10:07.000 Like it's.
02:10:08.000 Do you have you seen the clips of the retard maxing?
02:10:11.000 No.
02:10:12.000 You don't know what this is?
02:10:13.000 You don't know what this is?
02:10:13.000 No.
02:10:14.000 No.
02:10:15.000 What's retard maxing?
02:10:16.000 Oh, this guy is fantastic.
02:10:18.000 He sits on his back porch.
02:10:20.000 Jamie, can you just show him?
02:10:23.000 He sits on his back porch smoking a cigar, basically telling you everything's kind of bullshit.
02:10:30.000 Stop thinking about shit.
02:10:31.000 You know, if you don't like your friends, leave them.
02:10:34.000 If you don't like your girlfriend, leave them.
02:10:35.000 Stop overthinking.
02:10:36.000 Simplify your life.
02:10:38.000 You know, it's so simple, but I think it's incredible.
02:10:43.000 Who is this guy?
02:10:44.000 Elisha Long, I think, is his name.
02:10:45.000 I don't know, Jamie, if you can find it.
02:10:47.000 I think Elisha Long.
02:10:48.000 Retard maxing is funny because I know about looks maxing.
02:10:51.000 We talked about that recently on a podcast, but that's recently entered into my mind, into my zeitgeist.
02:10:57.000 Looks maxing.
02:10:57.000 That's the clavicular.
02:10:59.000 Clavicular, yeah.
02:10:59.000 But I've only found that about that within the last few months of life.
02:11:02.000 Because I genuinely stay off social media as much as possible.
02:11:02.000 Yeah, yeah.
02:11:06.000 And if I do read things, what I like to do, I like to focus on fascinating things.
02:11:11.000 Like a lot of my time I spend looking at YouTube stuff.
02:11:14.000 Same.
02:11:14.000 Because YouTube stuff, my algorithm is all like new black holes they've discovered, you know, new discoveries in terms of like what is the fabric of reality.
02:11:24.000 Like, that's interesting to me.
02:11:26.000 And if I just concentrate on people being mean or shitty to each other or the latest fucking political drama, it's like.
02:11:34.000 I don't have much time.
02:11:36.000 I'm busy.
02:11:37.000 I like things.
02:11:39.000 Are you on Instagram and TikTok?
02:11:41.000 I'm on Instagram.
02:11:42.000 I do not have a TikTok.
02:11:44.000 This is Lux Maxing.
02:11:46.000 No, this is Retard Maxing.
02:11:47.000 So let me hear what he says.
02:11:49.000 Who's this guy?
02:11:50.000 What's his name?
02:11:50.000 Elisha Law.
02:11:51.000 Shout out to Elisha.
02:11:53.000 Being used as a poisoning of nostalgia, but to simply remind you of what you found important.
02:12:01.000 And as we grow up, we often give that up for security.
02:12:05.000 We give that up so that we are accepted.
02:12:07.000 We give that up to flex and appear like we have now figured things out, that people will accept us.
02:12:15.000 The only way that you will truly be successful is if you are righteous and you live according to your nature and you play, man, and you don't let people take play away from you to be at the circus and be oohed and awed and worried about all the bullshit.
02:12:32.000 Return to a state of play.
02:12:35.000 Well, that's very good advice.
02:12:38.000 Return to retard max.
02:12:40.000 The best thing that you could do is return to a state of play.
02:12:43.000 There's a lot of that, you know?
02:12:43.000 That's true.
02:12:45.000 There's a lot of that.
02:12:46.000 Absolutely.
02:12:47.000 Oh, I think that that is like a wise man for a young fella.
02:12:51.000 Oh, okay.
02:12:51.000 Yeah.
02:12:52.000 He's a jujitsu guy.
02:12:53.000 Look, he's getting his fucking blue belt there, or he's getting his purple belt.
02:12:53.000 There you go.
02:12:57.000 What is going on there?
02:12:58.000 So, is he getting his blue belt?
02:13:00.000 Yeah, it's his purple shirt.
02:13:01.000 Purple.
02:13:02.000 Yeah, so they're taking his blue belt off and putting his purple belt on.
02:13:05.000 Yeah, see, he's learning, he's a martial artist.
02:13:07.000 That's why.
02:13:08.000 You think martial arts people are just more like spiritually connected to the truth?
02:13:12.000 I don't know if it's spiritually connected to the truth.
02:13:14.000 It's forced down your fucking throat because you can't believe you're better than you are if you're getting mauled every day.
02:13:22.000 You know?
02:13:23.000 And there's only one way.
02:13:25.000 This guy's on the path to becoming a jujitsu black boy.
02:13:26.000 He looks like a pretty big guy, too.
02:13:28.000 That'll help.
02:13:29.000 But there's only one way to get a black belt in jujitsu.
02:13:33.000 You got to train jujitsu all the time and get better at jujitsu.
02:13:35.000 You can't pretend you're better.
02:13:37.000 You know, there's a lot of people that write poems and they suck and they think they're so deep.
02:13:41.000 But those poems suck.
02:13:41.000 Yeah.
02:13:42.000 Meaning, like, there's just a very simple objective measurement.
02:13:44.000 That's it.
02:13:45.000 100%.
02:13:46.000 You either win or you lose.
02:13:48.000 You either tap or you get tapped out.
02:13:51.000 You know, you tap somebody or you get tapped out.
02:13:53.000 But can you get a black belt in some gym that's easier than a different gym or something like that?
02:13:58.000 Sort of, kind of.
02:14:00.000 But not really.
02:14:01.000 I mean, everybody's trying hard.
02:14:02.000 I mean, there's definitely better gyms where they're more technical and their program is much more systematic and they're better at breaking down skills, like how to develop skills.
02:14:12.000 You know, there's definitely better gyms, there's better schools, there's better places to learn.
02:14:18.000 But everywhere you learn, you're going to have a bunch of people that are trying hard.
02:14:22.000 Like, and you have a bunch of people that are trying to learn these.
02:14:25.000 And also today, because of the internet, you could go on YouTube and there's Thousands of tutorials breaking down new moves.
02:14:34.000 Jiu jitsu is like endlessly complex.
02:14:36.000 One of my kids has ADHD, and one of the things that was recommended to us was jiu jitsu.
02:14:41.000 Yeah, what is ADHD, man?
02:14:42.000 It's not even fucking real because I definitely have it.
02:14:44.000 And I think we all have it.
02:14:45.000 I think it's a superpower.
02:14:46.000 I think we all have it.
02:14:48.000 I think, look, I do not focus well on things that I think are boring.
02:14:52.000 But if you give me something that I love, I can't, I'll play pool for fucking 12 hours in a row.
02:14:57.000 It's crazy, but like the reason I got back into golf is my seven year old gets on the course, and sometimes you can talk to him and he's not making, you know, he's just like in his own world.
02:15:04.000 Exactly.
02:15:05.000 And then you start talking about chess or jujitsu or whatever.
02:15:09.000 And then we get him on the golf course, and this kid is just dialed in.
02:15:13.000 Yeah, superpower.
02:15:14.000 And I'm like, holy shit.
02:15:15.000 And they say that that's a disease.
02:15:17.000 That's crazy.
02:15:18.000 Crazy.
02:15:18.000 Because if you find a thing that that kid loves, he's going to excel at it above and beyond most humans.
02:15:24.000 He does these chess classes.
02:15:25.000 And like, look, he's seven.
02:15:27.000 So I'm like, all right, motherfucker, bring it.
02:15:29.000 Fucking fucking destroy you.
02:15:31.000 I'm going to fucking maul you.
02:15:33.000 And we're playing last weekend.
02:15:36.000 And he goes, Oh, Dad, you know, you can't castle out of check.
02:15:40.000 I'm like, Shut the fuck up.
02:15:41.000 I know how this game works.
02:15:43.000 And I go on to beat him.
02:15:44.000 And I went to my wife and I'm like, He's six weeks away from beating me.
02:15:52.000 I spent two days.
02:15:53.000 I spent two fucking days on YouTube.
02:15:55.000 And I was like, Okay, I got to brush up on my openings.
02:15:58.000 And I got, I got, oh my God, I don't have time for this shit.
02:16:01.000 But I can't let this seven year old beat me.
02:16:03.000 You know what I mean?
02:16:04.000 You're going to have to.
02:16:06.000 You're going to have to.
02:16:07.000 And I'm like, And I was like, how do I stall this until maybe he's 10 or 11?
02:16:10.000 Then it's like, okay, fine, you finally beat me.
02:16:12.000 Congratulations.
02:16:13.000 You have to think of him as an extension of you and be happy when he does.
02:16:17.000 Yeah, that's just how it is.
02:16:17.000 Oh, my God.
02:16:19.000 Look, if you're a man and you have a son, I have all daughters, but if I had a son, I would be legitimately terrified that he'd be able to tap me.
02:16:28.000 Because if I had a son, one of the first things that I would do is get them.
02:16:31.000 I got my kids involved in martial arts at an early age, but I didn't force them to keep doing it.
02:16:35.000 They did it for a certain amount of time and then they went on to do a bunch of other things that they enjoy better, which is fine.
02:16:40.000 But I think it's good to learn some skills, learn how to defend yourself so you're not completely lost.
02:16:46.000 Just, I think it's good for you.
02:16:47.000 It's good to learn, it's good to develop confidence.
02:16:50.000 But for boys, I think it's critical.
02:16:52.000 You know, especially boys with my kind of DNA, I'm like, I think it's good to get that shit out of your system.
02:16:57.000 But if I had a son, there'd be a certain point in time, I'm like, it's a matter of time before this motherfucker can kill me.
02:17:02.000 You know, it's like, I mean, I'm 58 years old.
02:17:05.000 If I had a 20 year old kid, like, he'd probably kill me.
02:17:07.000 He'd kick your ass.
02:17:08.000 Probably fucking kill me.
02:17:09.000 He'd kick your ass.
02:17:10.000 Yeah.
02:17:10.000 It's like, what am I going to do?
02:17:11.000 There's nothing you could do.
02:17:12.000 You just have to accept it and then hope your relationship with him is strong enough that he still respects you, even though he can kill you.
02:17:17.000 Because it can't be entirely bait.
02:17:20.000 Look, there's a lot of martial arts instructors that are old.
02:17:24.000 And they're revered and respected, and nobody wants to try to hurt them.
02:17:27.000 Because you realize if you learn enough, you get to a certain point in time, you realize like, I'm a much better dad to my sons than I am my daughters.
02:17:27.000 Yeah.
02:17:27.000 Right.
02:17:35.000 And I mean this in the following way my daughters have the run of the place, whatever they want.
02:17:39.000 I'm in love with them.
02:17:40.000 I don't love them.
02:17:41.000 I'm in love with them.
02:17:42.000 Whatever they need.
02:17:43.000 Right.
02:17:44.000 Just enamored by.
02:17:44.000 They can just.
02:17:45.000 They're just like, they can control me.
02:17:47.000 They just kind of send me in one direction or another.
02:17:49.000 I'm just like, they're.
02:17:50.000 By the way, they know that too.
02:17:51.000 I'm enslaved by them.
02:17:52.000 Yes.
02:17:53.000 You know, and I just want their attention.
02:17:55.000 Any small little shred, I'm like, boo your son, you keep him in check.
02:17:59.000 Whereas, like, my sons, I keep the, and I'm doing everything that I was supposed, I think I'm supposed to be doing.
02:18:04.000 Now, the good news is my, you know, daughters are just different.
02:18:07.000 They're just, so they don't need the same kind of like tough love ish.
02:18:07.000 They're girls.
02:18:11.000 Right.
02:18:12.000 You know?
02:18:13.000 But then my boys reveal their characteristics in ways that really surprised me.
02:18:16.000 And I'm just like, man, this is so fucking awesome.
02:18:18.000 Parenting has been the best.
02:18:20.000 Like, when I, again, like, slowing down and actually being in it.
02:18:24.000 And I'm like, fuck, this is amazing.
02:18:25.000 It is pretty amazing.
02:18:27.000 Yeah.
02:18:27.000 And watching your kids get really good at things is really fascinating.
02:18:30.000 It's fascinating.
02:18:31.000 I told you this story before, but like, you know, my son, my oldest son, this is my 17 year old, it's just a great kid.
02:18:40.000 He goes and he's like, okay, I'm applying for college.
02:18:43.000 And I'm like, great, let me take you to the Naval Academy, West Point.
02:18:45.000 Let me show you these service academies.
02:18:47.000 And he sees those and he's like, these are incredible.
02:18:49.000 But then he's like, I think I want to go to like, you know, Georgetown or Vanderbilt or whatever.
02:18:53.000 And I'm like, hey, man, that's like just a bigger version of your high school.
02:18:58.000 And whatever, if that's what you want to do, You do you.
02:19:01.000 And, you know, but, you know, my, the, I'll help you like kind of get to the starting line here, but you're on your own.
02:19:10.000 And he had to get a job because I'm like, if you're going to get into these schools, you got to get a job.
02:19:14.000 And so he tries to, last summer, I just started fucking screaming at him.
02:19:19.000 And I'm like, you fucking louse.
02:19:21.000 You haven't done anything.
02:19:23.000 And this is at like another kid's, at our, at our son's birthday party.
02:19:27.000 I scream at him.
02:19:27.000 He starts crying.
02:19:28.000 I'm like, you need to do more.
02:19:30.000 Then my, Wife screams at him.
02:19:32.000 He starts crying again.
02:19:33.000 Then my ex wife screams at him.
02:19:35.000 He starts crying again.
02:19:38.000 And he just goes, I'm out of here.
02:19:40.000 He walks out.
02:19:42.000 Meanwhile, I start panicking and I'm like, I got a tiger dad in this situation.
02:19:45.000 So I start texting a few friends, trying to figure out, hey, can I, you know, do you guys want to hire this kid?
02:19:50.000 He's like, really, you know, he's a pretty smart kid, did all this stuff in robotics, yada, yada.
02:19:55.000 One of them says, I'd be willing to interview him.
02:19:58.000 I call him and he's like, Dad, I got a job.
02:20:01.000 I said, What do you mean you got a job?
02:20:03.000 Said, I went around downtown, went to all these places, and I was in a McDonald's.
02:20:11.000 The woman was having a little bit of difficulty speaking English, so I just spoke to her in Spanish.
02:20:15.000 I got the application, I sat down at the desk, and the guy having lunch beside me said, Hey, I heard you needed a job, and I really like the way you talked to this woman.
02:20:25.000 I'm the general manager of the car wash down the street.
02:20:26.000 Come and work for me.
02:20:29.000 And I said, Well, what are you going to do?
02:20:30.000 He goes, I'm going to go work there.
02:20:32.000 And I said, Okay, well, I got this other interview for you as well, so you should see maybe you can do both.
02:20:37.000 Anyways, the end of the story is he did these two jobs.
02:20:40.000 He worked at a robotics firm, but then he worked at a car wash.
02:20:43.000 And when I tell you this story, I am so proud of this kid because of the car wash.
02:20:47.000 Because that car wash thing, he would come home and he's like, Man, you have no idea how people live.
02:20:52.000 And I'm like, What do you mean?
02:20:53.000 He's like, The stuff that I find in the trunk when I have to vacuum these cars and clean out the cars.
02:20:58.000 And I'm like, Bro, that is a gift.
02:20:59.000 You have given a fucking gift.
02:21:02.000 That is the thing that if you take with you, you'll be golden the rest of your life.
02:21:06.000 Because all this other shit is all kind of manufactured.
02:21:08.000 I help because I'm anxious, I'm insecure.
02:21:11.000 But that shit you did on your own.
02:21:12.000 And that thing is what people will fucking respect when they.
02:21:15.000 Push comes to show.
02:21:16.000 It's also jobs that suck are really good for you.
02:21:18.000 So good.
02:21:19.000 I used to work at Burger King when I was 14.
02:21:23.000 You were 14 and you had a job?
02:21:23.000 Man, let me tell you.
02:21:26.000 When my dad had to stay behind, like we were, my dad was a diplomat in the embassy of Sri Lanka in Canada.
02:21:34.000 This fucking war in Sri Lanka is crazy.
02:21:36.000 He writes this essay.
02:21:38.000 His life is threatened.
02:21:39.000 So he files for refugee status.
02:21:42.000 He gets it.
02:21:45.000 He gets kicked out of the embassy.
02:21:46.000 So he doesn't have a job.
02:21:47.000 My mom becomes a housekeeper.
02:21:50.000 And we're kind of toiling in this poverty cycle.
02:21:52.000 So, 14, I had to get a job.
02:21:54.000 And I would take the money and I buy the bus passes, I would buy some of the groceries.
02:22:00.000 We just try to make it all work, right?
02:22:02.000 And I got a job at the Burger King.
02:22:05.000 This is another example where I was like, I'm going to go get a job.
02:22:09.000 Hey, can you drive me to the interview?
02:22:12.000 And my dad's like, no.
02:22:16.000 Get on your fucking bicycle and go.
02:22:18.000 And I thought, bro, we need this.
02:22:20.000 You need the money more than I do.
02:22:21.000 Why are you making me bicycle?
02:22:24.000 But I bicycled and I got the job and I worked there.
02:22:27.000 And I used to work the night shift, 14 year old kid, man.
02:22:29.000 Wow.
02:22:30.000 From fucking eight till two in the morning.
02:22:32.000 And I would have to clean this like 8 p.m. to two in the morning.
02:22:32.000 Wow.
02:22:34.000 Then you had to go to school in the morning?
02:22:36.000 No, then I, this was always like Friday, Saturday, Sunday.
02:22:39.000 Thursday, Friday, sorry, Thursday, Friday, Saturday.
02:22:39.000 Wow.
02:22:41.000 And then, yeah, some days I would have to go to school.
02:22:42.000 But, and why did I work until two?
02:22:45.000 Because when the restaurant closes, you get whatever the food is left over, right?
02:22:51.000 So like you get a couple chicken sandwiches, you get like the, you know, the, The version of the McNuggets that Burger King had, a couple Whoppers, and you take them home.
02:23:02.000 But the amount of vomit that I had to clean up at the bathroom, you can't imagine, man, a downtown Burger King near bars, you know, after closing time, the shit you see.
02:23:16.000 And the shit you deal with.
02:23:16.000 Oh, wow.
02:23:18.000 And all I could think of was, I just want to get the fuck out of here.
02:23:22.000 But that was so valuable for me.
02:23:24.000 Yeah.
02:23:25.000 That was so valuable for me.
02:23:29.000 And then I worry that my kids don't get exposed to it.
02:23:31.000 But when my son got it, maybe I'm overimposing too much about it, but it's like, I'm like, man, that car wash thing is really going to be the thing that separates you in life.
02:23:40.000 Yeah, doing something that sucks.
02:23:42.000 It also just being humble and grinding through that shit.
02:23:45.000 Do you realize this is sometimes people, they don't pick a path and they just have a job and they don't like it and they stay with this thing they don't like forever.
02:23:56.000 And that's not what you want.
02:23:58.000 It's not what you want.
02:23:59.000 But the development, Like learning how to do something that sucks and grinding through it.
02:24:05.000 And still doing it well.
02:24:06.000 Yeah.
02:24:07.000 Doing it well.
02:24:08.000 Make a whopper.
02:24:09.000 I know how to fucking make a whopper.
02:24:09.000 Be there on time.
02:24:11.000 Yeah.
02:24:11.000 Do you know what I mean?
02:24:12.000 Yeah.
02:24:13.000 Make the fries, change the oil, all that shit.
02:24:16.000 And then when you apply those lessons to something you actually love and you work hard at something you love.
02:24:23.000 Magical.
02:24:23.000 Oh, it's incredible.
02:24:25.000 That's a real gift.
02:24:26.000 It's a real gift.
02:24:27.000 Yeah.
02:24:27.000 I mean, you know, some people, they don't appreciate the process.
02:24:32.000 And it's hard to because when you're young and you're going through these difficult jobs and these things that suck, and you don't know how it's going to turn out.
02:24:40.000 You know, and a lot of times people aren't really educated in what a process actually is and about how it does develop character, it does develop discipline, and these things are actual skills that you can apply to other things in life.
02:24:53.000 You just think, God, I'm a fucking loser.
02:24:55.000 I have a visual for this.
02:24:57.000 I always ask myself, Am I in the engine room right now?
02:25:01.000 This is my way of saying, like, an engine room is a little hot, it's a little uncomfortable, but it's where all the shit is happening, it's where the shit is being made.
02:25:10.000 And so I'm like, It's a little, you know, discomforting.
02:25:13.000 But I got to be in there.
02:25:15.000 And there are days where there'll be weeks where that's all I do.
02:25:18.000 I'm just in it.
02:25:20.000 You know, I don't, I'm not good at responding to emails sometimes or whatever, because there's just be weeks where I'm in it.
02:25:26.000 And it's an incredible visual for me because I'm like, yeah, this is like where I'm grounded and I like feel myself.
02:25:32.000 And then when I look at my health, that's when I just feel like really good about myself, like not insecure.
02:25:43.000 And my vitals are different.
02:25:44.000 Like, it's crazy.
02:25:45.000 Like, my fucking HRV.
02:25:47.000 Like, my HRV craters when I'm, like, just, you know, insecure.
02:25:56.000 Of course.
02:25:57.000 Why is that?
02:25:58.000 Like, it's your heart rate variability.
02:26:00.000 This should have nothing to do with your, like, disposition and your mood.
02:26:05.000 Well, your mind is the idea that your mind is separate from the body is crazy.
02:26:10.000 It's not.
02:26:10.000 It's crazy.
02:26:12.000 But is your HRV lower when you're just out of sorts?
02:26:15.000 Yes, probably, right?
02:26:16.000 I'm sure.
02:26:17.000 Yeah, I don't really monitor it that much.
02:26:19.000 Yeah.
02:26:20.000 And I try not to ever get out of sorts, too.
02:26:22.000 And one of the ways that I keep from getting out of sorts is daily discipline.
02:26:27.000 Like, it's if I have days where I'm sure it gets out of sorts, if I have a few days in a row where I don't work out, but I work out almost every day.
02:26:35.000 And if I'm not working out, I'm still cold plunging and going to the sauna and stretching.
02:26:39.000 I'm always doing something.
02:26:41.000 And if I don't do something, I feel like I'm fucking up.
02:26:43.000 And then I can.
02:26:45.000 So does it matter what it is?
02:26:46.000 Meaning, as long as it's a routine?
02:26:48.000 Yeah, well, I.
02:26:49.000 I do it all myself.
02:26:50.000 I don't have a trainer, but I write things down.
02:26:52.000 I write down what I want to accomplish.
02:26:54.000 I write down what I'm going to do.
02:26:55.000 And then I just do it.
02:26:57.000 And like a robot, force myself to do it.
02:27:00.000 Then I always feel better after it's over.
02:27:02.000 And it's always the hardest part of my day.
02:27:04.000 And so it makes everything else so much easier because I fucking work out hard.
02:27:08.000 And so everything else is pretty easy, you know, because the strain, like just being in that fucking cold water or just going through Tabatas on an Air Dyne bike, this shit's hard.
02:27:19.000 It's really hard.
02:27:20.000 Like I could die right now hard.
02:27:21.000 Yeah.
02:27:22.000 And so everything else is like, how hard is it going to be?
02:27:24.000 Oh, it's uncomfortable.
02:27:26.000 Oh, boo hoo.
02:27:27.000 I think it's important to go through that.
02:27:30.000 I really think it is.
02:27:32.000 I really think it is.
02:27:33.000 And that's the difference between sanity and having a very slippery grip on your own personal sovereignty.
02:27:44.000 I think a lot of it is like you have to choose, it has to be like elective.
02:27:52.000 Voluntary adversity.
02:27:54.000 Like you have to choose to do it.
02:27:56.000 Yeah, that's a really great way of saying it.
02:27:57.000 Voluntary adversity.
02:27:58.000 If it's forced upon you, you can kind of compartmentalize it.
02:28:02.000 And then you get angry, like, wow, this is fucking making me do stupid shit.
02:28:05.000 But if you force yourself to do it, you know.
02:28:08.000 This is why these special forces guys are such fucking animals.
02:28:10.000 Of course.
02:28:11.000 They're choosing.
02:28:12.000 Right.
02:28:13.000 Exactly.
02:28:14.000 And they develop that, you know, this mentality when you're around other people that are also savages.
02:28:19.000 You know, you just realize like there's other people out there in the world that are not.
02:28:24.000 Making excuses.
02:28:25.000 And they are getting after it every day.
02:28:27.000 And they are pushing every day.
02:28:29.000 And the more you can surround yourself with people like that, the more people, the people that complain about nonsense and find excuses and focus on other people and bitch about things and why is she doing this?
02:28:41.000 Why is this happening for him?
02:28:44.000 It's loser mentality.
02:28:46.000 And if you're around more winners, you know, you absorb that.
02:28:48.000 You imitate your atmosphere.
02:28:50.000 It's very important.
02:28:50.000 And it's very hard for people, especially young people, to find positive influences and to find positive groups.
02:28:59.000 And I think.
02:29:00.000 It's one of the reasons why a lot of young people gravitate towards podcasts because they get to hear interesting conversations with really accomplished people that are fascinating, that are unlike anybody that they're around on a daily basis.
02:29:12.000 And that's also one of the reasons why it's important to find that's why martial arts is so good for young people because you're around other people that are doing this really difficult thing and other sports too, whether it's football or wrestling, whatever it is.
02:29:24.000 I actually found the last few years I go out of my way to not isolate myself.
02:29:28.000 That's one thing.
02:29:29.000 Being around other people engaging in things has been really healthy for me.
02:29:33.000 Oh, for sure.
02:29:34.000 Oh, my God.
02:29:35.000 And I just found like, what the fuck am I doing?
02:29:36.000 It's like everything is in my little house by myself with everybody, everything comes to me.
02:29:41.000 It's so odd.
02:29:42.000 It's really odd.
02:29:42.000 It's odd.
02:29:43.000 Very unhealthy.
02:29:44.000 And it starts to fuck you up in the bind.
02:29:45.000 And then your interaction with humans is only on the internet.
02:29:49.000 It's terrible.
02:29:50.000 Or with people that are sycophantically either being paid or need something from you.
02:29:54.000 Yeah.
02:29:55.000 And then I think you're in a really bad place.
02:29:57.000 Whereas, if you're in the grind with other people, they're beating you at things.
02:29:57.000 Absolutely.
02:30:01.000 Yeah.
02:30:01.000 It's great.
02:30:02.000 If you're in a situation where there's a bunch of sycophantically connected people to you and they're just all kissing your ass and, I mean, we all know people that are like the heads of companies and that are just like fucking tyrants.
02:30:12.000 I think the trap about being successful, because it's not everything it's crapped up to be, is exactly that.
02:30:17.000 You become so isolated that you become this like very caricaturous version of yourself because you forget what it's like to just a basic example, like wait in line, be kind to other people, be polite, like be accommodating, have some empathy.
02:30:32.000 Where are you put in that situation to do those things?
02:30:32.000 Right.
02:30:35.000 You forget that you're just a person.
02:30:35.000 Right.
02:30:36.000 You're just a fucking person.
02:30:37.000 And, If you achieve some level of success that you're trying to achieve, you're trying to achieve this level of success so you elevate past being a person, you're missing the point.
02:30:47.000 Like, you're never going to.
02:30:48.000 And if you do, it'll come at a price.
02:30:50.000 I thought being successful was supposed to right all the wrongs that I felt like I missed.
02:30:58.000 And it turns out nobody gives a fuck.
02:31:00.000 No.
02:31:01.000 And it does none of that.
02:31:02.000 I think it's all the process.
02:31:06.000 All of life is the process.
02:31:07.000 I agree.
02:31:08.000 I think as soon as you think that there's a goal, Like, oh, I'm going to retire and experience my golden years.
02:31:13.000 I think it's all horseshit.
02:31:15.000 And that's one of my main fears about AI.
02:31:19.000 One of my main fears about this idea of universal high income and everyone's going to have ultimate abundance.
02:31:26.000 It's like, where does anybody find purpose and meaning?
02:31:30.000 And where do you take whatever this thing is that the mind is constructed of, these needs that the mind has that have to be satisfied in order to achieve sanity?
02:31:44.000 In order to achieve some sort of place where you can be at peace.
02:31:49.000 Yeah.
02:31:50.000 You're going to have to do something, man.
02:31:52.000 You're going to have to do something.
02:31:53.000 And maybe it could just be jujitsu and golf and find some stuff that you enjoy doing and take some benefit in that.
02:32:01.000 But boy, that's not been the case for hundreds of years.
02:32:07.000 That's not how human beings have existed.
02:32:10.000 But also, part of me says why do we have to work to find those things?
02:32:15.000 Why can't we?
02:32:18.000 Why is it all that?
02:32:20.000 Well, you've got to find the thing that's not work.
02:32:23.000 But what I'm getting at is like, why is our identity all tied up in money and just things and objects and stuff?
02:32:23.000 Right.
02:32:36.000 And this is a fairly new thing in human society, right?
02:32:40.000 Why can't it transform into like your basic needs are all met?
02:32:47.000 Like, nobody ever has to worry about starving again.
02:32:49.000 Nobody ever has to worry about not having a home to sleep in.
02:32:52.000 Nobody ever has to worry about not having health care.
02:32:54.000 Nobody ever has to worry about not having education.
02:32:56.000 So then it becomes.
02:32:58.000 Find a purpose with your life.
02:33:00.000 And as a society, can we adjust?
02:33:03.000 Can we gravitate towards a new way of existing and meaning?
02:33:08.000 It would probably be great.
02:33:10.000 In one way, it'd be great because we wouldn't have to be constantly thinking, why does he have that and I don't have that and this and that.
02:33:17.000 Instead, it would probably be like, what can I do to get better at the thing that I love?
02:33:23.000 Or let me be a part of a project to do something that seems implausible.
02:33:28.000 But I feel like I'm in the engine room every day.
02:33:30.000 This is great.
02:33:31.000 I'm toiling with these guys.
02:33:32.000 Yes.
02:33:33.000 It's probably not going to work.
02:33:34.000 Some crazy, convoluted thing that has a.001 chance of success.
02:33:40.000 That can captivate a lot of people.
02:33:42.000 Yes.
02:33:42.000 You know?
02:33:43.000 The process.
02:33:43.000 The process.
02:33:44.000 Yeah.
02:33:45.000 The process.
02:33:45.000 The process is everything.
02:33:46.000 And there's no, I used to like think backwards.
02:33:48.000 There is no attention in the process.
02:33:51.000 Right.
02:33:51.000 There's only attention in the outcome.
02:33:53.000 Right.
02:33:54.000 Right.
02:33:55.000 Absolutely.
02:33:56.000 Which is another clue and a secret that that's actually where you should be.
02:33:59.000 Well, you might get attention, but that's not what you want.
02:33:59.000 Focused.
02:34:01.000 What you want is the process to work out.
02:34:03.000 You want to get better at whatever it is you're doing and get that thing to a better place than it is right now, currently.
02:34:09.000 Right?
02:34:09.000 That's what you're thinking of.
02:34:10.000 You're not thinking of, I am going to get all this attention.
02:34:13.000 I'm going to be on the cover of a magazine.
02:34:16.000 Yeah.
02:34:17.000 You can't be that.
02:34:18.000 That's not good for anybody.
02:34:20.000 But everybody thinks that's what they're going to get.
02:34:23.000 Everybody thinks that's what they want.
02:34:23.000 Oh, I'm going to get this.
02:34:25.000 Yeah.
02:34:26.000 Right.
02:34:27.000 And the problem with that is that it's not what you want.
02:34:30.000 No.
02:34:31.000 And then now we're going to completely upend potentially all of that.
02:34:38.000 Well, maybe it'll come inside, it'll coincide with the hive mind technology.
02:34:38.000 Yeah.
02:34:44.000 This hive mind thing, actually, that you say, I find very compelling because this idea of like how do you govern an AI?
02:34:53.000 Each of us individually are not capable, but I think you, me, like 10,000, 100,000 people working together.
02:35:01.000 The question is are we smarter?
02:35:03.000 And I think there's a reasonable chance that that could be true.
02:35:07.000 And then the other version of the hive mind is here are all these like crazy ideas that would just make the world incredible.
02:35:13.000 And a group of a thousand people go off and they kind of jointly work on that together.
02:35:18.000 That I find super fascinating.
02:35:20.000 Like, that could be it.
02:35:21.000 Like, it could be like, you know, a thousand physicists are like, we're going to create this new interstellar form of transportation.
02:35:27.000 And they just go off and they're just like, they don't have to worry about existing because all of that's paid for.
02:35:34.000 Well, it also could solve all of our problems that we have with like haves and have nots.
02:35:40.000 If we're all one, how could we tolerate have nots?
02:35:44.000 How could we tolerate people living on dirt floors in third world countries with no access to clean water?
02:35:48.000 We wouldn't tolerate it.
02:35:49.000 We wouldn't tolerate it.
02:35:50.000 Because they would be us and we would understand that.
02:35:52.000 I mean, it could be like a complete game changer in terms of human civilization.
02:35:58.000 It could really move people into a complete next direction.
02:36:00.000 I mean, it could eliminate crime and violence, which sounds insane.
02:36:05.000 That's so utopian.
02:36:05.000 Like, boy.
02:36:07.000 Like, oh, why don't you suck on some crystals, you fucking hippie?
02:36:10.000 But legitimately, if, look, if everybody has a cell phone, which essentially everybody does, right?
02:36:15.000 Right now, in this time and age, if we get to a point where everybody is connected, everybody is hive mind connected, you're not going to just be able to drive by a homeless encampment.
02:36:28.000 You'll feel it.
02:36:30.000 You'll feel it.
02:36:31.000 It won't be like, hey, you fucking losers, hit the gas.
02:36:34.000 It's going to be like, we need to solve this.
02:36:37.000 We need to get these people counseling, mental health crisis, get them off the drugs, whatever it is that's wrong with them.
02:36:44.000 I mean, that's an incredible idea.
02:36:46.000 You know, like when an airplane kind of goes like this and your stomach goes and you just feel it?
02:36:46.000 Yeah.
02:36:51.000 Could you imagine like you drive by a homeless encampment and that's what you feel?
02:36:54.000 Like you feel like something's wrong.
02:36:56.000 And we'll all feel it collectively.
02:36:59.000 If we're all connected and we all feel things connectively, we will actively work together to solve these problems.
02:37:05.000 And if we're dealing with, if we really get to a point of abundance, like true abundance, where resources are not an issue and no one's starving, We could really fix all the problems that, like, none of them are insurmountable.
02:37:20.000 None of them are breathing underwater, right?
02:37:22.000 None of them are flying to the sun.
02:37:24.000 None of them, right?
02:37:25.000 So, all of them are things that could be.
02:37:27.000 If we took all the world's resources, socialism doesn't work, right?
02:37:32.000 Why does it not work?
02:37:33.000 Because it rewards lazy people and it punishes ambitious people.
02:37:36.000 It's not, it doesn't work with human nature, but it would work if you have a fucking hive mind.
02:37:41.000 If we all understand what it means to put in effort, we all understood what each other are feeling and thinking, right?
02:37:48.000 And we all.
02:37:49.000 Compiled resources and fixed all of our social problems.
02:37:53.000 Like, literally, stop all wars, stop all crime, stop all violence, stop all poverty.
02:38:00.000 Done.
02:38:01.000 And then what do we do?
02:38:02.000 We work together to solve whatever the fuck else is wrong with society.
02:38:06.000 Well, it's more like what is left over that we haven't figured out.
02:38:10.000 Think about what the world was like before the internet.
02:38:12.000 It's almost impossible to imagine, but we both grew up without it.
02:38:17.000 Yeah.
02:38:17.000 Yeah.
02:38:18.000 And so we're entering into this new world.
02:38:21.000 Think about what the world was like without the hive mind.
02:38:24.000 But yet, we all grew up without it.
02:38:26.000 Like, that might be the next thing.
02:38:28.000 The thing that I remember the most about that era is I had a positive view of everybody.
02:38:35.000 Really?
02:38:36.000 Meaning, there weren't like the bad actors were pretty bad.
02:38:41.000 But yeah, generally, like, I looked up to most business people.
02:38:44.000 Like, the people that I now feel like have been a little bit unmasked, then to me, were pristine.
02:38:49.000 Oh, that's interesting.
02:38:50.000 Like the Bill Gates's of the world.
02:38:51.000 You know, I was like, man, I really aspire to be Bill Gates when I was like 13 or 14.
02:38:56.000 It just seemed like.
02:38:57.000 Now you're like, why is he buying all the farmland?
02:38:59.000 This fucking weirdo.
02:39:00.000 I mean, it's fucking so funny.
02:39:03.000 He bought this like 45,000 acres and 4,500 acres.
02:39:08.000 I can't get the order of magnitude right.
02:39:10.000 In Phoenix to build his own digital city.
02:39:13.000 Yeah.
02:39:14.000 It's like weird.
02:39:14.000 Okay.
02:39:15.000 So I bought the 1,700 acres beside him.
02:39:20.000 It's hilarious.
02:39:21.000 Fuck you, dickheads.
02:39:22.000 It's a very odd thing.
02:39:24.000 It's a very odd thing when people get exposed and you just go, what the fuck is that guy really all about?
02:39:30.000 And but also, like, how isolated is he?
02:39:33.000 He's been he's been isolated for 50 years, right?
02:39:36.000 Like, who are his friends and how how many people does he have?
02:39:39.000 Must be very hard to be him, actually.
02:39:41.000 I mean, especially now that he's divorced, right?
02:39:43.000 So now he's got no one going, but that speech sucks.
02:39:46.000 Yeah, he's got, I mean, he has a long term partner.
02:39:49.000 Um, she seems like a lovely woman.
02:39:51.000 Um, but yeah, it's just got to be super lonely.
02:39:54.000 It's got to be.
02:39:56.000 It's not, I to me, it's not worth that level of.
02:40:00.000 I don't even know what it is.
02:40:01.000 It's like material success, at least measured in the outside world.
02:40:04.000 I don't know what it is, but it's not.
02:40:06.000 That's a lot, man.
02:40:07.000 This is like, I don't know how Elon does it.
02:40:09.000 It's a lot.
02:40:10.000 It's super isolating.
02:40:11.000 Yeah.
02:40:14.000 It's just that he's very by himself.
02:40:17.000 And he's going to be even more isolated in a matter of a few months.
02:40:20.000 Yeah.
02:40:21.000 And that's unfortunate because very empathetic, very kind of like sensitive people like that, I think, need other people.
02:40:28.000 Well, he's got people around him, but he's got very few people around him that can kick reality at him.
02:40:35.000 You know, that is a bit of a problem.
02:40:38.000 But he still seems to be having fun.
02:40:40.000 Every time I'm around him, we have a bunch of laughs.
02:40:43.000 Like, he's fun to hang around with.
02:40:44.000 He's got an incredible sense of humor.
02:40:45.000 We, Jamie and I, went down to one of the rocket launches at SpaceX.
02:40:51.000 Yeah, we went down there.
02:40:52.000 Fucking crazy.
02:40:53.000 And we watched from the ground while I took off, which is incredible.
02:40:57.000 Because it's like, how far was it, Jamie?
02:40:58.000 It was like two miles away from us?
02:41:00.000 A mile, mile and a half.
02:41:01.000 It's like it's a mile and a half.
02:41:02.000 You feel it in your chest.
02:41:03.000 Have you been?
02:41:05.000 When a rocket launches?
02:41:06.000 Dude, it's bananas.
02:41:06.000 Have you been there?
02:41:07.000 The fucking thing, like, first of all, it doesn't look that far.
02:41:10.000 It looks like it's like.
02:41:12.000 Maybe a quarter mile.
02:41:14.000 I'm just not good at judging.
02:41:15.000 This is a Starship?
02:41:16.000 Oh, yeah.
02:41:17.000 So you feel it.
02:41:20.000 His kids started crying, like, we want to go inside.
02:41:23.000 It's disturbing the amount of energy that's coming out of these fucking rocket boosters.
02:41:29.000 And then I hung out with him in the command center while the rocket was flying through space and we're watching it on all these monitors and then lands in the water in Australia.
02:41:38.000 And he's cracking jokes the whole time because the thing is like losing pressure because it's.
02:41:43.000 They're stress testing all this stuff, which is really funny when really dumb people go, Oh, he's a fucking dumbass.
02:41:48.000 His rockets keep blowing up.
02:41:50.000 Like, they just don't understand.
02:41:51.000 Like, the only way you find out what the capability of this technology is, is you have to, like, let it blow up.
02:41:58.000 And then you go, Okay, it needs to be thicker.
02:42:00.000 It needs to be this and that.
02:42:01.000 And we need to add these things.
02:42:02.000 And there's sensors everywhere.
02:42:04.000 And so he's cracking jokes the entire time while this thing is, like, losing pressure.
02:42:07.000 And it eventually wound up landing.
02:42:09.000 And it was fine.
02:42:10.000 But it did have a hole in it.
02:42:12.000 But it was just like, he's laughing.
02:42:14.000 Like, he's having a good old time.
02:42:15.000 He's not freaked out.
02:42:16.000 You know, he's uniquely built to handle it.
02:42:16.000 No.
02:42:19.000 I, when there was a rocket launch in Vredenburg, California, and I chartered a Pilatus, and I, because you can get one.
02:42:27.000 What's a Pilatus?
02:42:28.000 Like a little, like, propeller plane.
02:42:30.000 And I went around and around, and I have this video of it kind of like coming up and through.
02:42:30.000 Oh, okay.
02:42:35.000 Because, like, How close were you?
02:42:40.000 100 miles.
02:42:41.000 Oh, wow.
02:42:42.000 But you, but it's like right there.
02:42:43.000 Uh huh.
02:42:44.000 You know, because the distance, right.
02:42:46.000 And it's coming up, and I'm kind of going around.
02:42:48.000 It was like, Craziest thing.
02:42:49.000 It was cool.
02:42:50.000 It was super cool.
02:42:51.000 That shit is super cool.
02:42:53.000 It's very cool.
02:42:54.000 It's very cool.
02:42:55.000 I mean, just Starbase is bananas.
02:42:57.000 Just when you go down there and they have their own town, the whole thing is fucking Cybertrucks everywhere.
02:43:02.000 I'm like, how do you find your car?
02:43:04.000 Is it an incorporated town?
02:43:06.000 It started off as unincorporated, but it's its own thing now.
02:43:09.000 I believe it's its own town.
02:43:10.000 Is there a mayor?
02:43:12.000 That's a good question.
02:43:13.000 I think there is.
02:43:14.000 I think we talked about this.
02:43:16.000 I don't remember though.
02:43:17.000 But the actual factory itself is nuts.
02:43:22.000 Because Jamie and I were both like, this is way bigger than I thought it was going to be.
02:43:26.000 And the rockets are way bigger than you thought.
02:43:27.000 And the garage doors are fucking bananas.
02:43:30.000 I got a city government website, commission, mayor.
02:43:38.000 That's crazy.
02:43:39.000 Bobby Peated?
02:43:40.000 Bobby Peated is the mayor.
02:43:42.000 They have their own little Irish pub.
02:43:45.000 It's really cool.
02:43:46.000 They have really good food.
02:43:47.000 You know, when he opened the first Giga Factory, which was in Nevada, we had a party.
02:43:53.000 And it was like a small opening thing.
02:43:55.000 And so we all drove in there.
02:43:56.000 And I have a video of me and just like a pickup truck driving into the thing.
02:44:01.000 I started the video and I think it was 43 seconds until it ended.
02:44:07.000 And this was like, you know, a decade ago.
02:44:09.000 And I thought to myself, this is implausible.
02:44:10.000 Like, I've never even contemplated things that could be built this big.
02:44:15.000 I didn't think it was allowed.
02:44:16.000 I don't even know how something like this works.
02:44:19.000 And I was like, how do you envision this whole thing works?
02:44:22.000 Like, simple raw materials in the front, cars out the back.
02:44:27.000 I'm like, that's it.
02:44:31.000 It sounds so simple.
02:44:33.000 Well, he thinks big.
02:44:35.000 He thinks big, and thank God he's around.
02:44:37.000 I mean, if he wasn't around, if he hadn't purchased Twitter, I think our entire civilization would look very different.
02:44:43.000 Very different.
02:44:44.000 I mean, that sounds like a very grandiose thing to say.
02:44:46.000 Sounds hyperbolic, but you're right.
02:44:48.000 I think it's true.
02:44:49.000 Because I think free speech is a core component of our civilization, and I don't really think we had it.
02:44:56.000 I think it was curated, and it was very tightly controlled by the actual federal government, which is spooky.
02:45:01.000 No, no, no.
02:45:02.000 It decided what we should be.
02:45:05.000 Paying attention to.
02:45:06.000 Yes.
02:45:06.000 Just put it very simply without kind of like, and that's not right.
02:45:10.000 Right.
02:45:11.000 Because when they're telling you to pay attention to this and the actual issue is this and you cannot, then you can't fix what's actually broken.
02:45:21.000 Right.
02:45:21.000 And you start to, we start to basically be like, we're part of just a useful idiot for these people.
02:45:27.000 Yes.
02:45:28.000 And that's not right.
02:45:29.000 It's not right.
02:45:31.000 Listen, man, this was a lot of fun.
02:45:32.000 It's always great to talk to you.
02:45:33.000 Thank you very much for doing this.
02:45:34.000 It was very cool.
02:45:36.000 Let's do it again sometime.
02:45:37.000 All right, thank you.
02:45:39.000 All right, bye, everybody.