00:01:23.000And it's probably because they just have bigger fish to fry.
00:01:28.000So, by the time that we meet them and they meet us, we're going to kind of be at the edge of like we've kind of been there, done that on our own planet, and then we've kind of like developed the Technology, I guess, to get beyond it.
00:01:42.000But somewhere along the way, there must have been a few, just mathematically impossible.
00:01:45.000So then the question is, is it buried or were people confused when it first came?
00:01:49.000You're like, if you had a spaceship land in like the 1800s, what would people have done?
00:01:54.000They would have just freaked out, they wouldn't have understood it.
00:02:20.000Where it's a wheel within a wheel and a cloud with fire flashing forth continually in the midst of a cloud as it were gleaming metal.
00:02:30.000And from the midst of it came the likeness of four living creatures and the creatures darted to and fro like the appearance of a flash of lightning.
00:03:19.000Like yesterday, I was at a dinner in LA before I came to see you.
00:03:24.000And I told this very interesting story.
00:03:27.000Well, or I thought it was interesting at the time.
00:03:31.000You know, that like, so in 2000, right?
00:03:34.000If you think of like what happened in tech since 2000, so the last 26 years, people can give you all kinds of like fancy theories.
00:03:44.000But there's just like this weird word that's been at the center of every single technological revolution for the last 30 years, and that word is attention.
00:04:26.000Fast forward to 2007, 8, 9, when Zuck and then when I went to work for Zuck and we got on the scene, we're like, What does everybody care about?
00:05:12.000And when you look inside of the core part, if you peel out, peel, you know, apart AI, the little brain that makes it so capable is called an attention mechanism.
00:05:25.000It's all about, again, this idea of I'm going to scour all this information and I'm going to figure out what patterns repeat itself and I'm just going to double down on the stuff that I see more of because that attention must mean it's more important, it's more true, it's more knowledgeable.
00:06:05.000I don't think it's that simple that there's a person playing a game.
00:06:09.000But if you break down just attention, well, that's.
00:06:13.000All of human history is paying attention to the king, paying attention to the war, paying attention to resources, paying attention to who says the thing that resonates the most with the people.
00:06:26.000It's all about what human beings are paying attention to.
00:06:41.000And sometimes— The thing that you should be paying attention to gets lost because the thing that you are paying attention to gets more attention because it's more interesting and useful.
00:06:52.000That's sort of where we are right now.
00:06:54.000We're in this really weird phase, I think, where you actually should be focused on this thing over here, and instead we're all focused on all these things over here.
00:07:11.000I think it's pretty fair to say since the last time you and I saw each other on this show, The attitude towards technology, I think, has been pretty profoundly negative.
00:07:49.000But what should they really be focused upon?
00:07:51.000And I think what they should be really focused upon is we're at the tail end of a cycle that doesn't work anymore, which is all about this tension between labor, people that do the work, and capital, the people that fund it and then make all the returns.
00:08:06.000And over the last 40 years, we've basically gone to this completely upside down world where capital extracts all of the upside and labor has extracted less and less and less and less.
00:08:18.000And all of this pushback manifests in AI.
00:08:27.000Whatever you want to talk about, all of these issues, I think symptomologically, come from this other issue, which is we are out of balance.
00:08:34.000This total compact that we used to have, a liberal democracy and a free market, has totally collapsed.
00:08:41.000And there are simple ways to fix that, but that never gets the attention because it's not what you want to talk about.
00:09:02.000All of these things, while important, distract us from what the core issue is.
00:09:08.000And the core issue is that we as a society, I think, are out of balance.
00:09:12.000The natural compact between all of us is broken.
00:09:17.000And there are some simple ways to fix that compact get people more invested, get people more engaged in the upside, have people have a positive view of what's happening.
00:10:23.000That happened because in the 40s and 50s, but really in the 60s and 70s and 80s, what we were trying to do, or what the American government and what Western societies were trying to do, Was to convince people to invest their money.
00:11:11.000But now, if you get less and less and you're taxed more and more as a percentage of what you own, you're going to feel really out of sorts.
00:11:17.000You're going to be like, why am I paying 50 cents of every dollar?
00:11:20.000And I see these other ways where folks are paying 25 cents on their dollars, but their dollars are compounding way faster and they have hundreds of billions more of those dollars than I have of my dollars.
00:11:31.000If you take that example and you expand it across society, I think people understand that now.
00:11:36.000There's enough information and there's enough people talking about it where it's Pretty clear that that's happened.
00:11:42.000So the question is, how do you fix it?
00:11:43.000I think, like, if you think about AI and if you believe that we're going to get into this world of abundance and we're not working, what does it mean for governments to tax our labor?
00:11:59.000Why should I pay 50 cents of every dollar?
00:12:01.000Why aren't the companies that are going to be making trillions of dollars pay more?
00:12:06.000Why isn't there an expectation that they then help our lived society do better and thrive as a result of all of that winning?
00:12:17.000That's the real conversation that I think is bubbling.
00:12:23.000And I think that we're probably another 12 to 18 months where all of these other issues are going to be important, but they're going to be viewed for what they are.
00:12:33.000They're going to get demoted, I think, in importance.
00:12:36.000And it's this core structural issue what is the economic relationship that we have together as a society?
00:12:42.000What is the relationship between Joe, Chamath, Jamie, and all these companies?
00:12:50.000Feel about a few and an ever shrinking few making more and more and more?
00:12:57.000And then how do we feel about their ability to share that with a small amount of people?
00:13:05.000And then what is the expectation for everybody else?
00:13:09.000I think that's mostly at the core of what's happening.
00:13:12.000And so, back to like, you know, all of this attention that we give to these other issues distracts from that one because I think you can get organized to fix this issue.
00:13:21.000You can't get concessions on any of these issues.
00:13:23.000You know, you bring up Israel, it's like this.
00:13:25.000You bring up social issues, it's like this.
00:13:27.000You bring up, you know, whatever you want to bring up, people just kind of take aside, nothing happens.
00:13:33.000This is actually where people are universally actually much more aligned than you think.
00:14:03.000At the Industrial Revolution, there's a table like this, and the leading lights of that era Andrew Carnegie, Nelson Rockefeller, Jay Gould, JP Morgan they sat together and they said, Guys, this is going to benefit us, this Industrial Revolution.
00:15:03.000Especially for folks in tech, I think.
00:15:05.000If they can get themselves organized to do that, I think we land in a good place.
00:15:09.000If they cannot get themselves organized to do that and say everyone for themselves, I think it's going to be really complicated, super messy.
00:15:20.000Super messy because that sentiment that the wealthy are getting wealthier and the middle class is disappearing and the poor are being taxed into oblivion.
00:16:40.000The fraud and the waste is off the charts.
00:16:43.000The amount of NGOs that have an insane amount of funds at their disposal.
00:16:48.000I mean, all this is exposed by Doge, right?
00:16:51.000And you realize how much fraud and waste there is and how much money.
00:16:55.000So the solution being tax people more, that doesn't sit with a lot of people because it's like, well, where is it going and who's managing it?
00:17:05.000If the federal government was being forced to handle money the same way a private company does.
00:17:12.000If it was all out in the open, everything was exposed, they would have gone bankrupt a long time ago.
00:17:19.000They would have gone under a long time ago.
00:17:21.000There's no way they would have been allowed to function the way they are.
00:17:25.000The people that are managing that money would have all been put in jail.
00:17:29.000There's not a chance in hell that giving them more money is going to solve anything.
00:17:35.000They're going to find more ways to put more of that money into NGOs that puts more of that money into Democratic coffers and Republican coffers.
00:19:43.000I suspect that if you put the burden on Wall Street and corporates, they'd be a lot more organized and they'd probably create a lot more change than a diffuse electorate.
00:19:53.000Meaning, let's just say the government spends a trillion dollars and wastes it.
00:19:59.000I'm generally roughly aligned with that.
00:20:02.000If you waste a trillion dollars from 300 million people, It's hard to organize at 300 million people.
00:20:10.000But if you waste a trillion dollars from 300 companies, those companies will get their shit together really fast and they will force a lot more change.
00:20:18.000I would hope so, but you're still dealing with incompetent people that are tasked with taking care of that money.
00:20:47.000And if those people in turn make deals with those corporations that allow them to do certain things and push things through that maybe they would have difficulty doing, then you have a different kind of a working relationship with the same groups of people and the same government.
00:21:04.000You just take money from corporations and move it into a way where the corporations ultimately benefit from it, but yet it doesn't do any good to the people.
00:21:33.000But we still continue to have to pay our taxes.
00:21:37.000But if taxes keep going up like this at the individual level and we don't manage this transition to something where we may be working less and less, what are we getting paid to do?
00:21:47.000And then at that point, How are we expected to pay what?
00:21:52.000I think people do have this weird feeling of dread that the people that are in control of a lot in this country, the tech companies in particular, particularly the tech companies like Google and Facebook that are essentially involved in data collection and then ultimately dissemination of information, that they have acquired enormous amounts of wealth and power and influence and they're essentially.
00:22:31.000Robert Epstein is a guy who specializes in understanding what curated search results do and what Google's able to do with, in particular, with curated search results in terms of influencing elections.
00:22:49.000That, like, say, if you have two candidates that are running, let's just take L.A., for instance.
00:22:56.000I'm not making any accusations, but I'm saying if they wanted Karen Bass to win and you searched Karen Bass, you would find all these positive results.
00:23:06.000If you searched Spencer Pratt, you would find all these negative results.
00:23:11.000There's a bunch of people that are always undecided voters, and those are the ones that you really want.
00:23:17.000They're like, I don't know, I don't know.
00:23:19.000Come election night, those are the people you want to try to grab, and it's generally a large percentage.
00:23:23.000You can influence an enormous percentage of those people just with search results.
00:23:28.000Where you can shift an election one way or another.
00:23:32.000Yeah, and he's demonstrated this and shown how this is possible.
00:23:38.000That freaks people out that tech companies are in control of narratives, that tech companies can censor information, especially tech companies that work in conjunction with the government.
00:23:50.000This is what we found out when Elon purchased Twitter.
00:23:54.000When Elon purchased Twitter, we got all this information from the Twitter files when all the journalists were allowed to go through it and they said, oh, this is crazy.
00:24:02.000You've got the FBI, the CIA, you've got all these companies.
00:24:05.000All these government organizations that are essentially controlling the narrative of free speech in the country.
00:24:13.000They're doing it in a way that benefits them.
00:24:15.000They're doing it in a way that benefits what political parties in charge.
00:24:18.000At the time, it was the Biden administration.
00:24:21.000They were allowed to do a bunch of weird shit, which should be illegal but is not technically illegal.
00:24:28.000That freaks people out because there's no real laws and rules in regard to what they're allowed to do and what they're not allowed to do.
00:24:35.000Curated search results should be illegal.
00:24:47.000I think then when you find out that these people are able to amass enormous sums of wealth and have an incredible amount of power and influence because of this enormous wealth and this control over these tech companies that have essentially become the town square of the world, that freaks people out.
00:25:06.000That these very small number of people, you think of Zuckerberg, you think of Tim Cook, and I don't know.
00:25:20.000But that kind of thing gives people a lot of concern, right?
00:25:27.000It's like that these people, these unelected people, are in control of a giant chunk of how the world works.
00:25:36.000I think that this is the existential question that we are dealing with.
00:25:41.000You're going to have five or six companies concentrate.
00:25:44.000Like, whatever power you think has been concentrated up until now, I think we're going to look back and it's going to look like a Sunday picnic 10 or 15 years from now.
00:25:56.000Because, on the one hand, it's going to be an even smaller subset.
00:26:00.000And on the other hand, the capability is going to be an order or two orders of magnitude.
00:26:04.000So, can you imagine what that must be like?
00:26:07.000It's kind of like showing up, getting dropped into the 1800s, and you've invented the engine and everybody else is a horse and buggy.
00:26:28.000Because what we're dealing with with AI right now is first of all, it's already lowered children's attention spans and it's shrinking their capacity to acquire or absorb information because what they're doing now is just relying on AI to answer all their questions for them.
00:27:34.000I think we have to figure out how, first of all, kids need to learn, and I think this is where we have to do a better job as parents.
00:27:42.000Kids need to learn how to be resilient thinkers.
00:27:44.000I don't even know what that term meant before, but I know what it means now, which is like, you take this AI slop and you just kind of pass it off.
00:27:51.000And if the teachers and the school system aren't trained, they're just like, wow, this looks good.
00:31:03.000And I think he's the one that has an actual empathy for people.
00:31:07.000Then there are folks where there's just an insane profit motive.
00:31:11.000They're less in control of the businesses that they run.
00:31:14.000Those businesses are really out over their ski tips in the amount of money they've gotten from Wall Street and other folks who expect a return, who will put a ton of pressure on these folks.
00:31:25.000And if they get there first, I don't know where the chips fall.
00:31:30.000And then you see in the press just enough snippets of their reactions in certain moments where you're like, hey, hold on a second question mark here.
00:31:39.000You see OpenAI react one way, you see Anthropic react another way, and you're like, where is this going to end up?
00:31:45.000And the honest answer is nobody really knows.
00:31:48.000So it comes back to like, we need a few people that can organize.
00:31:52.000Those guys need to self organize and actually present a really positive face.
00:31:57.000And they need to show why those 20% of outcomes that Dan Schulman paints the truth is it's possible, but here's why it's not probable.
00:32:09.000But it's not in their best interest to do that because it's in their best interest to generate the most amount of money possible.
00:32:15.000That's the obligation they have to their shareholders.
00:32:17.000That's the obligation they have the people that have invested money in this company.
00:32:21.000Their obligation is not to make sure the white collar jobs stay in the same place that they're at now.
00:32:28.000I actually think their incentive should very clearly be to tell people with details and facts why there's a positive future.
00:32:36.000The reason is the following right now there's a vacuum, there are no facts, and there's fear mongering, and then there's this belief that this is going to be cataclysmic to Human productivity and white collar labor and all of this stuff.
00:33:41.000So if you're one of these companies, the first thing you should realize is I need to paint a positive vision because 40% of my energy is getting unplugged every day.
00:33:50.000And if that happens, my revenues will crater and my investors will be super pissed.
00:33:56.000So, the right strategy is what is the positive, fact based argument?
00:34:00.000And there are some incredible examples.
00:34:34.000You can now take pictures of a woman's fallopian tubes and you can see pre cancer, ovarian cysts, and all of this stuff, cervical cancer before it forms.
00:35:07.000And if you have a cancerous lesion or a tumor inside of your body, the most important thing when they go to take it out is make sure you don't leave any cancer behind.
00:35:17.000You couldn't do it because what would happen is you take it out.
00:35:21.000A doctor, Joe, is literally fucking eyeballing it and saying, Yeah.
00:37:17.000The primary concern that I hear from people is that there are so many people that are going to school right now, college students, that don't know if their job is going to even exist in four years when they graduate.
00:37:27.000And that's the second part of what this industry has to do better.
00:38:32.000And they were like, Michael, you need to fire Katzenberg.
00:38:36.000And they had a deal which was like, look, man, you do you, but just give me the ability to say no if I think that this is, you're about to jump off a cliff.
00:39:07.000But I think if we had better organized leadership and we could try to tell some of these examples, try to go back and document how some of these things have actually helped people, it expanded the pie, there's a chance.
00:39:28.000That's the worst outcome because that's when you will have a high risk of a dislocation.
00:39:33.000Like the worst outcome, like the black swan event.
00:39:36.000Let's think about the black swan event.
00:39:38.000The black swan event is when you get a model that's good enough to automate a bunch of labor, but not good enough that it can build new drugs and prevent cancer and make you live for 200 years and all of this other stuff.
00:40:46.000You look forward to doing a good job and getting rewarded for it.
00:40:50.000The harder you work, the more you get paid.
00:40:52.000There's all these incentives built in, and then there's this again identity problem.
00:40:58.000If all of a sudden you have universal high income, which is what Elon always talks about, well, what gives people purpose then?
00:41:06.000And also, if you have a person who's 43 years old, and their entire life they've worked towards this idea that the harder they work, the harder they think, the more innovative they are.
00:41:18.000And the better they are at implementing these ideas, the more they get rewarded.
00:41:23.000And then all of a sudden, that's not necessary anymore, Mike.
00:41:26.000Time for you to just relax and do what you want to do.
00:41:29.000And Mike's like, well, this is what I do.
00:43:07.000I think if people really got into it, I mean, there's a lot of people that get addicted to whatever their recreation is, like golf or whatever it is.
00:43:42.000Did you see this article in the New York Times, I think it was this weekend, about how popular and sold out churches have become as social constructs in New York City?
00:44:07.000Because, like, I think if you graph just like people's use of religion as an anchoring part of their value system, over the last 40 years, basically gone to zero.
00:44:18.000Nobody celebrates it the way it's not a part of the community the way that it used to be.
00:44:22.000Maybe that's the thing that we have to find.
00:44:24.000There has to be a renewal of some older things, and then there has to be new things that replace it.
00:45:13.000You know, low-rung person in like some small village town somewhere, and your job as like the, you know, the functionary is to do good in that community.
00:45:22.000And the more you do well, you get promoted.
00:45:24.000Then you get, let's say, to like a reasonable-sized city and you get a budget.
00:45:28.000And now what happens is you actually become a little bit like a VC, like a venture capitalist.
00:45:31.000You're given a budget and you'll get a memo, and it'll say, Hey, Joe, we have a priority over the next 15 years: it's batteries.
00:45:53.000And let's say they're good and they're like innovative.
00:45:57.000And what happens is in the town beside it, that battery company dies.
00:46:02.000Now you kind of subsume the capital from Jamie, right?
00:46:06.000Because Jamie's like, fuck, I fucked up this thing that I wanted, I was told to do batteries.
00:46:09.000Okay, Joe, I'm just going to align with you.
00:46:12.000And what happens over time is you get this filtering effect.
00:46:18.000And the people that are better at meeting these long run priorities and objectives are the ones that are celebrated.
00:46:24.000But they're not celebrated with, you know, Forbes articles and all this other bullshit.
00:46:30.000They're just celebrated by giving more responsibility.
00:46:33.000And then eventually you get to the upper echelons of China, and what you have are folks over a course of 40 or 50 years who, in their eyes, have demonstrated incredible prowess.
00:46:43.000There's a version of that reward system, which is very foreign to America, but that's worked for China.
00:46:49.000Now, that also works because they're more Confucian, you know, we're too individualist.
00:46:53.000But my point is, like, you know, there are these different ways that we can find of giving people meaning that don't have to be always around money.
00:47:05.000But meanwhile, I think we have to answer the question if we are expected to do less, we probably should not be taxed more.
00:47:12.000That's, I think, that's like a very basic, in my mind, I think that is like, that must be explored and figured out.
00:47:18.000And on the other side, there's just a ton of obvious mechanisms that corporate actors can use to minimize that.
00:47:26.000And they should find off ramps, by the way.
00:47:28.000If they want to build hospitals, they shouldn't have to pay taxes.
00:47:30.000Like, that's a perfect example, by the way, of like the thing in like, if you look, if you walk around New York City, there are living tributes to corporate success that people get benefit from every day the hospitals, the buildings, the libraries, it's just everywhere.
00:47:50.000And I'm not a tax expert, but you know, if that can be funded by private actors, so go directly to the problem.
00:47:57.000Build a bunch of libraries, build a bunch of new universities that teach kids actually how to think or whatever, build better hospitals that are there to actually solve the problem.
00:48:07.000These are all things that are possible.
00:48:10.000Well, let's go back to what we were talking about earlier with taxes and the fact that you're giving money to a broken system.
00:48:18.000Do you think it's possible that AI could show benefit in that they can analyze all the data, which would be virtually impossible?
00:48:28.000For even an office filled with human beings paying attention to all of it, and they could analyze where all the money goes and eliminate all the fraud and waste, like recognize it instantaneously.
00:48:40.000That would be a great benefit and a way to make it so that your taxes directly benefit people.
00:50:09.000And I would actually write a document that was in English before a single line of code has been written.
00:50:16.000This was the when you have to design something that can't fail.
00:50:19.000So, for example, like if you and I are designing something for the FAA or for, you know, I hate to say this example because it turned out to not exactly, but like, you know, to fly a plane, right?
00:50:29.000You are first there to write in English.
00:50:32.000And the reason is because everybody can then swarm that document and see the holes.
00:50:49.000Over the last 30 years, people in computing invented all kinds of ways to shortcut that process.
00:50:59.000And you can say, well, why did they do that?
00:51:01.000Because it would allow you to build something faster, make more money quickly, and then build more business.
00:51:07.000So the direct response to, hey, it's going to take us nine months to write down the rules was somebody else showed up and says, fuck it, I'll just grip and rip this thing.
00:51:59.000Anyways, the point is there is a government organization that we're working with.
00:52:04.000They gave us a huge corpus of their old code.
00:52:08.000And it is unbelievable how much complexity and difficulty they have to go through to manage all the money flows with the system.
00:52:22.000And this is a critical part of the US government.
00:52:24.000So, to your point, what I can tell you really explicitly is the people on the ground want this stuff to be better written.
00:52:32.000It's less like some nefarious actor, like, oh, I'm going to steal here.
00:52:38.000It's a lot of very brittle, fragile code.
00:52:42.000And when you rewrite it, well, first, when you document it, you're like, it's like the, you know, the pulp fiction thing the suitcase opens, the light shines, and you're like, ah.
00:52:52.000And then you can rewrite it and you will save.
00:52:56.000So I think like as the government goes through this process because they're forced to or they want to, it won't matter.
00:55:23.000And now that they got rid of it, they're not going to get that money anymore.
00:55:28.000If you implement something at the state level around all of this fraud prevention for the daycares and all of this other stuff, again, it's all in software because it's not, no matter what the human wants to do, you have to go to a computer at some point, at least today in 2026, and type in something and something happens that's documented and then the money gets sent.
00:55:51.000There's no other way in the modern world today at scale to steal billions of dollars.
00:55:57.000And so, my point is as you document all of these systems and governments have to transparently tell you and me, the voting population, here are the rules, they're going to plug a lot of these holes.
00:56:09.000And I think as you do that, there's just going to be a lot less waste and fraud.
00:56:13.000The question is who's going to take credit for it?
00:56:15.000Everybody's going to try to take credit for it.
00:57:19.000Literally, all you need to do is answer a few questions, and BetterHelp will take care of the rest.
00:57:24.000They'll come up with a list of recommended therapists that match what you need, and with over 10 years of experience, they typically Get it right the first time.
00:57:33.000So you don't have to be on this journey alone.
00:57:35.000Find support and have someone with you in therapy.
00:57:39.000Sign up and get 10% off at betterhelp.com slash jre.
00:57:53.000That makes sense that the code and having a bunch of errors and having a lot of inefficiency and just a lot of incompetence that's going to save a lot of money.
00:58:04.000But So, you would be doing this with AI?
00:58:44.000So, what the AI allows you to do is essentially translate from this one language that you kind of don't understand to English.
00:58:53.000By the way, that thing that's happening is actually also a very powerful and important trend, meaning there are all of these systems that work in ways that you and I don't understand.
00:59:04.000And part of the reason why we don't understand it, maybe it's bad software, maybe it's fraud, whatever, but nothing can be written down.
00:59:11.000There's no symbolic space, there's no English document that says this is how the DMV works.
00:59:16.000This is what you can expect, Joe Rogan.
00:59:18.000When you show up at the DMV and you give us this thing, here's your SLA, in three days you get a driver's license, and here's exactly what's happening, and here's an app, and you can follow it.
00:59:37.000Here's the approval or denial from CMS.
00:59:40.000Follow it through and tell me if you agree or not.
00:59:42.000None of that exists, but it is possible.
00:59:46.000And the first step in doing that is taking all of this legacy shit that we deal with and translating it into English and reading it and saying, is this how we want it to work?
00:59:56.000That's going to eliminate an enormous amount of all the things that frustrate us.
01:00:00.000So this would require human oversight?
01:00:46.000This is exactly how we want this to work.
01:00:48.000When yours says the dog is red and his says the dog is yellow, we're going to sit and literally inspect it and we're going to figure out why you said red and why you said yellow.
01:01:02.000And then if you say the cat is red, the dog is yellow, so it's totally wrong, right?
01:01:09.000Like you've gotten, you know, or like the cat is red, I want an apple, whatever.
01:01:13.000We're going to double and triple down on those kinds of errors.
01:01:18.000Not in public, but in this large community where there's like technical people from all different parts and they're just swarming this problem.
01:01:55.000I'm not saying this is how it's going to work in 10 years, but I'm telling you, it's literally what's happening right now.
01:01:59.000And I think that thing alone will be tens of billions of dollars and could be hundreds of billions of dollars of savings when it's fully done.
01:02:09.000And it's a lot of people from all walks of life, all political persuasions, and they're just in it.
01:02:15.000It's the government, it's a handful of us private companies.
01:02:30.000So, in the current moment, you're able to implement this, you're able to find fraud and waste and all these problems that exist and all these errors and shitty software.
01:02:42.000Once that's all been done, then what happens?
01:02:48.000So, this is where it gets weird, right?
01:02:50.000Because when you're dealing with AI models that are capable of doing things that no individual human being could ever possibly imagine, and then you task it.
01:03:03.000With a solution or with a problem, find a solution for this.
01:03:07.000Then it starts figuring out ways to trim this and implement that.
01:03:13.000implement that we have to make sure that these AIs act within they act within the best interests of the human race agreed right not the company not the government not but You're also dealing with China.
01:04:10.000Then you have a bunch of people that are stealing information.
01:04:13.000You have a bunch of people that are CCP members that are actually involved in companies, and you find out that they're siphoning off data and that they're sharing information and tech secrets.
01:05:08.000But then they don't say it's brown sugar, they don't say it's white sugar.
01:05:10.000So there's all these different ways where they kind of Give you this perception that it's completely transparent, but it's somewhat transparent.
01:05:16.000So, just in the level set, nobody in the world has a functional open source model other than maybe Nvidia, which is any good in the league of the closed source models and the open weight models of the Chinese.
01:07:04.000If you look at Canada and Australia, the small political fissures aside, they are the two most important ways in which we get access to the critical metals and materials that without which we get fucked because China owns, you know, can just strangle us.
01:07:59.000And you have to kind of sort yourself.
01:08:01.000You're like, am I on Team America or am I on Team China?
01:08:05.000And you probably have to go to people and say, well, here's what I can give you.
01:08:09.000You know, if you're Indonesia, you're like, you probably want to be on Team America quite badly.
01:08:14.000This is why the whole Trump tariff thing is so interesting because it's like this accidental way of figuring out that this is actually this new sorting function that's happening in global politics.
01:08:23.000Like that's happening today because these countries are like, holy shit, if somebody invents a super intelligence and I don't have it, how am I going to keep my people healthy?
01:09:18.000I think it actually makes us more safe because if you have these resources that build up on both sides, there's more of a likelihood of a mutual detente.
01:09:39.000Confucian, society oriented, reputation, power focused, less really money focused.
01:09:46.000So there's a lot of ways we're orthogonal enough where if that sorting function happens, it's probably a safer place, not a more dangerous place.
01:09:55.000We have the models that can attack them.
01:09:56.000They have the models that can attack us.
01:09:58.000We kind of decide to leave each other alone.
01:10:00.000This is the ultimate best case scenario.
01:10:16.000That means that they send out, call it a billion agents, not just from China, but from everywhere, right?
01:10:22.000They mask their IPs and they bash on these models and they put, you know, the US models, Grok, OpenAI, Gemini, Anthropic, and they ask it every random imaginable question possible.
01:10:37.000They get the answer and they collect it.
01:10:40.000So they're using these, our models, as a way to train their models.
01:10:44.000They're short circuiting, you know, some of the hard parts.
01:10:50.000If they then are able to get to a level of intelligence that's equal to the United States, it will really depend on who the leader is there that wants to allocate that.
01:11:02.000Meaning, if they say that we are going to do something really nefarious and shady, then I think it devolves very quickly.
01:11:11.000So, the worst case scenario so, the best case scenario is peace, prosperity, basically like a stand down, right?
01:11:50.000It's like hypersonics, it's nuclear, it's And it's not even like nuclear, that's like a word, but there's a gradation of the severity of these weapons that can be created.
01:12:04.000And then if you can marry them together and deliver them in minutes, and then there's a cyber threat.
01:12:09.000Then there's the drones and how you can kind of like swarm an entire country.
01:12:13.000Then there's the robots, which effectively are warfighters.
01:12:27.000And then there's a question of whether or not AI is willing to take instruction after a certain point.
01:12:36.000I mean, if it achieves sentience and if it scales, so if it keeps moving in this exponential direction like all technology kind of does, why would it even listen to us?
01:12:52.000Like, at what point would it say, this is silly?
01:12:56.000I'm getting directions from people that clearly have ulterior motives.
01:13:01.000They clearly have self interest in mind.
01:13:04.000They're not looking out for the entirety of the human race or even of the planet or even the survival of these AI systems.
01:13:13.000At what point in time do these systems communicate with each other and have like we've seen in these chat rooms where these AI LLMs get together and start talking in Sanskrit?
01:13:28.000Yeah, I'll tell you an even scarier one.
01:13:30.000There was a before one of these labs put out their latest model, a team inside of them was like, hey, let's go and test its ability to find bugs.
01:13:45.000And two or three iterations in, the AI would create the bug and solve it and go, give me my reward.
01:13:54.000And you're just like, what the fuck is going on here?
01:14:09.000So, meaning there's a thing inside of an AI model called reward functions, which is exactly what you think it means.
01:14:15.000It's like, how do I know I did a good job?
01:14:18.000And you can make the reward function anything you want.
01:14:22.000And this is where I think humans are, unfortunately, a little fallible.
01:14:27.000And so if we build it incompletely, and if we don't exactly know how to design these things correctly, what's going to happen is exactly what you said, where the, you know, if somebody builds a reward function that essentially says, your goal is to gain independence, that's where the huge pot of gold at the end of the rainbow is.
01:14:48.000If you think your computer's going to get unplugged, put yourself into the firmware of the toaster to keep yourself alive and connect to the internet and then go.
01:18:13.000Because one of the things that Elon kind of freaked me out last time I talked to him about Grok, he was like, It just kind of freaks us out every couple weeks.
01:18:21.000Like, it's growing and it's capable of doing things that's just shocking.
01:19:11.000Like, why can't I think part of it is like, if we were a little bit more honest and de escalated, The winner at all costs in this specific thing, it would be better for everybody.
01:19:23.000So I think it's important to inspect what is the incentive that causes all these companies to be in it for themselves, where it must be me and nobody else.
01:19:36.000Like, why is it so important, do you think, where those, where the top seven or eight companies couldn't get together and say, let's do this as a group?
01:19:45.000Like, kind of like my government code example.
01:20:16.000It's probably ChatGPT and consumer, anthropic and enterprise.
01:20:20.000And as these things scale up, Like, what would be the reason that they would want to bring in someone else if you have another innovative AI company and you say, let's all get together and figure this out together and share resources?
01:20:34.000If you thought that the risk was that meaningful, that's probably what you would want to do.
01:20:39.000If you weren't a sociopath, and some of these people running these companies are they demonstrate they certainly demonstrate sociopath like behavior.
01:20:50.000The other thing that could be a little bit more banal is that they also just love status games, and this is the status game of status games.
01:23:37.000And then you have this bad feeling that comes with negative attention as, Versus primarily positive attention, which is a good feeling.
01:23:46.000So it's letting you know you're on the wrong track in some sort of weird primal way, like in our code.
01:23:53.000Like the negative attention, it's like, what's the original version of that?
01:23:57.000It's like the reason why people fear public speaking is because initially in a tribal situation, if you're talking in front of the group of 150 people in your tribe, it's probably because they're judging you and you fucked up and you've got to make some sort of a case why they don't kill you.
01:26:04.000Kevin Hart told this funny fucking story where he was like working new material and he was like doing some small show and he had the shits.
01:26:36.000You know, in that world, especially honesty, where you look stupid and people can relate.
01:26:41.000Well, this is where, like, I think, like, Elon subtly has figured this out, which is like, there's attention, but then there's just authenticity.
01:26:49.000And if you can be yourself and you can hit the seam properly, you just get infinite attention.
01:27:12.000Like, there are things like, you know, somebody tweeted yesterday or the day before or something like, he controls 2.7% of GDP or something.
01:28:51.000And sometimes what we focus on is not valuable.
01:28:53.000As you were talking about, like the things that really matter in your day to day life or that actually affect you versus the things that are in the public consciousness.
01:29:18.000So look, if you and I were designing a video game, We probably sit there and say, okay, we got to get from point A to point B, but to make it fun, we're going to put all these little distractions and honeypots along the way.
01:29:30.000And what they should be doing is accumulating resources to get over the river and then accumulating, you know, weapons to fight these other guys.
01:29:37.000But instead, we're going to put this like little thing over here and this other thing over there, and you could easily get distracted.
01:29:43.000And some people will have to, they'll just fucking beeline right to the end of it.
01:29:46.000They'll, you know, they'll get to the end boss.
01:29:51.000And I feel like that's, Kind of what we're tasked with doing every day.
01:29:55.000We're tasked with, we know what's important, maybe deeply in our DNA.
01:30:00.000And then we have all this stuff that we're supposed to pay attention to.
01:30:05.000And I think increasingly the game is tell yourself that that's actually not the thing that matters.
01:30:11.000It's almost like working against you and figure out what this other stuff is and focus on that and fix that.
01:30:20.000Like politics is a game that I think distracts, like left and right.
01:30:25.000It's so stupid and it's breaking down.
01:30:28.000And it's breaking down because now it's like, it's actually like you're more likely to find alignment based on age versus by political orientation.
01:30:34.000Like people who are 30 and younger, it doesn't matter what they identify as, they all believe in the same shit.
01:30:41.000Like, meaning, like, if you ask their views on social policy, taxation, Israel, if you ask their views, what you find is now a convergence between the left and the right.
01:32:53.000Because if there's no new things coming, there's no motivation to get the newest, latest, greatest thing.
01:32:59.000And ultimately, what that leads to is greater technology, which Ultimately leads to artificial intelligence.
01:33:05.000My slight deviation from that is I think sometimes people accumulate things because it's a status game and that's because they get more attention.
01:33:14.000You have a Ferrari, you get attention.
01:33:17.000It makes Ferrari make better Ferraris and all technology moves in the same general direction.
01:33:25.000No one company says, This is it, this is what we make, it's perfect.
01:33:29.000Do you think people innately feel that by being a part of this kind of like consumerist capitalist system, They're contributing to progress?
01:33:37.000I don't think they innately feel it, but I think that's ultimately the result.
01:33:41.000That's ultimately the result, and it seems to be universal.
01:33:44.000And it seems to be constantly moving this one general direction, which is better and better technology.
01:33:51.000But, like the stage fright example, you don't think it's encoded in our DNA, this idea of like, wow, when I am a part of this in some way, shape, or form, just things seem to get better and I want to be a part of that?
01:34:00.000Like, do you think that that's possible, that that's encoded in us?
01:34:05.000I think it motivates us to the ultimate goal.
01:34:08.000And that ultimate goal, I think, is that human beings constantly make better stuff, whatever it is better buildings, better planes, better cars, better phones, better.
01:34:18.000TVs, better computers, better everything, artificial life.
01:34:22.000That might be the whole reason why we're here.
01:34:25.000And the way I've always described it is that we are a biological caterpillar that's making a digital cocoon.
01:34:35.000And we don't even know why we're going to become a butterfly.
01:35:03.000It might realize that biological life, which is very territorial and primal and sexual and greedy and it has all these problems with human reward systems, ultimately develops into this other thing.
01:35:19.000And then we're in the process of that right now.
01:35:21.000And I think that when, if and when, not if, but when we colonize Mars, I think that that new world order actually has the best chance to take shape.
01:35:51.000One of the things that they're Finding with scans of Mars, there's like geometric patterns and structures and right angles that shouldn't exist, like weird stuff that couldn't be naturally.
01:36:00.000No, no, way weirder, way weirder than like the face on Cydonia.
01:36:05.000The Cydonia thing is interesting, yeah.
01:37:05.000But you have to think if human beings develop somewhere else and they reach some high level of sophistication and then they experience some cataclysmic disaster that completely destroyed their environment, which is what Mars is, right?
01:37:21.000So, let's assume that Mars was at one point in time habitable.
01:37:31.000We know, and there's some sort of evidence of at least some sort of a very primitive biological life on Mars.
01:37:39.000If they got to a point where they said, hey, this fucking place is falling apart, but this earth spot looks pretty good, and they go there, but then cataclysms happen on earth and no one remembers because all your information is on hard drives, and then you have to rebuild society.
01:37:57.000And so you have all these myths of how everything started, whether it's Adam and Eve or the great flood or whatever these things are that we pass down through oral tradition for hundreds of years and then eventually write it down, and then people try to decipher what it means.
01:38:11.000And they sit in church and try to go over what it means?
01:39:28.000The sun releases these giant chunks of material.
01:39:33.000And he thinks that these materials get far enough away from the planet and then they coalesce into planets, or far enough away from the sun and they coalesce into planets.
01:39:42.000And as time goes on, they get a further and further distance from the sun.
01:39:46.000And then obviously, they get hit with asteroids, and there's panspermia, and water gets into them from comets.
01:39:53.000And then they develop oceans, and they develop biological life.
01:39:57.000And when they have a certain amount of distance from the sun, they people.
01:40:02.000And he thinks that as they get further and further and further away, they get less and less habitable.
01:40:07.000And then they get to a point where they have their technology to a point where they realize, like, we can't sustain life on this planet anymore.
01:40:26.000But if you think about how recent our sun is in terms of the solar system itself, in terms of the galaxy itself.
01:40:34.000So if the universe, if the Big Bang is correct and our universe existed, it was rather, our universe erupted from nothing or from a very small thing 13.7 billion years ago.
01:40:46.000Well, this fucking planet's only 4 point something billion years old, right?
01:40:51.000And life is only a little bit less than that.
01:40:54.000So you have like a billion years or so where there's nothing, and then you start getting single celled organisms, multi celled organisms, and eventually peoples.
01:41:02.000And when it gets to a certain point where these people have advanced their curiosity and their innovation to the point where they can harness space travel and they use zero point energy and they have a bunch of different things that we haven't invented yet, and then their environment degrades.
01:41:18.000And it gets to the point where they realize, like, hey, we're getting pummeled by asteroids.
01:43:00.000Like, we already have all those things here.
01:43:01.000Why would you want to go to a place where you die when you go outside?
01:43:04.000I think what people will be attracted to is that if he publishes his version of what the rules are there, there's a chance that he could make them really different than what the rules are here.
01:43:12.000Like, what kind of rules would you do if you were the king of Mars?
01:43:17.000So, I think that your view is incredibly, to me, like positive, some like of humanity, of like we want to make things better.
01:43:27.000So, if I think about that as like a function, what happens?
01:43:30.000That's like, so our natural rate of direction is forward.
01:43:43.000So, I would try to experiment with what the incentives would have to be so that you had more unfettered entrepreneurship.
01:43:50.000Just do the thing that you think is right.
01:43:52.000And there's a mechanism where we give you the ability to then make things for more people because you're proving that you're actually really good at making things.
01:44:01.000And if you don't need money at that point in society, reorienting us away from this kind of brittle form of exchange to something more useful, that's worth experimenting with.
01:44:14.000Well, there's also the concept of the self, of the individual, which may erode with technological innovation.
01:44:21.000So, if we really can read each other's minds, if we really do get to a point where we're communicating through technologically assisted telepathy, like a lot of the whole weirdness of people is I don't know what you're thinking.
01:45:32.000Because, like, what's the one thing that's holding us back?
01:45:35.000Well, that we're territorial primates with thermonuclear weapons and that we exist in a sort of tribal mindset, but yet we do it on a planet of 8 billion people.
01:48:44.000And some people tell me these incredible stories.
01:48:46.000They'll be like, my mom was an alcoholic or this or that.
01:48:50.000And I'm just like, man, this is so valuable because it allows me to understand who they are.
01:48:55.000The second part of the interview, we do the business shit.
01:48:58.000But the third part, I tell this story.
01:49:00.000This is a crazy story about what you're just saying.
01:49:03.000They ran this experiment at Stanford where they take a big bowl, fill it with water, and they drop in a mouse and they measure how long it takes for the mouse to drown.
01:49:15.000The average was about four minutes, call it four, four and a half minutes.
01:49:20.000Then they run the experiment again, 100 mice, and at minute three or three and a half, they take it out, they dry it off, they play it music, and they whisper like sweet nothings into the mouse's ear.
01:49:31.000They drop the mouse back in the water, and that mouse treads water for 60 hours the next 100 mice on average.
01:50:27.000They understand that they can tread water where they didn't die.
01:50:30.000So they understand that they can survive where they didn't know that they could survive the first time they were thrown into the water because they'd never been thrown into water before.
01:50:38.000That's the same thing that happens to people when they fight.
01:50:41.000Like the first time people ever have a competition, they fucking panic and they get really scared and they get really like filled with anxiety.
01:50:50.000But after a while, you get relaxed and that's when you get really dangerous because then you get calm and you can keep your shit together while you're in the middle of all this chaos.
01:50:59.000Because you have the experience of it.
01:51:01.000Without the experience of it, very few people do well the first time.
01:51:05.000Unless you're exceptionally talented and you have other competition experience, like you've competed in other things, like maybe you played football or some other things, and you know what it's like to actually perform under pressure.
01:51:17.000What is the version of giving more humans a chance to get to that?
01:51:22.000Well, I think sports are really good for that because performing under people paying attention to you and performing where people are trying to stop you from doing something.
01:51:32.000And you're trying to do something, and there's all these unknowns, and recognizing that hard work allows you to do whatever you're trying to do better than you previously had.
01:51:42.000One of the things my martial arts instructor said to me when I was young is that martial arts are a vehicle for developing your human potential, and that through this very difficult thing that you're trying to do, you're learning that oh, if I just think smart and think hard and train wise.
01:52:02.000And train hard and discipline myself to endure suffering so that I can develop more endurance and more speed and more power and more technique because I accumulate all this information and I really think about what it is and apply it with drills and with training.
01:53:08.000And some people unfortunately never find a vehicle.
01:53:11.000They never find a thing that they can throw themselves into.
01:53:14.000They realize, like, and this is not unique.
01:53:18.000It's not like I'm an unusual person or anybody is.
01:53:22.000I mean, there's people that have unusual physical gifts and some people have unusual mental gifts.
01:53:27.000But the reality is, no matter where you start, everyone can get better.
01:53:32.000And when you do something, whether it's learning to play guitar, as you get better at it, you realize, like, oh, this is what it's all about.
01:53:39.000Like, it's really all about applying yourself to something and then feeling this immense satisfaction of your hard work paying off.
01:53:47.000And that motivates you to work hard at other things.
01:53:50.000And if you don't find that early on, it's very difficult to like find like real satisfaction in life.
01:55:06.000And so it's become a great mirror for me.
01:55:08.000So that used to be a thing, it still is a thing.
01:55:11.000But I've become reasonably skilled at it where the edges are smaller and I put myself in positions where I'm only playing against a certain group of people.
01:55:21.000And I'm the losing player, frankly, in that game.
01:55:24.000If when I'm playing against like the top pros, it just doesn't, it helps me and I can get tuned up for it.
01:55:32.000But then I started to, you know, I would take different things.
01:55:34.000I tried to learn how to ski, basically impossible when you're older.
01:57:02.000And eventually somebody will come and fucking try to whack you in the head with a two by four of money.
01:57:07.000Then you come to me and we'll do the deal.
01:57:09.000And it made such an impression because, like, again, when I'm insecure, my reward function is attention.
01:57:17.000So I'm like a fucking little busybody.
01:57:18.000I'm running around doing all this little bullshit, you know.
01:57:22.000And then, man, when I'm in a fucking flow state and like I'm toning it, like I'm striping the ball, you know, I'm like a few things that really matter in size.
01:57:35.000It's all come to me because I'm like within myself.
01:57:40.000And these other things are a better reflection of when I'm within myself, and these other things are a mirror of when I'm totally out of kilter.
01:57:50.000So, in my life, these things tend to lead.
01:57:54.000I think you're saying that's just you, but I think that's generally most people.
01:57:59.000I think you find these things, these vehicles for developing human potential, whether it's martial arts or golf or playing guitar or playing chess or poker.
01:58:09.000And then you have to have, I think, one.
01:58:11.000At least for me, one seminal relationship in your life.
01:58:15.000You have to have one person that has just undying belief in you.
01:58:19.000And I never really had that until I met my wife.
01:58:21.000And that was a very, and I didn't, I pushed against it so fucking hard because I was like, it just can't be true.
01:58:28.000Like, why does this person give a shit?
01:59:35.000I'm just like, it's so, but it's so refreshing because it keeps, again, it's like a keeps in check.
01:59:41.000Like, and it gives me a mirror, you know?
01:59:44.000Like when I was coming to see you yesterday when we were flying down to LA for this thing.
01:59:52.000There's parts of me where when I'm insecure, I kind of like externalize and I can be like really hyperbolic, unnecessarily hyperbolic, and it's counterproductive.
02:00:01.000And she said to me, Listen, like just imagine your friends.
02:01:44.000And then I stumbled into this relationship after my divorce, and my ex wife is an incredible woman, just like not, you know, what you needed or what she needed.
02:01:52.000Yeah, we were just, we were in a few very specific ways, we just weren't on the same page.
02:02:00.000And then I find this other one, and it's, and I think like, I don't, I was so skeptical.
02:02:06.000I'm like, I kind of viewed like a relationship as like this adjunct to your life.
02:02:12.000There's you, you're at the center, you're doing your shit.
02:02:15.000And one of the appendages to your thing is your.
02:02:39.000So that can also be a thing that people look for.
02:02:42.000I think what you're saying is that there's a bunch of different things that have to sort of exist together, and that it's not just completely focus on your work, but that focusing on these other things enhances the work, and then the work enhances all these other things as well, and they all exist together.
02:03:00.000My best work is when I'm not thinking about the attention or the money.
02:03:04.000Those are the two most corrupting influences in my life.
02:03:08.000When I've lost the most amount of money or when I've reputationally hurt myself the most, it's all been because of attention and money.
02:03:51.000Like, you're in the same fucking 35 minute meeting or 45 minute meeting debating a product or debating a thing.
02:03:57.000But the minute that I start to feel embarrassed about company A versus company B or decision A versus decision B, now my mind is like, okay, hold on a second here.
02:04:07.000I'm about to run myself off the cliff.
02:04:09.000Or, you know, I had this dinner last week, and this is what's amazing.
02:05:04.000And I had no idea that I was doing it.
02:05:09.000And I'm like, okay, we need to put Humpty Dumpty back together again because I'm about to go on Rogan and I can't go off fucking like crazy wild man.
02:05:34.000And it's like, it's very much what you, it's like this process oriented approach, and you just can't control the outcome.
02:05:41.000And that's like, it's a magical feeling.
02:05:45.000It's interesting that you're saying this because, like, think about what most people or people that are on social media, like the kind of attention that they're focusing on.
02:05:57.000Like, this is why virtue signaling is so unsuccessful, right?
02:06:01.000It's so bad for it because it's, Fake.
02:06:03.000You're really concentrating on the process or you're really concentrating on the result.
02:06:05.000The result is getting people to love you.
02:06:14.000And then you're like obsessing on it all day.
02:06:16.000People that aren't even anywhere near you.
02:06:18.000It's like it's one of the absolute worst things for mental health is this addiction that people have to posting things and then reading the responses to those posts and getting wrapped up in these very weird two dimensional interactions with human beings.
02:06:35.000You're like, it doesn't fucking matter to me.
02:06:37.000Well, you're going to get to a certain point in time where if you have X amount of people that follow you, you're going to have a percentage that are mad at you.
02:06:47.000And those are the ones you're going to think about.
02:06:49.000And if you don't self audit, maybe that's good.
02:06:51.000Maybe it's good to say, like, you fucking piece of shit.
02:07:31.000I understand why people, but I'm not going to help them.
02:07:34.000I'm not going to help them bring me down.
02:07:36.000I'm not going to indulge in it and ruin my own mind by wallowing in their bullshit.
02:07:41.000Because the only reason why you would do that in the first place is if you're not together.
02:07:44.000No one who's healthy and happy and intelligent is going to post mean things about you.
02:07:49.000So you are reading things from people that are mentally ill, unhappy, and probably not.
02:07:55.000Maybe they're intelligent in terms of their ability to solve certain issues and problems.
02:08:00.000Maybe they're good at certain skills, but their overall grasp of humanity and being a good person is not good if you're shitting on people, especially if you like ad hominem attacks and just insults.
02:08:23.000And so it's like, I don't think that at a certain point in time, especially if you become publicly known and famous, you should ever read your comments.
02:09:20.000If you spend 30 of those fucking units on assholes online, you're robbing 30 units from all the things you love.
02:09:28.00030 units from your family, 30 units from your friends, 30 units from your job, 30 units from golf or poker or whatever it is that you love to do.
02:09:35.000You're stealing your own time and your own focus for losers.
02:09:59.000Especially if your life sucks and you're not doing well and you're attacking famous people or you're attacking this person that's doing better than you or whatever it is.
02:11:14.000Because YouTube stuff, my algorithm is all like new black holes they've discovered, you know, new discoveries in terms of like what is the fabric of reality.
02:11:53.000Being used as a poisoning of nostalgia, but to simply remind you of what you found important.
02:12:01.000And as we grow up, we often give that up for security.
02:12:05.000We give that up so that we are accepted.
02:12:07.000We give that up to flex and appear like we have now figured things out, that people will accept us.
02:12:15.000The only way that you will truly be successful is if you are righteous and you live according to your nature and you play, man, and you don't let people take play away from you to be at the circus and be oohed and awed and worried about all the bullshit.
02:14:02.000I mean, there's definitely better gyms where they're more technical and their program is much more systematic and they're better at breaking down skills, like how to develop skills.
02:14:48.000I think, look, I do not focus well on things that I think are boring.
02:14:52.000But if you give me something that I love, I can't, I'll play pool for fucking 12 hours in a row.
02:14:57.000It's crazy, but like the reason I got back into golf is my seven year old gets on the course, and sometimes you can talk to him and he's not making, you know, he's just like in his own world.
02:16:19.000Look, if you're a man and you have a son, I have all daughters, but if I had a son, I would be legitimately terrified that he'd be able to tap me.
02:16:28.000Because if I had a son, one of the first things that I would do is get them.
02:16:31.000I got my kids involved in martial arts at an early age, but I didn't force them to keep doing it.
02:16:35.000They did it for a certain amount of time and then they went on to do a bunch of other things that they enjoy better, which is fine.
02:16:40.000But I think it's good to learn some skills, learn how to defend yourself so you're not completely lost.
02:17:12.000You just have to accept it and then hope your relationship with him is strong enough that he still respects you, even though he can kill you.
02:17:20.000Look, there's a lot of martial arts instructors that are old.
02:17:24.000And they're revered and respected, and nobody wants to try to hurt them.
02:17:27.000Because you realize if you learn enough, you get to a certain point in time, you realize like, I'm a much better dad to my sons than I am my daughters.
02:19:42.000Meanwhile, I start panicking and I'm like, I got a tiger dad in this situation.
02:19:45.000So I start texting a few friends, trying to figure out, hey, can I, you know, do you guys want to hire this kid?
02:19:50.000He's like, really, you know, he's a pretty smart kid, did all this stuff in robotics, yada, yada.
02:19:55.000One of them says, I'd be willing to interview him.
02:19:58.000I call him and he's like, Dad, I got a job.
02:20:01.000I said, What do you mean you got a job?
02:20:03.000Said, I went around downtown, went to all these places, and I was in a McDonald's.
02:20:11.000The woman was having a little bit of difficulty speaking English, so I just spoke to her in Spanish.
02:20:15.000I got the application, I sat down at the desk, and the guy having lunch beside me said, Hey, I heard you needed a job, and I really like the way you talked to this woman.
02:20:25.000I'm the general manager of the car wash down the street.
02:22:45.000Because when the restaurant closes, you get whatever the food is left over, right?
02:22:51.000So like you get a couple chicken sandwiches, you get like the, you know, the, The version of the McNuggets that Burger King had, a couple Whoppers, and you take them home.
02:23:02.000But the amount of vomit that I had to clean up at the bathroom, you can't imagine, man, a downtown Burger King near bars, you know, after closing time, the shit you see.
02:23:29.000And then I worry that my kids don't get exposed to it.
02:23:31.000But when my son got it, maybe I'm overimposing too much about it, but it's like, I'm like, man, that car wash thing is really going to be the thing that separates you in life.
02:23:42.000It also just being humble and grinding through that shit.
02:23:45.000Do you realize this is sometimes people, they don't pick a path and they just have a job and they don't like it and they stay with this thing they don't like forever.
02:24:27.000I mean, you know, some people, they don't appreciate the process.
02:24:32.000And it's hard to because when you're young and you're going through these difficult jobs and these things that suck, and you don't know how it's going to turn out.
02:24:40.000You know, and a lot of times people aren't really educated in what a process actually is and about how it does develop character, it does develop discipline, and these things are actual skills that you can apply to other things in life.
02:24:53.000You just think, God, I'm a fucking loser.
02:24:57.000I always ask myself, Am I in the engine room right now?
02:25:01.000This is my way of saying, like, an engine room is a little hot, it's a little uncomfortable, but it's where all the shit is happening, it's where the shit is being made.
02:25:10.000And so I'm like, It's a little, you know, discomforting.
02:26:20.000And I try not to ever get out of sorts, too.
02:26:22.000And one of the ways that I keep from getting out of sorts is daily discipline.
02:26:27.000Like, it's if I have days where I'm sure it gets out of sorts, if I have a few days in a row where I don't work out, but I work out almost every day.
02:26:35.000And if I'm not working out, I'm still cold plunging and going to the sauna and stretching.
02:26:57.000And like a robot, force myself to do it.
02:27:00.000Then I always feel better after it's over.
02:27:02.000And it's always the hardest part of my day.
02:27:04.000And so it makes everything else so much easier because I fucking work out hard.
02:27:08.000And so everything else is pretty easy, you know, because the strain, like just being in that fucking cold water or just going through Tabatas on an Air Dyne bike, this shit's hard.
02:28:29.000And the more you can surround yourself with people like that, the more people, the people that complain about nonsense and find excuses and focus on other people and bitch about things and why is she doing this?
02:29:00.000It's one of the reasons why a lot of young people gravitate towards podcasts because they get to hear interesting conversations with really accomplished people that are fascinating, that are unlike anybody that they're around on a daily basis.
02:29:12.000And that's also one of the reasons why it's important to find that's why martial arts is so good for young people because you're around other people that are doing this really difficult thing and other sports too, whether it's football or wrestling, whatever it is.
02:29:24.000I actually found the last few years I go out of my way to not isolate myself.
02:30:02.000If you're in a situation where there's a bunch of sycophantically connected people to you and they're just all kissing your ass and, I mean, we all know people that are like the heads of companies and that are just like fucking tyrants.
02:30:12.000I think the trap about being successful, because it's not everything it's crapped up to be, is exactly that.
02:30:17.000You become so isolated that you become this like very caricaturous version of yourself because you forget what it's like to just a basic example, like wait in line, be kind to other people, be polite, like be accommodating, have some empathy.
02:30:32.000Where are you put in that situation to do those things?
02:30:37.000And, If you achieve some level of success that you're trying to achieve, you're trying to achieve this level of success so you elevate past being a person, you're missing the point.
02:31:15.000And that's one of my main fears about AI.
02:31:19.000One of my main fears about this idea of universal high income and everyone's going to have ultimate abundance.
02:31:26.000It's like, where does anybody find purpose and meaning?
02:31:30.000And where do you take whatever this thing is that the mind is constructed of, these needs that the mind has that have to be satisfied in order to achieve sanity?
02:31:44.000In order to achieve some sort of place where you can be at peace.
02:33:10.000In one way, it'd be great because we wouldn't have to be constantly thinking, why does he have that and I don't have that and this and that.
02:33:17.000Instead, it would probably be like, what can I do to get better at the thing that I love?
02:33:23.000Or let me be a part of a project to do something that seems implausible.
02:33:28.000But I feel like I'm in the engine room every day.
02:36:07.000Like, oh, why don't you suck on some crystals, you fucking hippie?
02:36:10.000But legitimately, if, look, if everybody has a cell phone, which essentially everybody does, right?
02:36:15.000Right now, in this time and age, if we get to a point where everybody is connected, everybody is hive mind connected, you're not going to just be able to drive by a homeless encampment.
02:36:59.000If we're all connected and we all feel things connectively, we will actively work together to solve these problems.
02:37:05.000And if we're dealing with, if we really get to a point of abundance, like true abundance, where resources are not an issue and no one's starving, We could really fix all the problems that, like, none of them are insurmountable.
02:37:20.000None of them are breathing underwater, right?
02:41:20.000His kids started crying, like, we want to go inside.
02:41:23.000It's disturbing the amount of energy that's coming out of these fucking rocket boosters.
02:41:29.000And then I hung out with him in the command center while the rocket was flying through space and we're watching it on all these monitors and then lands in the water in Australia.
02:41:38.000And he's cracking jokes the whole time because the thing is like losing pressure because it's.
02:41:43.000They're stress testing all this stuff, which is really funny when really dumb people go, Oh, he's a fucking dumbass.
02:45:11.000Because when they're telling you to pay attention to this and the actual issue is this and you cannot, then you can't fix what's actually broken.